Tag Archives: Images
Terri Loewenthal uses special reflective optic lenses to project multiple landscapes in one frame, like this image taken in Lone Rock, Arizona.
A California transplant, Loewenthal intended her photo series to celebrate the spirit of the American West. Above, a sloping blue mountain in Lundy Canyon, California.
Loewenthal takes her Psychscapes images on camping trips—like this one is taken to Granite Mountain, California—meaning she hikes carrying all her photo equipment. Shooting outdoors can get “pretty precarious” at times, she says.
What began as a California-focused project has expanded to include more states in the American Southwest. Above, a mountainous terrain captured in Peach Springs Canyon, Arizona.
Loewenthal has been taking pictures for the Pyschapes series for the past nine months but she’s been planning the project for years. “Sometimes it fails,” she says. Above, a successful photograph taken in Tonopah, Nevada.
Loewenthal uses a Mamiya 645, a medium format camera that allows her to swap out the film back for digital depending on the type of image she wants to make. The result is images like this one taken in Whale Peak, California.
Landscape photography, like this image from Lassen, California, has been a change for Loewenthal, who usually takes portraits. “Working with people has been an exploratory process,” Loewenthal says. “You’re always aiming for the moment when you forget the camera isn’t around. It’s the same when taking Psychscapes.”
Loewenthal says Psychscapes was inspired by the autonomy in painting, especially the ability to separate the color from the subject. Above, a dramatic shot from Yosemite, California.
A colorful sky in San Gabriel Peak, California. Loewenthal says this kind of photography is a “playful process.”
Loewenthal says the best Psychscapes images, like this one from Thunder Mountain, California, aren’t taken from a peak. “It’s nice when there’s a mix of far away landscape and nearby,” she says. “Just far away places are less interesting.”
Loewenthal plans to spend more time in Arizona this summer producing Psychscapes. Here’s a picture she made earlier this year in Diamond Peak, Arizona.
A mystical pool of water in Buck Creek, California.
Loewenthal uses filters as a paint to color her photographs, like this rosy image from Buck Creek, California.
“To have a psychedelic experience is to free your mind from its normal constraints,” Loewenthal explains. “When I had the idea for these images, I was able to shift the colors of the natural world in my mind.” She took this photograph in Diamond Peak, California.
The deep learning method uses a process the company calls “image inpainting.” It can reconstruct images that could be missing pixels and can remove unwanted content within a photo and replace it with a computer-generated alternative.
“Our model can robustly handle holes of any shape, size location, or distance from the image borders. Previous deep learning approaches have focused on rectangular regions located around the center of the image, and often rely on expensive post-processing,” the NVIDIA researchers stated in their research paper. “Further, our model gracefully handles holes of increasing size.
For example, when you use Nvidia’s technology on a blank space where there is supposed to be a nose, instead of filling the space with its surrounding, the software adds a computer-generated nose.
The company hasn’t offered a time for when this technology could be released, but they do see it someday being implemented into current photo-editing software.
Nvidia, founded in the 1990s, makes graphics chips that can be found in computers, video games, and even self-driving cars. The company has currently been dominating the AI scene and even made its debut in the Fortune 500 last year at #387. The cofounder and CEO, Jen-Hsun “Jensen” Huang was also Fortune’s Businessperson of the Year for 2017.
Watch the video from Nvidia about this state-of-the-art technique below.
When Google launched its Pixel 2 flagship smartphone last year, it included something of a surprise: A co-processor called Pixel Visual Core, the company’s first homegrown, consumer-facing piece of silicon. And while that feels like a momentous foray, the co-processor has lain dormant for months. Monday, Pixel Visual Core goes to work.
As it turns out—and as Google had nodded at previously—the hidden chip inside every Pixel serves a narrow but critical purpose. It will use its eight custom cores, its ability to crunch 3 trillion operations per second, all in the service of making your photos look better. Specifically, the photos you take through third-party apps like Instagram, WhatsApp, and Snapchat.
Those are the three partners at the Pixel Visual Core switch-flipping; since it’s open to all developers, more will presumably follow. They’ll all gain the powers to produce Google’s HDR+ images, photos that rely on a series of post-processing tricks to make images shot with the Pixel appear more balanced and lifelike. Photos taken with the Pixel Camera app have already benefited from HDR+ powers since launch—that’s one reason Pixel 2 earned the highest marks yet given to a smartphone by industry-standard photo-rater DxOMark. But Pixel Visual Core will extend the feature to the streams, feeds, and snaps of Pixel owners as well, after an update that will roll out early this week.
To understand why Google would devote its first homemade smartphone processor to a relatively narrow function—not just photography, but HDR+ specifically—it helps to understand the importance of HDR+ to the Pixel’s photo prowess. For starters, it’s not the HDR you’re used to.
“HDR+ actually works shockingly differently,” says Isaac Reynolds, project manager for Pixel Camera. Where HDR essentially tries to combine three or so simultaneous exposures for the best result, HDR+ takes up to 10 identical underexposed shots. “We take them all and chop them into little bits, and line them on top of one another, and average the image together,” says Reynolds, who ticks off the reduction in noise and color quality as just two of the benefits.
That’s not just hype, or at least not entirely. HDR+ really does have tangible benefits—especially in Google’s implementation.
“HDR+ technology is a very good technology for noise and data preservation. This removes the noise in the picture,” says Hervé Macudzinski, manager of DxOMark.com. “That enables Google to provide a nice picture with low level noise high level detail.”
You can see an example of what that means in the below before-and-after shots, with the usual caveat that Google provided them, and your own experience may vary.
The various benefits of HDR+ are also more or less pronounced depending on the conditions of the shot you’re taking. It helps especially bringing clarity to low-light images, or to give an assist if you for some reason take a portrait with the sun at someone’s back.
Google’s not the only company capable of this particular trick, but its execution clearly stands apart.
“The HDR+ is very impressive because they did something very efficient,” says Macudzinski. “If you want to do that, it’s going to be optimized and very powerful.”
Pixel Visual Core will also power two related photographic enhancements; RAISR, a technique to sharpen zoomed-in shots, and Zero Shutter Lag, which is exactly what it sounds like.
Until now, these optimizations have been off limits for third-party developers. Photos taken within the Instagram app, for instance, look a bit muddled compared to those taken with the Pixel’s native camera app. Which is where Pixel Visual Core comes in.
Sharing the Wealth
The primary benefit of the Pixel Visual Core, now that it’s on? You still won’t even notice it, says Ofer Shacham, the chip’s engineering manager.
“If we look at HDR+ as a key benchmark for us, it gives us the ability to run five times faster than anything else in existence, while consuming about 1/10th of the energy of the battery. We can put it under the hood,” says Schacham. “We basically hide it. That’s what enables every developer to use it, while not consuming energy from the battery, and even better, reducing the energy consumption from the battery while those applications take pictures.”
That also hints at why Google decided to go it alone with Pixel Visual Core, rather than rely on the powerful Snapdragon 835 processor that handles the bulk of the Pixel 2’s computational needs. The Pixel Visual Core offers not just customization, but flexibility.
“Google in a sense is a software and algorithm company,” says Schacham. “We want something that allows us to rapidly innovate, rapidly change the algorithm, rapidly improve it.”
To that end, the Pixel Visual Core is also programmable. That means while it works primarily in service of HDR+ today, it could go toward making other applications zip in the future, a possibility that Schacham acknowledges, while declining to go into detail on what sorts of use cases Google envisions.
More broadly, though, the Pixel Visual Core represents Google’s first foray into an increasingly common trend of smartphone manufacturers rolling their own silicon, giving itself tighter control over its product and weaning itself off of chip giant Qualcomm.
“I think it’s significant in that, first off, Google is an advertising company, who is also an operating system provider, and they are going more deeply vertical in what they’re doing by adding semiconductor features to enhance the experience,” says Patrick Moorhead, president of Moor Insights & Strategies. “Any time somebody in software gets into hardware, interesting things happen—as in interesting really good, or interesting really bad.”
It would also make sense, Moorhead says, for Google to extend its processor plans beyond Pixel Visual Core. Microsoft uses a custom system-on-a-chip for the Xbox. Apple’s A series SoC has contributed greatly to the iPhone’s dominance. And with Google having poached a key Apple chip designer last summer, it seems unlikely that an HDR+ coprocessor is the end of the line.
For now, though, Pixel 2 owners can look forward to adding an HDR+ veneer to their social media pics—while waiting Google’s broader ambitions to come more fully into focus.
A fresh batch of images straight from the New Horizons downlink give us just what we’ve been waiting for: color views of Pluto! Ridiculously high resolution detail! Strange new snakeskin textures! Plus a first look at how methane is involved in shaping these crazy ice landscapes.