For the first time in human history, an entire generation does not remember life before photographs could be taken at an exceptional level. We are now more attuned to the quality of photography, and more literate in what that means (and skilled in retouching) than ever before. Photographic literacy is a form of communication, and even the smallest children are adept. What’s about to happen next though, is that that retouching and tweaking skill set will likely become irrelevant.
The newest iPhones already showcase how the camera “knows” best (as did previous generations of the Pixel), but now we’re seeing truly mind-blowing capability on Google’s Pixel 3 and Pixel 3 XL. Both phones retain the same 12.2MP sensor with 28mm equivalent lens—hardware that debuted on the Pixel 2 and 2 XL. However, Google added dual front-facing cameras for selfies, including a very wide 19mm equivalent that allows the shooter to frame a shot using the screen and zoom out (which switches to the wider lens) so that several people can fit in the frame of a shot composed at arm’s length.
But the bigger breakthrough is that like Apple, Google’s using computational photography to shoot multiple shots at once, then welding highlights, blacks, and color into a single, usable image. Even Top Shot—a feature that shows you the best shot if you just hold down the shutter—is busily analyzing every single image to ensure that it picks the crispest capture, fusing up to 15 shots into an HDR+ and then yielding “The One” (rather than several images) with the truest color, sharpest focus, etc.
Shoot while holding down the shutter and, with “motion enabled,” and Pixel 3 can generate an animatronic, shareable GIF of the sequence. This is slightly different from Apple’s Live Photos; Pixel 3’s motion shots look like animations—and they’re more interesting as a result.
During testing, we got the best results with auto-enable HDR, so rather than a single RAW image, you’ll have a very balanced JPG, where color saturation is beautiful but hardly over-boosted or artificial looking.
Note that photographing people is fun with this phone too, because Top Shot also captures its series “looking” for smiles and expressions. Further, it also creates the after effect of using fill flash (minus actually triggering the phone’s flash) to “warm up” a subject’s skin tone, regardless of the light in the room. Since so much inside lighting these days is harsh, the effect is to reduce the blue cast that makes human skin look cadaver-esque.
Portrait mode on the Pixel 3 (much like on iPhone Xs and Xs Max) allows you to adjust the blurriness of the background after you snap the image. Unique to Pixel 3, you can also change the background to black and white. It’s an interesting look that, especially with a low-light background, is less odd-looking than Apple’s Stage Lighting.
While we noted the ability to shoot non-human subjects using Portrait Mode with the new iPhones, Google’s Pixel 3 also allows this, and if anything felt more versatile and readily available to lock focus for this application.
You can see the power of this tool from our shots of apples being crated at a local orchard, where the bokeh effect of Portrait Mode really brings the right mood to the subject. Without the blur, the shot’s “agricultural.” With it, it’s painterly autumnal.
We also took a few snaps using Portrait Mode at our local rehab garage, where the lot’s always full of interesting near-junked cars with at least two wheels in the grave. With these, you can really see the power of Portrait Mode to bring the subject of the shot forward and blur the background. We found, repeatedly, that enabling Portrait Mode wasn’t handcuffed by trying to identify the subject. Even with Apple’s improved tech on the iPhones, frequently the Pixel 3 was simply better.
While it’s still just in beta, the latest version of Google’s Camera app—which we downloaded to shoot in Night Mode—is quite remarkable. The sample above was snapped at 9PM, and the stars in the sky are incredibly clear. We used a tripod to stabilize the phone, but what’s interesting about this shot is that the camera’s using computational tech too. Beyond just taking long exposures, it’s also merging these shots, pixel-shifting each photon so they align correctly. You can do this with a series of night exposures taken with a “real” camera, but that requires a lot more effort after the fact. What we hope to see coming soon then, is an Astral Photography mode, where the fusion is centered around keeping the night sky dark, and only focusing on the piercingly bright objects. If you could take a series of two-second exposures and sandwich these, the effect might be astonishing.
Then there’s video. We mounted the Pixel 3 on a bike helmet and then took to a muddy two track, purposely filming into the sun. Despite the speed, the motion and the rough terrain, that the video is entirely free of the warping effects that might typically occur while shooting moving footage with a cellphone, and it also mutes vibration that might make this footage seasick-making to view.
Not only is the stabilization astonishing (and this is just 1920 x1080, not 4K), so is the color accuracy, and the balance of light and shadow, even with frequent lens flair with the sun winking in and out of the trees. It may not be as rugged as a GoPro (IP68 rating means it can withstand a dunk in about ten feet of water for a half hour), but there are plenty of cases that would armor your Pixel 3 for harsher environments.
And given that Google says Pixel 3 photos and video get free online storage through 2022, there’s also an incentive to shoot more.
Images by Michael Frank