“The latest Pixel 4 flagship phones from Google include several camera improvements thanks to computational photography”
Google launched the Pixel 4 and Pixel 4 XL at an event in New York earlier this week. The audience wasn’t really surprised since pretty much everything about the devices had leaked online. However, the company did have a few camera features that it was proud of. Ever since the first Pixel device, Google has also focused on two main features; the camera, and the software. The previous Pixel 3 and 3a series have been touted as some of the best camera smartphones ever. And that trend continues with the Pixel 4 this year.
Related read: Here’s why Google is not launching Pixel 4 and 4 XL in India
With Pixel 4 series, Google has included two rear cameras and added four significant new features. More than just sensors, the Pixel cameras are also about computational photography, and the usual Google magic. Let’s take a look at the four Pixel 4 camera features you definitely can’t miss.
The Pixel 4 and Pixel 4 XL now come with a feature known as Dual Exposure. Google states that no other smartphone has this feature, making it a first of its kind. With Dual Exposure, users get controls for two exposures, which can be used to create better looking images. The feature offers slider controls for highlights, as well as shadows. The shadow slider will not change the exposure, but will adjust the tone and mostly affects the darker spots in the scene.
This is a cool new feature, which should allow users to get some amazing silhouettes and artistic looking images from the Pixel 4. This is exclusive to the new phones.
This has to be one of the highlights of the new Pixel 4 smartphones. Technically, this is more of a software feature than camera, but the Pixel does have slightly improved camera and other hardware required to make this work better. With astrophotography, users can get amazing photos of the night sky, which include stars, the Milky Way galaxy, and more. The phone can now take longer exposure shots of the sky via Night Sight. Of course, you’d have to be in a really dark environment to enable this feature, which turns on automatically.
Google recommends that the phone is placed on a rock or a tripod. The feature can take 16-second long exposure shots and uses 15 such frames to stitch together the final image. The phone can take around 4 minutes to finish an astrophotography shot. The sample photos shown off by Google are nothing short of impressive.
Improved Portrait mode
The Google Pixel 4 finally gets a second rear camera. The phones now feature a 16-megapixel telephoto sensor, which has helped improve the portrait mode performance on the new devices. Google has also improved machine learning on the dual-pixel PDAF technology, which has helped the camera distinguish hair and animal fur better than before. The improvements to portrait mode also aid in taking shots of objects that are further away and there’s an added SLR effect to the bokeh in images.
Google has also added a new Live HDR+ mode to the camera software on the Pixel 4 devices. This essentially lets the user view a live HDR preview in the viewfinder before taking the shot. Users can judge the lighting and dynamic range in their photos and click better photos. This is a nifty feature that should improve the overall camera experience.
Apart from these new main features, the Pixel 4 camera also gets machine-learning based white balancing on all camera modes. This was only available with Night Sight on Pixel 3. This should enable better colours in difficult lighting conditions. Google also highlighted the improved Super Res Zoom feature, which now utilises the telephoto sensor to improve digital zoom. The company recommends that users should pinch to zoom instead of cropping after taking a picture. It promises crisp images even with 8x digital zoom.