Google made its newest smartphones official today, unveiling the much-leaked Pixel 4a 5g and Pixel 5. Both smartphones will get the same, improved cameras, despite a $200 price different between the models, which is great news for people who are specifically coming to Google for their excellent mobile camera tech. Here’s an overview of what google did with the new and improved Pixel cameras in terms of both hardware and software.
The biggest new physical change to the new Pixel phones is the addition of a new ultrawide lens to the camera array on the back. This provides a new wide angle field of view that lets you capture a significantly larger perspective, which is great for large group shots and landscapes. This was one of the features that Apple added to the most recent iPhone that Google fans were looking for on their Pixel devices.
Here’s an example of the additional coverage you’re getting (roughly, since the first shot likely wasn’t actually filmed on Pixel):
HDR+ with bracketing
The HDR+ feature of Google’s Pixel phones is also very popular with users, providing a way for people to get better lighting in their photos without having to worry about compositing images after the fact to adjust exposure in different parts of the scene. Google has upgraded its HDR+ feature by combining its own machine-learning powered techniques, stacked with traditional, much more old-school exposure bracketing for what the company says is a better final product.
Night Sight in portrait mode
Portrait mode has been popular since its introduction on smartphones, and has improved over time to allow people to get a more accurate depth effect with artificial background blur. Google added the ability to use portrait mode with its Night Sight feature with this generation of devices, meaning you can get that kind of depth effect even when you’re using Google’s software trickery to increase the illumination in a dark scene for clear, static-free results like the shot below.
Another portrait mode feature is the addition of portrait light, which lets you apply a customizable lighting effect to do things like counteract deep shadows or washed out potions of the image. This works similar to Apple’s studio lighting effects in its own portrait mode in iOS, but it looks to be considerably more customizable, and potentially more powerful thanks to Google’s AI tech on the Pixel devices – though we’ll have to get them in for testing to know for sure.
New stabilization for video, including Cinematic Pan
Finally, there are three new stabilization modes for filming video on the new Pixels – Locked, Active and Cinematic Pan. These were built using tutorials on YouTube, Google said during its event, as well as by studying Hollywood cinematographers. Cinematic Pan looks like potentially the most fun for YouTubers, since it gives that silky smooth, slowed down effect (it’s half actual speed) that makes it look straight out of a film travelogue.
Source: Tech Crunch