Night mode: Google explains how it works for astrophotography
The Night Mode introduced with Google’s Pixel 3 (in Google Camera) has then been improved with Pixel 4! Google wanted to explain how images are captured, particularly when it comes to astrophotography.
Introduced with Pixel 3 and improved with Pixel 4, Google’s Night Mode (Night Sight) allows you to take photographs, which are then processed by the software with the help of Artificial Intelligence. With the latest smartphones it is also possible to take advantage of this mode for afstrophotography. Now Google has explained in more detail what is behind it.
The Night Mode of Google Camera allows you to take several photos in sequence thus reducing the necessary exposure time. The photographs are analyzed by the AI to align them, reducing the effect of involuntary movements. Furthermore, by ” joining ” the various shots it is possible to reduce the noise and increase the details of the image.
The Night Mode of Pixel smartphones and astrophotography
Even for the astrophotography part, the Night Mode works in a similar way. In this case, having to capture even more light, it is necessary to increase the exposure time for the various frames captured in sequence. This leads to the limitation of having to use a tripod or stand of some kind to avoid blurry photographs.
Considering the focal length of smartphones like the Pixel that support this mode, the maximum time chosen by the engineers is 16 seconds for 15 shots (capturing about 250 times more light than a classic shot). This allows the stars to appear point-like and not stretch too long in capturing the image.
To correct the problem of hot pixel, the image is first analyzed and then compared with the others. When a pixel has a value outside the average of the others, it is corrected. Always AI reduces the brightness of the sky to make the scene more realistic and better representative of reality.
Thanks to the Night Mode it is also possible to see the long exposure image while it is captured so as to understand if there are corrections to be made for the next test. Thanks to a neural network and the analysis of 100 thousand images of various types of sky, the app recognizes what ” is sky ” from what is not.
Always the ability to recognize the sky it is possible to highlight details such as clouds, stars, the Milky Way and everything the user would like to see at best.
On Pixel 4, considering the use in low-light conditions, an operation dedicated to autofocus is also performed. For example, two photographs are taken (up to 1 second as the exposure time) to automatically determine the focus. If even this system does not work, Night Mode focuses on infinity.
One of the improvements that Google’s engineers have not yet managed to correct is the possibility of obtaining defined photographs in the presence of lampposts or particularly bright / clear objects. In this case, the difference between the low brightness of the sky and the lighter object is still a difficult problem.
Product prices and availability are subject to change. Any price and availablility information displayed on Amazon at the time of purchase will apply to the purchase of any products.