More

    Google explains how the Live HDR+ feature on the Pixel 4 and 4a works

    Google Pixel 4 and 4a supports Live HDR+ and Dual Exposure, features first introduced with the Pixel 4 (note that these won’t be backported to older devices). The tech giant published a detailed blog post explaining how the two features work.

    Google Pixel 4 and 4a

    First, what is Live HDR+? It shows a real time preview of what the final Live HDR+ photo will look like. Note that this is just a preview derived using a different algorithm. The real Live HDR+ takes 3-15 underexposed photos (to reduce noise), aligns them and merges them.

    After the merge, tone mapping is applied to produce the final photo where details in both highlights and shadows are visible. The phone computes a 2D histogram to achieve that, which is interesting to see.

    However, current mobile chipsets don’t have the computational power to do that 30 times per second. Instead, a dash of AI is used. The image is sliced into small tiles and the AI predicts the tone mapping for each of them. Then every pixel on the viewfinder is computed as a combination of the tone maps from the nearest tiles.

    Here’s a comparison between the predicted image and the actual Live HDR+ result. It doesn’t get it quite right, but it looks pretty close (especially since you’ll be viewing this on the phone’s screen).

    Google Pixel 4 and 4a

    Balancing highlights and shadows is done automatically by Live HDR+. The Dual Exposure sliders give you manual control over the process, so you can get the desired look for your photo in camera. Traditionally, this is something you would do afterwards by processing the RAW file.

    Google Pixel 4 and 4a

    They also introduce dual exposure controls. When the user taps on the Live HDR+ viewfinder, two sliders appear. The “Brightness” slider works like traditional exposure compensation, changing the overall exposure. This slider is used to recover more detail in bright skies, or intentionally blow out the background and make the subject more visible. The “Shadows” slider affects only dark areas — it operates by changing the tone mapping, not the exposure. This slider is most useful for high-contrast scenes, letting the user boost shadows to reveal details, or suppress them to create a silhouette.

    If you want a more detailed explanation of how all of this works, you can follow this link or the source

    Also read: Google “Pixel 5a” first mention and appears in AOSP

    Recent Articles

    Samsung Galaxy S30 Design Leaked, Looks Awful

    Overview After launching the Note 20 Series, Samsung is working on its next Phone. Which would be the...

    iPhone 12 Disappoints In AnTuTu Benchmarks, Uses Underclocked A14 Bionic Chipset

    In my previous article, we discussed about Apple removing the charger from the box and add 5G bands. Not just the charger,...

    Samsung To Remove Chargers After Making Fun Of Apple

    On October 13th Samsung uploads the picture of its travel adapter and mocks about Apple that “we are giving you what you...

    Vivo V20 Launched in Pakistan and India With High-End Camera and Design

    Another Vivo’s mid-ranger Phone of V-series has now launched in both Countries Pakistan and India. This V20 has a 64MP rear camera...

    Another Day Another Phone, Oneplus has Officially Released OnePlus 8T

    The wait is over now, The OnePlus 8T is here with an amazing spec sheet. A $749 5G smartphone with 120Hz Display...

    Related Stories