6/20/2023 0 Comments Pixel 3xl sleeping dogs wallpaperWould the Pixel 3 benefit from a second rear camera? For certain scenarios – still landscapes for example – probably. We'll reserve judgement until we've had a chance to test the feature for ourselves. The results I was shown at Google appeared to be more impressive than the example we were provided above, no doubt at least in part due to the extreme zoom of our example here. But the claim is that you can get "digital zoom roughly competitive with a 2x optical zoom" according to Isaac Reynolds, and it all happens right on the phone. As expected, the lower your level of zoom, the more impressed you'll be with the resulting Super Res images, and naturally the resolving power of the lens will be a limitation. Not in the default 'zoomed out' 28mm equivalent mode. Super Res only activates at 1.2x zoom or more. There's a small catch to all this – at least for now. The benefits are essentially similar to what you get when shooting pixel shift modes on dedicated cameras. That itself leads to an increase in resolution (since you don't have to interpolate spatial data from neighboring pixels), and a decrease in noise since the math required for demosaicing is itself a source of noise. After alignment, then, you have R, G, and B information for any given scene element, which removes the need to demosaic. If you have enough samples, you can expect any scene element to have fallen on a red, green, and blue pixel. "We get a red, green, and blue filter behind every pixel just because of the way we shake the lens, so there's no more need to demosaic" explains Marc. We get a red, green, and blue filter behind every pixel just because of the way we shake the lens, so there's no more need to demosaicīut Google – and Peyman Milanfar's research team working on this particular feature – didn't stop there. Precise sub-pixel shifts are not necessary at the sensor level though instead, OIS is used to uniformly distribute a bunch of scene samples across a pixel, and then the images are aligned to sub-pixel precision in software. "We can demonstrate the way the optical image stabilization moves very slightly" remarked Marc Levoy. In fact, I was told the shifts are carefully controlled by the optical image stabilization system. Subtle shifts from handheld shake and optical image stabilization (OIS) allow scene detail to be localized with sub-pixel precision, since shifts are unlikely to be exact multiples of a pixel. It uses HDR+ burst photography to buffer up to 15 images 2, and then employs super-resolution techniques to increase the resolution of the image beyond what the sensor and lens combination would traditionally achieve 3. This year, the Pixel 3 pushes all this further. Click image to view the level of detail at 100%. Like the Pixel 2, HDR+ allows the Pixel 3 to render sharp, low noise images even in high contrast situations. And going back in time to the last 9 frames captured right before you hit the shutter button means there's zero shutter lag. Averaging simulates the effects of shooting with a larger sensor by 'evening out' noise. Blurred elements in some shots can be discarded, or subjects that have moved from frame to frame can be realigned. When you press the shutter, the camera essentially goes back in time to those last nine frames 1, breaks each of them up into thousands of 'tiles', aligns them all, and then averages them.īreaking each image into small tiles allows for advanced alignment even when the photographer or subject introduces movement. HDR+ was its secret sauce, and it worked by constantly buffering nine frames in memory. Last year the Pixel 2 showed us what was possible with burst photography. Let's take a closer look at some of the Pixel 3's core technologies. Any technology that makes a single camera better will make multiple cameras in future models that much better, and we've seen in the past that a single camera approach can outperform a dual camera approach in Portrait Mode, particularly when the telephoto camera module has a smaller sensor and slower lens, or lacks reliable autofocus. At a time when we're seeing companies add dual, triple, even quad-camera setups, one main camera seems at first an odd choice.īut after speaking to Marc and Isaac I think that the Pixel camera team is taking the correct approach – at least for now. One of the first things you might notice about the Pixel 3 is the single rear camera. I had the opportunity to sit down with Isaac Reynolds, Product Manager for Camera on Pixel, and Marc Levoy, Distinguished Engineer and Computational Photography Lead at Google, to learn more about the technology behind the new camera in the Pixel 3. With the launch of the Google Pixel 3, smartphone cameras have taken yet another leap in capability.
0 Comments
Leave a Reply. |