Friday, October 13, 2017

The reason portrait mode works on some iPhone models


When Apple introduced the iPhone 7 Plus last year, it contained a new camera feature that quickly became one of the most commented and copied : portrait mode .
This mode uses dual camera phones and Apple software to mimic the quality you would get from a digital SLR camera , which keeps the subject of the photo in focus and slightly blurs the background.
The feature originally released in beta as an exclusive for the iPhone 7 Plus. But a year later, the portrait mode is in the new iPhone 8 Plus and it will be in the next iPhone X .
Shortly after Apple introduced portrait mode in 2016, the feature appeared on other flagship phones such as the Samsung Galaxy Note 8 (called Live Focus) and Google Pixel (called Lens Blur).
In the case of Pixel phones , which only have one purpose, Google relies on the software to achieve that quality portrait mode . Apple iPhone requires two lenses to make this happen, at least for now. So if you buy the new iPhone 8, for example, you will not have the ability to take photos in portrait mode.
Apple's portrait mode requires two lenses because each lens is different: one is a 12-megapixel wide-angle lens, while the other is a 12-megapixel telephoto lens. When taking a picture in portrait mode, the two lenses serve different purposes .
The telephoto lens is what really captures the image. While doing so, the wide-angle lens is busy capturing data on how far the subject is, which it then uses to create a nine-layer depth map .
That depth map created by the wide-angle lens is crucial to the end result as it helps Apple's image signal processor to figure out what should be strong and what should be blurred.
The image above shows what a standard iPhone photo looks like (left) and how a photo is in portrait (right) mode. At a glance, the image on the right looks like it only has a totally blurred background , but this is where the depth map comes into play.
In order to make the photo look natural and as close to a real DSLR photo as possible, the Apple image processor goes through the layers one by one and blends in varying amounts , an effect known as " bokeh ".
The layers closest to the subject will be slightly sharper than the furthest layers , and if you look closely at the photo above, you can say: The things that are close to her in the photo, like the long grass and the wooden slab in the ground is much easier to make than the cliff in the distance, which is only a dark, fuzzy shape.
Portrait Mode works best in people and objects , and there are some limitations to the feature: the amount of light present and the distance of the subject. Portrait Mode does not work well, or not at all in low light . If it is too dark for the function to function, a message will appear inside the camera application to inform you.
It will not work if you are too close to the subject you are trying to capture . To take a picture in portrait mode, it can not be within 48 centimeters of the subject.
It is also worth noting that portrait mode works best when there is a lot of contrast between the subject and the background . If you photograph a cup of white coffee on a tablecloth of the same color, for example, the sensors may have trouble deciding what should be in the focus and what should be blurred.
If you take a picture in portrait mode but change your mind, you can remove the background blur after you do it. Just choose the photo, touch the Edit button and then touch "Portrait", which will be at the top of the screen . The photo will resemble a standard iPhone jack.
Portrait mode is available on the iPhone 7 Plus, the iPhone 8 Plus and the iPhone X. The last two phones also have portrait lighting , which artificially adjusts the lighting around the subject to add different effects.
The iPhone 8 and 8 Plus are already on sale, and the iPhone X will arrive on November 3.
Copyright © 2015 NewCydiaTweaks | Privacy Policy Privacy Policy | Disclaimer NewCydiaTweaks
Scroll To Top