Behind the scenes: Apple shows how the portrait light works
iOS 11 brought some new features, including the dual-camera capabilities In addition to the depth-of-field effect, the new operating system also makes it possible to adjust the exposure conditions. Apple calls this portrait light and is now in a video on YouTube new insights into the development of the function. In it, the company explains that it has worked closely with the best photographers in the world to combine the principles of light usage with machine learning.
We now know the result as a portrait light. However, this is only used in the iPhone 8 Plus and iPhone X. With the latter, thanks to the TrueDepth camera, it also works with the camera above the OLED display, while with the iPhone 8 Plus, only the dual camera on the back can be used for the effect.
As noted several times over the last few months, Apple uses a sophisticated algorithm for portrait light to calculate how the face interacts with different types of exposures and achieve the desired effect. Apple therefore provides several "light situations": natural light, studio light, contours light, stage light and stage light mono.