Sunday, May 26, 2019

Magic 3D Cameras. How do they work?

A dual lens camera, like that on the back of an iPhone X, uses the lens separation to detect a parallax shift to measure 3D depth? Objects closer to the camera, shift by a greater distance between the two lenses. This parallax shift, based on distance, is called disparity. 




Then how does the single lens front-facing camera measure depth? This technology is called TrueDepth. You might be surprised to know that the TrueDepth camera projects an infrared light pattern in front of the camera. By observing how the pattern is distorted by objects in front of the camera lens, the snapshot software calculates the distance from the camera to each point in the image. From this, it creates its own disparity. 



Both technologies then generate a secondary image of depth pixels. But instead of each pixel representing a color, the pixels represent depth at that point. By combining a depth map with colors it is possible to blur or even make transparent parts of a photo such as the background of a portrait.