Then how does the single lens front-facing camera measure depth? This technology is called TrueDepth. You might be surprised to know that the TrueDepth camera projects an infrared light pattern in front of the camera. By observing how the pattern is distorted by objects in front of the camera lens, the snapshot software calculates the distance from the camera to each point in the image. From this, it creates its own disparity.
Both technologies then generate a secondary image of depth pixels. But instead of each pixel representing a color, the pixels represent depth at that point. By combining a depth map with colors it is possible to blur or even make transparent parts of a photo such as the background of a portrait.