top | item 44184980

(no title)

Uncorrelated | 9 months ago

Other commenters here are correct that the LIDAR is too low-resolution to be used as the primary source for the depth maps. In fact, iPhones use four-ish methods, that I know of, to capture depth data, depending on the model and camera used. Traditionally these depth maps were only captured for Portrait photos, but apparently recent iPhones capture them for standard photos as well.

1. The original method uses two cameras on the back, taking a picture from both simultaneously and using parallax to construct a depth map, similar to human vision. This was introduced on the iPhone 7 Plus, the first iPhone with two rear cameras (a 1x main camera and 2x telephoto camera.) Since the depth map depends on comparing the two images, it will naturally be limited to the field of view of the narrower lens.

2. A second method was later used on iPhone XR, which has only a single rear camera, using focus pixels on the sensor to roughly gauge depth. The raw result is low-res and imprecise, so it's refined using machine learning. See: https://www.lux.camera/iphone-xr-a-deep-dive-into-depth/

3. An extension of this method was used on an iPhone SE that didn't even have focus pixels, producing depth maps purely based on machine learning. As you would expect, such depth maps have the least correlation to reality, and the system could be fooled by taking a picture of a picture. See: https://www.lux.camera/iphone-se-the-one-eyed-king/

4. The fourth method is used for selfies on iPhones with FaceID; it uses the TrueDepth camera's 3D scanning to produce a depth map. You can see this with the selfie in the article; it has a noticeably fuzzier and low-res look.

You can also see some other auxiliary images in the article, which use white to indicate the human subject, glasses, hair, and skin. Apple calls these portrait effects mattes and they are produced using machine learning.

I made an app that used the depth maps and portrait effects mattes from Portraits for some creative filters. It was pretty fun, but it's no longer available. There are a lot of novel artistic possibilities for depth maps.

discuss

order

heliographe|9 months ago

> but apparently recent iPhones capture them for standard photos as well.

Yes, they will capture them from the main photo mode if there’s a subject (human or pet) in the scene.

> I made an app that used the depth maps and portrait effects mattes from Portraits for some creative filters. It was pretty fun, but it's no longer available

What was your app called? Is there any video of it available anywhere? Would be curious to see it!

I also made a little tool, Matte Viewer, as part of my photo tool series - but it’s just for viewing/exporting them, no effects bundled:

https://apps.apple.com/us/app/matte-viewer/id6476831058

Uncorrelated|8 months ago

I'm sorry for neglecting to respond until now. The app was called Portrait Effects Studio and later Portrait Effects Playground; I took it down because it didn't meet my quality standards. I don't have any public videos anymore, but it supported background replacement and filters like duotone, outline, difference-of-Gaussians, etc., all applied based on depth or the portrait effects matte. I can send you a TestFlight link if you're curious.

I looked at your apps, and it turns out I'm already familiar with some, like 65x24. I had to laugh -- internally, anyway -- at the unfortunate one-star review you received on Matte Viewer from a user that didn't appear to understand the purpose of the app.

One that really surprised me was Trichromy, because I independently came up with and prototyped the same concept! And, even more surprisingly, there's at least one other such app on the App Store. And I thought I was so creative coming up with the idea. I tried Trichromy; it's quite elegant, and fast.

Actually, I feel we have a similar spirit in terms of our approach to creative photography, though your development skills apparently surpass mine. I'm impressed by the polish on your websites, too. Cheers.

lxgr|9 months ago

> Yes, they will capture them from the main photo mode if there’s a subject (human or pet) in the scene.

One of the example pictures on TFA is a plant. Given that, are you sure iOS is still only taking depth maps for photos that get the "portrait" icon in the gallery? (Or have they maybe expanded the types of possible portrait subjects?)

oxym0ron|9 months ago

One thing worth noting: LIDAR is primarily optimized for fast AF and low-light focusing rather than generating full-res depth maps.