The iPhone's "Portrait Mode", shallow depth of field in portraits, and the future of computational photography

It can do nine, according to the article. Given the compresses space in my very shitty sample shot, it's probably using fewer.

The sample picture obviously isn't perfect—there are no faces to detect, and all the light is coming from the top/back, causing the top of the battery (and the surface behind it) to be blown out and be difficult to mask. The relevant bits are the blurriness applied to the text on all three batteries. Sharp, fuzzy, blurry. My shitty picture does bring up a good quote from the article, though:

The funny thing about test photos is that there's often nothing worth photographing in them, so you just stare at the problems. In my own testing, whenever I've pointed Portrait Mode at something I actually care about, the results have been solid.

Here's a photo of an actual person taken at night with just streetlights.

https://imgur.com/a/69KJf

It's grainy (again, streetlights), but I like it. iPhone did a good job of approximating the amount of blur based on distance. The blurred version is better than the version without the blur. The to

The point of the article is that the shooting mode offers people DoF effects when they don't have big-boy gear on them. All I'm trying to prove with my pictures and this thread is that the effect is in fact "distance-based".

/r/photography Thread Parent Link - prolost.com