Home / Explore Technology / Phones / See the new iPhone’s ‘focus pixels’ up close

See the new iPhone’s ‘focus pixels’ up close

The new iPhones have excellent cameras, to be sure. But it’s always good to verify Apple’s breathless onstage claims with first-hand reports. We have our own review of the phones and their photography systems, but teardowns provide the invaluable service of letting you see the biggest changes with your own eyes — augmented, of course, by a high-powered microscope.

We’ve already seen iFixit’s solid-as-always disassembly of the phone, but TechInsights gets a lot closer to the device’s components — including the improved camera of the iPhone XS and XS Max.

Although the optics of the new camera are as far as we can tell unchanged since the X, the sensor is a new one and is worth looking closely at.

Microphotography of the sensor die show that Apple’s claims are borne out and then some. The sensor size has increased from 32.8mm2 to 40.6mm2 — a huge difference despite the small units. Every tiny bit counts at this scale. (For comparison, the Galaxy S9 is 45mm2, and the soon-to-be-replaced Pixel 2 is 25mm2.)

The pixels themselves also, as advertised, grew from 1.22 microns (micrometers) across to 1.4 microns — which should help with image quality across the board. But there’s an interesting, subtler development that has continually but quietly changed ever since its introduction: the “focus pixels.”

That’s Apple’s brand name for phase detection autofocus (PDAF) points, found in plenty of other devices. The basic idea is that you mask off half a sub-pixel every once in a while (which I guess makes it a sub-sub-pixel), and by observing how light enters these half-covered detectors you can tell whether something is in focus or not.

Of course, you need a bunch of them to sense the image patterns with high fidelity, but you have to strike a balance: losing half a pixel may not sound like much, but if you do it a million times, that’s half a megapixel effectively down the drain. Wondering why all the PDAF points are green? Many camera sensors use an “RGBG” sub-pixel pattern, meaning there are two green sub-pixels for each red and blue one — it’s complicated why. But there are twice as many green sub-pixels and therefore the green channel is more robust to losing a bit of information.Apple introduced PDAF in the iPhone 6, but as you can see in TechInsights’ great diagram, the points are pretty scarce. There’s one for maybe every 64 sub-pixels, and not only that, they’re all masked off in the same orientation: either the left or right half gone.

The 6S and 7 Pluses saw the number double to one PDAF point per 32 sub-pixels. And in the 8 Plus, the number is improved to one per 20 — but there’s another addition: now the phase detection masks are on the tops and bottoms of the sub-pixels as well. As you can imagine, doing phase detection in multiple directions is a more sophisticated proposal, but it could also significantly improve the accuracy of the process. Autofocus systems all have their weaknesses, and this may have addressed one Apple regretted in earlier iterations.

Which brings us to the XS (and Max, of course), in which the PDAF points are now one per 16 sub-pixels, having increased the frequency of the vertical phase detection points so that they’re equal in number to the horizontal one. Clearly the experiment paid off and any consequent light loss has been mitigated or accounted for.

I’m curious how the sub-pixel patterns of Samsung, Huawei and Google phones compare, and I’m looking into it. But I wanted to highlight this interesting little evolution. It’s an interesting example of the kind of changes that are hard to understand when explained in simple number form — we’ve doubled this, or there are a million more of that — but which make sense when you see them in physical form.

Read more

Check Also

Vital Labs’ app can measure changes in your blood pressure using an iPhone camera

If a twinkle in the eye of a venture capitalist could predict the longevity of a startup, Vital Labs is going all the way. During a quick demo of the Burlingame, Calif.-based startup’s app, called Vitality, True Ventures partner Adam D’Augelli’s enthusiasm was potent. The company, which emerges from stealth today, is pioneering a new era of personalized cardiovascular healthcare, he said. Vitality can read changes in a person’s blood pressure using an iPhone’s camera and graphics processing power. The goal is to replace blood pressure cuffs to become the most accurate method of measuring changes in blood pressure and eventually other changes in the cardiovascular system. The app is still in beta testing and is expected to complete an official commercial rollout in 2019. The technology relies on a technique called photoplethysmography. By turning on the light from a phone’s flash and placing a person’s index finger over the camera on the back of the phone, the light illuminates the blood vessels in the fingertip and the camera captures changes in intensity as blood flows through the vessels with each heartbeat. This technique results in a time-varying signal called the blood-pulse waveform (BPW). The app captures a 1080p video at 120 frames per second and processes that data in real time using the iPhone’s graphics processing unit to provide a high-resolution version of a person’s BPW. The startup was founded by Tuhin Sinha, Ph.D., the former technical director for UCSF’s Health eHeart Study. He’s been working on the app for several years. “Part of the reason this project strikes a chord with me is because if I look at the stats of my own family, I probably only have 20 years left,” Sinha told TechCrunch. “Most people on my dad’s side of the family have passed away before 60 from cardiovascular disease.” Prior to joining UCSF, Sinha was an instructor at Vanderbilt University and the director of the Center for Image Analysis, where he directed and developed medical image analysis algorithms. He linked up with True Ventures in June 2015, raising a total of $1 million from the early-stage venture capital firm. “[Sinha] saw an opportunity to improve a stagnant practice and invented a new approach that takes full advantage of today’s technologies,” True’s D’Augelli said in a statement. Three years after that initial funding, Sinha says Vital Labs is looking to raise another round of capital with plans to create additional digital tools to advance cardiovascular health.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Disclaimer: Trading in bitcoins or other digital currencies carries a high level of risk and can result in the total loss of the invested capital. theonlinetech.org does not provide investment advice, but only reflects its own opinion. Please ensure that if you trade or invest in bitcoins or other digital currencies (for example, investing in cloud mining services) you fully understand the risks involved! Please also note that some external links are affiliate links.