Apple patent hints at massive leap in camera sensor tech, nearing human eye dynamic range

4 days ago 2
iPhone sales won't grow says sensor maker Sony | Close-up of iPhone 14 Pro camera module

Apple is investigating new image sensor technology that promises up to 20 stops of dynamic range. That’s a level that surpasses the ARRI ALEXA 35, and gets really close to matching the dynamic range of the average human eye. Here’s what that actually means.

1,048,576:1

A newly published patent, “Image Sensor With Stacked Pixels Having High Dynamic Range And Low Noise,” first spotted by Y.M.Cinema Magazine, reveals Apple’s plans for a next-generation sensor that rivals the dynamic range of current professional cinema cameras.

The patent details a stacked sensor design promising up to 20 stops of dynamic range, which his the ratio between the largest and smallest values of light that can be captured simultaneously without loss of detail. It is measured in “stops”, where each stop represents a doubling or halving of light.

So, a 20-stop dynamic range would essentially mean a 1,048,576:1 contrast ratio with no lost light or shadow in the same picture.

A complicated thing to measure

For reference, while there is no official dynamic range specification published for the iPhone 16 Pro Max sensor, here’s CineD’s compreheNsive estimation of the iPhone 15 Pro Max 24mm camera, which they measured using three different techniques: waveform test (“how many stops can be identified above the noise floor”), IMATEST (“signal to noise ratio for each stop”) and latitude test (“the capability of a camera to retain colors and detail when over- or underexposed”).

The results:

“The waveform shows around 11 stops above the noise floor. Speaking of which, there is almost no such thing as a noise floor – everything is super clean, hinting at massive noise reduction happening internally (there is no way to turn this ‘off’)”

And

At ISO55: “We are getting 12 stops of dynamic range in the iPhone 15 Pro (Max) for a signal to noise ratio (SNR) of 1, and the same 12 stops for a signal to noise ratio of 2. Also for the ‘slope based DR’. This is a clear sign of ‘too much’ noise processing for IMATEST to calculate a meaningful result. It also becomes obvious in the lowest diagram where ‘Noise (% of max pixel)’ is shown. Noise values for the shadow stops are super low.”

(…)

At ISO1200: “IMATEST calculates (higher) 13.4 stops at SNR = 2 and 13.4 stops at SNR = 1.”

And

At ISO55 “We get 5 stops of exposure latitude (3 above to 2 under). This is actually 2 if not 3 stops below the current crop of consumer APS-C or full frame cameras. Comparing it to the aforementioned ARRI Alexa Mini LF, it is 5 stops less exposure latitude. And compared to the Alexa 35, the difference is even seven stops.

Meanwhile, most estimates seem to put the human eye’s instantaneous dynamic range averages around 10-14 stops, reaching 20-30 stops after pupil and retinal adjustments.

A two-layered approach that could spawn new product categories

While Apple has long relied on Sony for its camera sensors, this patent suggests the company might be cooking up something far more ambitious in-house, from the silicon up.

According to the patent, Apple’s architecture combines two layers:

  • A sensor die, where light is captured via photodiodes and custom analog circuitry
  • A logic die, where processing happens, including built-in noise suppression

As Y.M.Cinema Magazine notes, this stacked layout is not entirely new in the industry. Sony reportedly uses something similar. But Apple’s approach brings a couple of twists:

  • First, it includes a mechanism called LOFIC (Lateral Overflow Integration Capacitor), which allows each pixel to store light across three distinct charge levels depending on the brightness of the scene.
  • Second, each pixel includes its own current memory circuit, which measures and cancels out thermal noise in real time, eliminating the need for post-processing cleanup tricks. And interestingly, Apple is achieving this with a three-transistor (3T) pixel structure, rather than the more complex and less noise-prone 4T.

As this Reddit discussion helps understand, by stacking the sensor on top of a chip, Apple basically adds a discrete shutter for each pixel, and processes the image for noise reduction before it even leaves the die.

What does all of this mean, product-wise?

If this sensor ever makes it into a shipping product, it could enable Apple to leapfrog not just its smartphone competitors, but pro camera makers like Sony, Canon, or RED in certain key metrics.

Throw the Neural Engine, and other tricks enabled by Apple’s tight hardware-software integration in the mix, and it wouldn’t be a huge stretch to imagine Apple taking this beyond the iPhone and building its own full-blown proper cameras beyond the hacked gizmo it schlepped together for the “F1 The Movie“.

Here’s what Y.M.Cinema Magazine thinks of what this may bring:

If this tech is implemented — perhaps in a future iPhone 17 Pro or Apple Vision Pro 2 — it could lead to:

  • Cinematic HDR on mobile devices
  • Real-time noise-free video capture
  • Professional-quality imaging in ultra-thin form factors with a very high DR (20-stops of Dynamic Range)

and

Dynamic range and noise are the two main limiting factors in digital imaging. A mobile or compact sensor offering 20 stops of dynamic range and advanced on-chip noise suppression is not just an improvement — it’s disruptive.

This could impact:

  • Mobile cinematography
  • HDR streaming content
  • AR/VR visual fidelity
  • Even professional filmmaking kits where compactness and quality must coexist

As always, take patents with a grain of salt

While this is exciting stuff, a patent is just a patent. We’ve seen time and time again Apple register patents for tech and products that never panned out, so don’t expect anything like this to come out any time soon.

And judging from the comments in the original Y.M.Cinema Magazine story and Reddit, many valid technical questions remain unanswered, some of which seem to border on what is scientifically possible. Camera aficionados are one of the most engaged and detail-oriented people out there, so the skepticism is not surprising, but rather welcome to help bring some extra perspective.

Still, it is interesting to see Apple looking at its in-house chip-making chops, and scoping out where else they can be put to work.

Do you think Apple might be able to pull this off? Do you care? Let us know in the comments.

Best AirTag deals on Amazon

FTC: We use income earning auto affiliate links. More.

Read Entire Article