“The day smartphones surpass the human eye may be near.”
A recently filed patent by Apple has sparked widespread interest.
Its title? “Image sensor with stacked pixels having high dynamic range and low noise.” A mouthful, even for those in tech—but in short, it’s a sensor that’s incredibly good at capturing light and shadow.
According to a report by YMCinema Magazine, this new sensor could achieve a staggering dynamic range of 20 to 30 stops.
To put that into perspective:
- Standard smartphone cameras
10–13 stops
- High-end video camera (Sony α7S III)
about 15 stops
- Human eye
approximately 20 stops
Even 20 stops is impressive—pushing into 30 would mean we’re entering satellite-grade imaging territory.
The Core Technology: Stacked Pixels
At the heart of this patent is a “stacked pixel structure.”
In simple terms, it layers multiple detection levels within a single pixel. Bright areas are captured without overexposure, while dark areas are accurately recorded—combining the best of both worlds.
Even more astonishing is the claim that image noise (graininess) can be significantly reduced.
This means low-light shots could be bright and clear, and harsh backlighting could be rendered naturally—bringing us closer to the ideal of “what you see is what you get.”
For iPhones? Or Apple Vision Pro?
Where might this technology end up?
- Next-gen iPhone cameras
Very likely. But with a 30-stop range, it might be considered overkill.
- Sensors for spatial computing devices like Apple Vision Pro
This might be the primary use case. To replicate the realism of human vision through a headset, ultra-high-performance sensors are essential.
- Professional-grade video equipment in partnership with Sony or RED
Some speculate this could mark Apple’s entry into the professional video space.
Patents Don’t Always Mean Products
Of course, a patent doesn’t guarantee a finished product. Every year, countless promising ideas vanish after filing.
However, this particular patent is attributed to Vladimir Koifman, a well-known expert in the sensor field. His involvement alone makes the chances of this becoming reality much higher.
From “Capturing” to “Experiencing”
If this technology does make it into a future iPhone, it could fundamentally change how we take photos.
We may finally be able to capture the world exactly as we see it—or even better.
At that point, photography wouldn’t just be about documentation—it would be about recreating memory.
Will Apple once again redefine what cameras can be?
We’ll be watching closely as the next generations of iPhone and Vision Pro unfold.
If Apple is truly aiming for a sensor that rivals human vision, it signals a shift from taking photos to feeling them.
At their next event, Tim Cook might just stand on stage and say with pride:
“This changes everything.”