Thursday, August 01, 2013

Are we at the limit of resolution improvements that people can notice?




The math behind the Retina display being kinda like our eyes.

Image via KyberVision.

I saw two things over the past two weeks that made me question whether we humans had reached some kind of landmark. They were not, thankfully, YouTube comments.

One was the Ubuntu Edge crowdfunding campaign. I wrote about how the cutting-edge smartphone was so powerful, and so custom-built for certain purposes, that it resembled the kind of bespoke suits one buys on Savile Row. The only thing restrained or modest about the Edge is this take on the screen:

We also believe the race for ever higher resolution has become a distraction. Beyond 300ppi you're adding overhead rather than improving display clarity. We think colour, brightness and dynamic range are now the edge of invention so we'll choose a display for its balance of resolution, dynamic range and colour accuracy.

The other thing was a note from Kevin Tofel, writer at GigaOM, pondering the leaked specifications for the upcoming Moto X phone. Some might say they're not quite cutting-edge, but Tofel begs to differ, at least on the screen and screen-driving power bits:

I'm basing that thought on the little bit of time I spent with the Droid Ultra lineup. These phones too had 720p displays, but you could have fooled me: They looked super crisp to my eyes which thought they were seeing a 1080p screen.

Are Motorola and Ubuntu onto something? Are we floating around the limit of screen improvements that people can actually notice?

In introducing Apple's trademarked "Retina" displays, Steve Jobs made the notable claim that the human eye is unable to notice (that is, differentiate) pixels on the display at a typical viewing distance (hence the name). Because of the different viewing distances and screen sizes, that works out to:

326 ppi for the smallest devices (iPhone and iPod Touch), 264 ppi for mid-sized devices (iPad) and 220 ppi for larger devices (MacBook Pro).

Phil Plait, who writes the Bad Astronomy blog for Discover Magazine (and who helped calibrate the Hubble space telescope camera), writes that those numbers, and Jobs' claim, are basically right. Resolution, you see, is a matter of how close things appear together. The better your eyesight, the more you are able to distinguish between two things—pixels, trees in the distance, surfaces on the moon. At the 12-inch mark Jobs staked out, then, Plait believes that Jobs was right—for people with normal or averagevision:

Let me make this clear: if you have perfect eyesight, then at one foot away the iPhone 4′s pixels are resolved. The picture will look pixellated. If you have average eyesight, the picture will look just fine.

Pablo Artal is a professor of Optics at the University of Murcia in south-est Spain. People have, of course, asked him about the "Retina" and our retinas. To summate his response: nobody is ever viewing a Retina display with truly perfect vision, in perfectly lit laboratory conditions, so it is quite good enough:

Under normal viewing conditions only a few subjects would be able to see the pixel details. On the contrary, under normal viewing conditions a majority of subjects can see the pixels in the old iPad2. So, the difference is really evident. For this type of tablet device the resolution in the new Ipad is an excellent compromise and it is well matched to most eyes. The Apple's vision researcher consultant did a good job in this case!

Where is the other side in this debate? Mostly, it is people who assume Apple lies about everything, even provable things, and it is Raymond Soneira, president of DisplayMate Technologies, quoted in Wired. Soneira said, as Chen paraphrases:

... The eye actually has an angular resolution of 50 cycles per degree. Therefore, if we were to compare the resolution limit of the eye with pixels on a screen, we must convert angular resolution to linear resolution. After conversions are made, a more accurate "retina display" would have a pixel resolution of 477 pixels per inch at 12 inches, Soneira calculated.

Soneira is mostly taking on the idea that "Retina" is a screen that is just like your actual retina. And that assumption that there is room for resolution improvement does assume, as noted previously, that your eyeballs are working just absolutely perfectly, and that there is no signal decay between the thing you're looking at and your brain's visual processing gear. And, again, that you are viewing something in perfect lighting conditions.

And just for piling on, you can count William H.A. Beaudot, PhD, vision scientist at KyberVision, as among those with a view that Retina-ish displays are basically what your eye can see:

Under this normal range of viewing conditions, Apple "Retina Display" would have the capacity to span the full range of normal visual acuity, from 20/20 at 10" to 20/12 at 18", further justifying Apple's claims.

I'm finishing up this post on a Chromebook Pixel, which has, according to Google, the highest pixel density of any laptop: 239 pixels per inch across 12.85 inches. I can tell, just from looking at text, that it is far more crisp and less obviously computer-generated than text on a MacBook Air, or my Samsung Galaxy Nexus phone. But I cannot say it is noticeably better than the text on a nearby 13-inch Macbook Pro with Retina Display, with 227 pixels per inch.

There are always reasons to make things more bright, colorful, and to test the limits of human perception (like, whoa, man). But in this case, I think Ubuntu and Motorola might be right: it's something of a mechanic's game over a certain resolution, maybe around 300 pixels per inch. There are definitely more interesting and important visual aspects with which one might monkey around.

No comments: