PPI stands for Pixels Per Inch and is a metric typically used to describe the pixel density (sharpness) for all sorts of displays, including cameras, computers, mobile devices, etc… It is important to understand what it really means in a world where visual computing and visual quality has increased exponentially over the past decade, and where PPI has become a prime marketing tool.

1920 pixels vs 3840 pixels

PPI is an interesting metric, but it cannot be used by itself as a sharpness benchmark because the distance between our eyes and the display is as important as the pixel density itself. If you bring your screen closer to your eyes, you will see the pixels. If you move the device further away, the additional pixel density may not be useful at all, because it won’t be perceptible. Smartphones are used much closer to your eyes than tablets. Computer monitors are a little further away and TV, cinema screen etc are even farther. Because of that, they require different PPIs to achieve the same perceived sharpness from your point of view.

What 20/20 vision really means?

Snellen chart partialLet’s look at how human visual acuity is measured: we have all heard of “20/20 vision”, and it would make sense to think that it means “perfect” or “maximum” vision, but that’s not true at all. The 20/20 vision test comes from the Snellen chart (on the right), which was invented in 1860 as a mean to measure visual acuity for medical purpose. This is important because Snellen was trying to spot low-vision, which is a medical problem. No medical patient has ever complained of having above-average visual acuity.

20/20 vision actually means that you have “normal” vision, which assumes that “most” human can achieve reading all the letters in the chart at a distance of 20 feet (about 6 meters or yards). In short, 20/20 really means “average” vision. People with poor vision will only be able to read the top row of letters at 20 feet, while most of the population can read them at a much greater distance.

The 300 PPI “myth”

steve jobs 300 ppi human limit

The 300 PPI limit is just marketing.

You may have heard many times that the human eye cannot distinguish details beyond 300 PPI. We have heard that for years when discussing Print work, and recently, the launch of the iPhone 4 moved that same myth into the mobile world.

The previous paragraph is key to understanding the 300 PPI claim that was made when the iPhone 4 came to the market. Apple’s CEO Steve Jobs implied on stage that the human eye could not perceive sharpness beyond 300 PPI in the context of Smartphone usage.

Steve Jobs considered that you are holding your phone/tablet at 10-12 inches from your eyes. There was a lot of controversy, but astronomer Phil Bait wrote a good article saying that it depends on how you look at it. He has a less polarizing opinion than many of the articles that came out at the time.

Mr. Jobs’ 300 PPI claim maybe remotely true only if you use the 20/20 vision as a reference. But the (big) caveat is that 20/20 vision does not represent “perfect” vision, not by a long shot. Real human vision limits are actually much higher than that – possibly closer to 900 PPI or more depending on who you talk to. Research from Sun Microsystems estimated the limit to be at least 2X what 20/20 vision is (pdf link), and Sharp thinks that humans can see up to 1000 PPI (pdf link).

No scientific consensus, but research points to higher sharpness limits

The limits of human vision are still under intense research, but just like other human physical activities, there is such an upper limit that would apply for the large majority of the population. But first, it’s necessary to understand how visual acuity is measured from your eye’s point of view. The most common metric that we have seen is the “arc minute” or “minute of arc”.

Arc minutes measure the size of things we see in terms of visual angle. This is convenient because it allows to express the size of things as perceived by our eyes, without regards to where they are in space. Some have proposed using a metric that may seem easier to grasp: pixels per (visual) degree. In that metric, 20/20 vision would be more or less equivalent to 58 pixels per degree of vision. Sony cites that NHK research has measured human visual acuity to 312 pixels per degree while research from NASA mentions 0.5-1.0 arc minutes

While there is no definitive answer to the question, most research points to the fact that 300 PPI does not represent the human visual acuity limit in the context of Smartphonee displays.

How Higher PPI may benefit you (or not)

Since the initial emergence of high DPI displays with the iPhone 4, we know from experience that the human eye can see beyond those 300 DPIs. How far we will go remains to be seen, and we would agree that there is a point of diminishing returns.

In the end, it depends on your own vision: in our experience, most people who own Smartphones that have a PPI higher than 300 can perceive that there is a difference in sharpness. That is especially true when looking at nature scene photos or simply text and icons.

What’s important is that you understand that perceiving details beyond 300 PPI is not some kind of super-human feat, a gift of nature to a few of us. Chances are that you are able to see much more detail than what the 20/20 chart was intended to measure in 1860.

Finally, the level of details that a display can output is not only about what we can pay attention to. Japan’s NHK Researchers point out that smaller pixels and more details are making the overall image look much more real. That’s probably why many people say that 4K TV seems more “real” than 3D TV.

Filed in Cellphones >Tablets. Read more about displays.

User Comments