The Resolution of the Human Eye

By on June 29, 2006

What is the resolution of the human eye, does anyone know?

All these new technologies are increasing video resolution. But what’s the max resolution from which humans can benefit? Put another way, at what resolution does it not matter anymore? At what resolution are any further increases indistinguishable to humans?

Just wondering.



  1. Chris says:

    C'mon Deane. Simple google search yielded this as the first entry:

    Goes into some nice detail about not only the resolution, but things like frame rate and light sensitivity as compared to cameras.

    A brief excerpt:

    Based on the above data for the resolution of the human eye, let's try a "small" example first. Consider a view in front of you that is 90 degrees by 90 degrees, like looking through an open window at a scene. The number of pixels would be 90 degrees * 60 arc-minutes/degree * 1/0.3 * 90 * 60 * 1/0.3 = 324,000,000 pixels (324 megapixels).

    At any one moment, you actually do not perceive that many pixels, but your eye moves around the scene to see all the detail you want. But the human eye really sees a larger field of view, close to 180 degrees. Let's be conservative and use 120 degrees for the field of view.

    Then we would see 120 * 120 * 60 * 60 / (0.3 * 0.3) = 576 megapixels.

    The full angle of human vision would require even more megapixels. This kind of image detail requires A large format camera to record.

  2. Matt R. says:

    I think this is like comparing apples to oranges. However it is interesting.

    The resolution of an image is highly dependant on the dpi of the output device for which it is intended. For example, some typical numbers for output devices:

    monitors ~72dpi magazines ~300dpi billboards ~45dpi

    A low resolution image can look outstanding on a monitor but very poor when printed in a magazine. You've all seen the cheesy ads in the back of magazines with the blocky photos right?

    You can print an 8x10 photo from a 3.2 Mp camera with good results. We could have easily stopped the development there but people were conditioned to look for the most megapixels they could find on a camera. Ok, the higher megapixel sensors are more forgiving because it allows for poor composition on the front end and more cropping on the back end. But let's be realistic, how many times does the average consumer print 8x10s? It is typically 4x6 these days isn't it?

    If we could make a parallel, I think 324 megapixels for the human eye is a bit of a stretch. Using that logic, I could say well I have a 3.2 megapixel camera but I keep moving it around so it is actually much more. You could put a wide angle lens on a 2.0 megapixel camera and capture the same 90 degrees by 90 degrees and if the scene were bright enough you could push the depth of field to near infinity and everything would be in focus at once - the eye can't do that. Apples vs oranges. . .

  3. Brenda Helverson says:

    The Gigapixel Project explored this topic in a lot of detail. This project was an attempt to create a camera with each component approaching or exceeding the maximum possible performance.

    Here's the page discussing eye response.

    I haven't plowed through this information in a while, but the eye response is characterized by a MTF (Modulation Transfer Function) and a resolution in terms of cycles/mm.

    IIRC, the limiting performance factor was the response of the red film layer.

  4. Mike says:

    From The Limits of Human Vision Michael F. Deering Sun Microsystems

    This is the conclusion "A model of the perception limits of the human visual system was given, resulting in a maximum estimate of approximately 15 million variable-resolution pixels per eye. Assuming 60 Hz stereo display with a depth complexity of 6, it was estimated a rendering rate of approximately ten billion triangles per second is sufficient to saturate the human visual system."

  5. Chris says:

    That Gigapxl camera is a fascinating piece of equipment. The resolution and detail available from that camera are, literally, unmatched by any other camera ever made. It's definitely worth taking some time to browse the image gallery on their site, and to read a little bit about their design and problem solving approach towards creating the camera.

  6. Laura says:

    Look at a ruler at arms length, can tell millimeters apart easily but below that its difficult, so roughly the lateral spatial resolution is about 0.1 mm.

  7. Applying to the Rayleigh Criterium, an eye with a 3mm pupil should resolve 2.2mm 10m away. The Dawes Limit puts this at 1.9mm. This probably applies to point sources, but definitely not for normal viewing. Experiments show that the eye can comfortably achieve such resolutions being 3m away from the object. That's 70% less than "advertised". Check it out for yourself!

  8. From what I am reading these people are making educated guesses. These are so because there is some sort of unquantifiable element at play here. I think the comparison of Apples to Oranges is perfectly accurate.

    From an unrelated topic i think I have pulled what MIGHT be an answer:

    1000000000000000000000000 FLOPS is a theoretically the amount of computing power required to simulate a human brain. In theory I would imagine that any image delivering more digital information that this would be beyond the perceptive range of humans, again theoretically.

    But i am not an expert. I find it hard that the real would can be quantified digitally though.

  9. Chris W says:

    Apples to Oranges is right. A less subjective question which can actually be answered scientifically is at what resolution can the human eye perceive a difference given a constant viewing distance and viewing dimensions. An experiment could be set up where a random sample of people could view the same image of the same size at the same distance. The image would then be replaced by the same image, of the same dimensions, but of a higher resolution. The resolution at which the human can no longer tell the image was replaced would be the usable "resolution of the human eye" given that viewing distance and dimensions. Also mind you, the human eye does not take an entire image in at once like a camera, it is constantly scanning and your brain actually compiles a coherent image. Thus, if the human is specifically focusing on trying to tell the difference in resolution, their eyes will behave differently then say if they were watching a movie, unless the movie made the viewer super focus on one patch of the screen in the same way a human in this experiment that know they are being tested on resolution perception might super focus on one part of the image solely to tell the difference in resolution.

  10. Shai.P says:

    The human eye would have to be over 20,000p resolution and we will not get that kind of technology until some decades have passed

  11. flash says:

    20,000p at what distance?

  12. As far as I have herd from a Danish 'eye-professor', the answer is between ½-1 arc second.

    An arch minute is one 100th of a degree. A arc second is 100th of a arc second - and that is the resolution of the eye.

    It all comes up to how far away the object you are looking at.

  13. fred bloggs says:

    The real issue with these types of estimations are that they fail to reflect both the differences in human vision between individuals and they known information related to the clarity of vision which is in focus and that which is peripheral.

    The average healthy person does not have perfect vision after the age of 25 and if their vision was ever perfect the actual focused and clear portion of their vision is actually very small.

    The actual area that your eye can focus on is an area the size of the thumb-nail held at arms length. Try this experiment to see (no pun intended);

    Hold a piece of printed text that you can read (type face large enough that you can read comfortably with whatever eye-ware you may need but not bigger than about 15pt text). Look at a word on the page that you clearly read and then cover that word with your thumb. Try to read the text surrounding your thumb WITHOUT MOVING YOUR EYES, YOUR THUMB OR THE PAPER) and if you haven't cheated somehow you won't be able to.

    The point of that test is to explain that your eyes really only focus clearly on a very small area and all the rest is your peripheral vision. Of course when it comes to a TV\PC screen the entire image needs to be in detail as you could be looking at any area but the reality is that the distance at which the detail is presented has a lot to do with the level of clarity you can see.

    Because the focal length of your eyes and the size of the are in true focus is of such importance it has to be taken into account when estimating what the resolution of your eyes actually is.

    When it comes to technology it means that putting HD displays in hand-held devices is of benefit and creating ever increasing resolutions for ever larger displays that inevitably get positioned further away is a waste of time. Think of it a bit like a billboard display made with 1" pixels, it looks good at a distance but just looks like a wall of multicoloured squares when viewed close-up.

    There is a point whereby a very high resolution becomes overkill though for example the rumours that Apple is planning to put an HD 'Retina' ('Retina' is an Apple sudo-technology with no scientific basis for existing) in a newer edition of the IPad2 upgrading the existing 1024x768 of the existing IPad2. If the pixel-count is made 'Retina' or doubled which is essentially what happens, the scene resolution becomes 2048x1546 which is of course greater than existing 1080HD (1920x1080) being utilised in a screen that is only 9.7" (that's equivalent to more than 211dpi ). Now consider you don't hold an IPad at arms length much much closer that relative dpi of the image increases.

    If you use that arm distance clarity estimate combined with how you view your TV you can perform another test;

    Sit in front of your TV and at arms length hold up a ruler and measure your screen (relative) to the ruler. e.g 6 feet away from a 32" 1080p screen should be perceived as roughly 8" the same relative DPI of your screens resolution is about 240dpi.

    If I personally hold my IPad2 at the distance I usually do and measure it's relative dimension (appears as 8") the relative dpi of the image (1024x768) is just 128dpi and still looks very good for photo and video viewing. If your where to do the same with the rumoured Retina IPad2 the relative would be 256dpi, at 6" - 341dpi and so on and so on.

    The minimum focal distance of the human eye is roughly 7" so the some of these displays details is simply wasted if used in very close proximity to the face.

    At the end of the day your eyes resolution is both subjective and relative in nature, differs from other individuals due to many factors and although they are very clever, your eyes really don't work as well as tech and media industries would like us all to believe.

  14. Malcolm LeCompte says:

    "the answer is between ½-1 arc second. An arch minute is one 100th of a degree. A arc second is 100th of a arc second - and that is the resolution of the eye. It all comes up to how far away the object you are looking at."

    The above statements are incorrect:

    An arc minute is 1/60th of a degree, an arc second is 1/60th of an arc minute.
    Angular resolution (acuity) is independent of the distance to the object being observed.

    The highest resolution (visual acuity) is about 0.5 - 1 arc minute, decreasing rapidly with distance away from the Fovea.

  15. Shai.P says:

    @flash 20,000p or more for viewing scenes, for instance, at a scene like the effel tower or empire state. It would most likely be less than 20,000 indoors. It is an analog eye but send information digitally to the brain. The refresh rate is numerous but to match fps (frames per second) our eyes would see, and it does vary in some people, would be over 60 fps. Because of motion it would have to be smoother than 24 fps. 24-50 fps is strobbing and irritates my eyes. I believe I would see 120 fps or more from my eye sight. Until that time which would be some decades away, enjoy your 1080p 120+ hz tv.

  16. Derek Dawes says:

    So for us less educated; basically what your saying is that the "depth of field" is like looking at something in your home and noticing that your peripheral view is blurred but as you change your view you change your peripherals. A TV or any object couldn't operate in a way to change its peripherals to trick us into thinking its real? Or would this be accomplished by surrounding the entire peripheral view of the person?

  17. Fred says:

    Good grief you people are nerds. I haven't seen anything on here that even begins to answer what is a simple question. What he is asking comes out to be "At what number of dpi would it be useless to add more dpi, since we can't tell anymore?". Now if you don't know, don't bother posting a long-winded excerpt of pseudo-intellectual b.s..

  18. Particle says:

    dear Fred,

    actually there's all the info here you need, or better: there's all the info a halfway smart person needs to get the picture. don't make the classic self-deceptive mistake of assuming that there is a simple answer and when not getting it accusing smart people to be pseudo-intellectual. the question was and still is: "what resolution does the human eye have?" and not "at what DPI you can't distinguish single pixels anymore?" the eye just does not work that way. the concept of DPI only makes sense at a fixed distance as it only refers to the amount of pixels in a given area. move move this area away from you and you get smaller pixels but you will still have the sameamount of pixels per inch. that is why it is measured in arc-minutes/seconds. dumbo.

  19. Hitesh says:

    Great reply Particle!!

Add a Comment