Visual perception

There is lots of talk about ever more pixels in cameras, 4k, 8k, etc, in movies too. I found a site that has two videos. First one is pretty short (10 minutes) showing various identical scenes from 6 video cameras (apparently some or all that are used professionally). But the second video, which is over an hour, has been more interesting to me as they do quick flips back and forth between two systems.

The point the person is making is that once you get past a certain point, the pixels typically make less difference than other aspects of movie making involving differences in cameras as well as post editing. In some cases,you’d be hard-pressed to tell any difference between two videos even though there is a big difference in pixels. And in some cases, it is the one with less resolution that looks better. A big surprise was one example in the second video where they zoomed in a lot - exactly where you think more pixels would matter the most. Apparently that is not always so!

So, after experiencing the advantages of switching to a Mac Mini combined with a 43 inch screen, I thought the stuff I learned from these videos shows that even when we can get 8k movies, they may well not actually matter - other than as an advertising gimmick to spend far more money for improvements you likely won’t recognize.

http://yedlin.net/ResDemo/

2 Likes

There’s that old saying about camera stuff. A bad photographer cares about their camera. A good photographer cares about their lens. A great photographer cares about the light.

5 Likes

Pixels may not matter for viewing, but it is important for the photographer. More pixels to work with in production even if 4K is the output will make the output better. Some examples:

  • Cropping photos (crop half the photo area and you still end up with 4K)
  • Dealing with image stabilization (movie camera moves images around to offset camera movements, which eats up pixels)
  • Pan and scan (like cropping for photos but for movies)

There are more technical issues associated with picture quality, but this should give you an idea of why higher resolution at data acquisition matters.

I’m all in favor of more pixels but what I saw in the examples was that there can be other factors that can be more important making one with lesser pixel better than one with more. Even to the lesser one actually looking sharper. And at least one which showed zooming in - exactly what you think would favor more pixels, but at least in some cases, it didn’t.

Of course, in some cases more is better. One of the points the video made is that a photographer or movie maker will change cameras based on the precise result he needs in a scene.

But for most of us, we aren’t going to be in need of some of these extreme benefits.

It’s like my 43” TV. Compared to an iMac 4 or 5k screen, mine probably isn’t as accurate and may not be as responsive. But I’ll also probably not be impacted by those lesser abilities. But I absolutely am impacted by the size. It has completely changed how I organize things. And saved me a lot of money.

You have hit on my area of expertise!

The first thing I need to correct you on is comparing the idea of more pixels on a camera (input) and a display (output). While there are similarities, the reasons for why or why not to use more pixels is a bit different.

With cameras, the reason why sometimes a camera with lower resolution looks better is related to the Modulation Transfer Function (MTF). This is the cameras ability to capture detail. With more pixels, each pixel is smaller and captures an smaller portion of the scene. By making it smaller it is also less sensitive to light which can introduce noise that can degrade the image. But also, after having tested the MTF of many still cameras (not video) I can also tell you that increasing the number of pixels has basically reached a physical limitation. In other words, while the sensors can be made with even smaller pixels, they will not capture more detail because there is no lens on the planet that can render that detail. In other words, we are capturing detail at the limit of the lenses available to us.

That said, it would important to note in this video test that you saw what the lenses are that were used, as this can have a profound effect on the result.

With regard to more pixels on a monitor, there is not any real need to go beyond 4K, and even 4K is questionable. The problem here is our eyes. While a camera lens is the limiting factor on how much resolution can be captured in a camera, our eyes are the limiting factor on how much can be “captured”, or seen, by our visual system. At the viewing distances expected for large format displays, 4K is already beyond the resolving power of our eyes. For a computer display, where your viewing distance is considerably closer, there is zero advantage to a higher resolution display. Anyone who tells you they can see the difference is either reacting to something else (perhaps more aggressive image sharpening on the higher resolution display) or they are convincing themselves they see it because they want to see it.

Our eyes cannot resolve that much detail.

8 Likes

Looking at screens on our phone as an example, it feels ridiculous with current resolution for normal use. However for VR applications I have read that approximately 1200 DPI will be the limit. And we are still a bit away from that so I assume that phones will still get some higher pixel density over the years.

Did you watch the videos (especially the second one where the comparisons are much easier to see)? I’d be interested in your opinions of those.

I’ll have to review them later myself to see when they are talking about the camera vs the display.

I do remember that he talked about the noise issue as you get the larger number (but smaller) pixels and made comparisons of those.

I know a little about the limits of our eyes. I guess that first became widely discussed with the first Apple “retina” screens. Supposedly making pixels so small, your eye wouldn’t particularly see anything better if there were more ppi. Of course, there will always be debate over exactly where that line is drawn and naturally some people’s ability will be better/worse than others.

And how something is used matters. A 1080p TV seen from a fairly traditional distance is going to still be pretty good. But my 4k is used mainly as my monitor, so I’m not much more than an arm’s length away. So I would say the 4k vs 1080p difference would be more obvious. As I recall, watching some videos in iTunes on this TV, Apple doesn’t allow 4k (the files you actually download are below that). But if you run them through an Apple TV instead, pulling the video from the cloud, I get 4k and I could tell the difference. I don’t think I’d see that going to 8k. Of course, 8k TVs may come with other advantages which people may chalk up to 8k, but may be totally separate from it.

I’m sure a lot of “self-convincing” is involved too. Sort of like wine tasting where I’ve seen tests that show even expect wine tasters aren’t that good a differentiating wines!

For me, I can get by quite well with lower quality (I mainly wanted the 4k to make text as clear as possible, actually. On my iMac (not 4k), I could see pixels. But I grew up with black and white TVs. Even when color became pretty common as prices dropped, I was still saving money by buying small black and white portables (usually 12 inches). My first color TV was not until 1988 when I was almost 40 years old! So, other than text, I can still watch an old black and white and enjoy it just fine. But I must say, when I got an oled iphone, even with the small screen, it was pretty cool to see movies with scenes in space and get those deep blacks instead of a gray!

PT_Ken - you’re right that many people wrongly assume more pixels always = better image quality.

In 3D laser scanning science, we had to use extensive MTF analysis to deliver objectively better 3D point accuracy and density.

Competitors would puff their specs for accuracy, by ignoring for example, the sample density. 20 micron accuracy is a BS number if your sample density is 2000 microns on the target surface. Yet many 3D scan industry players use this trick of specsmanship to mislead buyers.

In our case, to improve MTF, we even did chromatic deconstruction into time-sequential filters to get the photonic yield per pixel way up. Basically a much larger monochrome pixel area with backside imaging fab technology let us capture far more photon steradians returned from the laser-illuminated target surface than any competing products. It works really well.

On screens, it’s true that past a certain point, we simply don’t have enough rods and cones in our eyes to perceive more pixels.

However, there’s another use-case, with different qualitative metrics that matters for a human viewer of a screen - the size and distance to the screen relative to the human eye’s field of view.

It turns out there’s no upper limit to pixel count that may potentially confer some benefit, but not in the way you might first think.

When you consider the eye’s field of view and distance to the image, it’s not gross pixel count but rather pixel density at the viewing distance that rules.

So, a case where more pixels can actually help can be seen in this thought experiment -

You have an 8K wall screen at 40 foot diagonal, that’s 25 feet away.

With 33 million pixels, your eye definitely won’t see the 33 million vs 8.3 million from 25 feet away.

However, because it’s actually a wall in a room, you can walk right up to it. There’s definitely a benefit to more pixels.

If you ‘zoom in’ by walking up - 2 feet away, the quality of the 33 million pixel image is way better.

The caveat here is that at that distance, your human field of view cannot take in the whole of the wall, and therefore you can’t see all those pixels in that 2 foot close-up glance. So you have to move your head around to get the benefit.

From our tests, 4K at 43 inch diagonal at 3 foot fixed viewing distance is quite a sweet spot.

You notice right away if you revert to an HD screen.

Interesting thread by many smart people.

7 Likes

And don’t forget the aperture. At the current (full frame photo) camera resolution, everything at f/5.6 (?) and beyond has already less visual information than the number of pixels on the sensor. Doesn’t matter how good the lens is.

1 Like

True. To try and keep the discussion somewhat simple and on target, I assumed optimum aperture for my comment. We always determine the optimum aperture for a particular lens by testing and analysis before we do any MTF testing.

Seriously, I could go on about this stuff all day if you let me. :slight_smile:

I had not watched them yet. I’ve now watched the first, but the second is too long to watch right now. I’ll check it out later. Looks interesting.

I don’t disagree with a single item he claimed.

Although I have some involvement with video, I am mostly familiar with still imaging which has slightly different criteria. I am on the still image working group for the Federal Agencies Digital Guidelines Initiative (FADGI) http://www.digitizationguidelines.gov/still-image/. We regularly use cameras with 80-100 megapixels and even up to 150 megapixels.

One thing we have found is that as the pixel count increases, manufacturers use tricks to make the MTF look better. For example, one manufacturer clips all image data below a certain threshold to solid black so that the noise measurements will look better. They also add aggressive sharpening filters so the apparent sharpness of the image is very good. This happens on the chip before delivering the raw image, so you cannot turn it off (these are the highest end professional cameras). Although the MTF measurement produces an excellent number and it seems like it is capturing more detail than other cameras, it is in fact capturing less detail because the pixels are too small. Analyzing the shape of the MTF curve reveals the aggressive sharpening. Looking at an image comparison with a different camera also reveals this shortcoming. We are lucky enough to have access to many cameras to compare…

The goal of FADGI is to define standardized objective ways of quantifying the types of things he talked about in this video so we can better communicate the imaging quality capabilities of a system and also to monitor the imaging quality results of a project (as a means of knowing if a vendor has provided the quality they claimed, for example). Most of our work relates to the imaging of cultural heritage objects (maps, books, documents, paintings, prints, drawings, etc.), but much of it could be applied to basic photography as well.

I forgot to mention that there is also an Audio-Visual working group where they are quantifying the items discussed in the link you provided. When evaluated using these guidelines, it takes the subjective analysis out of the picture. You many find it interesting to review the guidelines:

http://www.digitizationguidelines.gov/audio-visual/

Very interesting. The second video is long, but sort of divided into many parts for each point they are making. I didn’t watch it all yet, but a number of things I did watch were very interesting.

The human eye’s angular resolution is dependent upon both the color (wavelength) and pupil diameter (how bright). I just calculated the angular resolution using https://www.omnicalculator.com/physics/angular-resolution, the wavelength for different colors and the range of sizes for the pupil.

image

Therefore the smallest angular resolution (and the smallest pixel at a specific distance) that you can see would be if it were a dim blue pixel and would subtend 0.00303 degrees.

This stuff sure gets complicated!

In an area more in my field, as a music teacher, I’m used to seeing audio equipment emphasizing a range of 20-20k hz, which is typically presented as all you need (assuming that it is within a plus or minus 3 db, if I recall correctly.

Of course, at my age, I doubt very much I hear the 20k! But even when younger, I always wondered about some claims about the average young person not hearing anything past that. It may be true - or not. My suspicion was that it may well be that, if something could reproduce a range up to, say, 25k, that maybe it wouldn’t be something you could easily measure, but it might have some other impact. IOW, that a decent listener may be able to tell some kind of difference. May not be so - just something I wonder about.

Or, for us older folks, if we can only technically hear 15k, is there some effect of playing something at 20k that still benefits us - just not as much as it would a younger person.

It seems much like all these other factors in pictures and videos that can make something seem sharper, even though it may have fewer pixels.

I do remember, back in the days of cameras with, say, 5 mp, that people were talking about how a 2 or 3 mp camera could give better pictures because of the larger size of the sensors gathering more of the light.

image

Typical change in audio frequency sensitivity with age. Gradual, and no big deal.

Neocortex helps us adapt to it pretty well.

Try this:

In previous years it was a bonus to not hear the 15625 Hz from the horizontal synch on a TV if you were old.

2 Likes

I wonder how much the speakers influence that linked test. I was only running through my LG TV speakers, but I couldn’t hear 12000 hz. But then, it says you are likely under 50 if you can and I’m way past that! There is also a fair amount of other sounds around me so, unless the tone has enough volume, I may miss it. Maybe later tonight I’ll try it again, just in case, when there isn’t all the traffic noise. Though I doubt I’ll here it.

This isn’t the case any longer. The reason that it used to be the case is that the micro-lenses didn’t cover the entire area of the sensor (or before that, there were simply no micro-lenses :smiley:) … now with better manufacturing, there is very little area of the sensor “lost” to the gaps between the pixels. In fact, the last time I checked, the quantum efficiency of sensors was around 60% (which means that no matter how much improvement they do, they can never get another full stop of light).

So with today’s latest sensors, going with a lower pixel count only matters for (i) global shutter implementation and (ii) minimizing rolling shutter effect. (That assumes that I know what I’m talking about, which is not something that I’m willing to guarantee.)

3 Likes