Why are the fastest high-speed cameras so much faster than the faster monitors?

271 views

There are commercially available video cameras that can record at 100s of thousands of FPS, and scientific cameras in the millions of FPS. Yet the fastest monitors are around 500fps. Of course, there is much more utility to a fast camera than a fast monitor, but I get the sense that there are also technical restrictions. Why has no one made a 1 million FPS monitor to show off as a tech demo?

In: 21

8 Answers

Anonymous 0 Comments

There might be technical limitations, but if going above 100 frames per second or so doesn’t make a difference, why bother innovating in this direction? What would the application be? I can’t think of a single screen that is meant for something other than human eyes, where a higher frame rate would somehow be advantageous.

Anonymous 0 Comments

HDMI category 3 for 4K at 120 Hz uses a 48Gb/s data connection. The fastest fiber connections are 100Gb/s, which might get you 250Hz for 4K. Look at the price of a 100Gb/s ethernet card, and figure out why no-one wants to make such a monitor. DisplayPort is slower.

That 500Hz monitor is 1080p, and the HDMI is still 48Gb/s.

Anonymous 0 Comments

Because cameras are not monitors, they do not use those little red/green/blue pixels that monitors use to create an image on the display. It is expensive and difficult to build monitors with such a high amount of pixels that have the ability to get fps that high. Plus, it is not necessary for anyone to need fps speeds that high, so that in and of itself makes it not worth the costs

Cameras have what are known as sensors, which are millions of light sensitive spots that pick up light and convert that signal into an image. That is a very expensive part of the camera and it is small, less than 1/3 the size of a credit card.

Realistically, many cameras have small monitors on the back of the camera for you to see photos or use as a viewfinder, and these are similar to monitors as they are red/green/blue pixels. They also tend to work fairly slow and not as well as, say, desktop computer monitors.

Anonymous 0 Comments

Frames Per Second (FPS) is the number of still images/frames captured by a camera or rendered by a GPU.

Refresh rate (Hz) is the number of times per second that your monitor refreshes the image on your screen.

They are not the same thing. Cameras are measured in FPS, monitors in Hz. A monitor cannot be “500 FPS”. The more FPS you have the less blur the image has and the more you can slow down the footage. The more Hz you have the better you can see the FPS up to a point.

Generally you want a refresh rate faster than your FPS so you get all the frames with no latency or tearing.

Many studies have been done on the human eye and standard healthy vision can only see at around 75 frames per second. So anything faster than 75FPS or 144Hz is beyond the scope of human visual information and exists solely for specific technical purposes (slow-motion/VR/compositing etc).

No one could tell the difference between 1million FPS and 100,000 FPS with the naked eye unless the footage was manipulated like extreme slow motion of an explosion etc. No one can see more than 200 Hz and screens only go faster than that to accommodate todays graphics cards and high seed cameras. We cannot perceive the difference but our software can.

Anonymous 0 Comments

> there is much more utility to a fast camera than a fast monitor

Because the cameras are used for filming and/or studying things that move very quickly. The monitor output is to be viewed by human eyes. Edit: Though it is debatable what the limit for human eyes is, it is pretty clear that there is a reason to push from 10000 fps to 50000 fps and so forth on a camera. Also the [Phantom TMX 7510](https://www.phantomhighspeed.com/products/cameras/tmx/7510) lists 76,000 fps at 1280 x 800 which is on the small side. /edit

Pretty much.

I started searching “cmos speed image capture limit” and found a few papers like https://www.mdpi.com/1424-8220/9/1/430 https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7070596/ From what I can tell it’s “just” physics and deals with photons and electrons.

“fastest display technology” didn’t yield as good of results. It’s possible that LCD technology (VA, IPS, etc) and OLED have a limit that is reached at slower rates than an image sensor. There is digital micromirror device (like TI’s DLP) but I haven’t found anything about its upper limit on how fast it can go from gtg. LCDs add some chemistry; your pixel has to react to an applied electrical field. https://forums.blurbusters.com/viewtopic.php?t=795 is the best I got.

That’s the limit of research I’mm doing here out of curiosity for free, but it should either satisfy your curiosity or point you down the right path on what you can start digging into about the fundamentals of both technologies.

An underlying question with companies spending money on R&D is that they are spending a limited amount of money and not just for fun.

Anonymous 0 Comments

It’s because you can’t see fast enough for it to matter whether the monitor is 500 or 2000hz anyway.

Anonymous 0 Comments

1. Monitors have a response time of how fast the pixels can change colors. While a current gaming monitor may have a response time as little as 1 millisecond, they often employ tricks to reach that rating such as a strobing backlight. So the true response time of a monitor’s panel is the first limitation.
2. Monitors have to have a processor in them that can decode/display the actual video feed. Higher framerates require more data, meaning bigger/more/power hungry processors.
3. I don’t think a human could differentiate between 500fps and 1,000 fps to begin with. What even would be a use for a million FPS displayed on a monitor?
4. The monitor would be absurdly expensive.
5. You also need content to display on that monitor that display’s the million FPS, meaning to get any use of that high FPS, you need to custom make content for that one, single monitor.

I think it’s mostly economics, and the lack of any practical use for it.

Anonymous 0 Comments

Monitors are mostly used for human eyes, we can only perceive a limited framerate, so there is limited use to have a higher framerate.

If you extend the definition of *monitor* a bit, then high framterate single pixel monitors do exist. Multiple of these pixels can be combined to achieve ridiculous speeds for fibre transmissions and each one has a frame rate in the TerraHertz range.