There’s two big problems.
First is that VR requires you to render two videos at once, which is much harder on graphics cards. That means VR graphics tend to be lower quality.
The second thing is the ultimate problem with VR or 3d movies. Your eyes have two different things they do to adjust for distance. Your individual eyes can squish to focus to bring a certain distance into focus, and your two eyes can focus together at a certain distance (think crossing your eyes).
VR can’t get that second part working properly. If you were to focus on something right in front of you in VR, your eyes are straining because they want to do both focusing at the same time because that’s what they’ve been trained to do. This creates strain, which tires you out. This ultimately limits the way good VR is designed.
There’s two big problems.
First is that VR requires you to render two videos at once, which is much harder on graphics cards. That means VR graphics tend to be lower quality.
The second thing is the ultimate problem with VR or 3d movies. Your eyes have two different things they do to adjust for distance. Your individual eyes can squish to focus to bring a certain distance into focus, and your two eyes can focus together at a certain distance (think crossing your eyes).
VR can’t get that second part working properly. If you were to focus on something right in front of you in VR, your eyes are straining because they want to do both focusing at the same time because that’s what they’ve been trained to do. This creates strain, which tires you out. This ultimately limits the way good VR is designed.
The answers on here are mostly talking about the data/energy being fed to VR headset – GPU power, the actual electrical power being transferred to power the headset, etc.
But there’s a much more important limiting factor, the resolution of the VR screens and the lenses that VR headsets use.
If you’re on a computer right now, put your eyes right up next to the screen. Get as close as your can, with your nose almost or just touching the screen. Can you focus on the words? Can you read what it says? Probably not, right?
Because your eyes are this close to a screen when using a VR headset, the headset needs lenses to adjust the focus of the screen. Lenses are also working to fill as much of your vision as possible. Your computer screen only takes up a portion of your view, maybe 50% or less depending on the size of the screen and how close to it you sit. A laptop, for example, may take up less than 25% of your field of vision.
But a VR headset is trying to take up as much of your field of view as possible. So lenses need to magnify the screen across your field of vision. This magnification winds up making pixels appear larger than they do on a regular computer screen. When pixels become magnified, you start to see the physical space between the pixels, which is known as the “screen door effect.”
The screen door effect results in an overall loss of graphical fidelity. The images no longer look as clear and crisp on a computer screen. You also lose fidelity by magnifying the edges of objects. In reality, our vision is analog, objects can start and stop where ever they want, and we can clearly see that. But on a computer screen, each pixel can only contain one color, so an object can only “end” in the space occupied by a full pixel.
The difficulty in rendering the edges of objects is handled in software through a process called anti-aliasing. Anti-aliasing tends to blur the colors of pixels on the edge of objects between the color of the object and the color of whatever is next to it/behind it. This helps smooth the edge between the object and it’s background.
Anti-aliasing relies on a high pixel to area ratio, called the pixel density. The higher the pixel density of the screen, the more clearly anti-aliasing can blur the edges of objects to create a more crisp and realistic effect.
On a VR headset, you are magnifying the pixels, which necessarily reduces the pixel density (each pixel now accounts for more space). This has the opposite of the desired effect for anti-aliasing, it makes the anti-aliasing anomalies much more apparent and reduces the crispness of the edges of objects. This is why polygons in VR games tend to look like they are of an older generation than modern games – because the anti-aliasing that normally smooths those polygons is less effective.
Finally, VR lenses cause glare issues. This is especially apparent in scenes with high contrast (something very bright against a dark background, and vice versa). The bright parts of the image can create glare on the lenses that overpowers the dark parts of the image and make scenes that would otherwise have crisp differences between light and dark appear instead to have blurry edges that look lower quality.
VR is constantly striving to provide higher resolution screens, and higher pixel density screens to overcome these shortfalls, but it’s typically no easier (and sometimes harder) to make resolution and pixel density improvements on VR screens as it is on regular monitors. So monitors tend to always look as good or better than VR screens.
The answers on here are mostly talking about the data/energy being fed to VR headset – GPU power, the actual electrical power being transferred to power the headset, etc.
But there’s a much more important limiting factor, the resolution of the VR screens and the lenses that VR headsets use.
If you’re on a computer right now, put your eyes right up next to the screen. Get as close as your can, with your nose almost or just touching the screen. Can you focus on the words? Can you read what it says? Probably not, right?
Because your eyes are this close to a screen when using a VR headset, the headset needs lenses to adjust the focus of the screen. Lenses are also working to fill as much of your vision as possible. Your computer screen only takes up a portion of your view, maybe 50% or less depending on the size of the screen and how close to it you sit. A laptop, for example, may take up less than 25% of your field of vision.
But a VR headset is trying to take up as much of your field of view as possible. So lenses need to magnify the screen across your field of vision. This magnification winds up making pixels appear larger than they do on a regular computer screen. When pixels become magnified, you start to see the physical space between the pixels, which is known as the “screen door effect.”
The screen door effect results in an overall loss of graphical fidelity. The images no longer look as clear and crisp on a computer screen. You also lose fidelity by magnifying the edges of objects. In reality, our vision is analog, objects can start and stop where ever they want, and we can clearly see that. But on a computer screen, each pixel can only contain one color, so an object can only “end” in the space occupied by a full pixel.
The difficulty in rendering the edges of objects is handled in software through a process called anti-aliasing. Anti-aliasing tends to blur the colors of pixels on the edge of objects between the color of the object and the color of whatever is next to it/behind it. This helps smooth the edge between the object and it’s background.
Anti-aliasing relies on a high pixel to area ratio, called the pixel density. The higher the pixel density of the screen, the more clearly anti-aliasing can blur the edges of objects to create a more crisp and realistic effect.
On a VR headset, you are magnifying the pixels, which necessarily reduces the pixel density (each pixel now accounts for more space). This has the opposite of the desired effect for anti-aliasing, it makes the anti-aliasing anomalies much more apparent and reduces the crispness of the edges of objects. This is why polygons in VR games tend to look like they are of an older generation than modern games – because the anti-aliasing that normally smooths those polygons is less effective.
Finally, VR lenses cause glare issues. This is especially apparent in scenes with high contrast (something very bright against a dark background, and vice versa). The bright parts of the image can create glare on the lenses that overpowers the dark parts of the image and make scenes that would otherwise have crisp differences between light and dark appear instead to have blurry edges that look lower quality.
VR is constantly striving to provide higher resolution screens, and higher pixel density screens to overcome these shortfalls, but it’s typically no easier (and sometimes harder) to make resolution and pixel density improvements on VR screens as it is on regular monitors. So monitors tend to always look as good or better than VR screens.
Latest Answers