Why is DLSS or other upscaling technology essentially required to get high framerates now days?

348 viewsOtherTechnology

Why is DLSS or other upscaling technology essentially required to get high framerates now days?

In: Technology

5 Answers

Anonymous 0 Comments

Kind of confused by this question. DLSS is used to upscale games passed the standard  1080p. 

It’s not required but if you had a tool that could upscale your game why would you not take it?

4K native is way more taxing on the computer than putting it through what’s essentially a filter that does the same thing with better performance. 

Anonymous 0 Comments

You only need to do upscaling when you’re running at higher resolutions. Modern GPUs will easily get high frame rates at 1080p, but as they start having to push higher native resolutions, the computational demands increase greatly. 4K resolution has 4x the pixels as 1080p, and thus needs a corresponding increase in computational power to render it. You’ll usually have higher resolution textures that require much more memory to store as well. It’s not entirely a linear 4x increase, since some of the rendering stages aren’t done in the pixel space.

DLSS lets you render at 1080p, then it uses a much more computationally efficient algorithm to upscale it to 4K. It’s not quite as visually perfect as true native rendering resolution, but it’s good enough for most people to not notice or care at all.

Anonymous 0 Comments

There are multiple factors that play a role in this. Firstly it’s not required to use it if you have top of the line hardware or you use a reasonable resolution for you hardware. Secondly the raw performance increase between the latest new gpu generation is smaller than in the past. Thirdly it wasnt used much in the past because the technology behind it wasn’t viable and good enough. Also in this day and age more people think they need 100-200 fps in story-driven games like cyberpunk, Baldurs gate, etc. I personally think it started/became mainstream when the PS5 and equivalent Xbox released and 60 fps became the norm. Now with more monitors being widely available with 144 or more Hz at 1440p or 4k, many people think they need to hit these numbers in every game even though they won’t benefit from it as much as in esport titles.

Anonymous 0 Comments

There are many facets to this, but one is that game engines are capable of producing more and more visually complex games and hardware is no longer advancing in lock step.

Games are advancing as they always have. The new techniques being used to increase visual fidelity require more PC resources. An example is having real time lighting instead of the artists going all across the whole game placing all the lights and lighting effects manually. It requires stronger hardware, which is why is hasn’t been done until recently because the hardware wasn’t really up to snuff.

The software side always keeps improving, but there are real physical limitations to the improvements that can be made to hardware. Every generation it becomes harder and harder, and more expensive, to make meaningful improvements to hardware performance.

As a result of that, they look for other solutions such as DLSS and frame generation

Anonymous 0 Comments

1. Hardware increases have slowed down dramatically, percentage wise.
2. Visual fidelity has hit dimishing returns.
3. The current complexity of games makes it more difficult to budget enough time to optimize and QA properly.

It’s mostly diminishing returns though. We’ve gotten to the point where things that are significantly more expensive to calculate don’t necessarily have a significant impact on fidelity.