ELI5- why can movies make CGI human faces and movements that look 100% real and normal but video games always look fake even with large budgets?

305 views

ELI5- why can movies make CGI human faces and movements that look 100% real and normal but video games always look fake even with large budgets?

In: 0

26 Answers

Anonymous 0 Comments

Video games are rendered in real time, so the degree of realism is limited to the gaming hardware being used. Movies are rendered once, and can take as much time per frame as necessary to achieve the desired look. The shortcuts often used in games to get good frame rates (at the expense of visual quality) are not necessary when rendering a movie.

Some quick research suggests that movie rendering takes around 24 hours _per frame_, so almost a month of compute time per second of film (24 fps). Many, many cloud instances are used in parallel to render a digital film.

Also, from the perspective of fine-tuning specific motions, a movie is a single set of motions that is painstakingly refined to achieve the desired look. A video game, by definition, varies with user input. It’s not always possible to refine every game scenario and every transition between movements.

Lastly, think about budgets in terms of dollars per minute of content. Most movies run around 100 minutes, while games can be upward of 1000 minutes or more. More budget per unit time allows for higher quality everything.

Anonymous 0 Comments

Video games are rendered in real time, so the degree of realism is limited to the gaming hardware being used. Movies are rendered once, and can take as much time per frame as necessary to achieve the desired look. The shortcuts often used in games to get good frame rates (at the expense of visual quality) are not necessary when rendering a movie.

Some quick research suggests that movie rendering takes around 24 hours _per frame_, so almost a month of compute time per second of film (24 fps). Many, many cloud instances are used in parallel to render a digital film.

Also, from the perspective of fine-tuning specific motions, a movie is a single set of motions that is painstakingly refined to achieve the desired look. A video game, by definition, varies with user input. It’s not always possible to refine every game scenario and every transition between movements.

Lastly, think about budgets in terms of dollars per minute of content. Most movies run around 100 minutes, while games can be upward of 1000 minutes or more. More budget per unit time allows for higher quality everything.

Anonymous 0 Comments

Video games are not a movie. They have way less budget – and way way way less budget for cutscenes compared to a whole movie.

A video game is made in engine- meaning its not a static thing but it can change and react to how you play.

If there is a cut scene in the game – it can be fully in engine like the rest of the game or some combination of CGI and engine .

The cutscene will never have the same bduget as a full movie so they never look as good, but most of the time they dont even try to look 100% real because they know gamers dont care

Anonymous 0 Comments

Video games are not a movie. They have way less budget – and way way way less budget for cutscenes compared to a whole movie.

A video game is made in engine- meaning its not a static thing but it can change and react to how you play.

If there is a cut scene in the game – it can be fully in engine like the rest of the game or some combination of CGI and engine .

The cutscene will never have the same bduget as a full movie so they never look as good, but most of the time they dont even try to look 100% real because they know gamers dont care

Anonymous 0 Comments

Video games could have pre-rendered cutscenes that look just as good as movies if the devs wanted to, but it would be jarring for the player experience to jump back and forth from in-engine rendered graphics to pre-rendered, so they keep with the best possible graphics that can be maintained while interactively moving through the environment.

Anonymous 0 Comments

Video games could have pre-rendered cutscenes that look just as good as movies if the devs wanted to, but it would be jarring for the player experience to jump back and forth from in-engine rendered graphics to pre-rendered, so they keep with the best possible graphics that can be maintained while interactively moving through the environment.

Anonymous 0 Comments

Video games are rendered in real time, movies aren’t. An example is PS1 games that look better in their cutscenes. Games like Final Fantasy VII and the Resident Evil games used Full Motion Video, which were basically pre rendered CGI cutscenes. Still 90s CGI, but it wasn’t limited by the power of the system itself, just by the computer that made and rendered it. They basically just interjected the files into the game and said when to play it.

Now that games render their own cutscenes, they still won’t look as good as movies MOST of the time. The frame rate is definitely part of it; movies and TV are 24fps consistently. Games are 30 or 60fps. Games often don’t aim completely for photo realism, there is an element of stylization.

But, there are recent examples of games that not only look amazing, but look real. Forspoken was supposed to be one of these, but that… didn’t happen. The Dead Space remake and The Callisto Protocol were pretty impressive too. But the 9th gen Call of Duty games have been the most impressive. I’m not a COD player, but I’ve seen some cutscenes from the remake (I think it’s a remake) of Modern Warfare 2, which is 2022’s annual COD game, and I gotta say, it doesn’t have that “game-y” look that you’re describing nearly as much as other games. You really have to see it for yourself.

Anonymous 0 Comments

Video games are rendered in real time, movies aren’t. An example is PS1 games that look better in their cutscenes. Games like Final Fantasy VII and the Resident Evil games used Full Motion Video, which were basically pre rendered CGI cutscenes. Still 90s CGI, but it wasn’t limited by the power of the system itself, just by the computer that made and rendered it. They basically just interjected the files into the game and said when to play it.

Now that games render their own cutscenes, they still won’t look as good as movies MOST of the time. The frame rate is definitely part of it; movies and TV are 24fps consistently. Games are 30 or 60fps. Games often don’t aim completely for photo realism, there is an element of stylization.

But, there are recent examples of games that not only look amazing, but look real. Forspoken was supposed to be one of these, but that… didn’t happen. The Dead Space remake and The Callisto Protocol were pretty impressive too. But the 9th gen Call of Duty games have been the most impressive. I’m not a COD player, but I’ve seen some cutscenes from the remake (I think it’s a remake) of Modern Warfare 2, which is 2022’s annual COD game, and I gotta say, it doesn’t have that “game-y” look that you’re describing nearly as much as other games. You really have to see it for yourself.

Anonymous 0 Comments

Think of it like a flipbook.

A movie is a completed flipbook, you’re just flipping through and seeing the end result.

A video game is like a blank flipbook, you have to draw every picture in the book as you go to get an end result.

Drawing those pictures takes a long time, video games do that as you sit there and wait for it to draw, movies do it before you ever see the movie and just show you the end result.

Anonymous 0 Comments

Think of it like a flipbook.

A movie is a completed flipbook, you’re just flipping through and seeing the end result.

A video game is like a blank flipbook, you have to draw every picture in the book as you go to get an end result.

Drawing those pictures takes a long time, video games do that as you sit there and wait for it to draw, movies do it before you ever see the movie and just show you the end result.