Why is having a lower refresh rate on a display better for battery?

154 views

If the screen is displaying the same thing (think an “always on display”) then why does lowering the refresh rate help with battery if the pixels are still lit?

In: 0

4 Answers

Anonymous 0 Comments

1. Lower power consumption from the CPU and GPU as they don’t need to prepare and render that many frames

2. Lower power consumption from the LCD panel – it takes power to flip the little crystals from transparent to opaque, and generally the faster you want something to work, the more power you need to pump in it, though I think that last part is only relevant if the screen also switches to a slower response time.

Anonymous 0 Comments

Because refreshing the screen means the device has to calculate what shall be rendered and where on the screen, and then actually render it.

So imagine repeating this job 100+ times every second, it will require a lot of energy compared to if it only refreshed say 30 times a second

Anonymous 0 Comments

Lets explain a few things

First whats an Hz, a Hz is a clock movement, its the rythem to which a process advances. its a clock tick which fixes the rate in which it works.

Hz is mesured at X/second usually, X being the number, so 60Hz is something that happens 60 times a second, your 5ghz processor has a work rate of 5,000,000,000 clock “ticks” per second.

to display video, a display usually works by using still frames at a very fast rate, 24fps for europe and 30fps in the US, mainly becuase the electricity which carries a AC current which provides a 50 or 60hz clock signal, this is a legacy component (a left over component) from when TV signals used the electricity frequency as a clock signal.

the thing is that with digital video and every device having its own clock signal (usually a quartz occilator on the board) theres no need for this now, and especially with gaming class video cards, they exceed the normal output of 24 or 30fps or even 60fps.

So you can use a higher refresh for a more immersive expiriance,

The problem is that the more work done by the GPU, the more power it consumes, and the display has to redraw the whle screan, meaning reseting all the pixels and recoloring them for each frame, and if th display consumes 1watt per redraw, at 30fps it will consume half of what it consumes at 60fps. (lighting not included, thats a constant power cost)

Theres also Interleved (thats why you see P or i next to resolutions) P is progressive scan that draws the whole image, and interleved draws only half the screen in every other line, but since its so fast you dont notice.

and as a final note, theres Vsync.
V-sync, aka vertical sync is a technology which fixes the video cards frames per second to a fixed refresh on the monitor, so if the monitor has a 60hz refresh, it will not allow the video card to exceed 60hz. now more advance technologies like g-sync and freesync adapt the videocard output to the maximum refresh of the monitor, but this is more to avoid screen tearing, which is when the Frames per second the video card is outputting is not divisiable by the refresh of the monitor and you start getting half rendered screens.

Anonymous 0 Comments

Because it doesn’t have to calculate what’s on the screen as often lowering the power consumption of the cpu/apu