How do laptops manage to deliver very decent performance despite drawing just a fraction of how many watts a desktop draws?

180 views

Take a laptop with a mobile graphics card. Mobile graphics cards obviously perform worse than their desktop counterparts. I would say they perform like the next lower tier on desktop (e.g. an RTX 3080 Mobile performs about as good as an RTX ~~3070~~ 3060TI desktop card according to UserBenchmark (yeah, I know, not the best site, but close enough in this case). But the mobile card still only less wats.

Example: [https://www.youtube.com/watch?v=Hz9GfxCAXgs](https://www.youtube.com/watch?v=Hz9GfxCAXgs)

This laptop uses a 200 watt charger. How can a 200 watt charger power a 3070 and an i9 processor? Even if they are just the mobile variant, isn’t that almost like expecting a 200 watt power supply to run a 3060 TI and an i7 processor from the same generation? Heck, we don’t even run 75 watt cards like GTX 1650s and 60w CPUs like some i5s and i3s on 200 watt on desktops. So what gives?

In: 27

4 Answers

Anonymous 0 Comments

Well for this one you have 32 gigabytes of RAM paired with an NVMe hard drive. Plus the GPU is designed for mobile. And initially at the beginning of the session it’s going to have decent thermals, the longer you use it the hotter it’s going to be and the more it will downclock itself. Unless they did really good with thermal construction It will never compare to a desktop. As for the wattage? That’s beyond on my expertise, I know the mobile variants consume far less power and are designed to change there clocks/volts on the fly.

I’m always impressed by the malleability of Intel on mobile. I’d never buy them for a desktop but they are really good in a mobile platform. Complete opposite for AMD, they are crap for mobile, but they are the king of desktop.

Anonymous 0 Comments

It’s very similar technology so they produce similar results. The main factor is laptops are more compact with components closer together and less air flow so they are much more susceptible to overheating. Heat is a byproduct of electricity so the components are throttled for how much power it takes to run them.

Anonymous 0 Comments

a 3080 mobile is not as good as a 3070 desktop. It’s not even the same league. A simple 3070 is tripple the size in height as the laptop. It’s the same chip running on 30% clock rates. Gaming Notebooks are a marketing thing. Buy a Gaming PC + a normal Notebook and you’re mostly cheaper than a gaming notebook (ok, not today..but when prices are back to normal). So they get descent performance (for a notebook) but not even close to the desktop counterpart – you can’t even compare them directly. If a rtx 3080 would fit in 5mm height with horrible cooling – the desktop versions wouldn’t take giant amounts of space the tower.

Anonymous 0 Comments

Couple of things, laptop parts tend to have a few extra optimizations for power. They are also custom designed for their specific needs. This adds up, and they don’t put in an oversized PSU. Specifically, when you hear people recommending a 750W PSU for a gaming system, it’s because they need specific current ratings on specific power rails, and it’s easier to spec a PSU that is guaranteed to have that than add them up for each rail. On a laptop they do add them up.

So the toms hardware review said their test with a 3080 rig used only 334W, and best I can find a GPU-only test of a desktop 3060 Ti used only 193W peak, that desktop system with everything in it probably uses 250-300W total. Most people are way over specing their home desktop PSUs, if you actually did the math and added up actual numbers most people’s gaming systems will run with a 450W PSU.

Laptop parts typically do have some different manufacturing stuff done to lower consumption, for example they typically run the parts at 20-30 degrees C higher, because it makes the heat sink more efficient and means less power spent on the fan. Also, a not insignificant amount of power is used by the memory, so reducing or slowing down memory can get a big power savings, even though “it’s the same GPU”.