Eli5 Appliance power consumption and what you pay

141 views

How does power consumption with appliances work? For things like refrigerators and coffee machines, is there a low draw of power since it’s in an “idle” mode and only ramps up when it’s cooling or being used? Same with my computer? My Graphics card is rated to 341w. Is that under the highest of loads or is that a constant draw?

In: 4

5 Answers

Anonymous 0 Comments

I am pretty sure the Watt rating is for the “active” mode, and appliances will use a lot less energy between activations. Watts need to compared to max amps on the circuit breaker, so you want the max power that the appliance will ever use.

Fridge and coffee makers literally do nothing most of the time, aside from fridge monitoring internal temperature, and coffee maker lighting up the buttons or screen.

Graphics card will use some power as long as the computer is on. It will use full power when playing a graphics-intense game.

Anonymous 0 Comments

Most appliances pull very little power when idle. The only power consumed might be a low voltage transformer converting high voltage to something for an LED display or array of controls but it’s very negligible compared to say a refrigerator compressor or microwave when heating. For just about any device however, the maximum wattage is usually higher than what you would ever see it demand. That’s because it could be operating on a lower voltage system (208V vs 220) or because it’s operating in a hot environment with a dirty coil. So the data plate showing the maximum wattage is usually a worst case scenario.

Anonymous 0 Comments

The only way to really know what an appliance is using is to measure it. Amazon will sell you a device to do this for appliances that plug in for < $20.

Anonymous 0 Comments

It really depends, appliance is a broad term.

A fridge needs the most energy (more power for longer time) when it’s first put into operation, because it has to cool itself and all the new content down from room temperature. Once everything is at the set temperature, it will only have to sporadically compensate for the degree it has deviated from the set temperature. This happens because no thermal insulation is perfect, because from time to time one opens it, because you put new stuff in it that is warmer than the fridge,…

A coffee machine really uses power when it has to heat up water, some filter coffee machines also to keep the plate warm. A TV uses power when it has to show video and play sound. A washing machine uses power when it’s washing. A bit to pump water, another bit to make the drum spin (more to start spinning, less to keep it spinning at the same speed), a lot to heat up water for higher temperature programs, a little bit for display and such. Your PC has a baseline power draw when turned on and idling, and from there it will need the more power, the more performance you demand from it. Same for the graphics card itself. When you start a graphics intensive game and all the fans spin up, that’s because the card gets hotter, which in turn is because it has more work to do, needs more power to do that and that power ultimately ends up turning to heat. Your 340W will likely (and hopefully) use less than 10% of that to just show you a spreadsheet.

Anonymous 0 Comments

Varies appliance to appliance.

Many appliances will have a little idle circuitry/lights/etc when “off” so might consume a few watts (say 20c every 100 hrs of idle or less).

Appliances like a fridge or oven typically cycle on/off to keep the correct temperature. So they’ll use their rated power during on the on portion of the cycle. Say a fridge is rated 500W and cycles on 25% of the time to hold temperature, cost is 0.5kW*25%*$0.2 per kWh = about 2 cents an hour.

Other appliances like PCs, higher end fridges/ACs (“inverter” based ones) can run at a variable power level. Your GPU light use like 20 W when you’re browsing the internet but 100W when you’re playing a game. The rated output is the max usage.