A smart device on standby uses a relatively small amount of electricity compared to the device being active, even for something as efficient as an LED light bulb. You make that trade off for multiple reasons. One is the convenience of not having to manually switch one or more devices on or off, you can do it remotely via some other interface like an app. “Convenience” is subjective, one person may not find that convenience worth the few extra dollars a year you might pay for in electricity.
But – as the name “home automation” suggests – you could instead automate this behaviour, perhaps tying it to the time of day, or a light sensor, or a motion sensor etc. Depending on what devices you automate and what rules you apply, you can actually reduce your electricity costs, more than making up for the minimal power draw from perpetual standby. Think devices like fridges, HVAC systems, washing machines etc, turning them on only when required or the time of day when electricity is cheapest. The more devices you have and the more complex those requirements, the less feasible that becomes without some central smart system to control it all.
You’ll also find that a lot of modern electrical appliances already consume standby power anyway. If you can turn your TV on with the remote, it’s using power in standby. The *additional* power required to make them compatible with some smart home system and receive a power on signal from the local network is basically nothing. Same goes for anything with a timer, a clock or other always-on display of some sort.
Latest Answers