With the so-called "cost of living crisis" and with energy prices rising I have been checking the energy use of various computers and gadgets in my vicinity. Even before the current crisis there was talk of the rising energy use of computers and that this may be unsustainable in the long run.
What my own domestic electrical testing tells me is that the computers I'm using consume very little electricity compared to the common appliances, such as lights, kettle and oven. Even doing a rough back-of-the-envelope equivalency calculation, a single boiling of a kettle for one or two minutes is approximately equivalent to a high specification gaming PC with a giant graphics card running continuously at 100% CPU utilization for multiple days.
So the energy unsustainability of computing seems to be primarily coming not for ordinary domestic use of computers but from cloud computing in data centers and/or cryptocurrency mining. Energy used by data centers appears to be included within estimates for the energy cost of tablets and IoT devices.
So when it comes to energy saving the things to focus on are the *boring tech* of lights and kitchen appliances. Especially any sort of heating appliance. Trying to save a few watts on computers won't make much of a reduction in electricity bills.