The news of ban on the sale of high-power computers energy in some states of the United States it made more people roll their eyes. Imagine visiting the Dell website to buy a nice Alienware gaming PC and find yourself with a warning blocking shipping in your state due to consumer laws. But how much do I consume? When it comes to energy, there is always a lot of confusion, so let's try to shed some light.

Gaming is not a primary human productive activity. We do it mainly as a pastime, as a recreational activity, because we like it. And that's enough, there is no need for anything else. The video game requires electronic machines to be used and these in turn need electricity. Electricity is used to perform a job: the mathematical calculations necessary to show the images on the screen. The power of the machine is connected to work. The more I want these beautiful, high resolution images with realistic graphic effects, the more I want them calculated in the shortest possible time (high framerate) and the more power I need.

But with what criteria are an i5 11400F and a GTX 1650 Super considered high consumption?

A continuous rush to do more

If it is true each new generation of hardware increases efficiency, it is unfortunately also true that this improvement is not used to consume less, but to have more power than before in the same space. Resolutions and framerates increase. Monitors that support these features are more energy-intensive.

Hence, if Nvidia's GTX 1080 Ti had a TDP of 250W in 2017 for 10TF of power the 3080 Ti gives us 34TF in 350W of TDP. So yes, from 25W for Teraflop we went to 10W for Teraflop, but at the end of the game, we are consuming 100W more than in the past. Although in my opinion, it is power used most of the time badly. And no, not because you insist on playing Cod Warzone and that's it. Think then that a Super Nintendo consumed just 17W.

Calculating the power consumption of a fully loaded computer is relatively simple. The old saying went like this: take your CPU's TDP value. Add up the TDP value of your GPU. Add 50W for the rest of the system and multiply the result by 1,2. Unfortunately, with today's fashion of ignoring any power limit, overclocking everything on your computer, using liquid cooling, putting fans and RGB everywhere, and launching every cannon game instead of optimizing your machine in a way this estimate could be biased, but it is still a useful estimate.

To obtain the energy consumed, simply multiply the power by the number of hours of activity

Let's take a $ 1099 Alienware Undeliverable Computer, which has entry level components. I5 11400f: 65W of TDP. GTX 1650 Super: 100W of TDP. So (65 + 100 + 50) X1,2 = 258W. The new generation consoles at full load consume about 200W. To know how much energy you consume in a year, just multiply the peak consumption by the hours of gaming during the year and you will get the Wh taken from the network. So California blocks everything for a measly 260W?

In reality, measuring the effect at the state or even global level is much more complicated. Just take a look even at the Steam hardware surveys to understand that existing systems are extremely varied and go from monsters of 800W consumption to the more peaceful notebooks that can be used to play with less than 100W. Furthermore, consumption is very dependent on use, not only on the type of machine you have.

Therefore, states such as California are very substantially evaluating every element of energy consumption at a very substantial 360 °. And gaming is one of them. This is why the report produced on the energy consumption of video games is 92 pages. I invite you to read it completely as it highlights the complexity of managing a system with so many variables. The result of the study is very simple: the machines on the market are often inefficient.

In Motherboard we include CPU, chipset and memory. Notice how my span calculation above took the Mid Range great.

We can reflect in a comparative perspective in this way. There is a lot of talk about how Bitcoins are a waste of energy, because it is consumed to generate a currency. Well, gaming consumes more than Bitcoins globally. If the virtual currency has weighed for 80 Twh in a year, gaming easily exceeds 100Twh. Italy consumes 280 TWh in a year. Do you think that the consumption of gaming in California alone in 2020 is higher than the energy needs of Ghana or Ethiopia in the same year. 112 million people use less energy to live than 40 million just to play. We often think of the difference between nations as a difference of wealth, of money. But the real difference is how much energy we have and how we use it. This alone should make us understand the intensity of the gulf of world wealth but also the different nature of the problems.

We should try to turn towards a more ecological way of playing? Should we have an energy rating for computers too? Here the situation becomes complex, as there are not only pre-built computers, but also individual components. And as mentioned above, here we are in a territory where enthusiast users want to have maximum performance with overclocking procedures that are very inefficient in nature.

Since 2013, a study shows that different configurations of pre-assembled PCs have a different performance / efficiency ratio.

The equation that determines the dynamic power from a processor is P = C * f * V² where C is the equivalent capacitance, which for simplicity we assume constant, f is the working frequency and V is the applied voltage squared. The frequency is directly proportional to the computing power. However, in order to maintain high frequencies, it is necessary to increase the working voltage. When a chipset is designed, manufacturers test its operation at different voltage and frequency characteristics. The products that end up on our desktops are the ones that maximize performance. Have you noticed how overclocking margins have shrunk in recent years? Those that end up in laptops have a duty point that maximizes their efficiency. All helped by an upstream job of selecting the most efficient silicon.

Then? To reduce the footprint on our planet Earth, must we now also stop gambling? Do we have to give up 480 fps and 16k? No. Many of the technologies that are being introduced in recent years, such as DLSS or FidelityFX Super Resolution are seen purely as systems to increase performance by reducing the resolution of the calculation, in fact making it lighter and therefore faster to perform. These systems can instead be used together with limiters to ensure adequate performance at reduced consumption.

Nvidia itself shows us the power and performance trend on this chart when explaining the MAX-Q approach

A quick numerical example. Normally, if I have a game that goes to 50fps, I can activate DLSS and it can also go up to 80fps. Well, if it's not a very fast-paced game, limiting to 60 might be fine. This leads to energy savings at the same time as an improvement in performance compared to not having it active. The truth is that to reduce our energy impact on the planet, we need to make choices. Wise choices. Which none of us do of our own free will.

The California study also highlights a couple of other very interesting metrics. Cloud gaming is much more energy-intensive than playing on the local platform. Sometimes consumption is even 200% higher. Therefore, it does not seem to be the way to go, even if servers can better exploit renewable sources in a centralized way and therefore consume more, but have a lower CO2 footprint.

Why is California so zealous in this respect? Why is it really aiming to be super green and only run on renewables? Yes, but sometimes the pillows were missing. The problem with the production of renewable energies is that they are not constant. Solar and wind depend on how much sun there is and how much wind blows respectively. So either a way is found to accumulate excess energy or the crisis is just around the corner. California has had numerous problems with its power grid over the past two years which led to blackouts in times of critical heat and forced citizens to change their consumption habits, such as charging electric cars only in the hours of lower consumption.

At sunset, its energy production drops dramatically and therefore needs to import a lot of energy from neighboring states and this alone has led the price of energy to quadruple in a few years. Fires in Oregon cut off energy supplies, old coal-fired power plants unusable as backups, the disappearing sun are ingredients for blackouts. An amendment was even made that allowed classic power plants to produce while ignoring any pollution limit set previously. Because it was needed.

So you understand that a state that is trying to restructure its entire energy distribution network, especially having to manage the unexpected without imploding, go and try to file any possible energy consumption that you see. Especially if it has become one of the main items of household consumption.

It is to set up specific laws also in this sector to promote more efficient tools, not just more powerful at the expense of everything else. And perhaps push the message that having an efficient as well as powerful computer is something to aim for. Products that do not meet the energy efficiency specifications cannot be sold. We also have the energy classes for household appliances, it is certainly not new. Maybe now PC makers will no longer use scarce power supplies that fold. Buy that platinum 90 PSU, come on.