The technologies that drive our world are changing rapidly. From smart appliances to AI software to cryptocurrency, our lives are being changed by new technology whether we see it or not. These changes can be overwhelming to keep up with as an individual, and it turns out that there are some bigger costs to these tech innovations as well. Specifically, new tech often takes a lot of energy and physical resources, something that is often considered a little too late.
Take Bitcoin, for example. When Bitcoin first became available in 2009, there was a frenzy around it, people got excited, some people made some money, and a lot of us just tried to figure out what it was. But a few years into cryptocurrency becoming more prevalent, the environmental concerns started to surface. It has become clear that Bitcoin mining—basically, the process of verifying Bitcoin transactions and creating new Bitcoin by solving complex math problems—consumes a staggering amount of energy and physical space in data centers, because it requires a lot of highly powered computers working constantly. And as of now, a lot of the energy used is coming from burning fossil fuels.
The issue of energy and Bitcoin came to a head this summer during a Texas heat wave. The State of Texas, concerned about the stability of the electric grid and wanting to avoid life-threatening blackouts, paid a Bitcoin mining facility $31.7 million to pause operations to cut electricity usage. And this wasn’t a one-off thing—Texas plans to use this strategy regularly to manage periods of strain on the grid because the strategy worked. There were no blackouts. But this situation raises some big questions. Why is this new technology taking up so much energy? And are there better ways to manage this for a greener future?
More broadly, this issue with Bitcoin raises questions about the energy cost of all new technologies. Over a decade after the advent of cryptocurrency, we’re only now seeing conversations about how we build and power new tech responsibly. With cryptocurrency, for example, there are alternative methods of verification that would consume 10% of the energy that Bitcoin uses to do basically the same thing. And any electricity used could come from green sources instead of fossil fuels, if the infrastructure was built with that in mind. But because energy cost wasn’t front of mind while the system was being built, we’re now playing catch up.
Considering the energy consumption of new technologies earlier could be very helpful in developing both the product and the physical facilities, like data centers, in much more sustainable ways. This way, as we develop and adopt new technologies, we don’t jeopardize the future of the planet in the process.
So, with that in mind, let’s talk about the current big thing: artificial intelligence (AI).
AI is a broad term that refers to “the science and engineering of making intelligent machines.” There are a lot of different kinds of AI, and it’s pretty clear that we’re in a time of explosive growth in AI technology. Whether or not it’s apparent, AI is being implemented in almost every sector, from Hollywood to internet search engines to the energy sector. And with that, the demand for energy and data center capacity is exploding.
This is because AI models are highly complex computer systems that require a lot of physical space and energy to use. As AI becomes more sophisticated, the energy required to build and run the models is only increasing. The possibilities presented by AI are astounding, and a lot of industries are looking into AI as a part of their future, but large-scale use of AI poses some serious challenges to our energy infrastructure and environment.
First, there’s the amount of power required to create and run the models themselves. Take ChatGPT, for example. It’s estimated that training the language model that ChatGPT uses took 1,287 megawatt hours of electricity and generated 552 tons of carbon dioxide, and that’s just to get the product off the ground. The model also “learns,” meaning it’s constantly being fed more data, which takes energy.
Then, each actual use of an AI product like ChatGPT requires a lot of data moving through a server, which translates to a lot of energy. Statistics on how much electricity is actually used this way aren’t quite clear yet, but a recent study estimates that if every Google search was AI-powered, it would consume as much electricity as Ireland, as it increases the energy cost of searches dramatically.
And finally, there are the data centers. These are large facilities filled with computer servers where the physical end of an AI model is housed, along with servers for basically everything we do digitally. These data centers require a ton of resources, like computer chips and metals, to build, and they consume a lot of electricity. The computers themselves require power 24/7, and the facilities need to run very powerful cooling systems to keep the computers from overheating. The boom in AI has caused a surge in demand for data centers, which is putting pressure on electric grids and consuming a lot of resources.
Larger corporations, such as Microsoft, are looking for solutions, such as investing in their own nuclear power plants specifically to power their AI systems. Microsoft is even training an AI model to help expedite applications for these nuclear power plants, which are highly regulated and extremely hard to get approved. While it’s promising that companies are looking to invest in power generation instead of taxing the public grid, the focus on nuclear is less than ideal. While nuclear does count as carbon-neutral, it’s not a renewable source of energy, as it requires uranium to be mined. A more sustainable solution will hopefully involve real renewable energy, but as companies need 24/7 access to electricity, there are some logistical concerns with wind and solar.
That said, AI itself may be a part of the solution to how to power AI models sustainably. There’s also a lot of promise in the use of AI to manage and optimize the green energy future. The energy sector is already investing in AI and machine learning to help forecast supply from wind and solar farms, predict peak demand, and prevent grid failures through predictive maintenance and monitoring. This could help to manage the issue of power supply from renewables and provide a path towards using more renewable energy to power AI. If these AI models can be indeed powered with renewable energy, they could be instrumental in building a stable green energy grid moving forward.
With any new technology, it’s important to consider what it will actually mean for the planet as it becomes more popular. Whether it’s cryptocurrency, AI, or the next new thing, there’s a physical and environmental cost for anything digital. Developing new technologies is an important part of a green future, so it’s important we focus on the environmental impact of innovations early. The more we can build our tech infrastructure sustainably from the beginning, the better the future will look.
Tech giants like Microsoft, Amazon, and Meta are offering the excess heat from their Nordic data centers to communities with district heating systems
Explore how the transition to electrification is pivotal for achieving U.S. energy independence and security.
On this episode of Smart Energy, we're chatting about making small habit changes in the New Year, the energy grid with Elysia Vannoy, and answering a listener's question about what to do with old holiday lights.