The secret environmental cost hiding inside your smart home device

/

AI could become a major source of planet-warming emissions. How can the smart home industry build products more sustainably?

Conceptual illustration showing a home in the hallway of a large server room. There is a haze coming from the servers to imply the environmental footprint of having a smart home.

Illustration by Nico H. Brausch for The Verge

Vijay Janapa Reddi runs a lab at Harvard University where he and his team attempt to solve some of the computer world’s greatest challenges. As a specialist in artificial intelligence systems, the technology he studies even follows him home, where his two daughters love to talk to their Amazon Alexa. 

“They put a person inside that black box,” Janapa Reddi likes to joke with his four-year-old.

Janapa Reddi may be teasing when he tells his daughter a person is squeezed into their machine, but isn’t that where we’re headed? Smart home devices may never host a miniature human being inside of them — this isn’t that one episode of Black Mirror — but as the AI ecosystem evolves, voice assistants will quickly begin to feel hyperrealistic. Indeed, tech companies like Amazon are now attempting to integrate large language models like OpenAI’s ChatGPT into smart home devices to elevate user interaction. 

“These devices are finally coming a step closer to how we naturally interact with the world around us,” Janapa Reddi said. “That’s a pretty transformative experience.”

But a machine can’t behave like a human without a cost. All that intelligence requires massive amounts of data — and the computers storing that data require loads of energy. At the moment, over 60 percent of the world’s electricity generation comes from fossil fuels, the main contributor to climate change. A study published in the journal Joule in October found that widespread integration of generative AI could spike energy demands. In one worst-case scenario from the analysis, the technology could consume as much energy as the entire country of Ireland.

Climate change is already exacerbating heatwaves. Last summer was the hottest on record. To make matters worse, the climate crisis has increased the scarcity of water, which some data centers need to stay cool. In order to keep a bad situation from getting worse, scientists have been urging world leaders to stop using fossil fuels. Some advocates, on the other hand, have demanded Congress take action on the energy burdens the AI sector presents.

These concerns link two of society’s most seemingly apocalyptic scenarios: world-dominating AI and world-ending climate change. Are smarter (and more energy-intensive) smart homes really worth the trouble?

Janapa Reddi uses his Amazon Alexa to listen to the news or music.His youngest daughter, on the other hand, often asks Alexa to play “The Poo-Poo Song,” her current obsession. Indeed, there’s something satisfying about coming home after a long day to find your lights dimmed and temperate set just how you like. Smart homes are kind of magical in this way: they learn a user’s behaviors and needs. 

Though AI has become a buzzword this year with the rise of ChatGPT, it’s been in the background for many years. The AI most people know about and interact with — including in their smart homes — has been around for about 10 years. It’s called machine learning or deep learning. Developers write programs that teach voice assistants what to say when someone asks them for the time or a recipe, for instance.

Smart homes are capable of doing an impressive amount of work, but the technology behind them isn’t as complex as, say, GPT. Alexa gives the same answer to pretty much everyone, and that’s because it’s preprogrammed to do so. The machine’s limited responses, which are processed locally in a person’s home, keep its energy demands quite low.

“The current type of AI that is in these systems are pretty simplistic in that they don’t take in a lot of factors when making decisions,” said William Yeoh, an associate professor of science and engineering at Washington University in St. Louis. 

GPT, on the other hand, generates original responses to every query. It considers many factors when it’s deciding how to respond to a user. How was the prompt worded? Was it a command or a question? Is the question open-ended or factual? Generative AI is fed immense amounts of data — trillions of different data points — to learn how to interpret questions with such intelligence and then generate unique responses. 

“You never tell [the system] that these are things people might ask because there’s an infinite number of questions people could ask,” said Alex Capecelatro, CEO of AI company Josh.ai, which has built a generative AI smart home system. “Because the system is trained on all of this knowledge… the information is able to be retrieved in pretty much real-time.”

What if this type of deep learning were applied to smart homes? That’s what Capecelatro sought to do back in 2015 when he and his team began to develop JoshGPT, a smart home device doing exactly that. The product remains in development, but the company believes JoshGPT is “the first generative AI to be released in the smart home space.” The technology has processed millions of commands during the six months JoshGPT has been live. Capecelatro is hoping to expand to an international market by early 2024. 

For him, this sort of integration is the future: “The old AIs are kind of like a vending machine. You get to pick from the options that exist, but those are the only options. The new world is like having the world’s smartest and most capable chef who can make whatever you ask.”

Josh.ai isn’t the only company investing in a new smart home ecosystem. In September, Amazon previewed the new iteration of Alexa: one that’s “smarter and more conversational,” per the company’s announcement. Its technology will assess more than verbal directions; it will even follow a user’s body language to offer the perfect response. Meanwhile, Google announced in October new generative AI capabilities that will help users write grocery lists or captions for social media posts. So far, Google hasn’t released plans to add this upgrade to smart home speakers, but it feels like a natural progression.

Smart home proponents like Capecelatro believe the technology can cut a household’s carbon footprint by automating tasks that can reduce energy — like lowering the blinds to keep a room cool or raising them to add natural light. Buildings contribute to over a third of global greenhouse gas emissions. One report from research firm Transforma Insights found that connecting buildings to smart home technologies could reduce global energy consumption by about 5 percent.

Suruchi Dhingra, research manager at Transforma Insights, spoke enthusiastically at length about smart blinds, smart lighting, and smart HVAC systems, shedding light on the energy savings they offer. But when asked about generative AI smart home integration, Dhingra looked confused: “Is there actually a need?”

It’s an important question to ask considering how much more energy goes into training and running AI models like GPT compared to current smart home models. Current energy emissions from these devices would be “significantly smaller” than ones featuring generative AI, Yeoh said. “Just because the number of factors or variables are so much smaller,” he said. Every user command or query would require more computational resources if plugged into a generative AI model. The machine wouldn’t be reciting a response a human programmed; it would be generating an original response after sorting through all the data it’s learned. Plus, smart homes with such advanced technology would need a strong security system to keep intruders from breaking in. That requires energy, too.

It’s hard to know whether the potential emissions reductions from smart home capabilities would outweigh the emissions that would come from adding generative AI to the mix. Different experts have different opinions, and none interviewed were comfortable speculating. Like Dhingra, all wondered whether generative AI in smart homes is necessary — but haven’t convenience and ease always been the point? Did we ever actually need to ask a machine for the weather when our phones can already tell us? We had manual dimmer switches before we had smart lights.

However, industry folks like Capecelatro want to see these generative AI models run as efficiently as possible so they can cut costs.

“I’m actually pretty confident we’re going to see a really good trend toward lower and lower emissions needed to generate these AI results,” he said. “Ultimately, everyone wants to be able to do this for less money.”

In October, Alex de Vries published a paper to examine the potential energy demand of AI. The founder of digital trends research company Digiconomist tried to forecast one scenario in particular where Google integrates generative AI into every search. Such functionality would be similar to how a Google Home generative AI integration would work even though de Vries wasn’t examining smart homes.

The study’s worst-case scenario painted a future where Google AI would need as much energy in a year as the entire country of Ireland — but that’s not what he wants the public to take away from the research. “This is a topic that deserves some attention,” de Vries said. “There’s a very realistic pathway for AI to become a serious electricity consumer in the coming years.”

He’s especially critical of the widespread application of generative AI. “One thing you certainly want to avoid is forcing this type of technology on all kinds of applications where it’s not even making sense to make use of AI,” he said.

His paper sheds light on the potential emissions that can come from running these huge models — not only from training them, which has historically been a source of energy consumption. De Vries argues that operating these technologies may be driving more emissions now with the deployment of ChatGPT, which saw 100 million users just months after launching. With AI being used in this way, the emissions can grow even higher when you consider that the models need to be retrained every few years to ensure they stay up to date, he said. 

That’s why many computer engineers are working on efficiency. What de Vries worries about is that more companies will use generative AI as the technology grows more efficient, keeping energy demands high. “It’s become a guiding principle of environmental economics that increasing efficiency doesn’t necessarily translate to less use of resources — it’s often quite the opposite,” said de Vries, who is also a PhD candidate at the Vrije Universiteit Amsterdam School of Business and Economics.“I don’t think that there is going to be one single thing that is going to solve all our problems.”

Not everyone is as pessimistic. Peter Henderson, an incoming computer science and public affairs professor at Princeton University, is impressed with the efficiency gains AI has seen, especially with the ability of hardware to run programs more locally, which requires less energy. He imagines that if smart homes were to integrate generative AI, they’d default to whatever mechanism is most efficient. Indeed, that’s how JoshGPT is being built: its model splits queries based on whether a command can go through the local processor or requires a full GPT response. 

“All in all, the power required for what we are doing is far less than what would be needed to do routine Google searches or streaming Netflix content on a mobile device,” said Capecelatro of Josh.ai.

So much of this, however, is speculative because there’s little transparency around where companies like OpenAI are sourcing their energy. Is coal powering their data centers or hydro? Buying energy from clean sources would alleviate many of the environmental concerns, but there’s only so much energy the Sun or wind can generate. And there’s only so much we can allocate to computers when there are still people without access to electricity or the internet. 

Without more data, Henderson isn’t sure what to expect for the future of AI. The situation could be better than it seems — or it could be much worse. He’s hopeful about what AI could mean as a tool to combat climate change by optimizing energy grids or developing nuclear fusion, but there are too many questions about the generative AI we may see in our homes one day.

For Janapa Reddi, the questions run much deeper than environmental costs. “What does this all mean in terms of educating the next generation?” he asked. This thought process is why he teases his four-year-old that there’s a person inside their Alexa; he wants his daughter to treat the technology with empathy so that she develops manners she can practice with actual people. Now, his daughter is nicer to Alexa, using words like “please.” 

“These are very simple things — but important,” Janapa Reddi said. “They’re going to be using these devices day in, day out, left and right, up and down.”

Underlying all of these conversations and questions is an overall desire to build a better world. For some, “better” entails more convenience and comfort. For others, it’s less reliance on these flashy new technologies. What everyone can agree on, however, is the longing for a healthy world to exist at all. 

Go to Source