Science and technology | Anything that can’t continue, won’t

The bigger-is-better approach to AI is running out of road

If AI is to keep getting better, it will have to do more with less

A spoon holding an atomic structure
Image: Mike Haddad

When it comes to “large language models” (LLMs) such as GPT—which powers ChatGPT, a popular chatbot made by OpenAI, an American research lab—the clue is in the name. Modern AI systems are powered by vast artificial neural networks, bits of software modelled, very loosely, on biological brains. GPT-3, an LLM released in 2020, was a behemoth. It had 175bn “parameters”, as the simulated connections between those neurons are called. It was trained by having thousands of GPUs (specialised chips that excel at AI work) crunch through hundreds of billions of words of text over the course of several weeks. All that is thought to have cost at least $4.6m.

Explore more

This article appeared in the Science & technology section of the print edition under the headline "Time for a diet"

The trouble with sticky inflation

From the June 24th 2023 edition

Discover stories from this section and more in the list of contents

Explore the edition

More from Science and technology

To stay fit, future Moon-dwellers will need special workouts

Running around the inside of a barrel might help

Wind turbines keep getting bigger

That poses a giant transport problem