OpenAI’s building GPT-5 – but it hasn’t got a clue what it will do
OpenAI is hard at work developing GPT-5 – the next generation of its AI large language model – but even the company’s CEO admits he can’t say what it will do.
OpenAI’s GPT-4 is one of the most prominent AI language models in the world, powering AI services including Microsoft’s Copilot offerings, the Bing AI chatbot and, of course, the company’s own ChatGPT.
GPT-4 was released in March and development work is well underway on GPT-5, but the company’s CEO Sam Altman says it’s too early to say how it will differ from what’s gone before.
“Until we go train that model, it’s like a fun guessing game for us,” Altman said in an interview with the FT. “We’re trying to get better at it, because I think it’s important from a safety perspective to predict the capabilities. But I can’t tell you here’s exactly what it’s going to do that GPT-4 didn’t.”
Recommended reading: What is Sora? Even the AI experts aren’t sure
Paying for GPT-5 and AGI
Developing large language models is a blisteringly expensive business, made even more costly by recent shortages of the Nvidia H100 chips that most of the big AI companies use to build their AI systems.
Also in short supply is high quality data to train the next generation of AI with, with OpenAI recently putting out a call for large-scale data sets, particularly those focusing on long-form writing.
The cost of acquiring the supercomputing power and the data necessary to improve AI models means OpenAI will once again be turning to partners such as Microsoft for additional investment, Altman claims. Microsoft has already ploughed $10 billion into OpenAI as part of a multi-year partnership, but the OpenAI CEO said the company will need more money to reach so-called AGI – artificial general intelligence – where AI achieves human-like levels of sophistication.
When asked by the FT whether Microsoft would continue to invest in his company, Altman replied “I hope so,” before adding: “There’s a long way to go, and a lot of compute to build out between here and AGI… training expenses are just huge.”
The big problem for OpenAI is it’s far from the only player in the game. Google, Amazon, Elon Musk’s x.AI and others are all investing heavily, which not only pushes up prices for those in-demand AI processors, it increases demand for training data and boosts competition for customers to pay for the resulting services.
NEXT UP
Lenovo ThinkPad X1 Carbon Gen 13 Aura Edition review: first look at this ultra-slim business laptop
Here’s our first-look review of the Lenovo ThinkPad X1 Carbon Gen 13 Aura Edition, which we played with during a private briefing at IFA 2024
Hackers beware: UK data centres now have critical national infrastructure protection (CNI)
UK government beefs up national security by adding CNI status to its data centres – here’s why it should help
Hans-Martin Zogg, Business Director TPS, Leica Geosystems: “Ensuring accurate, tamper-free measurements in high-pressure environments is a complex problem”
If you’ve ever wanted to know how Olympics organisers measured distances thrown in field events, Hans-Martin Zogg, Business Director TPS, Leica Geosystems, has the answer.