AMD Ryzen AI could push people from the cloud to local machines

Up until this point, when we talk about large language models and generative AI it has been in the context of the cloud. A phalanx of mysterious computers sitting in a distant data centre, blinking at each other. But the AMD Ryzen 7 7840U has AI skills built in, and that could change the way we use AI in the future.

So AMD believes, at least, with Senior Product Marketing Manager Wendy Wong dialling up expectations on a call with technology journalists today. “Just as the industrial revolution harnessed machines and the advent of microprocessors fuelled the information age, now, AI PCs stand at the threshold of a new frontier that holds the promise of reshaping how we interact with technology,” she said.

This tallies with much of the tech industry’s thinking (some would call it hype) around AI. That it could have a huge dramatic effect on the way our civilisation works. And certainly, if you’ve followed TechFinitive’s guide on how to get the best AI art from Midjourney then you will have experienced its power.

But you access Midjourney via a Discord server. Wouldn’t it be much better to have a large language model, such as Meta’s open-source Llama 2, on tap on your local computer?

AMD Ryzen AI versus cloud-based AI

That’s what AMD argues, at least. “We take for granted features like long-lasting battery, like connecting to Wi-Fi, but soon it will be unimaginable not to have AI built directly into your PC, where it can process data quickly and securely,” said Wong.

Wong went on to compare the benefits of Ryzen AI — that is, local AI — against AI running in the cloud. Her three main advantages were:

  • Reduce latency when running AI applications
  • Eliminate security risks and leaks of valuable data
  • Reduce costs by cutting out subscriptions

All three points have merit, but it’s saving on monthly subscription fees that might just sway people towards local AI. “Soon there will be a move to open-source software running on your own dedicated hardware that doesn’t require cloud service,” said Wong.

“Just giving the flexibility of options such as shared cost between cloud and local AI applications adds that control element to AI for the end user.”

As hinted at in that statement, software support is key. Something that AMD can only encourage, not control. Right now, you can use Ryzen AI to blur backgrounds in video calls and help with facial recognition, but those are hardly killer apps.

That’s why AMD has launched a software platform for developers to take advantage of Ryzen AI, and it also promises there will be plenty of hardware to develop on.

For example, AMD has partnered with Acer to release a version of its Swift Edge 16 with a Ryzen 7 7840U inside. Wong said that “over 50 systems available are coming soon to consumers in-market with Ryzen AI built-in”.

We can also confirm that the Ryzen AI will run Meta’s Llama 2, an open-source large language model.

Updated 28 September to confirm running of Lllama 2.

Learn more through our TechFinitive Explainers

Avatar photo
Tim Danton

Tim has worked in IT publishing since the days when all PCs were beige, and is editor-in-chief of the UK's PC Pro magazine. He has been writing about hardware for TechFinitive since 2023.