Three things AI PCs can do already
Table Of Contents−
All the tech CEOs are talking about AI PCs. Pat Gelsinger at Intel’s recent “AI everywhere” event, Dr Lisa Su at AMD’s Advancing AI launch in December, Satya Nadella at Microsoft Ignite 2023.
But what can AI PCs actually do? How will they change your world?
We’ll answer this through three videos, which we shot at the AMD event mentioned above. This is where AMD announced both its Instinct MI300 AI accelerators, to rival Nvidia’s highly lucrative H100 units in data centres, and Ryzen 8040 chips with Ryzen AI built in.
AI PCs can help with processing video
This short video shows how Ryzen AI laptops can take over deblurring in Topaz. You’ll see that the graphics chip remains free to play a game whilst the NPU (neural processing unit) takes care of the blurring.
While this demo runs on an AMD Ryzen laptop, you would see a similar effect with any forthcoming AI PCs running on Intel’s Core Ultra chips. Likewise on Apple Macs with its M series chips inside, as these also feature NPUs.
It’s a great demo as it plays to one of the strengths of neural processors: handling large amounts of data at any one time. Expect to see video editors – and any app that handles video or high-res photos – take advantage of neural chips in the future.
Stable Diffusion AI running on a gaming PC
If you thought that Midjourney was impressive (and we certainly do), then just wait until you see Stable Diffusion running on a high-end gaming PC. This demo is based on a high-end consumer graphics card, AMD’s Radeon RX 7900 XTX, and is simply spectacular.
Watch the video and you can see the real-time effect of changing parameters in a console. This means you can iterate images to get exactly what you want without the thumb-gnawing wait you get on a remote server. And it’s free for non-commercial and commercial use.
For more details, head to stability.ai.
Running text-based generative AI on a laptop
Unlike the Stable Diffusion example, it isn’t obvious what’s happening here. But it is very clever. The large language model SHARK created by Nod.ai (bought by AMD in October 2023) cleverly adjusts itself to its host.
So it can run with tens of billions of parameters on the remote server with AMD Instinct accelerators inside. This is the first portion of the video above. But with one text change, it can be shrunk down to a few billion parameters so that it runs efficiently on a laptop. That’s the second part of the demo.
You’ll notice that the open-source technology also runs on lesser processors, such as that found in the Asus ROG Ally. You can try it yourself by downloading the SHARK executable from GitHub.
NEXT UP
James Frampton, Chief Revenue Officer at SugarCRM: “AI is a game-changer for sales, service and marketing”
We interview James Frampton, Chief Revenue Officer at SugarCRM a veteran of the technology arena, with over 23 years of ERP, CRM and IT Service Management experience.
IBM acquires Accelalpha, world’s biggest Oracle logistics practice
IBM has announced plans to snap up Oracle consultancy and services company Accelalpha
Why Rotterdam is a tech haven: a love letter from a startup
We reached out to Kees Wolters asking for a comment on Rotterdam as one of the best cities in Europe for tech workers – he sent us what amounted to a love letter to the city, which we decided to publish in full (with his consent), below.