Apple AI iPhones move one step closer with on-device AI experiments

If the future of tech can be judged by what Apple chooses to do, then small language models could well be the next big thing for artificial intelligence. And that could mean Apple AI iPhones later this year.

Apple has largely stayed out of the most recent round of the AI arms race. In public, at least. Look closer and you’ll see that it has acquired a host of AI companies, and it’s reportedly in talks with the major AI players.

Apple is clearly working on its own LLMs as well, but for now, it’s thinking small.

The company recently unveiled a small set of language models on the Hugging Face platform, called “Open-Source Efficient Language Models” (OpenELM) that could run locally on devices. The move follows Microsoft’s unveiling of similar-style “Phi-3” models last week.

Small but smart AI models

Microsoft’s Phi-3-mini model features just 3.8 billion parameters, while Apple’s OpenELM models range from 270 million parameters up to 3 billion.

While that’s still plenty for a well-designed language model to be useful, they’re far smaller than OpenAI’s offerings. GPT-3.5 reportedly had 175 billion parameters while GPT-4 has been put around the 1.8 trillion mark.

With LLMs, the rule of thumb is that the more parameters the better. Essentially, bigger datasets plugged into a more complex model equal more accuracy. However, the use of pretraining and more efficient design could create smaller language models that are accurate enough without demanding as much compute power or energy consumption.

Crucially, this enables them to run on-device rather than in the cloud. On a device such as an Apple iPhone, in fact.

Microsoft says that subsequent Phi-3 models will have 7 billion and 14 billion parameters. “What we’re going to start to see is not a shift from large to small, but a shift from a singular category of models to a portfolio of models where customers get the ability to make a decision on what is the best model for their scenario,” said Sonali Yadav, principal product manager for Generative AI at Microsoft, in a statement at the time.


Related: Let the games begin: Paris Olympics puts AI to the test


OpenELM hints at Apple AI iPhones

The Apple developers describe OpenELM also talk about a family of models, beginning with the eight introduced so far.

Each was pre-trained using existing libraries before being fine-tuned to differing levels depending on potential use cases. “OpenELM uses a layer-wise scaling strategy to efficiently allocate parameters within each layer of the transformer model, leading to enhanced accuracy,” says the post on Hugging Face. “We release both pre-trained and instruction-tuned models with 270M, 450M, 1.1B and 3B parameters.”

The models were released in an open format for others to access, including a wider range of information about the models including training logs and configurations. However, the researchers warned they may not be perfectly “safe” and should be tested before use.

“The release of OpenELM models aims to empower and enrich the open research community by providing access to state-of-the-art language models,” the developers added.

The keyword there is “research”. The models released on Hugging Face shouldn’t be expected to show up on iPhones anytime soon.

However, the OpenELM work suggests a clear direction of travel for AI at Apple. We certainly wouldn’t bet against the next Tim Cook announcement talking of Apple AI iPhones.

OpenAI discussions

In the meantime, reports suggest, Apple may be in talks with ChatGPT-maker OpenAI. So, similarly to Windows Copilot, Apple could embed its technology into a search-focused chatbot.

Other reports suggest Apple is holding discussions with Google to use Gemini for the same purpose.

It would be more surprising if Apple wasn’t in talks with either, of course, but it suggests that Apple will buy in LLMs for AI-powered tools in the next versions of iOS and macOS, while also building its own models for on-device AI.

In short, Apple could be picking and choosing a bit of everything from AI for iOS 18, widely expected to arrive in September.


Read next: Apple Vision Pro used in surgery — to organise tools


Nicole Kobie
Nicole Kobie

Nicole is a journalist and author who specialises in the future of technology and transport. Her first book is called Green Energy, and she's working on her second, a history of technology. At TechFinitive she frequently writes about innovation and how technology can foster better collaboration.

NEXT UP