Tech & Gadgets

Meta and Arm reportedly teaming up to let AI perform more tasks on phones

Meta Connect 2024, the company’s developer conference, took place on Wednesday. During the event, the social media giant unveiled several new features for artificial intelligence (AI) and wearables. In addition, Meta also reportedly announced a partnership with tech giant Arm to build special-purpose small language models (SLMs). These AI models would be used to power smartphones and other devices and introduce new ways to use them. The idea behind it is to provide on-device and edge computing options to keep AI inference fast.

According to a CNET reportMeta and Arm plan to build AI models that can perform more advanced tasks on devices. For example, the AI ​​could act as the device’s virtual assistant, making a phone call or taking a photo. This isn’t far-fetched, as AI tools can already perform a wide range of tasks today, such as editing images and composing emails.

The main difference, however, is that users have to interact with the interface or type certain commands in order for the AI ​​to perform these tasks. During the Meta event, the duo reportedly emphasized that they wanted to do away with this and make AI models more intuitive and responsive.

One way to do this is to bring the AI ​​models on the device or keep the servers very close to the devices. The latter is also known as edge computing and is used by research institutions and large enterprises. Ragavan Srinivasan, vice president of product management for generative AI at Meta, told the publication that developing these new AI models is a good way to take advantage of this opportunity.

This requires the AI ​​models to be smaller. While Meta has developed large language models (LLMs) that are as large as 90 billion parameters, these are not suitable for smaller devices or faster processing. The Llama 3.2 1B and 3B models are considered ideal for this.

Another issue, however, is that AI models also need to be equipped with newer capabilities than just text generation and computer vision. This is where Arm comes in. According to the report, Meta is working closely with the tech giant to develop processor-optimized AI models that can adapt to the workflows of devices such as smartphones, tablets and even laptops. No other details about the SLMs have been shared at this time.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button