Automating the Edge with Robotics

Qualcomm, Meta join forces to bring generative AI to edge devices

Qualcomm, Meta join forces to bring generative AI to edge devices

Qualcomm Technologies and Meta have directly integrated Llama 2 Large Language Models (LLM) on devices, reducing dependence on cloud services. This partnership aims to bring on-device LLM implementation to various AI applications running on edge devices like intelligent virtual assistants, productivity tools and entertainment apps.

Qualcomm’s Snapdragon platform supports the implementation. This high-processing solution enables efficient AI operations even in areas without internet connectivity. Running LLMs on the edge device allows customers to save on cloud expenses, as they no longer need external cloud services to process AI tasks.

“To effectively scale generative AI into the mainstream, AI will need to run on both the cloud and devices at the edge, such as smartphones, laptops, vehicles and IoT devices,” says Durga Malladi, the senior vice president and general manager of technology, planning and edge solutions businesses, Qualcomm Technologies, Inc.

In addition to cost savings, using on-device processing of generative AI models on edge devices offers enhanced privacy and service reliability. With the processing occurring locally, there is no need to send user data to remote servers for analysis. Moreover, utilising LLMs on edge devices improves the developer experience by enabling the creation of more customised services that leverage AI models using local data and resources.

Qualcomm also announced its plans to introduce Llama 2-based AI implementation on Snapdragon-supported devices starting in 2024. The partnership leverages the synergies between the two companies, as Qualcomm’s proficiency in embedded high-processing hardware complements the AI development of the other.

According to executives talking to Wall Street analysts on its second quarter earnings call, Qualcomm Technologies has an advantage in facilitating the widespread adoption of AI applications on edge devices. The company attributes its success to significant developments in AI, enabling core on-device functionalities like perception, reasoning, action, and, most recently, content creation. This progress has allowed them to distribute millions of AI-enabled platforms annually.

However, revenues are still largely tied to sales of chips for smartphones and cell tower equipment. On that count, fiscal 2023 third quarter revenue fell short, with revenues falling 23% year-over-year to $8.44 billion. Analysts were expecting sales of $8.5 billion. Handset semiconductor sales declined 25% to $5.26 billion, and revenue from IoT business segment dropped 24% to $1.49 billion.

Article Topics

 |   |   |   |   | 

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Featured Edge Computing Company

Edge Ecosystem Videos

Automating the Edge

“Automating

Deploying AI Models at the Edge

“Deploying

Latest News