REGISTER HERE!

Cerence AI and Arm push LLM boundaries with on-device AI for smarter cars

Categories Edge Computing News  |  Hardware
Cerence AI and Arm push LLM boundaries with on-device AI for smarter cars

Cerence AI has partnered with semiconductor manufacturer, Arm to enhance its embedded small language model (SLM), CaLLM Edge, using Arm’s Kleidi software library. 

The collaboration aims to optimize CPU and GPU performance for real-time language processing at the edge, improving speed, efficiency, and privacy highlighting the growing importance of edge computing and generative AI in the automotive industry. 

Arm’s Kleidi technology accelerates machine learning and neural network operations on Arm-based devices, addressing the challenges of limited compute power in vehicles. CaLLM Edge operates fully on Arm-based chipsets, enabling advanced in-car AI capabilities without relying on cloud connectivity. 

“We are excited to partner with Arm to take CaLLM Edge to the next level, setting new standards for performance and efficiency in edge computing in the car,” says Nils Schanz, EVP, Product & Technology, Cerence AI. “By combining our expertise in AI-powered language models with Arm’s innovative library, we are continuing our journey to create a new era of voice-first experiences and next-generation AI applications in the automotive space, empowering consumers with smarter, faster, and more responsive in-car assistants.”

This partnership supports automakers in delivering smarter, faster, and more responsive AI-powered user experiences for drivers and setting new standards for in-car AI applications, enhancing safety and connectivity.

Article Topics

 |   |   |   |   |   | 

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Featured Edge Computing Company

REGISTER

“The

Edge Ecosystem Videos

Latest News