Deploying AI Models at the Edge

Qualcomm, Prophesee team up to increase camera performance in mobile devices

Categories Edge Computing News  |  Edge Startups  |  Hardware
Qualcomm, Prophesee team up to increase camera performance in mobile devices

Qualcomm Technologies has announced a multi-year partnership with Prophesee to integrate its neuromorphic vision system into the next generation of the Snapdragon mobile platform. The resulting products promise to enable new capabilities at the device edge.

Through this collaboration, Qualcomm can utilize Prophesee’s Metavision sensors and software to enhance image quality in low-light settings and fast-moving scenes. This collaboration seeks to offer mobile users a more authentic experience when capturing images.

Qualcomm Technologies believes that Prophesee’s pixel sensing technology, which is a combination of event-based continuous and asynchronous processes, will not only boost speed and efficiency but also fulfill the needs for contemporary camera performance. Prophesee’s sensing technology is designed to continuously and rapidly detect changes in a scene, pixel by pixel.

“Prophesee’s pioneering achievements with event cameras’ shutter-free capability offer a significant enhancement to the quality of photography available in the next generation of mobile devices powered by Snapdragon, even in the most demanding environments, unlocking a range of new possibilities for Snapdragon customers,” says Judd Heape, vice president of Product Management at Qualcomm Technologies.

The Prophesee’s Metavision sensor has a logic core within each pixel that activates based on the number of photons it detects. These activations are referred to as “events,” and they are triggered by changes in the scene rather than by a clock, as is the case with traditional cameras.

The companies say achieving event-based deblurring is possible through synchronizing a frame-based and Prophesee’s event-based sensor. Through this merging of technologies, the system can fill in blanks between and inside frames with microsecond events to retrieve motion information and eliminate motion blur.

Event-based vision systems, also called neuromorphic systems, mimic how the human eye and brain process visual information. Instead of capturing a sequence of static frames like traditional cameras, these systems capture and process individual pixel-level changes in real-time. Integrating event-based vision systems has the potential to transform the smartphone ecosystem by enabling fast focusing and tracking, improving low-light performance and reducing motion blur in video recordings.

“Through this collaboration, product developers will be able to dramatically enhance the user experience with cameras that deliver the image quality and operational excellence not available using just traditional frame-based methods,” says Luca Verre, CEO and co-founder of Prophesee.

In 2022, Prophesee partnered with BrainChip to deliver a neuromorphic event-based vision system with integrated low-power tech and high AI performance. The collaboration sought to help businesses explore edge artificial intelligence use cases with neuromorphic vision systems.

Article Topics

 |   |   |   |   |   | 

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Featured Edge Computing Company

Edge Ecosystem Videos

Automating the Edge

“Automating

Deploying AI Models at the Edge

“Deploying

Latest News