Nokia and Blaize sign edge AI inference MOU targeting APAC networks

Nokia and AI-enabled edge computing chip company Blaize signed a strategic Memorandum of Understanding (MOU) to introduce edge AI solutions to the APAC region.
The partnership brings together Nokia’s network, automation and cloud orchestration with Blaize’s AI inference energy-efficient platform.
The partnership will combine hybrid AI solutions particular to real-world use cases, which prioritize low-latency and power efficiency as well as sustainable operation.
“Our collaboration with Nokia marks an important step forward in delivering Practical AI and Physical AI at scale,” says Dinakar Munagala, co-founder and chief executive officer of Blaize. “By combining Nokia’s leadership in connectivity and automation with the Blaize AI inference platform, we are enabling Real World AI that runs efficiently at the edge while integrating seamlessly with cloud and GPU infrastructure. This Hybrid AI approach allows organizations to deploy inference where it matters most and turn intelligence into real operational outcomes.”
Critical focus areas of the partnership include catalysts and hybrid AI use cases, reference architectures, co-validation of optimization tools and go-to-market efforts.
The objective is to support the development of large-scale, production-quality AI systems in communications networks and clouds.
Blaize’s AI platform is the only solution for edge, enterprise data centers and public cloud that combines flexible scalability in any combination of edge or private node configurations while reducing TCO by 10x compared to GPU/FPGA equivalents.
The MOU is without any obligation and sets a framework for future collaboration efforts to meet the increasing demand for hybrid AI architectures.
Nokia recently partnered with Tampnet for 5G offshore connectivity in the Gulf of Mexico.
OnLogic debuts CL260: A palm-sized, fanless industrial PC aimed at scalable edge deployments
Article Topics
AI inference chips | AI/ML | Blaize | edge AI | edge computing | hybrid inference | Nokia


Comments