Zededa and Submer target off-grid AI with modular, liquid-cooled edge GPU systems

Zededa and Submer target off-grid AI with modular, liquid-cooled edge GPU systems

Zededa and Submer have partnered to deliver modular, liquid-cooled edge AI infrastructure for high-density GPU inference in locations without traditional data centers. 

Linking Submer’s liquid-cooled AI infrastructure with Zededa’s edge intelligence platform, the partners will deliver scalable, secure and resilient edge AI deployments.

“AI is rapidly moving from centralized cloud environments into real-world operations, from industrial sites to telecom networks and remote energy infrastructure,” says Patrick Smets, CEO of Submer. “Delivering that intelligence requires purpose-built AI infrastructure that operates efficiently in environments where traditional data centers simply cannot exist. By combining Submer’s liquid-cooled high-density AI infrastructure with Zededa’s edge intelligence platform, we’re enabling organizations to deploy scalable, resilient AI infrastructure anywhere it is needed.”

Three modular solutions are offered: edge pods (from 2 to 8 GPUs); ruggedized micro-data centers (to 168 GPUs), and; megawatt-scale containerized systems (up to 800 GPUs).

Submer’s liquid cooling technology improves energy efficiency, reduces water consumption and enables high-density GPUs in harsh environments.

Zededa’s software-defined resilience reduces hardware redundancy costs by redistributing workloads during node failures, ensuring uptime and lowering total cost of ownership.

The deployments address a wide range of AI workloads such as real-time computer vision, predictive maintenance and industrial automation so that AI can run at the edge in remote or extreme environments.

The first pilot deployments are expected to begin with industrial and telecommunications customers later this year.

The partners will focus on delivering the emerging edge AI infrastructure necessary to bring workloads from centralized cloud environments to operational and industrial settings.

Late last year Zededa rolled out full-stack edge Kubernetes to tackle large-scale AI deployments.

Related Posts

Article Topics

 |   |   | 

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Featured Company

Sponsored Links

Avassa: Empowers companies to bridge the gap between modern containerized applications development and operations and distributed edge infrastructure. https://avassa.io/

DataBank: We believe there is a different edge to be served - the “middle edge" - that will become the first step for many in their journey to the edge. https://www.databank.com/

Latitude.sh: Where the power of bare metal meets the flexibility of the cloud. Deploy physical servers across 23 global locations in as little as 5 seconds. https://www.latitude.sh/

NodeWeaver: Minimizes the total lifecycle cost of deploying, managing, and operating edge compute by addressing the main drivers of cost and complexity.​ https://www.nodeweaver.eu/

OnLogic: A global industrial PC manufacturer and solution provider focused on hardware for IoT and edge AI, designing highly-configurable computers engineered for reliability. https://www.onlogic.com/

Zenlayer: A massively distributed edge cloud service provider operating over 270 PoPs around the world, with expertise in fast-growing emerging markets. https://www.zenlayer.com/

Latest News