Automating the Edge with Robotics

StackPath upgrades virtual machine product with NVIDIA edge computing instances

StackPath upgrades virtual machine product with NVIDIA edge computing instances

Edge computing platform StackPath has added NVIDIA GPU-accelerated instances to its Virtual Machine (VM) and container product options.

The new instances use NVIDIA A2 Tensor Core and NVIDIA A16 GPUs to deliver the compute power required for workloads such as deep learning algorithms, graphical processing, and other parallel architectures.

Currently, the instances are available in StackPath Dallas, San Jose, and Frankfurt locations, and will be added across the StackPath platform throughout 2024, the company reveals.

Speaking about the announcement, Tom Reyes, chief product officer at StackPath, says: “Our GPU-Accelerated Instances are exactly what new and next-generation workloads—like AI inference, computer vision, and natural language processing—really need to succeed.

“These are real-time applications. So, as much as they need high computational power, they also need exceptionally low latency. The physical location of our platform minimizes the number of hops in and out of our instances, so the advantages provided by a GPU aren’t undermined by geographic distance.”

StackPath edge compute instances are provisioned on demand through the StackPath Customer Portal or API. The company reveals that the instances are billed by the hour and volume of data transferred. Additional options include forming virtual private clouds, leveraging built-in L3-L4 DDoS protection, storage, image capture and deployment, private IP addresses, and more.

StackP edge compute VM GPU-Accelerated Instances are available in 1 NVIDIA A2/16 GPU x 12vCPUs x 48GiB RAM x 25GiB root disk, 2 NVIDIA A2/16 GPU x 24vCPUs x 96GiB RAM x 25GiB root disk, and 4 NVIDIA A2/16 GPU x 48vCPUs x 192GiB RAM x 25GiB root disk.

Additionally, the company’s edge compute container GPU-accelerated Instances are accessible in several configurations including 22 NVIDIA A2/16 GPU x 12vCPUs x 48GiB RAM x 40GiB root disk, 2 NVIDIA A2/16 GPU x 24vCPUs x 96GiB RAM x 40GiB root risk. opm4 NVIDIA A2/16 GPU x 48vCPUs x 192GiB RAM x 40GiB root disk.

Article Topics

 |   |   |   |   | 

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Featured Edge Computing Company

Edge Ecosystem Videos

Automating the Edge

“Automating

Deploying AI Models at the Edge

“Deploying

Latest News