Cloudflare is set to make a jump into the market for edge AI services, announcing a partnership with Nvidia. The goal: offer a global platform for developers to deploy their artificial intelligence algorithms closer to data sources.
Cloudflare is a provider of security, content, and application delivery services. The company operates a network of over 200 PoPs. It has been offering a serverless edge compute offering, called Workers, since 2017. Using Workers, developers can run code on Cloudflare’s infrastructure.
Adding GPUs to the hardware mix in Cloudflare PoPs takes the next step and will enable developers to run AI algorithms—using tools and models that they are familiar with—at edge locations. CEO Matthew Prince told Bloomberg that GPU-powered Workers instances would be within a “few milliseconds” of users, be they people or other machines.
That latency measure means Prince is talking about deploying GPUs in lots of cities and, ideally, in multiple locations throughout a city. Cloudflare has been extending the reach of its content delivery services by extending its network to edge data center providers like EdgeConnex, EdgeMicro, and Vapor IO. It’s not a stretch to imagine that servers in these facilities will soon have GPUs in them as well.
Beating the competition—so far
Among its core competitors, Cloudflare has placed a stake in the ground ahead of the pack. Akamai has placed servers in EdgeConnex facilities for content delivery, for example, but has not detailed plans for GPU instances from its edge POPs. Akamai recently announced that it too would enable developers to run custom code from edge servers with its EdgeWorkers service.
Lumen is advancing an edge computing offering with the goal of edge cloud services within 5ms of end-users. The company is offering bare metal as a service and its with edge platform is enabled through its partnership with Section IO.
Fastly’s [email protected] service is similar to Cloudflare’s Workers, but doesn’t appear to have staked out edge AI territory yet.
Outside of the CDN market, cloud providers are offering a range of edge AI options, including on-premise and transportable options. AWS and Microsoft Azure are the two primary competitors for edge AI services. Equinix, a provider of data center services, could be considered a competitor in some cases. Equinix Metal offers BMaaS in multiple edge locations and lists a GPU option as being under development.
One challenge for the avilability of an edge GPU cloud service: will there be enough chips? A global semiconductor shortage may last between 6 months to a year or more, depending on which CEO one listens to. In the least, supply will be constrained for the foreseeable future. That could hamper a rollout, though Cloudflare hasn’t actually specified when such a service would be available.
Will that be in 2021? Maybe on a limited basis. Judging from comments made by Cloudflare executives in February, widescale service availability doesn’t sound like a 2021 project. During a call covering Q4 2020 earnings, executives said they expect network capital expenditures to be lower than the 12% of overall revenue the company spent in 2020. With 2021 revenues expected to be in the range of $589M to $593M, that comes to an estimated $59 to $65m for capital expenditures this year. (Just for reference, Akamai expects to spend between $150M to $155M on capex in Q1 2021 alone).
Regardless of when the service actually materializes, the main takeaway is that Cloudflare is continuing to create interesting new services for developers and has stolen some thunder from bigger competitors like Akamai. Cloudflare should take advantage of that developer interest by creating a limited beta service as soon as possible to start building a showcase of interesting edge AI apps.
Jim Davis is Principal Analyst and CEO of Edge Research Group.
Akamai | AWS | Azure | Cloudflare | edge AI | edge cloud | EDGE Data Centers | Equinix | Fastly | interconnection | Lumen Technologies | Section.io