Deploying adaptive AI in distributed water plants - Barbara Accoina

How to build a framework for edge computing: balance the data, networking, security formula

How to build a framework for edge computing: balance the data, networking, security formula

By Ernest Sampera, a co-founder at vXchnge.

Since the late 90s, cloud computing has entered the zeitgeist and gained popularity. Although not as mainstream, edge computing is now having a significant impact on companies as the number of Internet of Things (IoT) devices grow and consumer expectations of network speed and reliability continue to rise.

Edge computing, as the name implies, allows data to be processed closer to the edge of a network, closer to where consumers are located as opposed to a centralized cloud server. By moving the processing of data closer to the end-user, the distance the data must travel is reduced which provides faster processing and greater functionality for IoT devices and other on-demand services. While building an edge computing framework may seem overwhelming, it is becoming a necessity for serving the needs of today’s digital consumers and will be imperative for supporting future IoT applications. When considering how to effectively build an edge computing framework, start by addressing these considerations:

Data Requirements

To build an effective framework, it is important to first understand what requirements your company has for its edge computing. These needs will vary based on the type of business. Take streaming providers as an example. For streaming providers and their customers, latency concerns are a top priority as consumers expect to be able to stream large amounts of data without interruption. As more content enters the marketplace and streaming services proliferate, the challenge of processing large amounts of data quickly only grows.

These requirements stand in contrast to an autonomous vehicle company that must be able to manage mass quantities of data. To function safely, autonomous vehicles must quickly collect and analyze data concerning multiple external factors like their environment, weather, and other vehicles on the road, and feed that information to manufacturers for monitoring purposes. This process must happen seamlessly and quickly despite the quantity of data to maintain safety.

Ultimately, an edge computing framework should match the needs of the organization. Identifying these specific data needs in advance is essential to understanding how to build an effective framework.

Connectivity Considerations

In addition to data requirements, connectivity options will play an essential role in your considerations. With data collection and processing moved to the edge of the network, the speed and reliability of the “last mile” connections may impact the placement of an edge facility.

In many edge markets, the advanced speeds provided by the emerging 5G network are simply not available and existing options may raise latency concerns. In building an effective framework, it is important to build in the flexibility to accommodate multiple forms of connectivity and ensure that the available connectivity options can serve the needs of existing and emerging technologies.

Security Requirements

Edge computing provides various security advantages compared to traditional cloud computing architectures. With traditional cloud computing being centralized, it is more prone to distributed denial of service (DDoS) attacks and other security disruptions. In contrast, the distributed nature of edge computing makes it more challenging for a single disruption to bring down a network since its processing, storage, and applications are spread across multiple locations. Edge computing, however, presents its own security challenges. With high connectivity and more locations on the network, there are more points of attack that must be protected. As a result, it is essential to consider the protocols that will allow you to protect your company’s data and systems from unauthorized access.

In addition to traditional encryption protocols and monitoring, a low or zero trust architecture should be used to address security concerns when it comes to edge computing because of the vast number of devices accessing the network at any given time. By assuming that every device is compromised, this type of security architecture is designed to limit access to a network even if an individual is within a predetermined micro-segmentation of approved users. Companies can set up segmentation based on users, applications, and data types which can be scaled and implemented over time. Zero trust architecture allows companies to protect edge networks to guard against various types of threats and can grow with an edge network as it expands. Since security breaches are costly, harmful to brand identity, and pose other significant threats to consumers, policies for edge security must play a central role in considerations when implementing an edge computing framework.

Edge Data Center Options

Identifying data centers that can meet your needs for edge computing is another essential step in building a powerful edge computing framework. Edge data centers can take several forms. They can be local data centers or even portable micro data centers, but what they have in common is that they sit closer to the end-user. These edge facilities are an important piece of the network architecture and can provide many different services including additional processing capacity near the edge of the network or reducing latency in content delivery.

Typically, these centers are connected to larger data centers that provide centralized data processing or are connected to multiple other edge data centers that store and cache data to quickly deliver content. Since edge data centers are often smaller and located closer to end-users, they are focused on meeting the needs of consumers in a local market. To serve their primary purpose, they must have a fast performance with high-capacity routers, boast quick response times, and have near-perfect uptime.

Colocation Benefits

Scalability plays an important role in edge computing. While growing companies are often unable to build private data centers when network expansion is needed due to cost, maintenance requirements, and timing, using co-located data centers in edge markets can provide the speed, reliability, and support needed to build an effective edge network. Colocation is also typically a cost-effective solution that provides flexibility to companies as they build out and maintain an edge network. When identifying centers for an edge computing framework, considering the use and availability of high-quality colocation services can help to ease the burden of building the network.

Edge computing is helping businesses move past the limitations of conventional cloud-based networks by providing customers with the speed and reliability they demand. Understanding your company’s needs first is essential in planning and deploying an effective edge computing framework to meet consumers’ growing demands.

About the Author

Ernest Sampera is a co-founder at vXchnge, an award-winning, carrier-neutral colocation services provider with edge data centers across the United States.

DISCLAIMER: Guest posts are submitted content. The views expressed in this blog are that of the author, and don’t necessarily reflect the views of Edge Industry Review (EdgeIR.com).

Article Topics

 |   |   |   |   |   |   | 

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Featured Edge Computing Company

Edge Ecosystem Videos

Machine learning at the Edge

“Barbara

Latest News