Edge compute is a model that allows data to be captured as close as possible to the point of creation. What happens with that data after it’s captured depends on the use case of the organization. In general, this topology allows for a network delivery model in which network sites have their own flexible level of sophistication wherein data is processed more effectively.
In layman’s terms, edge compute is like dropping a letter off at the mailbox down the street versus taking it to the post office for direct processing. The letter is collected as close to its point of creation and then sent on a journey through the mail system that follows the address and the type of postage paid for. To the letter recipient, it is all the same seamless experience, and that’s key.
This is how telecommunications carriers build their own networks. Small nodes in the field, such as a wireless tower, collect local data and then underpin it back to the long-haul network for accurate delivery. Organizations can now use this same model to create more resiliency, but also better performance for all end users accessing the network tools.
In the past, colocation facilities, or colos, were perceived as a critical component to a business continuity strategy. Colos were places you could “back up your stuff” by driving your storage tapes to the facility, which was usually hyperlocal to the hub site where the information was being collected, secured and stored. This hyperlocal model doesn’t account for a local natural disaster, but it’s better than no business continuity strategy at all.
Increasingly we are seeing organizations use colocation facilities to process data as close to the point of capture as possible, creating lower latency and a stronger end user experience, all the while following the security protocols being laid over the network topology. Given that services are deployed within a datacenter, which is connected via a ring to a colocation facility, this topology will give IT departments the highest amount of flexibility in how they process data and utilize connectivity in real-time.
As 4GLTE, 5G, satellite and LoRaWAN technologies continue to strengthen over time and underpin internet of things (IoT) and artificial intelligence (AI) initiatives, edge compute designs will continue to proliferate as a de facto model for any organization that intends to collect and act on large amounts of data.
Edge compute allows IT departments to solve for low latency with flexible connectivity in a secure facility. As purchasing decisions and online experience tend to hang on a delivery and experience model where milliseconds matter, low latency will be the king of commerce and digital consumption moving forward.
As we barrel toward more synchronous decision making in the field that pins back to a network (think drones, IoT sensors, logistics/fleet management, connected vehicles, etc.), edge compute will be the only known way to realize the fruits of these technologies because of latency challenges. Dumb IoT sensors need only to collect data and send it back to a smarter host, most likely in a colocation or data center facility.
All this points back to the importance of data capture taking place out in the wild. Once all that data is collected and brought to the edge, decisions need to be made about what to do with it.
As data becomes a business all to itself, edge compute solves for a critical question in that speed to market:
What needs to be processed in real time as a localized event versus what can be processed at the predetermined schedule of the organization?
Another way to think of this question is: What needs to be executed on demand versus what can be scheduled and set?
For instance, users chatting with one another need to be instantaneous and in real time. But capturing a transaction can be saved and then sent in a batch from the merchant to the payment processor.
In a flat edge compute model, organizations can decide what types of computation and actions happen across the edge and when. This is in stark contrast to every piece of data hair-pinning back to a hub location where the hub acts and then disperses or stores the byproduct of the data.
Edge compute can also tie disparate legacy systems together in a future-proof model by using this design and delivery model. If a legacy application lacks the robust compute power needed to handle modern day IT data decision trees, that computation can be offshored elsewhere and returned to the legacy application in a neat package once the data processing has taken place.
Edge compute contains the power to tie disparate protocols together with tethers at the edge. The colo, or data center, can do the heavy lifting and the old school and new school protocols only need to learn how to talk to the edge versus one another.
Edge is to applications what content delivery networks (CDN) was to web/media content. It allows them to perform at their highest level with a scalable number of inputs and requests. It can handle massive geographical challenges with ease, and there’s a built-in business continuity tool in the model. That checks a lot of boxes for IT and business leaders.
The main benefit of the hub and spoke model that has historically proliferated across large organizations is that you only have to sit your security stack in one seat: The hub. The problem is, that solution was built for a different world that no longer exists. Networks are now open and endpoints are everywhere.
Edge compute increases the attack surface while simplifying it at the same time.
The reason why is that each node on the edge, whether it’s an IoT sensor, a laptop, a cell tower, a tablet, a jetpack, a smart phone or a container, is simply an endpoint. That means your approach to securing it is similar, if not identical in a shared context, with the main variable being how many outward IP addresses or APIs are connected into that endpoint.
A critical tool to manage an edge network and topology are endpoint detection and response (EDR) solutions. EDR will offer you the tools to diagnose issues across all different types of endpoints, and this diagnosis becomes critical when the attack surface becomes stretched out.
When an organization makes the pivot to edge compute, they can create more security measures within their design. For instance, a remote sales office doesn’t need heavy security if the organization decided they didn’t need robust compute or data to rest at that facility. That makes the entire office an endpoint that can be overlaid with an SD-WAN (software defined wide area networking) controller to manage the user experience and multiple connections in and out. Pricey point-to-point or VPN tunneling is no longer necessary, unless it is a user with high access privileges.
That leads us to the latest buzz in cybersecurity network design, SASE. Secure access service edge (SASE) is an approach/methodology instead of a protocol or language of delivery.
SASE takes the zero trust methodology and takes it to the next degree while implementing SD-WAN, firewall as a service (FWaaS), cloud access service brokers (CASB) and secure web gateways (SWG). At the bottom of all that alphabet soup is the value that SASE gives organizations: A contextual and intelligent real-time decision tree of security protocols and actions to take.
That means access can be granted based on historical behavior, network context, system integration and user credentials. Since SASE is an approach, and not a box off the shelf, it makes it harder to peel open once the attack initiates. It certainly won’t stop an attack from occurring, but it’s naïve to think you can prevent a threat actor from taking their brute force to your network.
One thing we like to say is “follow the wire.” If you follow that wire, you’ll find something that needs to be secured as if it is an endpoint. By reframing the edge architecture as a network of data capturing and processing endpoints, an IT organization can pivot into an intelligent line of defense, even as threats increase and complexity rises among systems and integrations.
A network designed around edge compute topology is set to be the most competitive and stable network design as we move into the next decade.
Feel free to continue delivering your letters directly to the post office if you’d like, but the mailbox at the edge of the street is sitting there waiting for you. As the pace of digital business and service continues to ramp upward, you may find the need to collect at the edge to stay competitive in the market.
Read more about Cloud Computing.