While a lot of the articles featured here have to do with massive centralized cloud and colocation data centers, there’s also a need for compute power closer to the user. This “edge” covers a lot of ground – from the idea of a mini data center on every street corner providing real-time compute power for driverless cars, to full sized data centers built in underserved countries to address the problem of transcontinental lag.
What does “edge data center” mean to you? Looking at the first two articles today, I would say it’s any concentration of networking capacity located according to where it’s needed, no matter how challenging it is to accomplish it. That means an edge data center can be as small as a rolling suitcase, or as big as it needs to be. What matters is bringing that computing power closer to where it’s needed, so that availability is no longer impacted by latency or unreliable infrastructure.
Cloud computing vs. edge computing – is it a competition or a collaboration? Personally, I think they are complementary systems, each with strengths and weaknesses to take into consideration. If you’re backing up data and retrieving it at human speeds, then the cloud can deliver – but if microseconds of latency are critical, then you need computing power on the edge, as close as possible to the task at hand.
Read on for this week’s “Edgy” network news, and let me know what you think in the comments!