Companies Struggle to Push Workloads to the Edge

Jul 31 2019

NGD Systems

The spirit is willing, but the technology is weak. Moving to the edge is impeded by lack of processing power, space limitations, and legacy technology, according to a new survey.

Enterprises are increasingly deploying edge workloads to support data-intensive use cases like artificial intelligence (AI), machine learning (ML), real-time analytics, Internet of things (IoT), and 5G. But in the majority of cases, organizations aren’t getting as much out of these workloads as they had expected. Edge computing is difficult — with a lack of processing power, space limitations, and legacy technology all contributing to the problem.

A recent survey, conducted by Dimensional Research, sheds light on the state of edge computing in the enterprise. The survey gathered feedback from over 300 storage professionals and software developers, employed at enterprise-level companies with 1,000 or more employees, who are responsible for data-intensive workloads across a diverse range of industries. Above all, the findings show that organizations simply aren’t happy with the performance of their workloads at the edge. Only 1 in 10 respondents gave themselves an “A” grade for their compute and storage performance at the edge, while half of all respondents graded themselves a “C” or worse.

Despite the frustrating results, organizations signaled that most (55%) are currently deploying workloads at the edge. Edge computing will only grow as it is integral to many key use cases today. According to the survey, the top use cases supported by edge computing are real-time analytics (71%), machine learning (57%), and IoT (56%). While 5G is still a few years out, 25% of respondents said they were deploying edge workloads to support 5G technology. We can only expect that edge deployments will grow as they are critical to supporting these bread and butter use cases.

So, with an increasing emphasis on the edge for important technologies like real-time analytics, ML, and IoT, what’s causing all the headaches? When asked about the biggest challenges of performing these use cases at the edge, respondents listed: cost of legacy storage solutions (61%), difficulty ingesting and managing data (55%), and compute-storage bottlenecks (53% ) as the top problems. In short, traditional storage solutions are too expensive and cumbersome to support edge workloads. And given the limited space in edge environments, these traditional solutions simply don’t offer enough power and efficiency — causing bottlenecks and trouble managing huge streams of data.

How can organizations effectively crack edge computing? The answer is in the survey. When asked what their company is missing in order to effectively support edge workloads, 46% of respondents said that, above all, there is a lack of infrastructure for performing compute directly where data is stored.

Bringing compute directly to storage is a fundamental shift that makes it possible to process data right where it’s created. This is a huge shortcut for executing data-intensive use cases at the edge — use cases that rely on huge volumes of streaming data from many different endpoints — as edge infrastructure won’t need to move data between platforms for it to be processed or analyzed.

According to the survey results, 70% of enterprises are using GPU accelerators to improve compute and storage capabilities at the edge, but this is ultimately a stopgap measure. GPU accelerators — like composable architectures, NVMe-oF fabrics, and FPGA accelerators — do moderately boost performance at the edge. However, they don’t move the needle enough to overcome bottlenecks: they fundamentally do not change the approach to edge computing, as the data still needs to be moved between edge platforms for it to be processed or analyzed. With the massive volumes of streaming data involved in edge use cases, this approach will never work reliably.

Computational storage SSD is the only technology that provides the potential to perform compute directly where data is stored. This is achieved via an innovative new concept called In-Situ processing that has been deployed in some NVMe SSDs. In-Situ processing allows acceleration and parallel processing resources to be embedded directly within the SSD, enabling data to be analyzed right where it resides.

Organizations appear to understand the value of NVMe SSDs. Currently, 60% of survey respondents are using NVMe SSDs at the edge, and 86% plan to increase NVMe SSD usage in the next two years. However, only 44% of respondents realize that NVMe SSDs offer the ability to perform compute directly on physical storage drives. As these enterprises become more aware of the true potential of NVMe SSDs to bring compute directly to storage, adoption will rise even faster and edge performance will increase dramatically.

Edge computing has rapidly emerged as an innovative approach to support AI, ML, and IoT. But these workloads are suffering due to a lack of processing power at the edge, while legacy storage platforms and stopgap measures like GPUs aren’t really changing the game. Through its ability to deliver processing right to the data, computational storage SSD technology has the potential to transform edge deployments.

Source: https://www.eetimes.com/companies-struggle-to-push-workloads-to-the-edge/

355 Goddard, Suite 200
Irvine, CA 92618

For Sales Inquiries:

sales@ngdsystems.com

For General Inquiries:

info@ngdsystems.com