By Beth Stackpole
3m read time
It’s well established that data is the crown jewel of modern business. The latest wrinkle is that data collected and processed at the edge—meaning on the factory floor, in the field, or on customers’ mobile phones—is the key to unlocking insights and near-real-time decisions that can be the driver of unrivaled business value. We asked Harish Grama, Kyndryl’s Global Cloud Practice Leader, for his perspective.
Q: Why are edge capabilities so important to data-driven business agility?
A: There is the concept of “data capital,” which associates monetary value with data, which experts predict will be mostly created at the edge by 2025. Companies that can capture, analyze, and execute on edge data will see a significant advantage over their peers. That’s thanks to an ability to turn insights into motions that drive revenue opportunities, help reduce operating costs, increase profitability, and disrupt the market.
There are two data flow paradigms: Data that requires an immediate response to capture its innate value or data that has a longer lifecycle for response. Capturing real-time data at the edge for low-latency activities is most common within industrial markets where machine or sensor data needs an immediate response to mitigate an operational failure, address a security issue, or for safety concerns.
The data capital in this case is the ability to prevent a costly issue like an assembly line failure, fire, or water damage. Longer cycle data is captured immediately but requires deeper analysis to deliver value. One such example would be gathering Internet of Things (IoT) data and analyzing the health of instrumentation, sending out a service rep proactively for repairs before there is a major issue.
Q: How has edge been limited in its ability to support near-real-time decision making?
A: Traditionally there has been no centralized management of remote devices so companies were required to deploy team resources at regional, local, or remote areas for management and monitoring. It has been difficult to build the concept of a data fabric at the edge. Customers had to cobble together a pipeline with tools that were not fully designed for that purpose. Even if customers could duct tape and bubble gum a data fabric together, it was not performant, highly available, or scalable and it lacked continuity and resiliency. These cobbled together solutions also can’t address real-time data volumes, requiring customers to spend a significant amount of time maintaining these environments.
Q: How does distributed cloud change traditional edge implementation?
A: It allows companies to gain centralized orchestration, monitoring, management, and remediation without having to have local resources. It supports the design of a purpose-built environment with best-of-breed data management, analytics, and BI tools and with the ability to tap other public clouds for best-in-class costing for disaster recovery, backup, and business resiliency. Customers can build a data fabric or mesh with new cloud technologies that were purpose built for such use cases. Centralized management and consolidating skills around a single public cloud provider helps teams prioritize skilled resources while cloud automation accelerates deployment, integration, and time to value.
For more on how Kyndryl can help on the distributed cloud journey, submit an online request.
Explore Collection
Explore Collection
Explore Collection