For most businesses that deal with “hard” assets (machines, drones, infrastructure, things, etc.) the last few years have seen an accelerating transformation as devices and machines have become more interconnected and “Industrial IoT” (IIoT) is now no longer a buzzword, but a way of business operations.
In these organizations devices now collect, transfer and record data in increasing quantities and monitor important processes, provide new insights, boosting efficiencies and enabling pro-active maintenance and smooth scaling. In short, this mode of operation has become the essential step to making informed decisions, triggering a revolutionary change in the fundamental economics and operations of these organizations.
The data from these devices are now able to give organizations a detailed and clear picture of what is actually happening, and with detailed context. No longer do organizations have to rely on second hand knowledge or be completely reliant on human-experience to comprehend what is happening. Data is now regularly fed back into analytics and AI / ML systems, identifying and uncovering patterns of usage and behaviors never detected before, and ultimately enabling businesses to refine their systems, processes and ultimately the products and services they deliver to their customers.
Organizational use of IIoT technologies has leapt from 13% in 2014 to 25% today (in 2020), according to McKinsey. IIoT-connected devices are projected to explode to 43 billion by 2023 worldwide, a dramatic 3X increase from 2018.
More organizations are utilizing the power of such IoT insights, and are now looking to process and better act upon the ever-increasing stream of data from more and more connected devices than ever before. Organizations will continue to gain more insights and efficiencies from these data-streams, and continue to gain a competitive edge to stay ahead of their competition and provide increasing value to their markets.
IIoT is giving rise to edge computing, which takes advantage of computing resources that are available at the “edge” of the network, i.e. outside of traditional data centers or cloud infrastructure. In edge computing, the compute workload is placed closer to where data is created and enables rapid (real time) actions in response to an analysis of that data. By harnessing and managing the compute power available “on site” premises, such as factories, retail stores, warehouses, hotels, hospitals, distribution centers, or vehicles, developers can create applications that:
One example would be a drone involved in containment operations in a disaster zone. In such a case, an edge computing capability can analyze the data from the done in real-time, extracting, classifying information and reacting to implement safety procedures rapidly in response to a developing situation. Such a capability would circumvent any delays caused by sending data to a cloud-based system. Or worse, in such volatile circumstances, the disaster could have disrupted network connectivity to the “cloud”. Ultra reliance on the cloud could compromise the entire operation.
A specific real life example of such a drone would be forest fire-fighting drones, being operated and guided from a fire truck. The fire truck would be a host to an edge computing cluster, connected through a 5G network to the “cloud”. Firemen and high value assets could be kept at sufficiently safe distances away from the actual fire, while they manage and guide the drones working to contain the fire. This would change the way fires are contained and dramatically reduce cost, risk and avoid human tragedy at the same time. Significantly, it would avoid putting firemen in harm's way more than was necessary.
In spite of the hype surrounding 5G, it can be thought of as one of the means (of course, an economical and critical one) to achieving what is possible with edge computing. Edge computing will see an upgrade with 5G’s ability to bring speed and rapid low cost deployment modes. This will enable new real-time applications in video processing and analytics, drones, smart transport, self-driving vehicles, robotics, and factory automation, among many others. In essence, closed-loop operations involving many loosely coupled systems become possible.
A key facet of 5G relevant to enabling a rich set of applications at the edge is 5G’s promise to support multiple differentiated services with network slicing. So, continuing with the above example of fire-fighting drones, imagine the fire truck connected through 5G to the “cloud”. In spite of this being a valuable “tether”, the “edge computing” capability of such a solution would make it a fault tolerant and resilient application, as the fire truck could operate independently even if the 5G tether is disrupted or becomes impeded.
In all real-time applications, time plays a key role. Real-time applications will need assurances for “in time” actions / reactions taken to implement any practical and useful control in a feedback loop. In other words, collected data must be processed and actions (whether through analytics or actuator actions) must be completed promptly within a predefined time. This time period could vary from application to application, but would define the boundaries of the reaction time that the system could tolerate. In general then, it makes sense to implement an edge computing architecture that limits this reaction time (which in turn is dependent on the inherent latency of the system) as much as possible to appeal to as broad a set of such real time applications as possible.
A different dimension of real time systems is the need to understand the association between data and time. The time that a data was generated can be critical in comprehending its relationship to other data generated elsewhere in the system (an example of this would be sensor data streams gathered from the sensors around a vehicle or drone). In other cases, time associations with data would be critical in determining the sequence of events and establish (for example cause and effect in an event log where events are occurring in different parts of a distributed system). Thus, having all the data accurately associated with a common (and synchronized) common clock would be essential to making any sense of the actual phenomena in the system. This would also be a fundamental building block to establishing an autonomous system(s).
Thus, the concepts of latency and time synchronization are distinct factors and each is important in its own unique set of applications. And, in some cases these overlap.
Expanding on the above previous example of firefighting drones, we can clarify these two concepts and appreciate the role each plays in such applications. Latency between the data from sensors (video streams, temperature readings, wind speeds etc.) and the edge computing platform in the fire truck should be minimized so that the applications can guide the drones to react quickly to any developing or changing situation. Latency has minimal effect on the ability to make deductions from the stream of data received. On the other hand, if the edge computing system is trying to understand the patterns from data and make deductions about the cause (or the direction) of the fire’s spread it is critical that the disparate data streams (including video streams) have accurate accurately synchronized time stamps, so that it is clear which direction the fire is spreading, or the cause and effect relationship between two events. In this particular application, it may be more critical to have precise time synchronization than reducing latency (a 20 millisecond difference may not have much impact). In some other applications, the opposite may apply.
Real time systems and systems of systems need a collection of technologies operating in a coordinated fashion to achieve the promise of improvements, automation, efficiencies and agility. In order to take meaningful steps towards a world of high fidelity integration and enhance self-monitoring, a skilled data management architecture is needed. In the case of all real time systems, such data architectures are meaningless without full context of time synchronization and location. Leveraging technologies such as 5G, edge computing, time synchronization, at the network edge to provide a buffer between IIoT platforms and hyper-scalers would be a key step towards realizing industry 4.0.