The Internet of Things opens the door to new ways of helping companies; to innovative management models in business processes and goods production; and to unprecedented procedures for monitoring and executing the maintenance of both machinery and technological equipment.
In the same proportion as the rise in the gigantic flow of data generated by Internet of Things (IoT) processes, there is also growing recognition of the advantages of the possibility of capturing and analyzing certain types of data at the same time and in the same place in which they are generated: the sensor or the device located at the ends of network edges.
Edge Analytics methods provide an innovative alternative for many companies that are used to sending the data generated by all kinds of sensors and devices to the ‘data lake’ in the cloud. However, in edge analytics processes, the information can be processed next to the source itself and not all the data needs to be sent to the cloud, only those that are outside the stipulated parameters. In large-scale IoT deployments, this functionality is critical, given the vast amounts of data generated daily on the network edges.
This evolution in analytic processes moves away from a centralised cloud model and means that companies do not have to perform mass data transmissions of all the information generated by their sensors and devices to remote data centres, thereby gaining greater control over the ‘things’ that remain connected.
In line with this decentralization of cloud analytics, Teresa Tung from Accenture alluded to the need to establish a system of real-time sensor analytics by situating the operational function of the sensor ahead of the data centre in her session entitled ‘Real-Time Streaming Analytics Platform’ at IoTSWC2015.
“In an IoT infrastructure, gaining visibility into real-time data involves synchronizing the incoming flow of information where it occurs, since the data centre never remains fully synchronized. In the event of a data centre or network crash, the real-time analytics platform has activity projection capabilities so that when the connection is re-established the analytics system enables synchronization and reconsiders decisions taken previously in light of the new data provided.”
Meanwhile, Jayraj Nair from Infosys mentioned in his presentation at IOTSWC2015 that the analytics system represents the engine when it comes to collecting and instrumentalising information from the sensors:
“An engine can be affected in the form of high costs and a slow-down in decision-making when the volumes of information that are transmitted by streaming come and go, and we are only interested in a minimal part of the information collected: everything else is just noise. Consequently, organisations have to select the type of data and the volume of data they collect; in other words, placing rules at the edge of the network rather than making it travel twice through expensive channels. One approach is Edge Analytics”, said Nair.
THE PRESSURE IS AT THE EDGE OF IOT INFRASTRUCTURES
This is why the ecosystem of the Industrial Internet of Things is favorable to collecting and analyzing data at the edges of business networks; a location where conventional systems are neither efficient nor sufficiently resilient. IT industry leaders such as IBM, Dell, HP, Cisco and Intel, who are regular participants in the annual IoTSWC event in Barcelona, claim they now have the solutions to meet those requirements.
IoT platforms equipped with edge analytics capabilities are often installed on oil rigs, mines and manufacturing plants; in other words, industrial sectors that operate with low speed bandwidth and in low latency environments. In addition to sensors, edge analytics systems can have other types of connected devices that project new data which bring greater wealth to information captures such as video analytics.
With an IoT network made up of more than 30,000 sensors in a typical smart city like Barcelona, the more than 300 that a Formula 1 car contains, or the almost 6,000 found in any commercial aircraft, avoidance of the massive volume of repetitive edge-generated information that directly impacts the cloud is the main task of IoT edge analytics systems. At this point, edge analytics can involve a dual-perspective approach: the evolution of traditional gateways and the progress of server manufacturers.
Depending on the industrial sector, the sensor or edge might be in a manufacturing plant, a crop field, one end of an oil rig or in a warehouse. Industries related to hydrocarbon logistics, the production of goods, telecommunications, transportation, retail, healthcare and consumer technology companies are just a few examples of sectors that are already taking advantage of the opportunities offered by IoT. Having a properly integrated IoT edge analytics platform means that these companies have the ability to analyse data from multiple remote environments in real time – from wind turbines in the desert to smart energy distribution networks, or from the pipeline to a manufacturing plant – enabling the possibilities of predictive maintenance and optimisation processes.
The growth that analysts and industry experts predict for the use of Internet of Things devices and applications is huge, as is the growing interest in taking analytics to the very edge of the IoT infrastructure. In this respect, Gartner estimates that the endpoints of the IoT, in other words, the sensor-related market, will grow year-on-year by over 30% between 2013 and 2020, by which time it will have reached a base of 20,800 million units ,most of which will end up in the industrial sector.
MAJOR PLAYERS IN THE IT ARENA ARE PUTTING MONEY ON THE COMPUTATIONAL POTENTIAL THAT EDGE ANALYTICS BRINGS TO IOT SYSTEMS
Companies such as Intel and Cisco have pioneered the implementation of Edge Computing, positioning their gateways as Edge devices given that, historically, gateways were performing the function of routing traffic aggregation. In the edge computing model, the core functionality of the gateway has evolved, as in addition to aggregation and routing these gateways could also perform data computing functions.
In combination with edge gateways, edge analytics systems allow you to perform pre-processing or data filtering at the same location as where the data was created. From here, the data that follow the normal parameters can be ignored or stored in a low-cost storage system, while anomalous readings can be directed to the ‘lake’ or stored in the internal memory of the database.
Today, a new market segment such as Edge Analytics is emerging in the IT landscape and having a full impact on the industrial IoT sector. Large organisations such as Dell, Cisco, HPE and others are trying to position their servers as edge devices through the integration of increased storage volume, computing power and analytics capabilities, which has direct implications on the processes of Edge Analytics for the IoT.
For some time now, Intel and Cisco have been working with edge analytics for IoT. Parstream, which was bought by Cisco, has created a lightweight database management system (less than 50 MB) which has been developed to integrate into IoT platforms that manage wind turbines. More recently, Cisco and IBM have established a collaborative alliance to bring Watson’s capabilities to the edges of IoT infrastructures.
Meanwhile, Intel’s presence in the edge analytics segment is based on a suite of APIs and security systems from MacAfee (now owned by the chip builder). In a system that can be downloaded through the IoT Developers Kits website, this Intel end-to-end platform includes the Wind River Edge Management System, IoT Gateway, cloud analytics, MacAfee security, identity privacy clusters and the possibility of establishing synergies with Cloudera, a company in which Intel has invested.
For those companies that have decided to incorporate certain data analytics functions through SQL into the sensor side, certain firms on the IT scene are offering innovative IoT convergent systems that help companies to generate more value by providing analytics and automated learning (machine learning) at the end of the system – precisely where things are – and at the moment the data is captured. One of these companies is Hewlett Packard, with its HPE Vertica Advanced Analytics Platform system.