Everyone’s talking of the Internet of Things (IoT) and the impact it’s started to make even on day to day lives. Yet, more than anything, it’s also becoming increasingly clear that analysis of the IoT data will be the differentiator between those who simply collect data and those who “use” it to drive their businesses.
Enterprises that will use IoT analysis will see themselves implementing faster customer services than their rivals, as well as add new amounts of additional yields.
That said, it’s clear that the IoT analytics will require a well-thought out strategy on part of the Enterprise. Unlike the other, modern-day streams of data analytics, this branch is slightly more complicated. The primary reason – the humungous amounts of streaming data that is/will be generated and analysed, in real time.
How Does One Collect This Data?
How does one collect this data? In fact, is it necessary to collect all the data? These and many other questions need to be answered by an Enterprise’s decision-makers to tackle the complexities of the IoT data.
Traditional data that is used in a B2B or B2C operation, and its analysis, requires the collection of raw data, locating it in a data hub, scrubbing it and then handing it over to the analysts to draw predictive or other forms of analytic models.
But that same structure cannot be applied for IoT data because the huge volumes that will pour out in real time means centralizing it will be almost impossible. Imagine you running a national cold storage company with your own in-house fleet of trucks, mini-vans and warehouses. You can expect copious amounts of data to be generated on various fronts – from your fleet of “smart” vehicles, your thermostats in each section of the cold storage unit in every town, data that comes out of your supply chain, customer-related data, data from the RFID chips, and so on and so forth. If all of this data starts pouring into one central location, after a short while, your central data servers will start suffering from “data lag”. Not to speak of the almost non-existence of the viability of scalability of such an analytics operation. The decision makers in your outfit will no longer be able to take efficient, business-related decisions. Take an example of a freezer unit malfunctioning on a truck carrying seafood across the county. Even if the temperature drops 5 degrees, the whole payload may need to be discarded for food safety reasons.
For now, one way forward looks like collecting information and analysing the same on the smart device itself. In other terms – Edge Analytics. Utilising the smartness of the device and its low cost computational power will help run analytics on the device itself or close to the source instead of the hub itself. As close to the edge of the system seems to be the answer, for now. The Big Data analysis than comes out of this “mini analysis” can then be done in the Cloud in real time.
So, in such a model, Enterprises will have to tackle many questions, including – how much of information has to be captured by the sensors, initially? How much has to be analysed and how much forwarded to a core location for further analysis? The analytics team will have to develop rule-based models that can determine all of this including how the gateway will handle data.
A note of caution – Edge Analytics may not be for every business and an initial feasibility test may have to be done to understand whether your business needs one or not.