This content has been updated.
Data volumes are exploding, and the Internet of Things (IoT) is one of the driving forces.
Users want to leverage incoming IoT data to make better decisions, serve customers better and innovate with new business models. This means they need to leverage data up and down the scale, from real-time for immediate analysis/action to long-term historical analytics for strategic planning. For the latter, the data needs to be stored efficiently and made available for large-scale analytics.
That is where cloud data lakes come in: They are cost-efficient and scale almost infinitely. But, they only provide low-level storage functionality and you need to use technology to get their full potential.
So, at Dremio’s Subsurface cloud data lake conference on July 30th, 2020 Software AG showed you how to use data lakes as essential building blocks for long-term storage of IoT data.
You could see how data is moved from the IoT platform to the data lake, how it is laid out, and how efficient querying by various consumers – such as IoT platforms, business intelligence tools, and machine-learning applications – is achieved.
The presentation was by Dr Tim Doernemann, Senior Lead Software Engineer and Dr Michael Cammert, Senior Manager – both are part of the Cumulocity IoT DataHub R&D team.
They also discussed general use-cases that Cumulocity IoT is used for, including fleet location tracking, office space management and social distance tracking. From a technical perspective, the use cases ranged between the extremes of near real-time processing for immediate action, to historical analytics of the last five years’ data.
Learn more about Cumulocity IoT by clicking below.