With the exponential growth rate of technology, the future of all activities involves an omnipresence of widely connected devices, or as we better know it, the ‘Internet of Things (IoT)’. In its report , McKinsey estimates a user base with 1 trillion interconnected IoT devices by 2025; while the recent publications  by Cisco in June 2017 indicate that we have already reached the Zettabyte Era, and the number of devices connected to the Internet is growing exponentially. The increasing range of real-world IoT deployments essentially increase the sources of data generation, thereby globally strengthening the challenges already being faced in the Big Data space , particularly regarding moving data from one end (i.e. from data sources such as sensor/IoT devices at the edge level of infrastructure) to the other extreme end (i.e. centralized data centres at the cloud) in the network infrastructure. Sending the entire data set across the extreme ends in the infrastructure becomes an unrealistic solution, specifically in scenarios with constrained network bandwidth and low/no internet connectivity. Instead, approaches that collect data and perform computational processing near the source of data itself present a more practical alternative to such scenarios, and is beneficial for a number of reasons such as in cases of video, whose transport across infrastructure can claim considerable network resources such as the requirement for storage at each node from source to destination. While IoT deployments vary across use cases, the most prominently common underlying aim is to analyse the data generated from the devices to achieve a specific set objective.
|Journal||IEEE IoT Newsletter|
|Publication status||Published - 2018|