The components that make up IoT have been in existence for many years, with the possible exception of wearables, and even wearables have been around for ages if you count the digital watch. What is different is the sheer number of devices with sensors that are now being connected, the ability of these devices to communicate in a manner that is much less proprietary and utilising existing networks to move data. The large number of sensor produce massive amounts of data, but now with large-scale storage of data possible (big data) and the ease with which the data can be manipulated into useful information and displayed on ubiquitous devices such as smart phones, this data becomes useful in many, many ways.
The underlying thing that makes IoT different … is commoditization; of sensors, network bandwidth, data storage, development tools and platforms, compute power … to be useful and ubiquitous IoT has to use very low-cost technology at all levels. The concept of automated control systems, Machine to Machine (M2M) and other like technologies have been around for decades, however in the last few years the commoditization of all what is needed to create IoT has come about.
To understand IoT and this commoditization it is worthwhile to look at the core layers of technology that are needed in an IoT solution. Additional vertical layers of management and security are not shown, as they are not strictly necessary to understand how an IoT solutions works, but rather are the ancillary but important layers needed as IoT matures.
The foundation layer to IoT is the Connected Devices. These devices contain processors, memory and sensors that enable you to discover something about the device, it history, its current operating state or something about the environment it is located. Sensors include such things gyro / fingerprint reader, / barometer / hall (recognizes whether cover is open/closed) / RGB ambient light / gesture / heart rate / accelerometer / proximity / compass / GPS / temperature/ etc.
When coupled together, the sensors enable the device to become smart, to not only report on its health, but also to accept commands to take action, i.e. lower the temperature, dim the light, turn off the water supply as the carpet is now wet …
The rise of smartphones has caused the cost of sensors to fall dramatically, the number of smart phones shipped annual has dramatically reduced the price of sensors to commodity levels.
These sensors require networks to communicate and become useful. Historically, devices report status via their external display (flashing light, led, screen), then more sophisticated via terminal port, and now, with low-cost reliable networks, via communication protocols, over wi-fi, bluetooth 3G and other networks. This is the Network and Communications layer in the architecture. Bandwidth costs have been declining 27% annually 1999-2013 from $1,245 to $16 per 1,000 Mbps. The declining cost and increasing performance of Networks enables faster collection and transfer of data, which facilitates IoT.
The Big Data Stores are the next important layer of the architecture. The plural is intended, as a solution may have more different types of storage requirements, i.e. images, value pairs (time and reading), audio.
Firstly, the mega trend affecting big data is the decreasing cost of digital storage, reducing 30% annually, 1992 to 2013, from $569 to $0.02 per Gigabyte over this period. It is now affordable to store the data that proliferation of sensors produce. Secondly, the advent of big data solutions like Hadoop, Cassandra, MongoDB and MapR. means that the large amounts of data collected can be captured, stored, analysed, curated, searched, shared, much beyond the capability of traditional relational database systems such as Oracle.
The next layer relates to Decision Support Tools . Without automation the sheer quantity of data is unmanageable and largely unusable. Aside from showing a trend line on graph, unless the data is processed in some manner, it does not become information. For example, temperature readings from a thermostat can easily be stored and display, but when processed can start to provide some useful information – i.e. average temperature at time of day, variance of current measure to the average, to the last reading, to the last 5 readings, etc.
The filtering, post processing (ie averages, change) of the data is the first step. The second step is to provide business rules to that data; generally known as Event Processing. These processes create the triggers that advise the applications to act; to make some decision and take some action based on the current state of the data being received.
The Application layer is where the business functionality lives. Whether it is an iphone app that tells you your home alarm has activated or a complex piece of scheduling software that is using multiple data points from sensors to schedule predictive maintenance of large assets.
With the cost of compute declining 33% annually 1990 – 2013, $527 to $0.05 per 1MM transactions, and the advent of pay-as-you-go cloud offerings from companies such as AWS, the cost of building and running applications has fallen dramatically. That has enabled the creation of many more digital based businesses that formerly would have needed to raise significant capital to get established. Now they can fun the infrastructure as demand rises.
This post has not mentioned the significant contribution of Open-Source to the IoT revolution. Most of the trends above have also been driven by the availability of open source software, that enables the building of protocols, infrastructure software and applications without paying costly license fees. Open-Source and IoT will be the topic of a forthcoming blog.
Statistics used sourced from Internet Trends 2014 – Mary Meeker May 28, 2014 kpcb.com/InternetTrends