Is the Internet of Things the world’s most confusing tech trend? On the one hand, we’re told it’s going to be epic, and soon – all predictions are either in tens of billions (of connected devices) and trillions (of dollars of economic value to be created). On the other hand, the dominant feeling expressed by end users (including at this year’s CES show, arguably the bellwether of the industry) is essentially “meh” – right now the IoT feels like an avalanche of new connected products, many of which seem to solve trivial, “first world” problems: expensive gadgets that resolutely fall in the “nice to have” category, rather than “must have”. And, for all the talk about a mega tech trend, things seem to be moving at the speed of molasses, with little discernible progress year on year.
Part of the problem is perhaps one of semantics. While gadgets are indeed part of the category (and quite often very large markets onto themselves), the Internet of Things (which we define as any “connected hardware” other than desktops, laptops and smartphones) is a much broader, and deeper, trend that cuts across both the consumer, enterprise and industrial spaces. Fundamentally, the Internet of Things is about the transformation of any physical object into a digital data product. Once you attach a sensor to it, a physical object (whether a tiny one like a pill that goes through your body, or a very large one like a plane or building) starts functioning a lot like any other digital product – it emits data about its usage, location and state; it can be tracked, controlled, personalized and upgraded remotely; and, when coupled with all the progress in Big Data and artificial intelligence, it can become intelligent, predictive, collaborative and in some cases autonomous. An entirely new way of interacting with our world is emerging. The importance of the IoT perhaps emerges more clearly when you think about it as the final chapter of “software eats the world”, where everything gets connected.