In theory, the Internet of Things, or IoT, is a fairly simple concept; connect as many devices as possible to the internet to make “things” smarter. However, the definition of IoT can vary depending on the industry a device is being developed for. In fact, it was stated in a publication by the US government, that there was “no consensus among commenters on a formal definition of IoT, or even on whether a common definition would be useful”.
For example, some developers may focus on the hardware, such as IBM who said IoT is “the growing range of Internet-connected devices that capture or generate an enormous amount of data every day”. Whereas the likes of Vodafone don’t focus on devices and instead describe IoT as a “dynamic global network infrastructure with self-configuring capabilities based on standard and interoperable communication protocols”, which, when put into layman’s terms, essentially translates to the IoT being a vast network that can be run autonomously, smart things are there just to connect to it.
Personally, I believe that the Vodafone statement can be a bit misleading. The IoT can be and is being, built upon industry standards and communication protocols that have already been established, but I hope to show you that the industry is a lot more fractured over this subject than one might think.