IoT Possibilities

It seems to me….

The Internet is the most important single development in the history of human communication since the invention of call waiting.” ~ Dave Barry[1].

Humans, as a species, are learning how to adapt to another round of disruptive technologies. As we develop methods of interaction with smartphones, tablets, social media, avatars, robots…, unlike previous experiences with disruptive technologies including the printing press (books, newspapers…) or communication devices (telephone, radio, television…), we now are always accessible regardless of where we might be to an extent never previously possible. It also is obvious that we are now only at the very beginning of a journey when as what is known as the Internet of Things (IoT) is implemented, everything will be measured, recorded, and available for whatever purpose we can imagine. The path is new and we haven’t any idea as to where it ultimately will lead. Some will find it intimidating, others exciting, but we realize the world as we know it will be different.

The IoT – where sensors built into domestic appliances, the buildings we live in, the clothes we wear, or the gadgets we carry communicate and share data online – is seen by many tech companies as the next great evolution of the Internet. By placing sensors on everyday objects we can better measure the world around us, fix machines before they break, create vastly more efficient services, and even better understand our own health.

With the IoT poised to grow into a nearly $9 billion market by 2020, organizations are only beginning to tap into its potential. And, like any relatively new technology, they’re running into obstacles in attempting to translate tech investment into business-benefiting results.

IoT promises to offer an unprecedented level of granular, real-time data that improves any personal, business, or planning processes for every corporation or individual that has any logistical dependence. The amount of data being generated is increasing exponentially and will continue at this rate of growth, or even faster, for the foreseeable future as the IoT becomes more widely available. By 2020, the number of connected devices is expected to double to 50 billion, according to industry research. As data acquisition rates increase, our ability to fully use that data becomes correspondingly more difficult.

Much IoT project data is difficult to capture and many companies are struggling to collect and analyze their data in a timely and effective manner. A notable share of company representatives said there’s too much to deal with, while others admit they analyze it too slowly to do anything meaningful with it.

Many researchers believe the only solution to avoid being buried under this avalanche of data is to incorporate machine leaning so devices can train themselves how to parse it. This long sought goal of artificial intelligence (AI) can theoretically be implemented in many ways, one of which is through neural networks. Many approaches are possible and it will be some time before the problem is resolved.

The potential benefits of IoT are clear but the security implications also are a key issue. It is not yet apparent how to ensure the protection and privacy of all the data that’s moving from machine to machine or data that’s being accessed by all the different systems and devices.

On the economic side the need is to cost-effectively manufacture up to a trillion sensors used to gather data; on the technical side, the challenge involves building out the infrastructure. This includes enabling the transmission, storage, and analysis of volumes of data far exceeding anything we see today. Implementing these capabilities and infrastructure on the scale imagined in the IoT will require far more powerful memory and logic devices than are currently available. This need will drive the continued extension of Moore’s Law and demand for advanced semiconductor manufacturing capability such as atomic-scale wafer processing.

More than two out of five enterprises are either currently leveraging IoT technologies or plan to do so this year. Overall, the economic impact will amount to at least $14 trillion by 2025.

IoT will have a widespread and beneficial impact for the indefinite future. It is coming regardless of security and performance issues as the upside benefit is too great; its potential is seemingly limitless. The challenge will be for the current network infrastructure to support all the new devices: capacity and throughput issues will result from the huge amount of data that will be created; latency will present a tremendous challenge to networks and carriers.

For the majority of people, implications of the IoT will be largely imperceptible being increasingly incorporated into the products we use and by the corporate world providing those products. Still, the effects will be pervasive effecting almost all that we do or use.

That’s what I think, what about you?

[1] Dave Barry is a Pulitzer Prize-winning American author and columnist.


About lewbornmann

Lewis J. Bornmann has his doctorate in Computer Science. He became a volunteer for the American Red Cross following his retirement from teaching Computer Science, Mathematics, and Information Systems, at Mesa State College in Grand Junction, CO. He previously was on the staff at the University of Wisconsin-Madison campus, Stanford University, and several other universities. Dr. Bornmann has provided emergency assistance in areas devastated by hurricanes, floods, and wildfires. He has responded to emergencies on local Disaster Action Teams (DAT), assisted with Services to Armed Forces (SAF), and taught Disaster Services classes and Health & Safety classes. He and his wife, Barb, are certified operators of the American Red Cross Emergency Communications Response Vehicle (ECRV), a self-contained unit capable of providing satellite-based communications and technology-related assistance at disaster sites. He served on the governing board of a large international professional organization (ACM), was chair of a committee overseeing several hundred worldwide volunteer chapters, helped organize large international conferences, served on numerous technical committees, and presented technical papers at numerous symposiums and conferences. He has numerous Who’s Who citations for his technical and professional contributions and many years of management experience with major corporations including General Electric, Boeing, and as an independent contractor. He was a principal contributor on numerous large technology-related development projects, including having written the Systems Concepts for NASA’s largest supercomputing system at the Ames Research Center in Silicon Valley. With over 40 years of experience in scientific and commercial computer systems management and development, he worked on a wide variety of computer-related systems from small single embedded microprocessor based applications to some of the largest distributed heterogeneous supercomputing systems ever planned.
This entry was posted in AI, AI, Analysis, Artificial Intelligence, Artificial Intelligence, Atomic-Scale, Avatars, Books, Communications, Computer, data, Devices, Disruptive, Infrastructure, Internet, Internet of Things, Internet of Things, Internet of Things, IoT, IoT, IoT, Machine Learning, Media, Moore’s Law, Network, Network, Neural Networks, Newspapers, Radio, robots, Security, Semiconductor, Smartphones, Smartphones, Smartphones, Storage, Tablets, Technology, Technology, Television, Transmission, Wafer and tagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.