Growth in IoT devices is fueling demand for machine learning technology

One of the first things Matt Oberdorfer tells you about Embassy of Things, his IIoT startup, is that it’s not focused on analytics.

Instead, Embassy has developed a platform for quickly authenticating IoT devices and parceling/managing access permissions. Machine data is valuable. Oberdorfer explains that more devices will get added to corporate networks, and we’re going to need automated procedures to manage them (just like we created authentication processes for IT networks).

Expect to hear more stories like Oberdorfer’s as time goes on. To date, a good amount of the attention around IIoT has focused on machine learning and artificial intelligence to address things like how to reduce energy and predict maintenance problems. But along the way people have discovered if you take care of the administrative/behind-the-scenes tasks (such as data prep) before plunging into analytics, you’re often going to end up with vague, inconsequential recommendations.

As a result, a market for IIoT operations technology has begun to emerge. For example, San Francisco-based Element started as an analytics company but shifted to produce software that makes it easier to produce digital or asset twins that can be used as platform for conducting analytics. Aperio, based in Israel, has developed a platform for smoking out falsified, corrupted or incomplete operations data. Aperio’s technology is used as a security platform as well as a platform for data governance, explains Gilles Barnes, Strategis Product Manager at Aperio.

What’s driving this trend?

It’s a variant of the old garbage in/garbage out problem. Companies know their system data is valuable, and they want to use it. Unfortunately, it gets generated in massive volumes and conflicting formats; in its raw form, the data can be indecipherable to humans. It needs to be organized and synthesized first.

People like to say data is the new oil. Which is accurate, but in an ironic way. Like oil, there’s far more data out there than one might think, and it is often in difficult-to-reach locations; it won’t do any good until it’s passed through a complex refining process. Not quite the analogy they meant.

The middleware movement is also a reaction to the overreaching control being attempted by some of the analytics platforms, argues John Tough, a partner at Energize Ventures, a VC firm focused on industrial technologies that has invested in companies specializing in virtualization (Zededa) and IoT Security (Nozomi Networks).

“The OEM asset management/conditioning systems are all built with intentional silos and so all of these middleware companies and data originators are being built to circumvent the siloes and bridge multiple assets into one visibility platform,” said Tough. “Newer engineers are looking for ways to hack around the expensive and generally unimpressive OEM software systems.”

Some analysts have suggested that the best way to deal with this flood of data is to only keep the important data. That, of course, is nonsense. People invest in IIoT because they cannot perfectly predict which values have an impact. Plus, those seemingly small and inconsequential data points are essential for a high-fidelity picture.

For example, take a look at energy data. A smart meter pinging your household for consumption data will generate around 400MB a year. In over 134 million U.S. households, that translates to 53.6PB, or about half the amount of data that gets uploaded to YouTube every year. And if you wanted to read the data every second — a level that would let a software program identify, and perhaps fine-tune consumption, in an unobtrusive manner — you will exceed 42 exabytes. By some estimates, that is eight times the data needed to record every spoken word.

The rise of these companies like Element and Aperio will help to finally bury the rank fantasy of an IIoT platform. A few years back, companies played with the idea of buying one piece of magical software that would serve as an ambassador to the world of things. It would connect to devices, manage them, organize the data, and then serve up deep insights to save millions of dollars. You’d have one throat to choke, and it would all be managed in the cloud.

You can see that idea already fading away. Gartner recently came out with an update to its Magic Quadrant for IIoT companies. There’s no one in the coveted Quadrant for the second straight time. They should call it the Empty Quarter. In fact, there’s no one in the top half with the “ability to execute.”

But, just like the IT world discovered back in the 80s and 90s, technology splinters and new problems continuously emerge. Back in 1999, some were predicting that the energy consumed by data centers would stress utilities and lead to a collapse of the Internet. Luckily, a somewhat obscure startup called VMWare, founded in 1998, was already working on the problem. Engineers were also beginning to experiment with low-tech solutions like plastic sheeting to create hot aisle-cold aisle centers. Containers, load-balancing and other technologies have followed in the wake.

Data centers are more complex than ever. They are also larger, more efficient and expanding their services, thanks in no small part to technologies lurking in the background.

The bottom line: complexity is the only way to make things simple. Get used to it.

Originally published in Information Management on July 30, 2019