Holger Reinhardt

Holger Reinhardt

Holger Reinhardt is a Product Architect & Business Developer at Layer 7. In this role, he explores opportunities related to IoT, M2M and Big Data for Layer 7. Holger has over 16 years of software development experience in Telecom and SOA appliances. Prior to joining Layer 7, he was part of the in-house incubator team at the office of the CTO of IBM WebSphere. He joined IBM through the DataPower acquisition.

July 24th, 2013

IoT: The Weighting Game

Written by
Category IoT, M2M, Security, Twitter
 

Data Weighting for IoTThis must have been a scary few moments. On March 23, the main Associated Press Twitter account tweeted about explosions at the White House and President Obama being hurt. Guess what happened next? The Dow went down by over 100 points within minutes of the tweet.

So why did this happen? Regardless of whether the trades were executed by an algorithm or a human, both where treating all tweets from that AP feed as equal. They traded  based on the content of a single tweet – and the resulting feedback loop caused the drop in the stock market.

Fast forward to IoT and imagine that each Twitter account is a sensor (for instance, a smart meter) and the tweets are the sensor readings. Further imagine that the stock market is the grid manager balancing electricity supply and demand. If we were to attach the same weight to each data point from each smart meter, a potential attack on the smart meters could easily be used to manipulate the electrical grid and – for instance – cause the local transformer to blow up or trigger a regional blackout via a feedback loop.

Yet strangely enough – when talking about the IoT – the trustworthiness of sensor data does not appear to be of concern.  All data are created equal or so the assumption seems to be. But data have an inherent quality or weight inferred by the characteristics of the endpoint and how much it is trusted. Any algorithm using sensor data would need to not only take into account the data points as such but also weight the data based on the actual capabilities of the sensor, its identity and its trust relationship with the sensor.

I tried to capture this relationship in picture below.

Endpoint Security in IoT

How can we account for the risk that not all data are created equal?

Credit card companies provide a good object lesson in the way they have embraced inherent insecurity. They decided to forgo stronger security at the endpoint (the credit card) in order to lower the bar for use and increase market adoption. But in order to limit the risk of fraudulent use, every credit card transaction is being evaluated in the context of most recent transactions.

A similar approach will be required for IoT. Instead of chasing impossible endpoint security, we should embrace the management of (data) risk in the decision-making process. An advanced, high-performing API Gateway like Layer 7’s can be used to perform data classification at the edge of the enterprise and attach labels to the data flowing through the Gateway and into the control processes.

I’d be curious to learn if and how you would deal with the data risk. Do you assume that all data are created equal? Or does the above picture resonate with your experiences?

July 19th, 2013

What I Learned in Helsinki: The Core Motivation of IoT & Other Reflections

Written by
Category Conferences, IoT, M2M
 

Reflections on IoT-ALast month, I attended a two-day IoT-A workshop during IoT Week in Helsinki. The goal of the workshop was to showcase the various IoT research projects that are jointly funded by industry and the EU’s FP7 research program. The quality of the projects on display was amazing and I could not possibly do it justice in the space of a blog post. Still, here’s a partial list of what I saw:

BUTLER

  • Horizontal open platform for IoT
  • To learn the intent of a user requires a horizontal approach
  • This horizontal approach leads to context awareness

FI-WARE

iCore

  • Composed of Virtual Objects, Composite Virtual Objects and  Service Layer
  • User characteristics + situation awareness = intent recognition

OpenIoT

  • Linked sensor middleware
  • Data management instead of infrastructure management
  • Uses information interoperability and linked data to enable automated composition

ComVantage

  • Manufacturing automation
  • Uses XACML and extends it for linked data

CHOReOS

  • Probabilistic registration of things
  • Registration decisions are based on existing density and coverage requirements

To get a more complete picture, you can find all the presentations from the workshop here.

There were two key insights I took away from this workshop, both of which had to do with subtle similarities shared by all the projects.

First, sitting in and listening to the various presentations, I was struck by one particular similarity: at the core of each use case was the desire to make better-informed decisions. I’ve tried to capture what I call the core motivation of IoT in the picture below.

The identity of the user or thing combined with the temporal and/or spatial context based on real-world knowledge and data from the past can allow us to make better-informed decisions for the future. I think this holds for both the smart coffeemaker and the smart city.

My other insight had to do with the surprisingly similar characteristics of the various presented IoT applications. I tried to capture these characteristics in the picture below.

At the heart of the applications lies data – lots of data. But Big Data has two siblings: Fast Data and Open Data. The applications are graph-structured based on the relationship of things to each other and to me. They are event-driven rather than transactional and they are compositional.

What do you think? What kind of similarities do you see between the various applications?

June 7th, 2013

IoT Tech Talk Follow-Up

Written by
 

IoT Tech Talk Follow UpLast week, I had the opportunity to answer questions about the Internet of Things (IoT) when I took part in Layer 7’s monthly API Tech Talk. We had a tremendous response, with lots of questions and a very active online discussion. You can find a replay of the Tech Talk here. I’d like to take this opportunity to answer a few of the questions we received during the webcast but didn’t have time to answer on the day.

How does Layer 7 help me manage a range of devices across IoT?
IoT is an opportunity for CA and Layer 7 to bring together identity, access and API Management.  To paraphrase a comment on a recent Gigaom article: Everything with an identity will have an API and everything with an API will have an identity.

With so many “things” potentially accessing APIs, what are some strategies for securing these APIs across such a breadth of consumers?
Identify, authenticate and authorize using standards. API for IoT means managing identity for many devices at Internet scale.

How will API discoverability work with the vast number of things, especially if we see REST as the primary communication style?
I reached out to my colleague Ronnie Mitra for this answer. Ronnie pointed out that, in the past, standards like UDDI and WSRR promised to provide service registries but that didn’t really work out. Nowadays, we see lots of independent human-oriented API registries and marketplaces that might have more chance of surviving. There are even some runtime discovery solutions like Google’s discovery interface for APIs and the use of HTTP OPTION to learn about APIs. At the moment, lots of people are trying lots of things, unsure of where it will all end up. It would be interesting to dive deeper into why we need discoverability to power IoT and when that discoverability has to take place.

How can API security get easier when API demand grows exponentially? There’s a big disconnect.
It doesn’t get easier. Transport-level security is reasonably well understood but endpoint identity and trust will be challenging.

Where will the intelligence be in IoT? Will there be some form of on-site intelligence, so that core functionality continues even if the connection is lost? Or will all intelligence be cloud-based?
It depends on whether you design for centralized “hub and spoke” or decentralized “domains of concern”. The former is responsible for correlating data and events within the domain whereas the latter is responsible for communicating with other domains (I owe this concept to Michael Holdmann’s blog). “Domains of concern” design communicates with different domains for different purposes –  in an apartment for home automation, in an apartment building for HVAC, in a city block for energy generation/consumption, in a city for utility grid etc. Emergencies or out-of-bound signals are handled like exceptions and are bubbling up through the domains until intercepted. But most things will serve an inherent purpose and that purpose will not be affected by the absence of any connectivity. There will be intelligence within the core of each domain as well as at the edges/intersections with other domains.

What is the best way to overcome fear of exposing data via APIs in an enterprise?
You need to identify a business opportunity. Unless you know what business impact you are trying to archive and how you will measure it, you should not do it.

Does IoT require a strong network or big data or both?
Not a strong network but ubiquitous connectivity. Not big data but sharing/correlating data horizontally between distinct vertical silos.

What significance (benefits/drawbacks) do the various REST levels have with respect to the Internet of Things (connecting, monetizing etc.)?
I had never heard of levels of REST and had to look it up. Turns out the levels are: resources, verbs and hypermedia. Hypermedia would allow you to embed long-lived clients, which could adapt to changes in API design. But it is actually the data or service behind the API which is monetizable, not the API itself. The API is just the means to an end.

How will IoT evolve? And more importantly how can enterprises solve the security and privacy issues that will arise as IoT evolves?
Culturally, the European regulators will try to put privacy regulations in place sooner rather than later whereas the North Amercian market will initially remain largely unregulated until some abuse prompts the regulator to step in. In Germany, the federal regulator tries to stay ahead of the market and recently published a security profile for smart meters. Personally I would look at designing M2M and IoT applications assuming that endpoint data is inherently unreliable and that I can not necessarily trust the source. But that is very broad guidance and may or may not be applicable to a specific use case.

As we create API frameworks that interact with sensors and control objects in the IoT what/who are the best organizations to follow to learn about new protocols that we should be preparing to handle, such as CoAP etc?
Here are some suggestions:

How close are we to having a unified platform for IoT application developers and who is likely to be the winner among the competing platforms?
Chances are there won’t be a winner at all. You have companies like Axeda, Exosite, Gemalto, Digi, Paraimpu, BugLabs, ThingWorx, SensiNode, deviceWISE and more. You have industry working groups like Eclipse M2M and various research efforts like SPITFIRE project, Fraunhofer FOKUS, DFuse and many others. The Eclipse M2M framework is probably a good choice to start with.

Even assuming ubiquitous and common networking (e.g. IPv6 on the public Internet) – how will the IoT identify peers, hierarchy and relationships?  
I think there is a huge opportunity for identity companies like CA to figure this out. Take a look at EVRYTHNG as one of the few startups in that space. Meanwhile, the folks over at Paraimpu are trying to tackle this challenge by combining aspects of a social network with IoT.

May 27th, 2013

The Nuts & Bolts of the Internet of Things

Written by
Category IoT, M2M, Tech Talks
 

The Nuts and Bolts of IoTA few days ago, I talked with Brian Proffitt of ReadWrite about the Internet of Things (IoT) and I’d like to take this opportunity to share some of his questions.

One of Brian’s first questions was about the difference between M2M and IoT. The best answer I could give him was actually one I had found through an M2M group on LinkedIn: “I see M2M platforms as mainly enabling vertical integration, as they have historically, of a single capability; where I see IoT as more about horizontal integration of multiple capabilities and resources into a larger system. M2M is about communication, IoT is about integration and interoperability.”

So, whereas M2M feeds data into existing vertical silos, IoT is layered on top horizontally, correlating and integrating data from different silos. A good illustration of this vertical–versus-horizontal distinction was provided in a recent More with Mobile article. The realization that the commercial potential of IoT first and foremost requires a new model of data sharing inspired us to create the Layer 7 Data Lens Solution.

Another question that Brian posed was about the protocols and standards underpinning the M2M/IoT ecosystem. Here is my short list of key protocols (in no particular order):

I’d certainly be interested to hear if you had any additions to the list. You’ll find background information about IoT protocols on Telit’s M2M blog and Michael Holdman’s blog. Also, Michael Koster published a very interesting blog post about adding event-driven processing to REST APIs, trying to bridge the necessity of supporting event-driven patterns in IoT within a RESTful API approach.

I’ll be discussing IoT in more detail myself when I take part in Layer 7’s latest API Tech Talk, on Wednesday May 29 at 12pm EDT/9am PDT. If I answer your IoT-related question live during the Tech Talk, Layer 7 will send you a free T-shirt. See you on Wednesday!

April 4th, 2013

Focusing on the Byte-Sized Tree: The IoT Conundrum

Written by
 

Data Lens for IoTYesterday, we introduced the concept of a Data Lens for aggregating and sharing data. Today, I want to talk about why this concept matters to organizations concerned with the Internet of Things (IoT).

Simply put: “things” generate petabytes of data. Putting sensors on everything, as both Cisco and GE propose, creates a data nightmare. Hadoop has made analyzing big volumes of data much easier but what happens when you want to share a small sliver of that information with a customer or partner? After all, the purpose of “Big Data” collection is not altruism – it’s about monetization. In many situations, this will only be possible if data can be shared easily.

A Data Lens gives IoT data owners – such as manufactures or telco carriers – an easy and secure way to share a focused and billable data set with their customers and partners. Anything outside of the scope of a Data Lens cannot be accessed, whereas anything inside the lens is  “in focus”. The data in focus can be raw or aggregated. There can be any number of Data Lenses on a data set. They can be used internally or shared securely with external partners and customers. Data access through individual Data Lenses can be governed by service level agreements and – through metering – monetized.

For manufacturers and network operators looking at ways to share focused data slices from their Big Data, a Data Lens solves a big problem. By leveraging the Layer 7 API Gateway’s unique ability to focus on small data sets inside larger ones and to present these data sets as secure APIs, customized to specific customers or partners, it’s possible for IoT operators to drive new revenue from their Big Data.