Last week, I had the opportunity to answer questions about the Internet of Things (IoT) when I took part in Layer 7’s monthly API Tech Talk. We had a tremendous response, with lots of questions and a very active online discussion. You can find a replay of the Tech Talk here. I’d like to take this opportunity to answer a few of the questions we received during the webcast but didn’t have time to answer on the day.
How does Layer 7 help me manage a range of devices across IoT?
IoT is an opportunity for CA and Layer 7 to bring together identity, access and API Management. To paraphrase a comment on a recent Gigaom article: Everything with an identity will have an API and everything with an API will have an identity.
With so many “things” potentially accessing APIs, what are some strategies for securing these APIs across such a breadth of consumers?
Identify, authenticate and authorize using standards. API for IoT means managing identity for many devices at Internet scale.
How will API discoverability work with the vast number of things, especially if we see REST as the primary communication style?
I reached out to my colleague Ronnie Mitra for this answer. Ronnie pointed out that, in the past, standards like UDDI and WSRR promised to provide service registries but that didn’t really work out. Nowadays, we see lots of independent human-oriented API registries and marketplaces that might have more chance of surviving. There are even some runtime discovery solutions like Google’s discovery interface for APIs and the use of HTTP OPTION to learn about APIs. At the moment, lots of people are trying lots of things, unsure of where it will all end up. It would be interesting to dive deeper into why we need discoverability to power IoT and when that discoverability has to take place.
How can API security get easier when API demand grows exponentially? There’s a big disconnect.
It doesn’t get easier. Transport-level security is reasonably well understood but endpoint identity and trust will be challenging.
Where will the intelligence be in IoT? Will there be some form of on-site intelligence, so that core functionality continues even if the connection is lost? Or will all intelligence be cloud-based?
It depends on whether you design for centralized “hub and spoke” or decentralized “domains of concern”. The former is responsible for correlating data and events within the domain whereas the latter is responsible for communicating with other domains (I owe this concept to Michael Holdmann’s blog). “Domains of concern” design communicates with different domains for different purposes – in an apartment for home automation, in an apartment building for HVAC, in a city block for energy generation/consumption, in a city for utility grid etc. Emergencies or out-of-bound signals are handled like exceptions and are bubbling up through the domains until intercepted. But most things will serve an inherent purpose and that purpose will not be affected by the absence of any connectivity. There will be intelligence within the core of each domain as well as at the edges/intersections with other domains.
What is the best way to overcome fear of exposing data via APIs in an enterprise?
You need to identify a business opportunity. Unless you know what business impact you are trying to archive and how you will measure it, you should not do it.
Does IoT require a strong network or big data or both?
Not a strong network but ubiquitous connectivity. Not big data but sharing/correlating data horizontally between distinct vertical silos.
What significance (benefits/drawbacks) do the various REST levels have with respect to the Internet of Things (connecting, monetizing etc.)?
I had never heard of levels of REST and had to look it up. Turns out the levels are: resources, verbs and hypermedia. Hypermedia would allow you to embed long-lived clients, which could adapt to changes in API design. But it is actually the data or service behind the API which is monetizable, not the API itself. The API is just the means to an end.
How will IoT evolve? And more importantly how can enterprises solve the security and privacy issues that will arise as IoT evolves?
Culturally, the European regulators will try to put privacy regulations in place sooner rather than later whereas the North Amercian market will initially remain largely unregulated until some abuse prompts the regulator to step in. In Germany, the federal regulator tries to stay ahead of the market and recently published a security profile for smart meters. Personally I would look at designing M2M and IoT applications assuming that endpoint data is inherently unreliable and that I can not necessarily trust the source. But that is very broad guidance and may or may not be applicable to a specific use case.
As we create API frameworks that interact with sensors and control objects in the IoT what/who are the best organizations to follow to learn about new protocols that we should be preparing to handle, such as CoAP etc?
Here are some suggestions:
How close are we to having a unified platform for IoT application developers and who is likely to be the winner among the competing platforms?
Chances are there won’t be a winner at all. You have companies like Axeda, Exosite, Gemalto, Digi, Paraimpu, BugLabs, ThingWorx, SensiNode, deviceWISE and more. You have industry working groups like Eclipse M2M and various research efforts like SPITFIRE project, Fraunhofer FOKUS, DFuse and many others. The Eclipse M2M framework is probably a good choice to start with.
Even assuming ubiquitous and common networking (e.g. IPv6 on the public Internet) – how will the IoT identify peers, hierarchy and relationships?
I think there is a huge opportunity for identity companies like CA to figure this out. Take a look at EVRYTHNG as one of the few startups in that space. Meanwhile, the folks over at Paraimpu are trying to tackle this challenge by combining aspects of a social network with IoT.