October 1st, 2013

Cyber Security Awareness Month & the Internet of Vulnerable Things

IoT SecurityDid you know that October 2013 is the 10th National Cyber Security Awareness Month in the US? While I usually emphasize the enormous potential of the Internet of Things (IoT), let’s use the occasion to look at the security risks of the Internet of really vulnerable things.

Over the last couple of months, a casual observer could have noticed a variety of security scares related to “connected things” – from hacked baby monitors to hacked cars. In August, my colleague Matthew McLarty wrote about the security vulnerabilities of the Tesla Model S. Regulators also started to take notice and felt compelled to act.

Given that the problems appear to be systemic, what can companies do to mitigate the risks for connected devices? Rather than looking for yet another technological solution, my advice would be to apply common sense. It’s an industry-wide problem, not because of a lack of technology but because security and privacy are afterthoughts in the product design process. To get a feeling for the sheer scale of the problem, I suggest taking a look at the search engine Shodan. Both SiliconANGLE and Forbes have recently run articles covering some its findings.

Yet these problems did not start with IoT. For instance, Siemens was shipping industrial controllers with hardcoded passwords before the dawn of IoT – enabling the now infamous Stuxnet attack. Despite all the publicity, there are still vulnerabilities in industrial control systems, as noted in a Dark Reading article from the beginning of the year.

All the best practices and technologies needed to address these problems exist and can be applied today. But it is a people (designer, developer, consumer) problem and a (product design) process problem, not a technology problem. Designing fail-close (rather than fail-open) systems, using meaningful authentication, authorization and encryption settings and so on – all of this can be done today with little or no additional effort.

Essentially, our legal process has not caught up with technology. And it won’t for as long as the lack of security merely inconveniences us rather than threatening us with loss of property – or even life! Conversely, we are pretty good at applying security best practices in aviation because most serious problems with an aircraft in flight are inherently catastrophic. So, let’s hope that the recent news of hackers accessing airplane flight control systems acts as a wake-up call for the industry.

As API Management providers, we at Layer 7 are, more often than not, actively involved in shaping the API security policies and best practices of our customers. Since we believe APIs will form the glue that will hold IoT together, we are using our API Academy to disseminate API best practices in a vendor-neutral way. Most of what we have learned regarding scalability, resilience and security from the SOA days is still applicable in the API space and will be applicable in the IoT space. As the magnitude of interconnectedness grows, security remains paramount.

August 29th, 2013

Steering Safely into the Open Enterprise

Tesla Model SI recently wrote an article for Wired, which discussed the importance of thinking about security at every stage of your application lifecycle.  This is especially important as we enter the new era of open enterprise IT. The explosive growth of mobile computing has shifted the enterprise perimeter and traditional access control mechanisms are no longer sufficient. This is even more relevant when thinking about the Internet of Things (IoT) and its rapidly evolving ecosystem.

George Reese of Dell recently published an article that discusses the Tesla Model S REST API.  This API enables some remote control features on the car and is primarily used by Tesla’s available smartphone apps. Great stuff, showing how mobile meets IOT meets API. The problem is that the focus of the article is all on its potential security vulnerabilities. Where the Tesla developers should be lauded for driving this type of innovation, they are instead scolded for addressing security poorly.

I think this is a great example of where thinking about security all through the lifecycle would have saved the developers some embarrassment. Here are some things for them to think about with the next app or API:

  • Are there other clients besides smartphone apps that I want to access my API?
  • Are there other clients besides smartphone apps that I don’t want to access my API?
  • Are there proven standards or protocols I can use to provide access control?
  • Are there proven tools out there that can help me deliver the solution more quickly?
  • Is there a way for me to revoke a client’s access after it has been granted?

The Tesla team chose to take an unproven path with their authentication solution.  “Security by obscurity” used to be a popular approach but it doesn’t cut it in the open enterprise. In open computing, open and popular protocols like OAuth are the most secure mechanisms to use.  That may seem counter-intuitive but these protocols provide the richest set of implementation tools and breadth of use cases. This allows app developers to focus on their areas of expertise – like automotive innovation – and rely on the security experts for protection.

At Layer 7, our products and services help companies build the foundation for the open enterprise.  Our new Mobile Access Gateway release provides a variety of security capabilities, including smartphone access control and token revocation. Our API Academy helps clients design sustainable APIs that address all aspects of the API lifecycle, including the most practical and comprehensive security protections.

August 13th, 2013

What ist DaaS?

DaaSWe live in the age of Big Data but Big Data is not showing up to the party alone. Fast data and open data are also coming along for the ride. This is why we need an “as-a-service” approach to data sharing. In a recent article for Big Data Republic, I explored the concept of data-as-a-service (DaaS) and some of the operational challenges associated with providing access to Big Data.

The fact that these challenges are not just theoretical considerations was driven home to me by one of our customers, who told me that he simply didn’t have enough IT cycles to keep writing and rewriting all those queries and APIs his customers were asking for.

Similarly another recent article on Big Data Republic, refered to three powerful drivers for machine learning identified by Tibco CTO Matt Quinn – drivers that I believe are equally relevant to data APIs:

  • “A surge of data being liberated from places where it was previously hidden (aka big data’s volume challenge)
  • A need for automation that manages the complexity of Big Data in an environment where humans have no time to intervene (aka Big Data’s velocity challenge)
  • An absolute requirement to create adaptable, less fragile systems that can manage the combination of structured and unstructured data without having a human write complex code and rules with each change (aka Big Data’s variety challenge)”

The efficiency gains and resulting agility and potential for innovation created by data-centric APIs are enormous – not just in respect to open data but also the ability to turn data into an active asset and monetize it. For an inspiring story, head over to Andorra via FastCompany.

Meanwhile, an interesting take on the way IoT is increasingly driving data democratization – and creating new governance challenges in the process – comes from  Christopher J. Rezendes and W. David Stephenson in an article at the HBR blog network. Naturally, the best place to implement and enforce data governance is in the API that provides access to the data.

Secure API design and management is not rocket science. Our API Academy is offering best practices and practical advice on everything from API design to API security to API lifecyle management (and yes, that includes versioning). And if you are curious about how Layer7′s API Management Suite can help your Big Data access challenge, download our Data Lens solution brief or contact me at hreinhardt@layer7.com.

July 24th, 2013

IoT: The Weighting Game

Written by
Category IoT, M2M, Security, Twitter
 

Data Weighting for IoTThis must have been a scary few moments. On March 23, the main Associated Press Twitter account tweeted about explosions at the White House and President Obama being hurt. Guess what happened next? The Dow went down by over 100 points within minutes of the tweet.

So why did this happen? Regardless of whether the trades were executed by an algorithm or a human, both where treating all tweets from that AP feed as equal. They traded  based on the content of a single tweet – and the resulting feedback loop caused the drop in the stock market.

Fast forward to IoT and imagine that each Twitter account is a sensor (for instance, a smart meter) and the tweets are the sensor readings. Further imagine that the stock market is the grid manager balancing electricity supply and demand. If we were to attach the same weight to each data point from each smart meter, a potential attack on the smart meters could easily be used to manipulate the electrical grid and – for instance – cause the local transformer to blow up or trigger a regional blackout via a feedback loop.

Yet strangely enough – when talking about the IoT – the trustworthiness of sensor data does not appear to be of concern.  All data are created equal or so the assumption seems to be. But data have an inherent quality or weight inferred by the characteristics of the endpoint and how much it is trusted. Any algorithm using sensor data would need to not only take into account the data points as such but also weight the data based on the actual capabilities of the sensor, its identity and its trust relationship with the sensor.

I tried to capture this relationship in picture below.

Endpoint Security in IoT

How can we account for the risk that not all data are created equal?

Credit card companies provide a good object lesson in the way they have embraced inherent insecurity. They decided to forgo stronger security at the endpoint (the credit card) in order to lower the bar for use and increase market adoption. But in order to limit the risk of fraudulent use, every credit card transaction is being evaluated in the context of most recent transactions.

A similar approach will be required for IoT. Instead of chasing impossible endpoint security, we should embrace the management of (data) risk in the decision-making process. An advanced, high-performing API Gateway like Layer 7’s can be used to perform data classification at the edge of the enterprise and attach labels to the data flowing through the Gateway and into the control processes.

I’d be curious to learn if and how you would deal with the data risk. Do you assume that all data are created equal? Or does the above picture resonate with your experiences?

July 19th, 2013

What I Learned in Helsinki: The Core Motivation of IoT & Other Reflections

Written by
Category Conferences, IoT, M2M
 

Reflections on IoT-ALast month, I attended a two-day IoT-A workshop during IoT Week in Helsinki. The goal of the workshop was to showcase the various IoT research projects that are jointly funded by industry and the EU’s FP7 research program. The quality of the projects on display was amazing and I could not possibly do it justice in the space of a blog post. Still, here’s a partial list of what I saw:

BUTLER

  • Horizontal open platform for IoT
  • To learn the intent of a user requires a horizontal approach
  • This horizontal approach leads to context awareness

FI-WARE

iCore

  • Composed of Virtual Objects, Composite Virtual Objects and  Service Layer
  • User characteristics + situation awareness = intent recognition

OpenIoT

  • Linked sensor middleware
  • Data management instead of infrastructure management
  • Uses information interoperability and linked data to enable automated composition

ComVantage

  • Manufacturing automation
  • Uses XACML and extends it for linked data

CHOReOS

  • Probabilistic registration of things
  • Registration decisions are based on existing density and coverage requirements

To get a more complete picture, you can find all the presentations from the workshop here.

There were two key insights I took away from this workshop, both of which had to do with subtle similarities shared by all the projects.

First, sitting in and listening to the various presentations, I was struck by one particular similarity: at the core of each use case was the desire to make better-informed decisions. I’ve tried to capture what I call the core motivation of IoT in the picture below.

The identity of the user or thing combined with the temporal and/or spatial context based on real-world knowledge and data from the past can allow us to make better-informed decisions for the future. I think this holds for both the smart coffeemaker and the smart city.

My other insight had to do with the surprisingly similar characteristics of the various presented IoT applications. I tried to capture these characteristics in the picture below.

At the heart of the applications lies data – lots of data. But Big Data has two siblings: Fast Data and Open Data. The applications are graph-structured based on the relationship of things to each other and to me. They are event-driven rather than transactional and they are compositional.

What do you think? What kind of similarities do you see between the various applications?