August 22nd, 2012

From the Vault: Understanding Mobile IAM with Forrester Research

Forrester WebinarsIn the new hybrid enterprise, organizations need to manage business functions that flow across their domain boundaries in all directions. Increasingly, this means using APIs as conduits for opening up information to services running in the cloud and apps running on mobile devices like the iPad. For enterprises, securing and governing these APIs is not straightforward.

Meanwhile, BYOD is making Mobile Access an urgent issue for enterprises; forcing them to make application functionality available to app developers in a consistent, easily-consumable, mobile-optimized manner, via APIs. Therefore, enterprise technologies are evolving to support API-based mobile interactions.

Identity and access management (IAM) represents a key concern for enterprise IT and it is particularly crucial in BYOD/enterprise mobile scenarios. Mobile IAM requires fundamentally new approaches and the adoption of new standards such as OAuth.

These are some of the most critical issues facing IT departments today but the associated techniques and technologies are not necessarily that well understood in the enterprise world. Therefore, I’d like to take this opportunity to  flag up some relevant webinars from the Layer 7 archive, all of which feature Forrester Research.

If you’re facing the challenge of ensuring secure access in an enterprise mobile scenario, these resources should help you make sense of the issues:

  • How to Make Your Enterprise Applications Mobile Ready, Fast
    Leverage backend mobile middleware to deliver mobile ready enterprise APIs
    Find out more >>
  • Identity, Access & Privacy in the New Hybrid Enterprise
    Make sense of OAuth, OpenID Connect and UMA
    Find out more >>
  • A Practical Guide to API Security & OAuth for the Enterprise
    Implement OAuth as part of an enterprise-level API security solution
    Find out more >>
August 6th, 2012

To OAuth or Not to OAuth? That is the Question – The Long Road to Standardization for OAuth 2.0

Written by
 

Tech Talk with Francois LascellesTo OAuth or not to OAuth? That seems to be the question many in the API business must ask themselves now that OAuth has moved closer to becoming a standard for authentication. OAuth 2.0 reached a major milestone this week on the road to becoming a standard, when the Internet Engineering Task Force (IETF) approved a draft of OAuth version 2.0. Layer 7′s Chief Architect Francois Lascelles says: “This milestone solidifies the OAuth 2.0 claim of being a standard.”

But OAuth’s journey towards becoming a standard hasn’t been completely smooth. Last week, the original editor of the OAuth 2.0 specification and author of OAuth 1.0, Eran Hammer, resigned and removed his name from the specifications. Layer 7′s own CTO, Scott Morrison, offered his support for the specification in a blog post titled Why I Still Like OAuth, in which he stated: “In the end, OAuth is something we all need and this is why this specification remains important. The genius of OAuth is that it empowers people to perform delegated authorization on their own, without the involvement of a cabal of security admins. And this is something that is really quite profound.”

Still, obvious questions remain: Is OAuth 2.0 a solid protocol for authentication? Should I stop building security architecture around such a tainted specification? What other means are there for authentication if OAuth has become too focused on the enterprise? Francois Lascelles will address these questions as well as discussing and commenting on the recent OAuth 2.0 draft approval during our next live Tech Talk, on August 7. Make sure you add this Tech Talk to your calendar, if you want to get the event details and a reminder on the day.

On the day of the event, join on Livestream or Facebook:

And if you’d like to submit some questions:

July 30th, 2012

Why I Still Like OAuth

Written by
 

OAuth 2.0 ControversyThat sound of a door slamming last week was Eran Hammer storming out of the OAuth standardization process, declaring once and for all that the technology was dead and that he would no longer be a part of it. Tantrums and controversy make great social media copy, so it didn’t take long before everyone seemed to be talking about this one. In some quarters, you’d hardly know the London Olympics had begun.

So what are we to really make of all this? Is OAuth dead or at least on “the road to Hell”, as Eran now-famously put it? Certainly, my inbox is full of emails from people asking if they should stop building their security architecture around such a tainted specification.

I think Tim Bray, who has vast experience with the relative ups and downs of technology standardization, offered the best answer in his own blog:

“It’s done. Stick a fork in it. Ship the RFCs.”

Which is to say sometimes you just have to declare a reasonable victory and deal with the consequences later. OAuth isn’t perfect, nor is it easy. But it’s needed and it’s needed now, so let’s all forget the personality politics and just get it done. And hopefully, right across the street from me here in Vancouver, where the IETF is holding it’s meetings all this week, this is what will happen.

In the end, OAuth is something we all need and this is why this specification remains important. The genius of OAuth is that it empowers people to perform delegated authorization on their own, without the involvement of a cabal of security admins. And this is something that is really quite profound.

In the past, we’ve been shackled by the centralization of control around identity and entitlements (a fancy term which really just describes the set of actions your identity is allowed, such as writing to a particular file system). This has led to a status quo in nearly every organization that is maintained first because it is hard to do otherwise but also because this equals power, which is something that is rarely surrendered without a fight.

The problem is that centralized identity admin can never effectively scale, at least from an administrative perspective. With OAuth, we can finally scale authentication and authorization by leveraging the user population itself — and this is the one thing that stands a chance of shattering the monopoly on centralized identity and access management (IAM). OAuth undermined the castle and the real noise we are hearing isn’t infighting on the spec but the enterprise walls falling down.

Here is the important insight of OAuth 2.0: delegated authorization also solves that basic security sessioning problem of all apps running over stateless protocols like HTTP. Think about this for a minute: The basic Web architecture provides for complete authentication on every transaction. This is dumb, so we have come up with all sorts of security context tracking mechanisms, using cookies, proprietary tokens etc. The problem with many of these is that they don’t constrain entitlements at all; a cookie is as good as a password because it really just linearly maps back to an original act of authentication.

OAuth formalizes this process but adds in the idea of constraint with informed user consent. And this, ladies and gentlemen, is why OAuth matters. In OAuth, you exchange a password (or other primary security token) for a time-bound access token with a limited set of capabilities to which you have explicitly agreed. In other words, the token expires fast and is good for one thing only. So you can pass it off to something else (like Twitter) and reduce your risk profile or — and this is the key insight of OAuth 2.0 — you can just use it yourself as a better security session tracker.

The problem with OAuth 2.0 is that it’s surprisingly hard to get to this simple idea from the explosion of protocol in OAuth 1.0a. Both specs too-quickly reduce to an exercise in swim lane diagram detail, which ironically runs counter to the movement towards simplicity and accessibility that drives today’s Web. And therein lies the rub. OAuth is more a victim of poor marketing than bad specsmanship. I have yet to see a good, simple explanation of why, followed by how. (I don’t think OAuth 1.0 was well served by the valet key analogy, which distracts from too many important insights.) As it stands today, OAuth 2.0 makes Kerberos specs seem like grade school primer material.

It doesn’t have to be this way. OAuth is actually deceptively simple; it is the mechanics that remain potentially complex (particularly those of the classic 1.0a, three-legged scenario). But the same can be said of SSL/TLS, which we all use daily with few problems. What OAuth needs is a set of dead simple (but nonetheless solid) libraries on the client side and equally simple, scalable support on the server. This is a tractable problem and it is coming. It also needs much better interpretation, so that people can understand it fast.

Personally, I agree in part with Eran Hammer’s wish buried in the conclusion of his blog entry:

“I’m hoping someone will take 2.0 and produce a 10-page profile that’s useful for the vast majority of Web providers, ignoring the enterprise.”

OAuth absolutely does need simple profiling for interop. But don’t ignore the enterprise. The enterprise really needs the profile too because the enterprise badly needs OAuth.

July 20th, 2012

Returning from #CIS2012

Francois Lascelles at Cloud Identity SummitCloud Identity Summit was definitely worth the trip. The talks were great, the audience was great and the venue was outstanding. Sign me up for next year in Napa!

It’s beautiful and quiet at Vail Cascade this morning. As I stepped outside, I’m pretty sure I saw SAML scurrying away into the trees. This is weird given this week’s proclamations that SAML was dead. Although we won’t be rid of SAML anytime soon, I do look forward to enterprise adoption of the new kid on the block: OpenID Connect. Easier federation, OpenID Connect-style is already common for consumer identity providers; enterprise identity providers should take note and follow suit. As a vendor of API Management infrastructure, it’s up to us to enable the enterprise to better reach out to its target audience. I see support for OpenID Connect as a key component in achieving this today.

My favorite proclamation of the week goes to Patrick Harding who declared in his talk titled “The Platformication of the Enterprise is Upon us Again and They Forgot Security (Again)” that API tokens are going to be “the currency of the API economy”. The management of tokens and their lifecycle is indeed a crucial component of API Management. Consider the case of a mobile application consuming an enterprise API using an OAuth token. Such tokens are associated with the API provider, the user (subscriber), the mobile application and the mobile device. Each live token is potentially associated with multiple parties and one of the challenges of API token management is to enable control of the right tokens by the right parties.

June 21st, 2012

Mobile-Friendly Federated Identity: Part 2 – OpenID Connect

 

The idea of delegating the authentication of a user to a third-party is ancient. At some point however, a clever (or maybe lazy) developer thought to leverage an OAuth handshake to achieve this. In the first part of this blog post, I pointed out winning patterns associated with the popular social login trend. In this second part, I suggest the use of specific standards to achieve the same for your identities.

OAuth was originally conceived as a protocol allowing an application to consume an API on behalf of a user. As part of an OAuth handshake, the API provider authenticates the user. The outcome of the handshake is the application getting an access token. This access token does not directly provide useful information for the application to identify the user. However, when the provider exposes an API that returns information about the user, the application can use this as a means to close the loop on the delegated authentication.

Step 1 – User is subjected to an OAuth handshake with provider knowing its identity

Step 2 – Application uses the access token to discover information about the user by calling an API

As a provider enabling an application to discover the identity of a user through such a sequence, you could define your own simple API. Luckily, an emerging standard covers such semantics: OpenID Connect. Currently a draft spec, OpenID Connect defines (among other things) a “user info” endpoint that takes an OAuth access token as its input and returns a simple JSON structure containing attributes about the user, authenticated as part of the OAuth handshake.

Request:
GET /userinfo?schema=openid HTTP/1.1
Host: server.example.com
Authorization: Bearer SlAV32hkKG

Response:
200 OK
content-type: application/json
{
“user_id”: “248289761001″,
“name”: “Jane Doe”,
“given_name”: “Jane”,
“family_name”: “Doe”,
“email”: “janedoe@example.com”,
“picture”: “http://example.com/janedoe.jpg”
}

In Layer 7′s SecureSpan Mobile Access Gateway OpenID Connect implementation, a generic user info endpoint is provided, which validates an incoming OAuth access token and returns user attributes for the user associated with said token. You can plug in your own identity attributes as part of this user info endpoint implementation. For example, if you are managing identities using an LDAP provider, you inject an LDAP query in the policy, as illustrated below.

To get the right LDAP record, the query is configured to take the variable ${session.subscriber_id} as its input. This variable is automatically set by the Layer 7 OAuth Toolkit as part of the OAuth access token validation. You could easily look up the appropriate identity attributes from a different source using, for example, a SQL query or even an API call – all the input necessary to discover these attributes is available to the manager.

Another aspect of OpenID Connect is the issuing of ID tokens during the OAuth handshake. This ID token is structured following the JSON Web Token specification (JWT), including JWS signatures. Layer 7’s OpenID Connect introduces the following assertions to issue and handle JWT-based ID tokens:

  • Generate ID Token
  • Decode ID Token

Note that, at the time of writing, OpenID Connect is a moving target and the specification is subject to change before finalization.