July 30th, 2012

Why I Still Like OAuth

Written by
 

OAuth 2.0 ControversyThat sound of a door slamming last week was Eran Hammer storming out of the OAuth standardization process, declaring once and for all that the technology was dead and that he would no longer be a part of it. Tantrums and controversy make great social media copy, so it didn’t take long before everyone seemed to be talking about this one. In some quarters, you’d hardly know the London Olympics had begun.

So what are we to really make of all this? Is OAuth dead or at least on “the road to Hell”, as Eran now-famously put it? Certainly, my inbox is full of emails from people asking if they should stop building their security architecture around such a tainted specification.

I think Tim Bray, who has vast experience with the relative ups and downs of technology standardization, offered the best answer in his own blog:

“It’s done. Stick a fork in it. Ship the RFCs.”

Which is to say sometimes you just have to declare a reasonable victory and deal with the consequences later. OAuth isn’t perfect, nor is it easy. But it’s needed and it’s needed now, so let’s all forget the personality politics and just get it done. And hopefully, right across the street from me here in Vancouver, where the IETF is holding it’s meetings all this week, this is what will happen.

In the end, OAuth is something we all need and this is why this specification remains important. The genius of OAuth is that it empowers people to perform delegated authorization on their own, without the involvement of a cabal of security admins. And this is something that is really quite profound.

In the past, we’ve been shackled by the centralization of control around identity and entitlements (a fancy term which really just describes the set of actions your identity is allowed, such as writing to a particular file system). This has led to a status quo in nearly every organization that is maintained first because it is hard to do otherwise but also because this equals power, which is something that is rarely surrendered without a fight.

The problem is that centralized identity admin can never effectively scale, at least from an administrative perspective. With OAuth, we can finally scale authentication and authorization by leveraging the user population itself — and this is the one thing that stands a chance of shattering the monopoly on centralized identity and access management (IAM). OAuth undermined the castle and the real noise we are hearing isn’t infighting on the spec but the enterprise walls falling down.

Here is the important insight of OAuth 2.0: delegated authorization also solves that basic security sessioning problem of all apps running over stateless protocols like HTTP. Think about this for a minute: The basic Web architecture provides for complete authentication on every transaction. This is dumb, so we have come up with all sorts of security context tracking mechanisms, using cookies, proprietary tokens etc. The problem with many of these is that they don’t constrain entitlements at all; a cookie is as good as a password because it really just linearly maps back to an original act of authentication.

OAuth formalizes this process but adds in the idea of constraint with informed user consent. And this, ladies and gentlemen, is why OAuth matters. In OAuth, you exchange a password (or other primary security token) for a time-bound access token with a limited set of capabilities to which you have explicitly agreed. In other words, the token expires fast and is good for one thing only. So you can pass it off to something else (like Twitter) and reduce your risk profile or — and this is the key insight of OAuth 2.0 — you can just use it yourself as a better security session tracker.

The problem with OAuth 2.0 is that it’s surprisingly hard to get to this simple idea from the explosion of protocol in OAuth 1.0a. Both specs too-quickly reduce to an exercise in swim lane diagram detail, which ironically runs counter to the movement towards simplicity and accessibility that drives today’s Web. And therein lies the rub. OAuth is more a victim of poor marketing than bad specsmanship. I have yet to see a good, simple explanation of why, followed by how. (I don’t think OAuth 1.0 was well served by the valet key analogy, which distracts from too many important insights.) As it stands today, OAuth 2.0 makes Kerberos specs seem like grade school primer material.

It doesn’t have to be this way. OAuth is actually deceptively simple; it is the mechanics that remain potentially complex (particularly those of the classic 1.0a, three-legged scenario). But the same can be said of SSL/TLS, which we all use daily with few problems. What OAuth needs is a set of dead simple (but nonetheless solid) libraries on the client side and equally simple, scalable support on the server. This is a tractable problem and it is coming. It also needs much better interpretation, so that people can understand it fast.

Personally, I agree in part with Eran Hammer’s wish buried in the conclusion of his blog entry:

“I’m hoping someone will take 2.0 and produce a 10-page profile that’s useful for the vast majority of Web providers, ignoring the enterprise.”

OAuth absolutely does need simple profiling for interop. But don’t ignore the enterprise. The enterprise really needs the profile too because the enterprise badly needs OAuth.

July 26th, 2012

Programming in the Cloud

CloudDevelop LogoQuite a bit has been written about how the Cloud is altering the landscape for platform, software and infrastructure providers but not as much has been said about what all this means for developers. I recently decided to find out for myself by going on an “all-cloud diet”. In practical terms, this meant I used a sealed netbook or smartphone to do all my work.

Therefore, I had to do all the things an active developer regularly has to do (coding, debugging, testing etc.) from a device that has no appreciable hard-drive space and does not allow the installation of any customer software. In essence, I was on a strict diet of browser-based and plug-in based tools and services reachable via an Internet connection.

In relatively short order I was able to find browser-based editors (even ones that support line-by-line server-side debugging!), tools for managing data stores and code repositories. Furthermore, I was able to post test scripts for execution/review and even deploy my projects to a wide range of server providers – all from my browser.

Along the way, I discovered that I had an easier time collaborating online with colleagues in other locations and was better able to take advantage of the most recent releases of new services and tools (since there was no “install” or “update” I had to manage). And – of course – I was more mobile in the process.

Not all programming languages, runtime environments and server profiles are represented in the cloud. And there are still many details to work out in order to make assembling a full-featured “cloud tool chain” easy, reliable and cost effective. Nevertheless, I can see that it is a possibility and I have met people who are working to make that possibility a reality.

My advice to developers would be: Conduct your own experiments; try out your own “cloud-only diet” and see what you learn. Even if you decide that not all the pieces you need are available, you may still discover there are ways to leverage cloud-based tooling to reduce barriers, add flexibility and increase productivity in various aspects of your development efforts.

I’ll be exploring these issues in greater depth when I present a talk titled Programming with the OSS “Cloud Stack” at the CloudDevelop show in Columbus, OH on August 3.

July 20th, 2012

Returning from #CIS2012

Francois Lascelles at Cloud Identity SummitCloud Identity Summit was definitely worth the trip. The talks were great, the audience was great and the venue was outstanding. Sign me up for next year in Napa!

It’s beautiful and quiet at Vail Cascade this morning. As I stepped outside, I’m pretty sure I saw SAML scurrying away into the trees. This is weird given this week’s proclamations that SAML was dead. Although we won’t be rid of SAML anytime soon, I do look forward to enterprise adoption of the new kid on the block: OpenID Connect. Easier federation, OpenID Connect-style is already common for consumer identity providers; enterprise identity providers should take note and follow suit. As a vendor of API Management infrastructure, it’s up to us to enable the enterprise to better reach out to its target audience. I see support for OpenID Connect as a key component in achieving this today.

My favorite proclamation of the week goes to Patrick Harding who declared in his talk titled “The Platformication of the Enterprise is Upon us Again and They Forgot Security (Again)” that API tokens are going to be “the currency of the API economy”. The management of tokens and their lifecycle is indeed a crucial component of API Management. Consider the case of a mobile application consuming an enterprise API using an OAuth token. Such tokens are associated with the API provider, the user (subscriber), the mobile application and the mobile device. Each live token is potentially associated with multiple parties and one of the challenges of API token management is to enable control of the right tokens by the right parties.

July 19th, 2012

Hypermedia APIs – Tech Talk Tuesday July 24 Featuring Mike Amundsen

Mike Amundsen Tech TalkOur recent Tech Talk discussing OpenID Connect was great. We had some pre-questions sent in via email, lots of live questions through the stream and some great questions through our twitter hashtag #Layer7Live.  We’re going to pick up on the momentum of that last Tech Talk and continue on with our next interactive API-focused discussion on July 24 at 9am PDT.

We’re very excited to be welcoming Mike Amundsen, Layer 7′s Principal API Architect, back to the Tech Talk studio. He’s ready to take on questions and discuss hypermedia APIs, a subject he literally wrote the book on.

  • What makes a hypermedia API different from other API types?
  • How is designing one different from designing any other form of API?
  • What are the benefits or complications for the publisher and the developer?

These are just a few of the questions that arise when thinking about designing hypermedia APIs. Now’s the time to get your thinking caps on and start formulating the questions you want to ask Mike on his specialist subject.

Make sure you click Add to Calendar to get the full event details and a reminder on the day.

On the day of the event, join on Livestream or Facebook:
»  livestream.com/layer7live
»  facebook.com/layer7

Submit your questions:

July 17th, 2012

Developer Management & the Layer 7 API Portal 2.1

Layer 7 API Portal Version 2.1As Layer 7’s CTO, Scott Morrison, recently stated: API Management = Developer Management. Okay, there are actually many elements to API Management – securing APIs, enforcing rate limits and SLAs, translating protocols and so forth. But if developers can’t make use of your APIs, then your APIs aren’t going to do you much good. So, providing a place where developers can discover, register for, learn about and leverage your APIs is – in many ways – the key to a truly effective API Management strategy.

That’s why the Layer 7 API Portal – which is designed to help organizations onboard, educate and manage developers – is one of the cornerstones of our API Management Suite.

The world of Web, mobile and cloud API publishing is growing and changing at an incredible rate right now, so we’re constantly working hard to expand and refine our line of API-focused products. With all that in mind – and hot on the heels of our SecureSpan Mobile Access Gateway – we’re very excited to announce version 2.1 of the Portal.

With the developer management needs of API publishers constantly evolving, we’ve added a range of new functionality to the Portal, including:

  • Advanced analytic reports
  • More granular privacy controls
  • Enhanced lifecycle management features
  • New customization options for the content management system

We’re exhibiting at Mobile+Web DevCon in San Francisco this week. If you’re at the show and you’d like to learn more about this new API Portal release, please stop by our booth.