Scott Morrison

Scott Morrison

Scott Morrison is the Chief Technology Officer at Layer 7 Technologies, providing the visionary innovation and technical direction for the company. He has extensive technical and scientific experience in a number of industries and universities, including senior architect positions at IBM. He is one of the four co-editors for the WS-I Basic Security Profile. Scott is a much sought-after author and speaker. He has published over 50 book chapters, magazine articles, and papers in medical, physics and engineering journals.

October 25th, 2012

The iPad Mini is for Cars

Written by
Category Apps, Mobile Access
 

Mini Cooper on an iPad MiniOn Tuesday, Apple launched the iPad mini. Apple events in the fall of 2012 may no longer command the social anticipation they did just a few years ago but they remain flash points for technology reporting. This release brought on more than its share of speculation that the mini is simply an overdue acknowledgement that Amazon got something right with Kindle and that Apple has quietly slipped into following mode. Some pundits have seized on the angle that Apple’s new tablet appeared to contradict Steve Jobs’ famous trashing of the 7″ form factor. But in all of the hullabaloo, one observation seems to be missing: that a tablet of this size is tailor-made for inclusion into the dashboard of your car.

Nothing dates a car like its electronics. And nothing is more tragic that the user experience of pretty much every single in-car navigation and music system. The luxury car segment can do Corinthian leather and wood grain appointments like no industry on earth. They can build a magnificent driving machine that powers through rain and snow like it was a sunny day in LA. But ask them to do a screen-based app and you get something that looks like it was designed on a TRS-80.

I didn’t renew the trial SiriusXM in my 4Runner because I couldn’t stand its programming compared with what I could stream from my iPhone using Bluetooth. Every time I rent a car, I use my phone-based Navigon app over any provided GPS because my app is just better. I’m hooked on Waze, despite how few people use it up here in Vancouver (you should sign up — the more people who use it, the better the traffic data is). The apps on my phone are always up-to-date and I replace the hardware every couple of years for the latest model (which is good enough for me; after all, it’s only a phone).

All cars need is a standard, lockable frame where you can plug in the device of your choice, plus a standardized connector. Then let free market competition and innovation prevail over apps. Tomorrow’s gear heads aren’t going to be like the hot rodders of my Dad’s generation or the tuner kids of a decade ago. They are going to be geeks with apps using APIs.

That’s what the iPad mini is for.

(It’s interesting to note that the wifi-only mini does not have GPS but the cellular version does…)

July 30th, 2012

Why I Still Like OAuth

Written by
 

OAuth 2.0 ControversyThat sound of a door slamming last week was Eran Hammer storming out of the OAuth standardization process, declaring once and for all that the technology was dead and that he would no longer be a part of it. Tantrums and controversy make great social media copy, so it didn’t take long before everyone seemed to be talking about this one. In some quarters, you’d hardly know the London Olympics had begun.

So what are we to really make of all this? Is OAuth dead or at least on “the road to Hell”, as Eran now-famously put it? Certainly, my inbox is full of emails from people asking if they should stop building their security architecture around such a tainted specification.

I think Tim Bray, who has vast experience with the relative ups and downs of technology standardization, offered the best answer in his own blog:

“It’s done. Stick a fork in it. Ship the RFCs.”

Which is to say sometimes you just have to declare a reasonable victory and deal with the consequences later. OAuth isn’t perfect, nor is it easy. But it’s needed and it’s needed now, so let’s all forget the personality politics and just get it done. And hopefully, right across the street from me here in Vancouver, where the IETF is holding it’s meetings all this week, this is what will happen.

In the end, OAuth is something we all need and this is why this specification remains important. The genius of OAuth is that it empowers people to perform delegated authorization on their own, without the involvement of a cabal of security admins. And this is something that is really quite profound.

In the past, we’ve been shackled by the centralization of control around identity and entitlements (a fancy term which really just describes the set of actions your identity is allowed, such as writing to a particular file system). This has led to a status quo in nearly every organization that is maintained first because it is hard to do otherwise but also because this equals power, which is something that is rarely surrendered without a fight.

The problem is that centralized identity admin can never effectively scale, at least from an administrative perspective. With OAuth, we can finally scale authentication and authorization by leveraging the user population itself — and this is the one thing that stands a chance of shattering the monopoly on centralized identity and access management (IAM). OAuth undermined the castle and the real noise we are hearing isn’t infighting on the spec but the enterprise walls falling down.

Here is the important insight of OAuth 2.0: delegated authorization also solves that basic security sessioning problem of all apps running over stateless protocols like HTTP. Think about this for a minute: The basic Web architecture provides for complete authentication on every transaction. This is dumb, so we have come up with all sorts of security context tracking mechanisms, using cookies, proprietary tokens etc. The problem with many of these is that they don’t constrain entitlements at all; a cookie is as good as a password because it really just linearly maps back to an original act of authentication.

OAuth formalizes this process but adds in the idea of constraint with informed user consent. And this, ladies and gentlemen, is why OAuth matters. In OAuth, you exchange a password (or other primary security token) for a time-bound access token with a limited set of capabilities to which you have explicitly agreed. In other words, the token expires fast and is good for one thing only. So you can pass it off to something else (like Twitter) and reduce your risk profile or — and this is the key insight of OAuth 2.0 — you can just use it yourself as a better security session tracker.

The problem with OAuth 2.0 is that it’s surprisingly hard to get to this simple idea from the explosion of protocol in OAuth 1.0a. Both specs too-quickly reduce to an exercise in swim lane diagram detail, which ironically runs counter to the movement towards simplicity and accessibility that drives today’s Web. And therein lies the rub. OAuth is more a victim of poor marketing than bad specsmanship. I have yet to see a good, simple explanation of why, followed by how. (I don’t think OAuth 1.0 was well served by the valet key analogy, which distracts from too many important insights.) As it stands today, OAuth 2.0 makes Kerberos specs seem like grade school primer material.

It doesn’t have to be this way. OAuth is actually deceptively simple; it is the mechanics that remain potentially complex (particularly those of the classic 1.0a, three-legged scenario). But the same can be said of SSL/TLS, which we all use daily with few problems. What OAuth needs is a set of dead simple (but nonetheless solid) libraries on the client side and equally simple, scalable support on the server. This is a tractable problem and it is coming. It also needs much better interpretation, so that people can understand it fast.

Personally, I agree in part with Eran Hammer’s wish buried in the conclusion of his blog entry:

“I’m hoping someone will take 2.0 and produce a 10-page profile that’s useful for the vast majority of Web providers, ignoring the enterprise.”

OAuth absolutely does need simple profiling for interop. But don’t ignore the enterprise. The enterprise really needs the profile too because the enterprise badly needs OAuth.

July 10th, 2012

Hey Twitter: API Management = Developer Management

Twitter APIQuick question for you: What matters most, the client or the server?

Answer: Neither —  they are really only useful as a whole. A client without a server is usually little more than an non-functional wire frame and a server without a client is simply unrealized potential. Bring them together though and you have something of lasting value. So, neither matters more and each actually matters a lot less than half.

In the API world, this is an easy point to miss. The server side always wields disproportionate power by virtue of controlling the API to its services and this can easily foster an arrogance about the server’s place in the world. This effect is nicely illustrated by Twitter’s recent missteps around developer management.

The problems for Twitter all began with a blog entry. Blogs are the mouthpiece of the platform. Tucked away within an interesting entry about Twitter Cards and the potential to run applications within tweets (something that is genuinely exciting), can be found a restatement of an early warning to developers:

“(D)evelopers should not ‘build client apps that mimic or reproduce the mainstream Twitter consumer client experience.’”

Ominous stuff indeed. This was quickly picked up on by Nick Bilton writing in the New York Times Bits blog, who pointed out that the real problem is that Twitter just isn’t very good at writing client-side apps that leverage its own API. Stifling competition by leveraging the API power card can only alienate developers — and by extension the public, who are left with a single vendor solution. Suddenly, it feels like the 1980s all over again.

This ignited a firestorm of concern that was well summarized by Adam Green on ProgrammableWeb. Green acknowledged that API change is inevitable but pointed out that this is something that can be managed effectively — which is not what Twitter is doing right now.

The irony of the whole thing is that, in the past, by exercising its power position, Twitter has actually made great contributions to the API community. In mid 2010, Twitter cut off basic authentication to APIs in favor of OAuth, a drop-dead event that became known as the OAuthcalypse. Hyperbole aside, in terms of actual impact on the populace, this cut over made even Y2K look like the end of days. Given a tractable challenge, developers cope, which is really Green’s point.

What is important to realize is that API Management isn’t technical but social. Win the community over and they will move mountains. Piss them off and they will leave in droves for the next paying gig.

The thing I always remind people is that as a trend, APIs are not about technology; they are a strategy. Truth is, the technology is pretty easy — and that’s the real secret to API’s success. You see, the communications are never the thing; the app is the thing (and that is what WS-* missed). Maintaining simplicity and a low barrier to entry counts for everything because it means you can get on with building real apps.

Now, I can give you the very best infrastructure and tools to facilitate API community. But how you manage this community… Well, that is where the real work begins and — in the end — it’s all a lot less deterministic than we technologists like to admit. People are hard to manage but communities are even harder.

If there is a lesson here, it is that APIs are really about potential and that potential can only be realized when you have two sides — client and server — fully engaged. Mess this one up and you’re left with just a bunch of unused interfaces.

June 7th, 2012

Platform Comes to Washington

Digital GovernmentEveryone wants his or her government to be better. We want more services, better services and we want them delivered cheaper. Politicians come and go, policies change, new budgets are tabled but in the end we are left with a haunting and largely unanswerable question: Are things better or worse than they were before?

One thing that is encouraging and has the potential to trigger disruptive change to the delivery of government services in the US is the recent publication Digital Government: Building a 21st-Century Platform to Better Serve the American People. The word to note here is platform –  it seems that government has taken a page from Facebook, Twitter and the others and embraced the idea that efficient information delivery is not about a carefully-rendered Web page but instead is really a logical consequence of developing an open platform.

I confess to some dread on my first encounter with this report. These publications are usually disheartening products of weaselly management consultant speak refined through the cloudy lens of a professional bureaucrat (“we will be more agile”). But in this instance, the reverse was true: this report is accessible and surprisingly insightful. The authors understand that Mobile+Cloud+Web API+decentralized identity is an equation of highly interrelated parts that, in summation, is the catalyst for the new Internet renaissance. The work is not without its platitudes but even these it bolsters with a pragmatic road map identifying actions, parties responsible and (gasp) even deadlines. It’s actually better than most business plans I’ve read.

Consider this paragraph clarifying just what the report means when it calls for an information-centric approach to architecture:

An information-centric approach decouples information from its presentation. It means beginning with the data or content, describing that information clearly, and then exposing it to other computers in a machine-readable format—commonly known as providing web APIs. In describing the information, we need to ensure it has sound taxonomy (making it searchable) and adequate metadata (making it authoritative). Once the structure of the information is sound, various mechanisms can be built to present it to customers (e g websites, mobile applications, and internal tools) or raw data can be released directly to developers and entrepreneurs outside the organization. This approach to opening data and content means organizations can consume the same web APIs to conduct their day-to-day business and operations as they do to provide services to their customers.

See what I mean? It’s well done.

The overall goal is to outline an information delivery strategy that is fundamentally device agnostic. Its authors fully recognize the growing importance of mobility and concede that mobility means much more than the mobile platforms — iOS and Android, among others — that have commandeered the word today. Tomorrow’s mobility will describe a significant shift in the interaction pattern between producers and consumers of information. Mobility is not a technological instance in time (and in particular, today).

But what really distinguishes this report from being just a well-researched paper echoing the zeitgeist of computing’s cool kids is how prescriptive it is in declaring how government will achieve these goals. The demand that agencies adopt Web APIs is a move that echos Jeff Bezos’ directives a decade ago within eBay (as relayed in Steve Yegge’s now infamous rant):

  1. All teams will henceforth expose their data and functionality through service interfaces.

It was visionary advice then and it is even more valid now. It recognizes that the commercial successes attributed to the Web API approach suggest that just maybe we have finally hit upon a truth in how system integration should occur.

Of course, memos are easy to ignore — unless they demand concrete actions within limited time frames. Here, the time frames are aggressive (and that’s a good thing). Within six months, the Office of Management & Budget must “Issue government-wide open data, content, and web API policy and identify standards and best practices for improved interoperability.” Within 12 months, each government agency must “Ensure all new IT systems follow the open data, content, and web API policy and operationalize agency gov/developer pages” and also “optimize at least two existing priority customer-facing services for mobile use and publish a plan for improving additional existing services.”

If the recent allegations regarding the origins of the Stuxnet worm are accurate, then the President clearly understands the strategic potential of the modern Internet. I would say this report is a sign his administration also clearly understands the transformational potential of APIs and mobility, when applied to government.

May 15th, 2012

APIs, Cloud & Identity Tour 2012: Three Cities, Two Talks, Two Panels & a Catalyst

Scott Morrison on Tour 2012On May 15-16 2012, I will be at the Privacy Identity Innovation (pii2012) conference held at the Bell Harbour International Conference Center in Seattle. I will be participating in a panel moderated by Eve Maler from Forrester Research, Inc., titled Privacy, Zero Trust & the API Economy. It will take place at 2:55pm on Tuesday May 15:

“The Facebook Connect model is real, it’s powerful and now it’s everywhere. Large volumes of accurate information about individuals can now flow easily through user-authorized API calls. Zero Trust requires initial perfect distrust between disparate networked systems but are we encouraging users to add back too much trust, too readily? What are the ways this new model can be used for ‘good’ and ‘evil’ and how can we mitigate the risks?”

On Thursday May 17 at 9am PDT, I will be delivering a webinar on API identity technologies, once again with Eve Maler from Forrester. We are going to talk about the idea of zero trust with APIs, an important stance to adopt as we approach what Eve often calls “the coming identity singularity” – that is, the time when identity technologies and standards will finally line up with real and immediate need in the industry. Here is the abstract for this webinar:

Identity, Access & Privacy in the New Hybrid Enterprise: Making Sense of OAuth, OpenID Connect & UMA
In the new hybrid enterprise, organizations need to manage business functions that flow across their domain boundaries in all directions: partners accessing internal applications; employees using mobile devices; internal developers mashing up Cloud services; internal business owners working with third-party app developers.

Integration increasingly happens via APIs and native apps, not browsers. Zero trust is the new starting point for security and access control and it demands Internet scale and technical simplicity – requirements the go-to Web services solutions of the past decade, like SAML and WS-Trust, struggle to solve.

This webinar from Layer 7 Technologies, featuring special guest Eve Maler of Forrester Research, Inc., will:

  • Discuss emerging trends for access control inside the enterprise
  • Provide a blueprint for understanding adoption considerations

You will learn:

  • Why access control is evolving to support mobile, Cloud and API-based interactions
  • How the new standards (OAuth, OpenID Connect and UMA) compare to technologies like SAML
  • How to implement OAuth and OpenID Connect, based on case study examples”

You can sign up for this webinar at the Layer 7 Technologies Web site.

Next week, I’m off to Dublin to participate in TMForum Management World 2012. I wrote earlier about the defense catalyst Layer 7 is participating in that explores the problem of how to manage Clouds in the face of developing physical threats. If you are at the show, you must drop by the Forumville section on the show floor and have a look. The project results are very encouraging.

I’m also doing a presentation and participating in a panel. The presentation title is API Management: What Defense & Service Providers Need to Know. Here is the abstract:

“APIs promise to revolutionize the integration of mobile devices, on-premise computing and the Cloud. They are the secret sauce that allows developers to bring any systems together quickly and efficiently. Within a few years, every service provider will need a dedicated API group responsible for management, promotion and even monetization of this important new channel to market. And in the defense arena, where agile integration is an absolute necessity, APIs cannot be overlooked.

In this talk, you will learn:

  • Why APIs are revolutionizing Internet communications
  • Why this is an important opportunity for you
  • How you can successfully manage an API program
  • Why developer outreach matters
  • What tools and technologies you must put in place”

This talk will take place at the Dublin Conference Centre on Wednesday May 23 at 11:30am.

The panel, organized by my friend Nava Levy from Cvidya, is titled Cloud Adoption – Resolving the Trust vs. Uptake paradox: Understanding & Addressing Customers’ Security & Data Portability Concerns to Drive Uptake.

Here is the panel abstract:

“As Cloud services continue to grow five times faster vs. traditional IT, it seems that concerns re security and data portability are also on the rise. In this session, we will explain the roots of this paradox and the opportunities that arise from resolving these trust issues. By examining the different approaches other Cloud providers utilize to address these issues, we will see how service providers, by properly understanding and addressing these concerns, can use trust concerns as a competitive advantage against many Cloud providers who don’t have the carrier-grade trust as one of their core competencies. We will see that, by addressing fraud, security, data portability and governance risks heads on, not only will the uptake of Cloud services rise to include mainstream customers and conservative verticals but also the type of data and processes that will migrate to the Cloud will become more critical to the customers.”

The panel is on Thursday May 24 at 9:50am.