July 26th, 2012

Programming in the Cloud

CloudDevelop LogoQuite a bit has been written about how the Cloud is altering the landscape for platform, software and infrastructure providers but not as much has been said about what all this means for developers. I recently decided to find out for myself by going on an “all-cloud diet”. In practical terms, this meant I used a sealed netbook or smartphone to do all my work.

Therefore, I had to do all the things an active developer regularly has to do (coding, debugging, testing etc.) from a device that has no appreciable hard-drive space and does not allow the installation of any customer software. In essence, I was on a strict diet of browser-based and plug-in based tools and services reachable via an Internet connection.

In relatively short order I was able to find browser-based editors (even ones that support line-by-line server-side debugging!), tools for managing data stores and code repositories. Furthermore, I was able to post test scripts for execution/review and even deploy my projects to a wide range of server providers – all from my browser.

Along the way, I discovered that I had an easier time collaborating online with colleagues in other locations and was better able to take advantage of the most recent releases of new services and tools (since there was no “install” or “update” I had to manage). And – of course – I was more mobile in the process.

Not all programming languages, runtime environments and server profiles are represented in the cloud. And there are still many details to work out in order to make assembling a full-featured “cloud tool chain” easy, reliable and cost effective. Nevertheless, I can see that it is a possibility and I have met people who are working to make that possibility a reality.

My advice to developers would be: Conduct your own experiments; try out your own “cloud-only diet” and see what you learn. Even if you decide that not all the pieces you need are available, you may still discover there are ways to leverage cloud-based tooling to reduce barriers, add flexibility and increase productivity in various aspects of your development efforts.

I’ll be exploring these issues in greater depth when I present a talk titled Programming with the OSS “Cloud Stack” at the CloudDevelop show in Columbus, OH on August 3.

July 20th, 2012

Returning from #CIS2012

Francois Lascelles at Cloud Identity SummitCloud Identity Summit was definitely worth the trip. The talks were great, the audience was great and the venue was outstanding. Sign me up for next year in Napa!

It’s beautiful and quiet at Vail Cascade this morning. As I stepped outside, I’m pretty sure I saw SAML scurrying away into the trees. This is weird given this week’s proclamations that SAML was dead. Although we won’t be rid of SAML anytime soon, I do look forward to enterprise adoption of the new kid on the block: OpenID Connect. Easier federation, OpenID Connect-style is already common for consumer identity providers; enterprise identity providers should take note and follow suit. As a vendor of API Management infrastructure, it’s up to us to enable the enterprise to better reach out to its target audience. I see support for OpenID Connect as a key component in achieving this today.

My favorite proclamation of the week goes to Patrick Harding who declared in his talk titled “The Platformication of the Enterprise is Upon us Again and They Forgot Security (Again)” that API tokens are going to be “the currency of the API economy”. The management of tokens and their lifecycle is indeed a crucial component of API Management. Consider the case of a mobile application consuming an enterprise API using an OAuth token. Such tokens are associated with the API provider, the user (subscriber), the mobile application and the mobile device. Each live token is potentially associated with multiple parties and one of the challenges of API token management is to enable control of the right tokens by the right parties.

June 7th, 2012

Platform Comes to Washington

Digital GovernmentEveryone wants his or her government to be better. We want more services, better services and we want them delivered cheaper. Politicians come and go, policies change, new budgets are tabled but in the end we are left with a haunting and largely unanswerable question: Are things better or worse than they were before?

One thing that is encouraging and has the potential to trigger disruptive change to the delivery of government services in the US is the recent publication Digital Government: Building a 21st-Century Platform to Better Serve the American People. The word to note here is platform –  it seems that government has taken a page from Facebook, Twitter and the others and embraced the idea that efficient information delivery is not about a carefully-rendered Web page but instead is really a logical consequence of developing an open platform.

I confess to some dread on my first encounter with this report. These publications are usually disheartening products of weaselly management consultant speak refined through the cloudy lens of a professional bureaucrat (“we will be more agile”). But in this instance, the reverse was true: this report is accessible and surprisingly insightful. The authors understand that Mobile+Cloud+Web API+decentralized identity is an equation of highly interrelated parts that, in summation, is the catalyst for the new Internet renaissance. The work is not without its platitudes but even these it bolsters with a pragmatic road map identifying actions, parties responsible and (gasp) even deadlines. It’s actually better than most business plans I’ve read.

Consider this paragraph clarifying just what the report means when it calls for an information-centric approach to architecture:

An information-centric approach decouples information from its presentation. It means beginning with the data or content, describing that information clearly, and then exposing it to other computers in a machine-readable format—commonly known as providing web APIs. In describing the information, we need to ensure it has sound taxonomy (making it searchable) and adequate metadata (making it authoritative). Once the structure of the information is sound, various mechanisms can be built to present it to customers (e g websites, mobile applications, and internal tools) or raw data can be released directly to developers and entrepreneurs outside the organization. This approach to opening data and content means organizations can consume the same web APIs to conduct their day-to-day business and operations as they do to provide services to their customers.

See what I mean? It’s well done.

The overall goal is to outline an information delivery strategy that is fundamentally device agnostic. Its authors fully recognize the growing importance of mobility and concede that mobility means much more than the mobile platforms — iOS and Android, among others — that have commandeered the word today. Tomorrow’s mobility will describe a significant shift in the interaction pattern between producers and consumers of information. Mobility is not a technological instance in time (and in particular, today).

But what really distinguishes this report from being just a well-researched paper echoing the zeitgeist of computing’s cool kids is how prescriptive it is in declaring how government will achieve these goals. The demand that agencies adopt Web APIs is a move that echos Jeff Bezos’ directives a decade ago within eBay (as relayed in Steve Yegge’s now infamous rant):

  1. All teams will henceforth expose their data and functionality through service interfaces.

It was visionary advice then and it is even more valid now. It recognizes that the commercial successes attributed to the Web API approach suggest that just maybe we have finally hit upon a truth in how system integration should occur.

Of course, memos are easy to ignore — unless they demand concrete actions within limited time frames. Here, the time frames are aggressive (and that’s a good thing). Within six months, the Office of Management & Budget must “Issue government-wide open data, content, and web API policy and identify standards and best practices for improved interoperability.” Within 12 months, each government agency must “Ensure all new IT systems follow the open data, content, and web API policy and operationalize agency gov/developer pages” and also “optimize at least two existing priority customer-facing services for mobile use and publish a plan for improving additional existing services.”

If the recent allegations regarding the origins of the Stuxnet worm are accurate, then the President clearly understands the strategic potential of the modern Internet. I would say this report is a sign his administration also clearly understands the transformational potential of APIs and mobility, when applied to government.

June 6th, 2012

Start Spreading the News… Cloud Expo, New York

Cloud Expo 2012Cloud Expo 2012 is almost here. This promises to be an incredible event, with thousands of attendees and over 100 speakers. As previously mentioned, I’m privileged to be presenting on Making Hybrid Cloud Safe & Reliable. I’m particularly excited that I’ll be introducing attendees to the new concept of API-Aware Traffic Management. It will also be great to be back in New York City!

I recently read Daniel Kahneman’s book Thinking Fast & Slow, a fascinating study of how the human mind works. With the new capabilities offered by big data and Cloud computing — the dual themes for next week’s event — and the increasing personalization of technology through Mobile devices, I think we have an opportunity to make our digital systems more human in their processing. What does that mean?  Well, more intuitive in user experience, more lateral through caching of unstructured data and more adaptive to changing conditions. API-Aware Traffic Management certainly reflects this potential.

If you are going to be (or hope to be) at the event, add a response in the comments box or tweet to @MattMcLartyBC. Hope to see you there!

May 15th, 2012

APIs, Cloud & Identity Tour 2012: Three Cities, Two Talks, Two Panels & a Catalyst

Scott Morrison on Tour 2012On May 15-16 2012, I will be at the Privacy Identity Innovation (pii2012) conference held at the Bell Harbour International Conference Center in Seattle. I will be participating in a panel moderated by Eve Maler from Forrester Research, Inc., titled Privacy, Zero Trust & the API Economy. It will take place at 2:55pm on Tuesday May 15:

“The Facebook Connect model is real, it’s powerful and now it’s everywhere. Large volumes of accurate information about individuals can now flow easily through user-authorized API calls. Zero Trust requires initial perfect distrust between disparate networked systems but are we encouraging users to add back too much trust, too readily? What are the ways this new model can be used for ‘good’ and ‘evil’ and how can we mitigate the risks?”

On Thursday May 17 at 9am PDT, I will be delivering a webinar on API identity technologies, once again with Eve Maler from Forrester. We are going to talk about the idea of zero trust with APIs, an important stance to adopt as we approach what Eve often calls “the coming identity singularity” – that is, the time when identity technologies and standards will finally line up with real and immediate need in the industry. Here is the abstract for this webinar:

Identity, Access & Privacy in the New Hybrid Enterprise: Making Sense of OAuth, OpenID Connect & UMA
In the new hybrid enterprise, organizations need to manage business functions that flow across their domain boundaries in all directions: partners accessing internal applications; employees using mobile devices; internal developers mashing up Cloud services; internal business owners working with third-party app developers.

Integration increasingly happens via APIs and native apps, not browsers. Zero trust is the new starting point for security and access control and it demands Internet scale and technical simplicity – requirements the go-to Web services solutions of the past decade, like SAML and WS-Trust, struggle to solve.

This webinar from Layer 7 Technologies, featuring special guest Eve Maler of Forrester Research, Inc., will:

  • Discuss emerging trends for access control inside the enterprise
  • Provide a blueprint for understanding adoption considerations

You will learn:

  • Why access control is evolving to support mobile, Cloud and API-based interactions
  • How the new standards (OAuth, OpenID Connect and UMA) compare to technologies like SAML
  • How to implement OAuth and OpenID Connect, based on case study examples”

You can sign up for this webinar at the Layer 7 Technologies Web site.

Next week, I’m off to Dublin to participate in TMForum Management World 2012. I wrote earlier about the defense catalyst Layer 7 is participating in that explores the problem of how to manage Clouds in the face of developing physical threats. If you are at the show, you must drop by the Forumville section on the show floor and have a look. The project results are very encouraging.

I’m also doing a presentation and participating in a panel. The presentation title is API Management: What Defense & Service Providers Need to Know. Here is the abstract:

“APIs promise to revolutionize the integration of mobile devices, on-premise computing and the Cloud. They are the secret sauce that allows developers to bring any systems together quickly and efficiently. Within a few years, every service provider will need a dedicated API group responsible for management, promotion and even monetization of this important new channel to market. And in the defense arena, where agile integration is an absolute necessity, APIs cannot be overlooked.

In this talk, you will learn:

  • Why APIs are revolutionizing Internet communications
  • Why this is an important opportunity for you
  • How you can successfully manage an API program
  • Why developer outreach matters
  • What tools and technologies you must put in place”

This talk will take place at the Dublin Conference Centre on Wednesday May 23 at 11:30am.

The panel, organized by my friend Nava Levy from Cvidya, is titled Cloud Adoption – Resolving the Trust vs. Uptake paradox: Understanding & Addressing Customers’ Security & Data Portability Concerns to Drive Uptake.

Here is the panel abstract:

“As Cloud services continue to grow five times faster vs. traditional IT, it seems that concerns re security and data portability are also on the rise. In this session, we will explain the roots of this paradox and the opportunities that arise from resolving these trust issues. By examining the different approaches other Cloud providers utilize to address these issues, we will see how service providers, by properly understanding and addressing these concerns, can use trust concerns as a competitive advantage against many Cloud providers who don’t have the carrier-grade trust as one of their core competencies. We will see that, by addressing fraud, security, data portability and governance risks heads on, not only will the uptake of Cloud services rise to include mainstream customers and conservative verticals but also the type of data and processes that will migrate to the Cloud will become more critical to the customers.”

The panel is on Thursday May 24 at 9:50am.