May 31st, 2013

Banking on APIs

Banking APIsSomething has changed in the world of banking technology. Over the last few months, the architects who shape IT strategy for banks have started talking about API programs with an enthusiasm and energy that was barely at surface level in 2012. I can’t put my finger on exactly what has changed but IT leaders in the financial services world are moving towards implementing API strategies as a legitimate method for supporting a wide array of business drivers.

We know that connectivity and message-based integration are not new concepts for the financial industry. Enterprise architects in the banking world are masters in the art of connecting old systems with new facades and exposing backend resources through multiple channels. But up until now, a conversation with these professionals would be dominated by concerns rooted in the Services Oriented Architecture world: “How do we prevent service proliferation?”, “How do we secure SOAP conversations?”  In fact, it wasn’t too long ago that the mere mention of an API as a strategic initiative would leave these enterprise architects scratching their heads.

Fast forward to today and many banking technologists are explicitly asking for API Management solutions.  They know the terminology of the space, they know what they want to achieve and they know the pitfalls they wish to avoid. In addition, I’ve been amazed at the depth of knowledge that has emerged among these teams, as enterprise developers have invested their own time to learn from relevant tutorials , articles and – of course – blogs posts. The caricature of the enterprise developer as a SOA dinosaur struggling to understand the “new stuff” is fast becoming a myth.

To be fair, there is still great hesitation within the industry when it comes to opening up data and losing control of the user experience but that isn’t stopping banks from applying good API design practice internally. As we’ve said before, APIs are not simply about reaching  anonymous third-party developers. Indeed, organizations can gain great benefits by applying API Management to the interactions that take place in private for their own apps and partners.  But I’ve been astounded by the handful of architects in the banking world I’ve met who are actively experimenting with public API releases.  It seems that the fear of losing control over data, services and products is beginning to lose out to the perceived value of growing the business through a new channel.

Banks provide a great indicator for the direction of enterprise technology and it certainly seems that the “API thing” has legs within this space. If you are in the enterprise world, make sure you have considered how launching an API program might help your business – because it’s increasingly likely your competitors are already doing so.

May 27th, 2013

The Nuts & Bolts of the Internet of Things

Written by
Category IoT, M2M, Tech Talks

The Nuts and Bolts of IoTA few days ago, I talked with Brian Proffitt of ReadWrite about the Internet of Things (IoT) and I’d like to take this opportunity to share some of his questions.

One of Brian’s first questions was about the difference between M2M and IoT. The best answer I could give him was actually one I had found through an M2M group on LinkedIn: “I see M2M platforms as mainly enabling vertical integration, as they have historically, of a single capability; where I see IoT as more about horizontal integration of multiple capabilities and resources into a larger system. M2M is about communication, IoT is about integration and interoperability.”

So, whereas M2M feeds data into existing vertical silos, IoT is layered on top horizontally, correlating and integrating data from different silos. A good illustration of this vertical–versus-horizontal distinction was provided in a recent More with Mobile article. The realization that the commercial potential of IoT first and foremost requires a new model of data sharing inspired us to create the Layer 7 Data Lens Solution.

Another question that Brian posed was about the protocols and standards underpinning the M2M/IoT ecosystem. Here is my short list of key protocols (in no particular order):

I’d certainly be interested to hear if you had any additions to the list. You’ll find background information about IoT protocols on Telit’s M2M blog and Michael Holdman’s blog. Also, Michael Koster published a very interesting blog post about adding event-driven processing to REST APIs, trying to bridge the necessity of supporting event-driven patterns in IoT within a RESTful API approach.

I’ll be discussing IoT in more detail myself when I take part in Layer 7’s latest API Tech Talk, on Wednesday May 29 at 12pm EDT/9am PDT. If I answer your IoT-related question live during the Tech Talk, Layer 7 will send you a free T-shirt. See you on Wednesday!

May 23rd, 2013

Join Our Live Internet of Things (IoT) Discussion – Win a T-Shirt

Written by
Category Events, IoT, M2M, Tech Talks

IoT-ShirtWe’ll be discussing the Internet of Things (IoT) during our latest API Tech Talk next Wednesday, May 29 at 9am PDT. Our special guest – Layer 7 Product Architect and IoT expert Holger Reinhardt – will be taking your questions live throughout the stream. And we’ll be sending every single person who gets an IoT-related question answered by Holger one of our nifty new IoT-shirts, for free! You can ask questions through the Livestream chat, using the Twitter hashtag #layer7live or by emailing

The Internet of Things is a simple concept: objects being connected to the Internet. What’s not so simple is managing the enormous, almost sublime amount of data these connected “things” (vehicles, appliances…) generate. There’s also the question of how you give people within your organization secure-but-seamless access to specific subsets of data they can actually make use of.  Well, our man Holger knows how it’s done, so start getting your questions together and join our live Q&A on May 29.

Click here to get the full event details and a reminder in your calendar. On the day of the event, join us at:

And don’t forget, you can ask questions throughout the stream by chatting or tweeting. Alternatively, you can email your questions in advance and Holger will give you an in-depth answer on the day. IoT is a pretty hot topic right now, so this is bound to be a lively discussion. See you next Wednesday!

May 16th, 2013

Are APIs Making the Biz Dev Role Obsolete?

Business Development AndroidThe role of the business developer has traditionally been to initiate partnerships and follow through by ensuring some sort of integration is implemented.  As enterprises become more software-driven, integration itself increasingly comes through APIs.  This may mean that the implementation of API-driven “partner portals” is replacing traditional business development practices.  A recent article from Wired claimed that 70% of all jobs will be replaced by robots by the end of this century. Are APIs and partner portals the robots that will replace manual business development processes?

Here’s an example of how a business partnership might come about these days. Interaction with an online API partner portal will act as the initial “conversation” that leads to the partnership. If you want to integrate with, you go to the Salesforce partner portal, figure out the relevant SDK/API, build an app and then submit it to the Salesforce AppExchange.  You don’t ever need to actually talk with anyone at Salesforce to become a business partner with the company.

Another example is the way many companies now enable access to their Web sites via Facebook Connect, Google+ Login or Twitter Login. This represents the first step towards establishing a business partnership with Facebook, Google or Twitter. It’s not new in the Web world and has been discussed for years. What makes it relevant to this discussion is the way it’s being applied to out-dated business processes and practices.

Great platform companies have realized this, “robotized” their business development processes and rationalized their business development teams. As robots are to manufacturing, APIs are to business development. Better technology means that we can focus our human resources on more valuable activities, since handshakes are now being made over OAuth instead of costly dinners and drinks.

May 10th, 2013

Making Government Data “Easy to Find, Accessible & Usable”

On May 9, 2013 the White House released an executive order with the title Making Open & Machine Readable the New Default for Government Information. My favorite line in the entire document is:

“Government information shall be managed as an asset throughout its life cycle to promote interoperability and openness, and, wherever possible and legally permissible, to ensure that data are released to the public in ways that make the data easy to find, accessible, and usable” (emphasis mine).

No Dumping
The usual approach to this type of work is to simply publish raw data in a directory or repository and then create some fencing around the data that helps track usage and distribution. Essentially, making government data “open” becomes a data dumping operation. This practice fails on all of President Obama’s three key points. First, data dumps make finding valuable information not at all easy. Second, even though the content might appear in a standard format like XML, CSV or JSON, it is hardly accessible (except for to geeks, who love this kind of stuff). And finally, raw data is hardly ever usable. Instead, it’s a mind-numbing pile of characters and quote marks that must be massaged and re-interpreted before it comes close to usability.

So, while this new directive offers an opportunity to make available a vast amount of the data the government collects on our behalf, the devil is in the details. And the details are in the interface – the API. As with poorly-designed kitchen appliances and cryptic entertainment center remote controls, when it takes extensive documentation to explain how to use something, the design has failed. There’s a simple principle here. Poor API design results in unusable data.

Affordable Data
It doesn’t have to be this way, of course. Government departments have the opportunity to implement designs that meet the goals set forth in the executive order. They can make it easy for people to find, access and use the data. They can publish not just data but APIs that afford searching, filtering and exploring the data in a meaningful and helpful manner; APIs that empower both users and developers to successfully interact with the data, without resorting to a dashboard featuring dozens of options or mind-numbing explanations.

In the (likely) event that the initial open data release consists of mere data, companies and individuals would be well advised to resist the temptation to build a multitude of “one-off” applications, each of which solves a single problem or answers a narrow set of questions for some subset of the data. Instead, work should be put into converting the raw data into usable API formats such as Atom, OData, HAL, Collection+JSON and HTML (to name just a few). APIs should be designed with the same care that would be given to any interactive experience.  Investment in tools and technologies that can properly represent the data in multiple formats while supporting various use cases and access requirements will yield great results.

Open Data APIs
In the end, organizations that know the importance of a good interface, the power of choice and the freedom of flexible representations will be able to convert raw data into valuable information, which can be consumed by a wide range of users, platforms and devices. These considerations are essential to building and supporting open data APIs.

Because – ultimately – data isn’t open, unless it’s “easy to find, accessible, and usable”.