July 23rd, 2013

Interoperability, Not Integration

Interoperability Not IntegrationIt’s a small semantic difference, really but a difference I think is worth calling out. When working in large distributed systems, it’s better to aim for interoperability than integration. And here’s why…

Integration for a Single System
Part of the Merriam-Webster Online Dictionary’s definition of integration is particularly relevant here:

A common approach to working with large distributed systems – e.g. internal networked implementations that run at various locations within a single organization or implementations that rely on some Web-based service(s) – is to attempt to treat the entire operation as a single unit, a “whole system”.

Bad idea!

These “whole systems” can also be called “closed systems”. In other words, people work to create a fully-controlled single unit for which, even when elements are separated by space (location) and time (“We built that part three years ago!”), there is an expectation that things will work as if they are all physically local (on a single machine) and temporally local (there is no significant delay in the completion of requests). As you might expect, attempting this almost always goes badly – at least at any significant scale.

There are several reasons for attempting this approach. The most common is that treating everything as “your system” is mentally easy. Another reason this single-system view prevails is that most tooling acts this way. The legacy of edit and build tools is that all components and data are local and easily accessible. How else would we be able to do things like code completion and data model validation?

Anyway, the point here is that “integration” is an anti-pattern on the Web. It’s not a good idea to use it as your mental model when designing, implementing and deploying large-scale systems.

Interoperability for Working with Other Systems
As you might have guessed, I find Merriam-Webster’s definition for interoperability much more valuable:

The interoperability mindset takes a different approach. In this view, you want – whenever possible – to treat things as interchangeable; as things that can be swapped out or re-purposed along the way. Interestingly, Merriam-Webster notes the first known use of this term was in 1977. So, the idea of interoperability is relatively new compared with “integration”, which was first used in 1620, according to Merriam.

An interoperability-focused approach leads to systems that do not need to “understand” each other, just ones that use interchangeable parts. Especially in widely-distributed systems, this interchangeability has a very high value. It’s easier to replace existing items in a system (e.g. changing data-storage vendors), re-use existing parts for other needs (e.g. applying the same editing component built for a blogging service to a new print publishing service) and even re-purpose parts when needed (e.g. using the file-based document caching system to provide caching for logged-in user sessions).

The primary challenge to thinking like an inter-operator instead of an integrator is that there are no easy tools for this kind of work. Pretty much all integration work is done by creative thinkers in the field (“We could just use our existing storage system for that.”) You usually need a rather specific knowledge of what’s available on site and what the existing parts can do in order to execute on interoperability.

Despite the extra cost of interoperability, there are important benefits for distributed systems that must operate over a long period of time. That’s why so much of the Web relies on interoperability. The standards we use for DNS, HTTP, HTML etc. all assume that varying products and services are free to decide what they do and how they do it as long as they inter-operate with other products and services on the Web.

Treat the Network Space as a Bunch of Other Systems
If you take the approach of treating everything in your network space (e.g. your local intranet or any system that relies on at least one Web-based service) as a bunch of “other systems” you’ll be better off in the long term. You’ll stop trying to get everyone to work the same way (e.g. using the same storage model or object model or resource model) and will be free to start working with other teams on how you can share information successfully across systems, via interoperability.

Even better, large organizations can get a big value out of using the interoperability model for their implementations. In practice, this means fostering an internal ethos where it’s fine to be creative and solve problems in novel ways using whatever means are at your disposal as long as you make sure that you also support interoperability with the rest of the parts of the system. In other words, you have the freedom to build whatever is most effective locally as long as it does not threaten your interoperability with the other parts.

There are lots of other benefits to adopting interoperability as the long-term implementation goal but I’ll stop here for how and just say, to sum up:

  • As a general rule, treat your implementations as exercises in interoperability, not integration.

(Originally published on my personal blog)

April 9th, 2012

Big Data & API Management

Written by
 

Big DataThe hottest IT trends of 2012 are shaping up to be Cloud, mobile and “big data”. The links between API management, Cloud and mobile are clear. The links between API management and big data – a concept that creates capabilities for capturing and analyzing previously unimaginable amounts of unstructured data – are less obvious but no less significant. I see two key areas of synergy…

First of all, in the three-tier architecture of the Web, the line was typically blurry between the presentation and logic tiers and concrete between logic and data. Big data now blurs the line between logic and data. Combine this with the fact that the mobile app development paradigm fragments the presentation platform and it is evident that the API will become the concrete and consistent border in application processing flows. In this context, API management will prove vital in enforcing security, collecting business metrics and normalizing protocols.

Second, big data allows analytics to be performed in the scope of real-time data retrieval. This will create another wave of real-time integration needs in enterprises of every size. More real-time integration means more APIs with higher volumes. The common protocol for exposing big data on the network is REST using either JSON or XML formats. Again, this will mean a greater necessity for API management tools and techniques and a compound benefit in their usage.

Simply put, mobile, Cloud and big data are driving a new era of enterprise IT and API management will provide amplified value for companies embracing these trends.

March 23rd, 2012

Layer 7 at the 2012 DoDIIS Worldwide Conference

2012 DoDIIS Worldwide ConferenceLayer 7 is proud to be exhibiting at the 2012 Department of Defense Intelligence Information Systems (DoDIIS) Worldwide Conference, which will be taking place in Denver this April 1-4. The show will be focusing on the Defense Intelligence Agency’s goal of unifying defense intelligence infrastructure and information sharing initiatives.

Never before has so much intelligence data been collected and never has the challenge of securely sharing these valuable assets been greater. As new intelligence systems come online, issues inevitably arise around the need to make data and security credentials interoperable between these new systems and existing capabilities.

As the leading provider of secure messaging and security Gateway solutions to the US Federal Intelligence Community, Layer 7 will be at the show, demonstrating its solutions for data and security interoperability within the enterprise and the Cloud. If you’re attending the DoDIIS conference, stop by Booth 917 to see first-hand how you can resolve interoperability and fine-grained access challenges with a Common Criteria EAL 4+ certified solution from Layer 7.

March 8th, 2012

QCon London 2012 is the Place to be this Week

QCon LogoI’m off to London for QCon 2012, the Sixth International Software Development Conference (March 7-9). I am one of the track chairs for this meeting. I’ve just learned that the show is now sold out but there is a waiting list if you haven’t already registered. All indications are that this is going to be an outstanding conference, so if there is any way you can attend, you should make the effort.

I’m hosting a track this Friday, called Industrial-Strength Architecture for Integration & Web Computing. Here’s how I described the track to potential speakers:

The enterprise is demanding more from the Web than ever before. No longer content with simple Web application delivery, the new enterprise Web has become an integration point between mobile devices, browsers, legacy systems and third-party Web apps. It is a difficult balancing act. The new enterprise Web is highly scalable but can also reconcile the different service level expectations across each participant. At its core, it enables agile product delivery while maintaining extreme reliability. In this track, we will study the architectural challenges faced by the enterprise that needs to harness the Web as a rich delivery channel — and highlight the real-world solutions that address these challenges. We will explore the intersection where trends such as virtualization, noSQL, JSON, OAuth, APIs and mobile apps meet. Join us to understand the fine tuning between milliseconds and dollars that can make the difference between wild success and disappointing mediocrity.

I’m fortunate to have a great roster of speakers, including Theo Schlossnagle from Omniti, Paul Fremantle from WSO2, John Davies from Incept5 and finally both Marcus Kern and David Dawson from Mobile Interactive Group.

I’m also going to chair a panel titled Integration at Scale: Lessons Learned from the New Enterprise Web. This one promises to be a very interesting discussion:

The mobile device revolution has upended our traditional view of the World Wide Web. The enterprise Web is now about integration: connecting any device to to any data, reliably and under wildly-fluctuating load. How has this affected Web architecture and what changes in the day-to-day operation of the Web resource? Join us for this panel of senior enterprise architects, each of whom has met the challenge of the new enterprise Web.

The panel line up consists of David Laing from CityIndex, Neels Burger from MoneySuperMarket.com, Neil Pellinacci form Tanzarine Technology and Parand Tony Darugar from Xpenser. Each brings tremendous experience to the panel and bringing them all together is going to make for a lively and informative debate. I’m looking forward to it.

Hope to see you in London!

February 27th, 2012

New Solution Brief: API Management for Mobile

API Management for MobileIncreasingly, mobile is one of the major factors driving enterprises to expose their information assets via APIs. With the BYOD movement bringing mobile into the workplace and some forward-thinking enterprises equipping their employees with tablets, there is a growing need for enterprise-level apps that leverage systems and data exposed via APIs.

Of course, allowing enterprise data to be accessed from smart-phones and tablets (via public networks) creates a range of concerns around security and performance. The security risks are clear – perhaps less well understood is the fact that, for apps to perform efficiently, data will need to be filtered and transformed into formats and protocols suitable for mobile.

Layer 7’s new API Management for Mobile solution brief explains how our API Management Suite of products delivers everything enterprises need to address the data security and performance management concerns raised by integrating enterprise assets with mobile devices. To find out more download the solution brief now.