July 23rd, 2013

Interoperability, Not Integration

Interoperability Not IntegrationIt’s a small semantic difference, really but a difference I think is worth calling out. When working in large distributed systems, it’s better to aim for interoperability than integration. And here’s why…

Integration for a Single System
Part of the Merriam-Webster Online Dictionary’s definition of integration is particularly relevant here:

A common approach to working with large distributed systems – e.g. internal networked implementations that run at various locations within a single organization or implementations that rely on some Web-based service(s) – is to attempt to treat the entire operation as a single unit, a “whole system”.

Bad idea!

These “whole systems” can also be called “closed systems”. In other words, people work to create a fully-controlled single unit for which, even when elements are separated by space (location) and time (“We built that part three years ago!”), there is an expectation that things will work as if they are all physically local (on a single machine) and temporally local (there is no significant delay in the completion of requests). As you might expect, attempting this almost always goes badly – at least at any significant scale.

There are several reasons for attempting this approach. The most common is that treating everything as “your system” is mentally easy. Another reason this single-system view prevails is that most tooling acts this way. The legacy of edit and build tools is that all components and data are local and easily accessible. How else would we be able to do things like code completion and data model validation?

Anyway, the point here is that “integration” is an anti-pattern on the Web. It’s not a good idea to use it as your mental model when designing, implementing and deploying large-scale systems.

Interoperability for Working with Other Systems
As you might have guessed, I find Merriam-Webster’s definition for interoperability much more valuable:

The interoperability mindset takes a different approach. In this view, you want – whenever possible – to treat things as interchangeable; as things that can be swapped out or re-purposed along the way. Interestingly, Merriam-Webster notes the first known use of this term was in 1977. So, the idea of interoperability is relatively new compared with “integration”, which was first used in 1620, according to Merriam.

An interoperability-focused approach leads to systems that do not need to “understand” each other, just ones that use interchangeable parts. Especially in widely-distributed systems, this interchangeability has a very high value. It’s easier to replace existing items in a system (e.g. changing data-storage vendors), re-use existing parts for other needs (e.g. applying the same editing component built for a blogging service to a new print publishing service) and even re-purpose parts when needed (e.g. using the file-based document caching system to provide caching for logged-in user sessions).

The primary challenge to thinking like an inter-operator instead of an integrator is that there are no easy tools for this kind of work. Pretty much all integration work is done by creative thinkers in the field (“We could just use our existing storage system for that.”) You usually need a rather specific knowledge of what’s available on site and what the existing parts can do in order to execute on interoperability.

Despite the extra cost of interoperability, there are important benefits for distributed systems that must operate over a long period of time. That’s why so much of the Web relies on interoperability. The standards we use for DNS, HTTP, HTML etc. all assume that varying products and services are free to decide what they do and how they do it as long as they inter-operate with other products and services on the Web.

Treat the Network Space as a Bunch of Other Systems
If you take the approach of treating everything in your network space (e.g. your local intranet or any system that relies on at least one Web-based service) as a bunch of “other systems” you’ll be better off in the long term. You’ll stop trying to get everyone to work the same way (e.g. using the same storage model or object model or resource model) and will be free to start working with other teams on how you can share information successfully across systems, via interoperability.

Even better, large organizations can get a big value out of using the interoperability model for their implementations. In practice, this means fostering an internal ethos where it’s fine to be creative and solve problems in novel ways using whatever means are at your disposal as long as you make sure that you also support interoperability with the rest of the parts of the system. In other words, you have the freedom to build whatever is most effective locally as long as it does not threaten your interoperability with the other parts.

There are lots of other benefits to adopting interoperability as the long-term implementation goal but I’ll stop here for how and just say, to sum up:

  • As a general rule, treat your implementations as exercises in interoperability, not integration.

(Originally published on my personal blog)

No Comments »

No comments yet.

RSS feed for comments on this post. TrackBack URL

Leave a comment