APIs are a big deal. There are thousands of them. They are the de facto way that you link things together. But, because there are so many, it is difficult to support them all. Worse, with cloud-based development, new versions of particular APIs appear at regular intervals, which means that not only do you need to support the latest version of particular APIs but you will also have to continue support older versions because not all of your users will have moved on to the latest versions. So APIs proliferate. And, yes, there are API management tools on the market but they are rather like putting lipstick on a pig. The problem is that there are too many APIs and they are not standardised and they create silos.
Let’s talk about standards.
OData is an OASIS standard protocol based on REST principles, but that’s like saying SQL is a standard – well, yes, but all of the implementations are different – okay, you get some sort of coherence, but not enough. Worse, despite what you might think to the contrary, REST isn’t a standard. What it is, is a common architectural pattern, but there is no standard implementation of it. Each RESTful API is a unique proposition and it’s hard to reason over or manage collections of unique things; it’s like a junk drawer in your house – no method to it – just a collection of oddments.
All of that said, there are moves afoot to define standards for particular types of APIs: for such things as middleware services, applications, for sensors and other devices used in the Internet of Things, web services, cloud services, and so on. How successful these moves will be remains to be seen but, in any case, all that this will achieve is siloed archipelagos rather than siloed islands.
What is needed is a fundamentally different approach. Consider what an API call does. Firstly, there is some sort of business content: you want to conduct some sort of exchange between the environments involved. Secondly, there is some technical handshaking that needs to go on in order to facilitate this exchange.
Just thinking about APIs in these terms should suggest a solution, because it has been done repeatedly within IT over the decades. Indeed, it is not even specific to IT. If you have a complex problem, how do you solve it? In simplistic terms, you model the problem, breaking it down into its constituent parts and address the problem through an understanding of each of those parts and their relationships. In IT-speak, this is referred to as a “separation of concerns” and, in practice, this means that you end up with a modular system rather than a monolithic one.
With respect to APIs this means de-coupling the business exchange from the technical handshaking. It’s not quite a logical layer as opposed to a physical layer but it is analogous. And, when you start to think of it that way, you can see that what you would really like to be able to do is to tell your enabling software what you want to do in terms of a business exchange and then have the software figure out the technical handshake for itself. In other words, you want a declarative approach (I didn’t mention SQL by accident).
Fortunately, there is a movement afoot to do just that; an emerging effort known as Dynamic APIs.
Dynamic APIs is a concept, proposed by EnterpriseWeb, which is getting some traction in the TM Forum and the Industrial Internet Consortium (IIC). You may recall I’ve covered this start-up once before.
The idea behind Dynamic APIs is that you introduce an abstraction layer – actually an information model – between the applications and the APIs. APIs are described in terms of links, metadata and policies and this allows independent change on either side of the equation. The information model knows “how” to communicate via APIs so you just have to worry about the details of the business exchange. This is what EnterpriseWeb offers. Anyone struggling with a highly dynamic and distributed environment should give this platform serious consideration.
Read more on IT-Director.com.