Position paper for “The Future of Interoperability Standards – Technical Approaches”
PUBLISHED: September 22nd, 2010
This is my position paper for the ” The Future of Interoperability Standards – Technical Approaches” meeting as part of the “ICOPER Best Practice Network”.
Interoperability
How is the future of interoperability of standards, I think the future is bright – if we need to weare shades – I’m not sure.
As the world of ICT is becoming increasingly more complex, where almost any device is somewhat connected. There is an increasing focus on the need for interoperability. Connected systems are no longer from one vendor, or developed on one system. The ICT infrastructure have grown so big that no vendor or supplier or software company can control it. Therefore interoperability is the foundation of connectiveness of ICT systems, and enable a free flow of information between systems, and users of these systems.
Since the world is becoming increasingly more complex, I think it is premature to have the idea that everything we would like to express or exhange of information could fit in any given model. We need to develop a more pragmatic approach to interoperability, we need to focus on a flexible model, and the tools used for exchanging and processing the information need to have a high tolerance of error processing.
Based on implementations and how the different tools manages errors, hopefully a best practice will evolve.
Categories of standards
To simplify and as an attempt to explain, I would like to distinguish between two different types of standards (There are many others and nuances to these categories.). That I think need a different level, and accuracy of interoperability.
- Descriptive standards
- Protocols – Connecting standards
In general the difference between the level of these standards in my view is that “Descriptive” standards have a higher level of human interaction, than “Protocol” standards that are more consumed by machines.
For standards that are closer to the human side of interaction the level of flexibility should be higher. Standards that are to be consumed by computers only, and not to be seen by any human, the level of flexibility should be nil. e.g. http protocol, tcp/ip, etc.
Semantics
We have several levels of interoperability, and this is connected to my previous position paper on the process of Development of standards. In this position paper I argue that we should harmonise the discussion, when developing the standards. I would argue that we need to look differently at how we specify standards with regard to interoperability.
With semantics I understand the meaning of thing, and this is a requirement to understand the value of the message submitted or shared among different systems. However the labelling of the data should not be important.
To achieve a more flexible interoperability scheme we should look at architectures of information domains. If a information segment belongs to the same architecture or are of the same architectural constructs there should be possible to have some level of interoperability between systems.
Graceful degradation
The future of interoperability standards lies within the possibility to degreade gracefully. To accept errors, to accept user flaws.
We should learn from how the use of HTML and web-browsers emerged. I would argue that the main reason for why the “internet” is a commodity today is that the earlier versions of web-browser managed errors gracefully, was forgiving, and tried its best to meet the intention of the authors of the data. Such a pragmatic approach is the future of interoperability of standards – that is if they are descriptive. However machine to machine communication should not be so pragmatic