A standard for openness?

By Rick Jelliffe
July 7, 2008 | Comments: 8

In public discourse and public policy, "open standards" are a now Good Thing (in the sense of 1066 and All That). However, the more that "open standard" is deemed good and important without having a common meaning, the more that interests will attempt to stretch its meaning in one way or another. In this case, one way means to allow actual royalty-bearing (RAND) standards as "open standards" and the other way is to require open source (or even free) implementations.

The Wikipedia entry for Open Standards shows the variability in the definitions of the term. Most pressingly, it has examples of legislation that uses the buzzword "open" but in very different ways.

It is an issue with a lot of angles. For example, the Digital Standards Organization (another fakey body from the people who brought you NOOXML.COM) is very concerned with ideas of "vendor capture." At the other end, I have long been calling for "verifiable vendor-neutrality" which takes into consideration both crypto-monopolization and cartel-ish exclusion, in particular in Is our idea of open standards good enough? And the Open Source Initiative has an angle too, which is that an open standard is something that allows an open source implementation.

In a related discussion on this at ConsortiumInfo.org, I suggested that what would be useful would be an ISO standard giving definitions for different kinds of openness. The need for this came clearer only today with the reported debate in Europe about draft European framework for interoperability (I trust my readers are sophisticated enough to see past trivializing words like "spat"!)

Openness is a motherhood term now, so of course there will be surprises and debate about what kind of motherhood we actually mean. My opinion, for what it is worth, is that RAND-z and RF is necessary but not sufficient for openness, and that governments embarking on an open standard policy need to put in place some patent-limitation plan which would bring existing, market dominating, royalty-bearing standards into the RAND-z fold by, say, 2010. This necessarily will mean that some successful consortia and even some SCs within ISO will have the ground cut out from under them: tough titties.

So what would such a standard contain?

I agree with a comment Andy Updegrove made about the idea, to the effect that pinning down a meaning for "openness" might be futile and that concentrating on enumerating the various categories would be the way forward.

What would such a standard have?

I think it would be useful to have:


  1. An enumeration of the various qualities that are roped into different ideas of openness. (See here for Updegrove's example.) RAND-z and RF fit here.
  2. A grouping of these into some distinct blocks that reflect typical community positions. I would make a distinction between open source, open standards and open technologies (where an open technology is one where there is an open standard, truly open licensing (see below) and open source implementability.
    In particular, the OSI Open Source Definition, the GPL, W3C license, RAND, NDA.
  3. A minimal set of pre-fab licenses for the major different classes of openness, so that consortia and so on can easily cut and paste. Again the Open Source Initiative is the model here: I'd see this as a way of trying to encourage their project (without stepping on their toes.)


I have been critical of some aspects of the use of "open" in the past: see Is our definition of open standards good enough? where I put forward the idea of verifiable vendor neutrality which would allow checks against both crypto-monopolization and cartel-ish technology-exclusion behaviours.

But I have two other concerns about the current kinds of licenses and promises at consortia and by the global corporations.

The first is that the licenses are typically made only in respect to particular standards: so if company A has granted a license for standard B, and you implement standard B, you are OK, but if you use the same technology on implementing standard C you are not covered.

The second is that even if company A has granted license for both standard B and C, and you implement either, you still may not actually be covered: this is because typically the grant is to "necessary" or "essential" claims. This does not mean that optional parts of the standard are not covered, I should stress: but what it means is that if there is no other way to implement a standard than to use the IP-encumbered way, then you are allowed to use that IP-encumbered way.

I don't think either of these is nearly good enough to sustain the kinds of openness that people think they are buying into. Think about how the anti-aliasing patents have held up open source software, for example. It will not meet the public policy requirements that the drive to openness is aiming at if at every turn implementers of standards are forced to pick the low quality implementation technique.

Of course, in general you only expect suits over IP when someone has some money. I am not predicting any kind of future where IBM sues a small developer for implementing something that IBM has open licensed for ODF but not OOXML, or where MS sues some small developer for using a method MS has a patent for but which was not "essential" to use. But it is certainly an issue for public policy holders to consider, IHMO.

But whenever I have mentioned in public forums that the essential claims provisions seem to mean that if there are multiple implementation techniques, you are not covered, I am often rewarded by blank stares. I think this is because the drafting lawyers interpret the problem of making sure that the open standard can have open implementations (which is what the public policy agenda is) as being the same as making sure that the open standard can have at least one open implementation (which is not the same thing, because different implementation strategies vary in their performance or quality properties.)

What is needed for openness is open licenses that all the technology being granted being usable in any open standard, whether there are alternative implementation strategies or not.


You might also be interested in:


8 Comments

I think that a comprehensive definition of "open standards" is long overdue myself, and would definitely second the call here. My fear, however, is that such a definition is likely to be as hotly contested as the word "organic" is in the agricultural environment, because there are any number of companies (and even standards bodies) that want to get all of the marketing goodness of "open standards" without necessarily conforming to a litmus test of that term.

Would either ODF or OOXML become an ISO specification if they had to conform to a formal definition of open standard? I suspect that OOXML probably wouldn't, whereas it would be a crap shoot with ODF (my suspicion is that it would probably pass in some definitions and fail in others).

One of the benefits of licenses is that they do provide (usually) fairly stringent litmus tests as to whether something does or does not qualify as an open source application of type X. Perhaps the approach here should be a metalicense that provides a legalistic definition of a given standard. This way, if I saw that a given specification was considered to be an OSML 3 (open standards meta-license) for ECMA but only an OSML 1 for ISO - with the numbering increasing based upon the levels of conformance that a given standard satisfied) - then it would make it easier for customers to quantify whether a given technology fell within their legal comfort zone.

Of course, this is all probably wishful thinking, but it can't hurt to at least propose it and see what other people think.

Rick, I've added a citation for this blog in the 'Articles' section of my summary document on "Open Standards." With a bit of coaxing, I might be willing to update that document some...

However, as you can see from the "Preliminary Stuff", written in 2004, I've never been very optimistic about finding enough common ground to warrant further work on agreed definitions.

I wrote there: Nobody admits to the goal of creating a "closed" standard or to having a "closed" process. Since corporate entities of all kinds -- convicted monopolists, industry cartels acting as self-appointed patent licensing authorities, pure-IP patent farming collectives, and accountable standards bodies -- all characterize their operational frameworks as "open," the value of "open" as a marketing claim is seriously diluted.

Kurt, Robin: Yes, I think no-one expects a satisfactory monolithic definition for "openness". So the point of the standard would be to make this really explicit.

You know that Woody Allen movie where he said his parents could argue about anything, for example "The Atlantic is mightier than the Pacific" "No the Pacific is mightier than the Atlantic!"

So the idea would be corral the various meanings that exist in the real world from all parties, perhaps giving them a number if words are too troublesome. But certainly it would not be an aim to usurp established meanings (such as free as in liberty) or even to say on amount of openness is better than another: that is out of scope.

For example, I don't agree that having multiple FOSS implementations of a standard is a pre-requisite for calling it an open standard. And I don't think it would be a useful concept for new standards where there has not been time to catch up. As China says, there is always change.

But I have been told of one country which tried to put in openness as a procurement requirement, then withdrew it for something excessively soft because the original drafters had actually sneaked in (so it was described to me) a much stronger definition of openness. Rather than politicians agreeing on "openness" and then allowing it to parlayed one way or another due to the amount of give in the term, I suspect that more clear levels or degrees of openness would remove some discretion.

So a policy maker can say "We want openness, the kind of openness that it ISO Level 3 now, but going to ISO Level 4 in 2010." or whatever.

Cheers
Rick

I have a redefinition of "open standard" at the first public draft stage, the Universally Accessible and Interoperable Specification v. 0.01.. It is far from complete but is to the point where concepts can be discussed and refined.

It is most specifically intended as a criteria for government ICT and procurement officials to use in selecting standards. It has deliberately not been bent to accommodate any existing standard with the notion that government IT and procurement officials would set a timetable for vendors to bring their standards into compliance.

It is also intended as a candidate successor to various definitions of an "open standard" currently in use by different governments to push change, all of which I have reviewed are inconsistent with governing competition law.

The draft specification is heavily footnoted with explanatory commentary, including citations and links to source materials. To the best of my knowledge, it is the first published attempt to encapsulate the international law governing standards-based interoperability. The accessibility and communications protocol provisions are the most underdeveloped, with the latter not being addressed at all at this point.

Feedback is more than welcome. I intend to maintain control of the document to the point where it is a useful draft and then hand it off to a suitable organization for maintenance and refinement.

Marbux: I think the term of "unversally accessible and interoperable" is useful, and the categories.

I think there is a slight logical problem with speaking in terms of outcomes rather than qualities. In concrete terms, no format can be consided universally accessible and interoperable because it depends on the configuration of the recipient system, not just the format.

That is why I would make a distinction between formats (where the openness is as a result of process and deliverables) with technology (where the openness relates to outcomes). For example, if MS does not support HTML 5, it might be considered an open format that is not an open technology (in the sense of beinguniversally accessible and interoperable), IYSWIM.

It seems to me that conflating a format standard (i.e. a document) with a deployed technology (e.g. RAND-z cross-platform, FOSS available) will lead to people talking at cross purposes (standards people will have to find a new name for "standard" to mean "standard independent of implementation or deployment".) And governments might prefer to say "Adopt open technologies where they are available, prefer open standards where they are available otherwise, prefer open source otherwise" or whatever hierarchy fits their preferences.

And, as I have blogged many times, the *kind* of interoperability you get may be different from the kind you expect. For example, the differences in typesetting engines and font metrics on different systems causing different word-, line- and page-breaks.

What do you think about the fact that OpenOffice already claims to support the new ISO ODF 1.2 version.

[quote]OpenOffice.org 3.0 already supports the features of the upcoming version 1.2 of the ISO standard OpenDocument Format (ODF)[/quote]

I asume that OASIS has not submitted this version to ISO yet.I guess ISO is expected to rubberstamd the new ODF version as else ODF could hardly be syupporting it already.

hAl: Patrick Durusau told me that ODF 1.2 is now in feature-freeze stage. So it may well be that Open Office supports all the features already. I gather the names and effective schema are fairly settled too.

However, the review work for the fixes for ODF 1.0 and ODF 1.1 are still under way, and I expect that ISO SC34 will also have some corrections and additions when it goes there: OASIS ODF TC should not expect a lower bar than the one set for ECMA 376, which will mean a lot more scrutiny and detailed comments.

That being said, it would be a mistake to say that ODF 1.2 represents what the new version of Open Office has: I hope the direction goes from stakeholders to committee to implementers.

Rick and hAI:


It is accurate that OpenOffice.org implements at least some features of the draft OASIS ODF 1.2. If you check the new bibliographic feature, you are looking at an implementation of the RDF metadata in the current draft.


There are trade-offs on implementing features of draft standards. On the one hand, concurrent implementations can go a long way toward troubleshooting a draft spec. Also, SDO's such as W3C and OASIS commonly require as exit criteria that there be one or more implementations prior to transition from draft to final standard. There is some basis for such requirements. E.g., should drafts attain the legal status of a standard if there are no implementations? The existence of implementations provides assurances that a standard is in fact implementable and that it will actually be used. For example, more than a year ago the OASIS eContracts TC produced what looks to be a very nice XML specification for automated document assembly. Should it have been adopted as a standard notwithstanding that there are no implementations?


On the other hand, those who implement draft standards have incentives to resist changes in the drafts, so there is built-in tension with the principle that the standards dog should wag the implementations tail rather than vice versa, that the quality of the standard should be first and foremost. E.g., Microsoft has joined the ODF TC to work on ODF 1.2; will Sun and IBM be eager to rewrite the draft to accommodate Microsoft's needs given that they have large programming investments in the draft as it stands? Somehow I suspect that this is an area of potential disagreement that will need to be resolved at JTC 1 rather than at OASIS.


Rick:


On the distinction between formats and technology, I agree that UAIS v. 0.01 needs tweaking in that area. I suspect that a more appropriate definition of "universal" would be a good place to start. I've struggled with that. My notion is that "universal" should not have the sense of being universally implemented, but rather that there be no more barriers to a standard being universally implemented than are necessary.


The guiding light I am trying to use in the drafting is the goal of leveling the ICT standards competitive playing field --- full vendor neutrality if you will --- along with a focus on responsiveness to market interoperability requirements.


I agree with you that the kind of interoperability one gets may vary from expectations for reasons such as those you suggest. But the governing law tells us that responsiveness to market requirements, for example through profiling and conformance requirements, is the right response in this area.


The notion that there can feasibly be a one-size-fits-all standard in my lifetime (wouldn't that be nice? :-) is simply pie in the sky. If the market wishes pixel-perfect replication, PDF is designed for the purpose. If the market wishes to recycle and repurpose editable content, then a different sort of standard/profile is needed. But the law instructs that it is the matching of standards/profiles with market requirements that is key.


In my opinion far too many interop messes have been created by focusing on implementor requirements (e.g., conformant status for application-specific extensions to facilitate the waging of feature wars) rather than on beginning the standard development process with an assessment of market requirements and which of them will be fulfilled by the given standard.


One can see signs that we may be at the dawn of a new era in IT standards, with government procurement officials rather than the vendors setting the market requirements to be fulfilled by standards. E.g., the E.U.'s Open Document Exchange Formats ("ODEF") initiative couple with the indications that Microsoft's support for ODF was coerced by DG Competition.


I also agree wholeheartedly that ODF should not be given any less scrutiny than OOXML; if anything it should be given even more since there seems to be an emerging consensus among government IT departments that the standards should be converged using ODF features as the core. In my book that makes it all the more important that ODF 1.2 go through a broad-ranging review at JTC 1 (and not on the PAS track). The core has to be right or we simply polish the skin of a rotten apple, so to speak.


On your patent barriers points in your parent article, I will try to address those concerns in the next UAIS draft. You have accurately pegged at least two valid concerns I did not deal with appropriately, the "essential claims" issue and the non-recyclability of portions of standards in other standards.


But I note that RAND-Z does not necessarily go far enough for free and open source licensing schemes such as the GPL and LGPL. RAND-Z is enormously elastic; e.g., may bar sublicensing, which makes it incompatible with the GPL and LGPL. See generally Microsoft attorney David Rudin's Patent Licensing Assurances in Standards Organizations for a quick overview of RAND-Z elasticity. E.g., the OASIS RF on RAND IPR guidelines (see section 10.2.1) governing ODF allow a prohibition on sublicensing.

Popular Topics

Archives

Or, visit our complete archives.

Recommended for You

Got a Question?