Openness: the survivalist's ideal becomes the authoritarian's buzzword

By Rick Jelliffe
July 25, 2008 | Comments: 1

Looking back on Richard Stallman's History of the GNU Project, I was struck by the difference between the kind of openness sought by his free ideals and the kinds of openness that underlies many supporters of open standards who come in from the FOSS community: survivalist versus authoritarian.

The essence I see in Stallman's freedom is a kind of personal survivalism: to be in control of the software he uses so that he is not prevented from understanding, extending and fixing the code. The right to redistribute the code and changes is an essential part of this: being free to give is the corollary of being free to get, in this case. But the core is that no-one is limiting your freedom: no-one is proscribing what you must do. In acting out his ideals, Stallman has rightly earned the warm respect of millions of users.

Some open standards supporters, however, hold multiple standards anathema: standards are made to promote efficiency by network-effects. This efficiency is highly socially or commercially beneficial but the network effect is very fragile, so single standards should even be mandated at every turn. If we are mandating standards, then requiring support of two big alternative standard technologies is no improvement for implementers over having thousands of technologies and no standards or market leader. This insistence on one way (their way) or the highway is why I term them authoritarians.

So one is a freedom from gaol, the other is the freedom of the gaol. Perhaps my long-time support of plurality might be similarly parodied as freedom to choose your own gaoler!

Obviously we are used to think of both survivalists and authoritarians as a some mix of nuttiness and danger, and I don't want my metaphor to be taken that far. But there is a fundamental difference there: from an economic angle the survivalist wants to remove barriers to exchange and increase the velocity of exchange; the authoritarian seeks to fence off a theory-perfect market by making the goods perfectly substitutable. But they different kinds of market: the survivalist market is one of diversity, the authoritarian market is one of uniformity.

Of course, as with all markets, competition, user habit, word of mouth, value-adds, gambles, and even minor market imperfections makes it possible for market leaders to establish themselves, one of the strategies of the losers is to rally around and emulate the leading technology of the day in that area, whether Emacs or Word. Everyone is familiar with this effect.

The authoritarian tendency is to see their actions in establishing single standards as merely being a more efficient way of getting to where the market (whether free or imperfect, monetary or non-monetary) would have reached anyway. And the survivalists may see technological concentration on an open or free technology as a sign that their superficially individualistic approach actually can yield superior technology: this is of course another invisible hand argument.

The astute reader might by now be wondering 'Rick began off by opposing aspects of the FOSS and open standards approaches, but isn't he now saying that both are trying to do the same thing by different tactics: to promote free or perfect markets?' (By market, I don't necessarily mean just monetary markets: I mean the notional location or forum of exchange. And I don't mean to imply that various parties either think in terms of markets, nor that, even if they did, anyone considers markets as ends rather than means.)

In my defense, m'lud, may I plead two points?

Markets all the way up

The first is that in fact, there is an enormous problem with talking about the market: the problem is not 'market' but 'the'. I am not talking about horizontal self-contained markets at the same level, such as a market for search engines and a market for ODF-capable software and a market for web browsers; instead I am talking about vertical stacks of markets.

A technology exists in an ecosystem of other technologies, and a lack of freedom in any of the technologies in the same ecosystem compromises the actual freedom its fellows. (Which is why, for example, it is so problematic and undesirable when a standard normatively references a non-standard technology, such as the current undesirable situation with ODF and OOXML's use of ZIP.)

For example, suppose we have have a standard for brown bread rolls, and that the wise and bearded white and Japanese men who largely populate standards organizations have decided should be 3" cubes. Governments have adopted the Isoblock (in Europe calling it the Europan, and in the US the Freedom Loaf) because it will clearly help both consumers to compare price and quality and allow the bakers cheaper mass-produced baking equipment, and allow greater precision from lunchbox-makers and celebrity chefs.

So Isoloaf standardization has, in a sense, created a market; but there are other markets involved as well: the market between the Isoblock and black market non-standard brown loaves; the market between white and brown bread; the market between wheat bread and spelt bread; the market between bread and rice; the market between bread and cheese; the market between bread and party balloons; the market between spending and saving; the market between keeping money and giving to charity, and so on. Not to mention niche markets as well as mass markets.

If the Isoloaf is adopted, but then mandated by authoritarians so that no other brown bread rolls are allowed, or that white bread is not allowed, or that spelt bread is not allowed, or that rice is not allowed, and so on. In those cases, the adoption of the standard has actually distorted the market, not perfected it!

This is the standardization dilemma for regulatory adopters of standards. In just the same way as inappropriate embrace of the patent mechanism (i.e. state-sanctioned temporary monopolies) is now widely recognized as having a deadening effect especially when related to software, then the inappropriate rejection of plurality by governments also can deaden. Of course, people will still struggle on, and the social engineering aspects also play a part.

It can also be argued that the current IPR regimes unfairly trap poorer nations into a technology tax with the West, and that foreign market-dominating technologies (established by any means) are only crudely localized and therefore can fail to be optimal against local requirements (the debate over global optimizations versus local optimizations is permanent, rightly so) and local culture. I don't see why the same does not apply to international standards. (Regular readers should be well and truly bored with my continual push for more participation in all standards bodies, as the role of standards gets more critical in our tasks and institutions.)

This is why I think the kinds of market promotion we can read into the 'survivalist' ideals of the FOSS movement in its pre-corporate days is very distinct from the kind of market promotion that is the rhetoric of our present-day 'authoritarians'. One reduces market imperfection; the other promotes market imperfections. Different kinds of "open" indeed!

In the case of standards, we can distinguish a market between substitutable products that use the same standard, a market between different standards, a market between different standards organizations, and even a market between standard and non-standard (witness the various definitions of openness.) These are the vertical kinds of markets I was talking of: all of them need to be robust and level.

Openness for dummies

The other aspect I'd like to raise is that the early survivalist ideals work well for geeks, but break down for the rest of us (err...of them...a blogger on an O'Reilly site must admit to being, to some small extent, a geek!)

The crusty boss of detectives on TV shows often requires "motive, method and opportunity." For FOSS to meet survivalist requirements this translates to expectations, skills and time/tooling. As software has moved from the lab to the home, increasingly the relative number of users who have the skills or expertise to program, the time or tooling to do so, and expectation that it is their job or that they can do it, decreases.

As software gets to lay people, the survivalist benefits of FOSS apply less (not to say that there are not other benefits!): unlike the geeks, the people have less opportunity, skills and expectations that they can master or hack.


However, even lay people benefit from a high velocity of exchange of FOSS by geeks. The relative disenfranchisement of non-geeks is less important than the absolute enfranchisement of hackers, for the survivalist benefits. However, the non-geek is probably less capable of privately adopting/adapting non-standard technology (a black market!) and more limited by authoritarian mandates.

Architected for Hackability

So just as one of the essential issues for anyone involved in making the standards process more open (whether you are a survivalist or authoritarian, a pluralist or a puritan) is that it is participation that gives an open standard credibility over and above the potential for openness from a workable process, I think an essential issue for making sure the good survivalist benefits of FOSS are not swamped by the corporate agenda abetted by authoritarians, is that FOSS (and open standards intended to allow FOSS) has to be architected for hackability.

I think XML, HTML and CSS (and Schematron!) are good examples of what I mean here: technologies which are in small easily explicable chunks, perhaps declarative, enfranchises the non-expert, hurried or skeptical user.

Now promoting hackable FOSS for its survivalist benefits is not the only reason for having technology on the standards books: for example all market-dominating interface technologies should be QA-ed, RAND-z standards. Different considerations and trade-offs apply, and the standards world needs to be varied enough to support both (and certainly procurers and regulators need to be educated in the differences of both.)

I have participated in various standards efforts which are now used to enable or cope with or handle hundreds of billions of dollars of business, services, goods and real property. And most of the standards I have been involved in have been to some extent competitive at the time they were made with other standards (or technologies): XML and SGML, XML Schemas and DTDs, Schematron and XML Schemas, OOXML and ODF (and tangentially, XHTML and HTML,) .

To get market uptake of a standard, you need several things apart from a market requirement: you need an off-ramp strategy to allow users of legacy technologies to extricate themselves from those technologies (standardizing them is one option!); and you need to provide on-ramps to your technology so that users don't have big risks nor and can incrementally adopt the technology. One of the primary ways of providing workable off- and on-ramps is modularity and focus: targeted scopes.

(Now, I did want to say "simplicity" but the simplicity of specifying something has little relation to the simplicity of implementing it. It is easy to say 0.34 + 0.66 = 1.0, but consider the complexity of actually computing this with binary arithmetic (non-symbolically)!)

At my work, we have a common phrase the Bxxxx Mxxxxx effect, named after an old customer of ours. (I am blanking it out, not because there is anything wrong with this effect, quite the reverse, but because I haven't asked him to use his name.) We had a project for a large institution to move over from a proprietary system to using markup with a custom schema; when delivery came, it turned out that the people who had ordered the system had left, and the remaining people in the organization, from data entry to management, were entirely skeptical (some even antagonistic) to the move.

As it turned out, the project was a success. But one of the factors in its success, which we hadn't anticipated, was that moving over to a markup-based system (e.g. to XML) enabled Bxxxx Mxxxxx, who was a new manager, to poke around and hack things; it enabled him to intervene and therefore to manage effectively. He could look at the markup in scripts and get an idea of the functionality without needing to understand the whole. The Bxxxx Mxxxxx Effect is nothing but the survivalist benefit that comes from a technology being approachable enough to be within the competence of the non-geek (who becomes a quasi-geek thereby, I guess!)

I think people are used to going the other direction more: a technical standard is bad when it disenfranchises normal geeks (in favour of an elite or minority) rather than good because it enfranchises non-geeks. You can see this in the Chinese argument that UOF is preferable to ODF because UOF uses Chinese element names, or in the argument that ODF is preferable to OOXML because ODF doesn't use structured attributes. And they are certainly legitimate points in the mix (though, to repeat, the rationale for standardizing a market dominating technology may be quite different from the rationale for standardizing a new technology.)

Back to Richard Stallman, I think it is no coincidence that GNU Emacs is highly architected for hackability: the major and minor modes system and e-lisp. Similarly, the GNU utilities, small targeted programs that can be readily understood, fixed, enhanced and even replaced (in the non-monetary marketplace of utilities!)

So what is a technology that is standard and not architected for hackability (or hackable implementations)? I would say W3C XML Schemas is one. OOXML is another. In these cases, rather thinking in terms of on-ramps, we need to think in terms of crossroads: how can their problems be neutralized to increase layering, modularity and integration with other standards? (In the case of XSD, firstly by moving out complex type derivation into a subordinate optional standard; in the case of OOXML by taking advantage of OPC and allowing simultaneous formats —which I notice is a suggested approach for ODF/UOF convergence by some Chinese—to allow pluralism and modularity.)

In the case of office applications, I suggest that being architected for hackability by non-geeks requires building on the ability of XML to provide and format user-comprehensible structures and labels. I have raised the example of Smart Art before, which provides the structures but not the labels, and it is this kind of non-geek enabling which open standards need to be concerned with.

Niche standards for special industries by industrial groups don't need to be concerned with this hackability; standards that document market-dominating interface technologies to reduce market barriers need it a little; but open standards intended for mass adoption need it a lot!


My blog last week What are they so scared of raised the importance of participation by user representatives at ODF and the usefulness of ISO review of ODF; ODF is not one of these niche standards where only the industrial group (i.e. of vendor/developers) should expect to (or be allowed to) reign.

I was surprised to see that several sites quoted my summary of my point and cut out half of the phrase I was saying. In effect it suggested I was making a point I specifically disavowed: that OASIS or Ecma standards are illegitimate or that consortium are not appropriate forums for standards development; in fact I wrote:

I am not presenting them as either/or choices but complements neither of which automatically give openness and both of which require participation..."Without participation, openness is empty."

The quote given at ConsortiumInfo site was

The core idea is that JTC1's process, based on National Body voting is...more genuinely open, because it is impossible to stack either directly or indirectly

and I was pleased to see Rob Weir at least didn't leave the half of it out (I expect Andy Updegrove cropped it for fit and sensationalism rather than any sinister motives!)

JTC1's process, based on National Body voting is both effective ... and more genuinely open, because it is impossible to stack either directly or indirectly.

but a casual reader would have missed that I was talking about standards review, not standards development or standard balloting (not that I think the ballot process was compromised in a material way.)

The openness of the national body system, where vendors/developers do not have a direct vote, is more appropriate for review than the consortium system where vendors/developers have a first-class seat at the table (or have almost all the seats at the table and set the agenda!) And the effectiveness of the National Body system can be seen, for example, in the extent of the issues raised during the OOXML ballot, even discounting the parroted issues. My sentence before (well two, before) in the same paragraph makes this context clear:

It is JTC1 as an audit or QA process...

On the issue of participation, I was also pleased that Groklaw has included in its news the (month-old) call from Weir for participation an ODF interoperability and conformance effort, which also needs good participation like the ODF TC itself: I don't know how many people subscribe to but it is worth publicizing again.

Just as with the ODF TC, and the various JTC1 SC34 working groups getting involved in maintenance, the opportunity to participate is there. These are golden opportunities for real openness.

Open standards, made with open processes and real formalized independent review, themselves represent a good strategy for the non-geek survivalist (individual, community or institutional) to gain a measure of influence on the software they use. Not the complete control the alpha geek gets from having his own code base certainly, but not the complete lack of control created the market distortion of mandating single standards that act as barriers to freedom not conduits of it either: modest survivalism for the rest of us!

You might also be interested in:


Randy Scouse Git.

My generation found out about it during a culture war. This one during the browser wars. The hunted become the hunters, the urge to merge becomes the scourge. The Open Hand and all that.

I guess many readers won't get it. The trick of getting out of Chinese handcuffs is not caring if the other finger manages it. It's understanding no one gets out if only one is pushing. Classic tit for tat.

Popular Topics


Or, visit our complete archives.

Recommended for You

Got a Question?