Check out the new USENIX Web site.

An Update On Standards

Nicholas Stoughton, USENIX Standards Liaison
nick@usenix.org

This issue of ;login is about operating systems, which got me to thinking about the role of standards in that context. Many of the parts of an operating system are invisible, or at least only indirectly visible, to the applications that run on it. An application doesn't care about the virtual memory implementation, or much about the details of how the file-systems work. They care about the interfaces they have to request the system to do something on its behalf.

Standards are often all about interfaces; they are a contract between a user and a provider of that interface. A standard draws a line in the sand, and promises certain behavior about everything that crosses that line. From an operating system standard's point of view, that line is often at a higher level than the strict "kernel-mode" line used by an OS developer. Most of, for example, libc, if not all of it, is regarded as part of the system, or implementation in standards-speak.

And it is standards that make an OS useful ... if every application has to be ported to a new OS, it will have a much slower uptake, and indeed may never escape the laboratory. But an OS that promises the same set of interfaces that everyone else has will get immediate traction. It may well offer new and innovative things, but these are unlikely to ever be enough to bring the world with it without the standard interfaces.

There are really only two OS standards that matter at the moment, and one of those is only a de facto standard, caused by being in the right place and the right time when it was first released. POSIX started life as a standard in 1986, when it was first released as a trial use standard by the IEEE (and the very first editions called it "IEEE-IX"!). It has grown to the point where it is the core of almost every system that is not sold by Microsoft, and even they support it.

But do standards choke innovation? Do they stop OS designers from bringing us new, better, interfaces? No; POSIX (and other related standards) allow an implementation to add new interfaces. And while it promises the behavior of the interfaces specified, it says little or nothing about the performance of those interfaces. That is left as a "quality of implementation" issue. The standardized APIs may provide a bridge to allow applications to migrate to a new OS platform, where they can, if they want to, improve in any way that the new OS allows.

Standards do need to be continually revisited, updated and revised. They should not be static, permanent and unshifting. Such a standard is an indication that it is not be used. So called "stabilized" standards are as often as not historic documents rather than handy works of reference. POSIX is undergoing a revision, as regular readers of this column will be aware. Chipmakers are no longer focussing on how they can make a single processor go faster and faster, but are looking at how much parallelism they can pack into a single package: how many cores can you get onto one chip. Modern operating systems need to be looking at how they can exploit that parallelism best, and in many cases that means giving the application more opportunity to use multiple threads.

POSIX is not he only standard that is currently being revised. C++ is also going through its first full scale revision. This revision will probably end up including explicit language level support for multiple threads, possibly with some library support. There is still considerable argument as to how to best achieve this, and it is certainly not yet cast in stone. However, it would be very surprising if this area was not a major part of the new C++ standard when it is issued.

At the same time, the C language, which underpins both POSIX and C++, remains unrevised since 1999. It is still very much a single threaded language: there is nothing in the language (apart from the volatile keyword) to support the concept that there may be more than one thread of execution. The memory model that results from this single threaded mindset is often the cause of subtle, extremely hard to find, bugs in applications.

Revising C has not been on the table since the revision was published over 7 years ago. However, pressure is starting to build in the community for better support. The two most popular C compilers, gcc and Microsoft's msvc, both have many, many, extensions over the underlying C99 standard's requirements, as do most of the others. Almost any serious application ends up using some of these extensions (sometimes without the programmer even realizing that this is an extension). So it is possible that C may also end up on the revision table within the next year or two.

As always, I welcome your feedback on these matters.