Monday, July 21, 2008

Standardization is a Bad Goal...

Recently I've hit upon how to express something that I've learned and worked with over many years: standardization is a bad goal. I know that standardization is something that management, both in software development and out, loves to focus on and push. However, I've often seen it cause more harm than good. There is a much better way to phrase the goal, with "standardization" taking it's proper, subservient role.

The key here is the one word - "goal". Standards themselves are not inherently negative. It is when perspective is lost and they become the goal itself instead of simply a means to a goal that the damage is done. A perfect example of this is military underwear.

For example, in the Eighties, the US Army had come to a realization that much of its purchasing standardization had come to be getting in the way of achieving the mission, and made efforts to reform it. Standard issue underwear were just one example. In order to try to gain consistent quality, specifications for requirements of underwear had been made. Pages upon pages of military specs covered them (tens or hundreds of pages, or perhaps more). However instead of the desired effect of quality and efficiency, over the years these military underwear specs had ended up locking things in to the state of the art from decades long past, pushing up prices and limiting supply. The process was redone with a simple focus on the actual goals and not only did the quality of the 'equipment' supplied to the troops go up, but the price came down.

When it comes to software development, most often there is an unstated goal hiding behind the calls for standardization. The true goal is not really standardization at all, but instead I believe it is most often interoperability. When a piece of software has a "standard" interface to meet (e.g. RFC 821) the goal is that different pieces of software running on different types of computers altogether can talk to each other, aka "interoperate".

Another reason to standardize things in software development is to allow for people new to a project or team to get up to speed quicker and be able to contribute sooner. I would argue that this, too, falls under the more general goal of achieving interoperability, but on a personnel level.

Of course one of the more recognized problems "standardizing" things causes is in limiting the effectiveness of developers. But there is another often overlooked issue: monoculture (and specifically software monoculture). If all developers are on a single version of a single operating system, then chances of code problems going undetected increases. Even moving to a new version of the same operating system can expose latent bugs. Time and again I've seen the quality of a project go up with an increase in variety of developer platforms and tools employed.

Monoculture often infects software development in the name of standardization. When a build system needs to pull code from several sub-projects and put them together, all of the sub-projects need to be able to play nice with the build system. Often management might try to guarantee this by declaring a requirement that all developers "standardize" on a single programming language and a single IDE tool on a single OS platform. Yes, this will reduce problems with setting up the builds, but at what cost? This approach can definitely be called the tail wagging the dog.

The focus needs to shift off of standardization and move to interoperability instead. One should leave the choice of language up to what is most appropriate for the project, based on common factors including target platform, delivery, maintenance, etc. The tool should be left to what allows individual developers to be most productive on whichever projects they get assigned to and may be different from developer to developer (and of course may change even during the course of a single project). Finally, a build system should be chosen to get projects built and delivered as needed. In all of this the requirement for interoperability needs to be explicitly stated and stressed.

A good example is with Java development. There are several choices used for development tools, with IntelliJ IDEA, Eclipse, Emacs and NetBeans among the more common. There are also many OS's that are used as developer platforms, with Linux, Windows and OS X among the more common of these. Despite being very different tools, all of these allow a developer to program in Java. Also most support the common build systems of Ant and of GNU Make in addition to their proprietary project format (and where the tools don't, Ant or Make can be made to support them).

So instead of battling over the various trade-offs of *.ipr files versus .project files, the "build people" and the "engineering people" of a group can standardize on either GNU Make or Ant. Then the requirement for developers would merely be that their workflow was compatible with the project build system and leave the choice of specific tool and/or tools up to the needs of individual developers. Even source control can be mix-n-match. Inkscape's official source repository uses Subversion, but some developers use other interoperable (there's that word again) tools such as SVK, git and others.

Similar comparisons can be made for C and C++, and with Inkscape we see quite a bit of it. Emacs, vi, VisualStudio, Anjuta, Eclipse, Kdevelop, vim, gvim and more have all been seen in use by different contributors. Developers don't don't have to use a standard IDE, but they do use interoperable IDEs and workflows.

To sum up, don't be foolish; clarify your actual goals. As Emerson said, "A foolish consistency is the hobgoblin of little minds."

1 comment:

You are not required said...

standardization is bad for creativity