Rebel Code - Linux and The Open Source Revolution 1st Edition-2002 by Basic Books - Glyn Moody
Rebel Code - Linux and The Open Source Revolution 1st Edition-2002 by Basic Books - Glyn Moody
00 CAN
ISBN-10: 0-7382-0670-9
0738206709-FM.qxd 9/4/03 10:19 AM Page i
REBEL CODE
0738206709-FM.qxd 9/4/03 10:19 AM Page ii
0738206709-FM.qxd 9/4/03 10:19 AM Page iii
REBEL CODE
[ The Inside Story of Linux and
the Open Source Revolution ]
GLYN MOODY
To My Family
Many of the designations used by manufacturers and sellers to distinguish their products are
claimed as trademarks. Where those designations appear in this book and Basic Books was aware of
a trademark claim, the designations have been printed in initial capital letters.
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or
transmitted, in any form or by any means, electronic, mechanical, photocopying, recording, or other-
wise, without the prior written permission of the publisher. Printed in the United States of America.
A CIP catalog record for this book is available from the Library of Congress
ISBN 0-7382-0670-9
Books published by Basic Books are available at special discounts for bulk purchases in the U.S. by
corporations, institutions, and other organizations. For more information, please contact the Special
Markets Department at The Perseus Books Group, 11 Cambridge Center, Cambridge, MA 02142, or
call (617) 252-5298.
2 3 4 5 6 7 8 9 10
0738206709-FM.qxd 9/4/03 10:19 AM Page v
CONTENTS
Afterword 324
Index 332
v
0738206709-FM.qxd 9/4/03 10:19 AM Page vi
0738206709-FM.qxd 9/4/03 10:19 AM Page vii
NOTE TO THE
PAPERBACK EDITION
Any book that deals with a fast-moving subject like free software is
doomed to be out of date the moment it appears. But as the first edition
of Rebel Code went to press at the end of 2000, the world of open source
was changing even more rapidly and dramatically than usual. The dot-
com boom years were coming to an end, and the market conditions that
had allowed open source companies to prosper so spectacularly were dis-
appearing.
The late 1990s were an exciting time for the free software world, and,
despite the subsequent financial turmoil, their deeper achievements have
proved enduring. Rather than rewrite the history of that period to take
account of the later economic downturn, I have chosen to leave the main
text of Rebel Code largely unchanged to reflect an important moment in
computing history. I have, however, added a short Afterword that at-
tempts to place those events in a broader context.
Glyn Moody
June 2002
vii
0738206709-FM.qxd 9/4/03 10:19 AM Page viii
0738206709-FM.qxd 9/4/03 10:19 AM Page ix
ACKNOWLEDGMENTS
Unless otherwise noted, the vast majority of quotations in this book are
drawn from interviews conducted between September 1999 and Septem-
ber 2000 in person, by telephone, or by e-mail. These have been supple-
mented by extensive material taken from an interview with Linus
Torvalds at a critical juncture in his life, in December 1996, as well as
other interviews with key players from the last three years.
I am particularly grateful to all these people, who somehow managed
to take time from their important work and hectic schedules to talk to
me, often at great length. My only regret is that because of space con-
straints I was not able to introduce more of their memories, thoughts,
and comments.
Despite this generous help, I am conscious that there will be mistakes
and omissions in this book, for which I alone am responsible. I would be
happy to receive corrections as well as general comments on the text at
[email protected].
The 1996 interview with Linus Torvalds was carried out for a feature
commissioned by Sean Geer, editor of the UK edition of Wired magazine.
The feature eventually appeared as “The Greatest OS That (N)ever Was”
in the August 1997 issue of the U.S. edition of Wired, where it was shep-
herded through by Jim Daly, a features editor at the time. Jim kindly sug-
gested my name to Jacqueline Murphy of Perseus Books when, in 1999,
she was looking for someone to write a book about GNU/Linux and
open source. My thanks go to him, and even more to Jacqueline, my edi-
tor, who not only followed up the suggestion but also persuaded me that
I should undertake this project. I am also grateful to Jacqueline’s assis-
ix
0738206709-FM.qxd 9/4/03 10:19 AM Page x
x Acknowledgments
tant, Arlinda Shtuni, for her constant help; Marco Pavia for steering the
book through production and to Jennifer Blakebrough-Raeburn for her
sensitive copyediting.
Others who have provided key assistance along the way are David
Croom, who offered valuable advice early on, and two people who
kindly read draft versions of the text and made helpful comments, Anna
O’Donovan and Sean Geer.
Finally, I must record my deep gratitude to my wife and family, with-
out whose constant support this book would not have been possible.
—Glyn Moody
0738206709-FM.qxd 9/4/03 10:19 AM Page 1
PROLOGUE
Outside, a louring Seattle sky broods over the clumps of squat white
buildings scattered around an extensive campus in constant expansion.
Neat lawns, assiduously tended flowerbeds, and the tidy ornamental
ponds create a mood of cloistered reflection and tranquillity.
Inside, a similar calm reigns in myriad small offices where young men
and women toil diligently. The silence is broken only by bursts of clatter-
ing keys; hardly a word is exchanged, as if a stern vow were in force. And
yet despite a conducive environment and comforting faith, there is unease
among the rooms’ inhabitants, a rising tide of something close to fear.
They know that a terrible ghost is abroad in the cloisters of Microsoft.
The ghost has a name: open source. Its characteristics have been
meticulously detailed by two of the company’s expert ghost-watchers in
a pair of lengthy memos. Though marked “Microsoft confidential,” they
surfaced outside the company and were published on the Web, suitably
enough during Halloween 1998. Forced to concede that the memos did
indeed originate from within the company, Microsoft dismissed them as
the private speculations of a couple of engineers.
As he read the memos describing this crazy phenomenon, Bill Gates
must have shuddered in recognition; it was as if a spirit from the past
had tapped him on the shoulder. Gates had sought to exorcise the ghost
of free software over twenty years before.
In 1976, Gates had published what he called—with what would prove
deep irony—an Open Letter to Hobbyists, addressed to users of the first
personal computer, the MITS Altair. Gates and Paul Allen, the other
founder of Microsoft, had written a version of the Basic (Beginner’s All-
Purpose Symbolic Instruction Code) language that would run on this
1
0738206709-FM.qxd 9/4/03 10:19 AM Page 2
2 Prologue
Prologue 3
4 Prologue
and wore neat, clean clothes, but had a respectable job, was married, and
was a father.
There could be no better symbol for the new generation of hackers
who are turning open source into a powerful force in today’s computing
world. They are the heirs of an earlier hacking culture that thrived in the
1960s and 1970s when computers were still new, a community that be-
lieved software should be shared, and that all would benefit as a result.
This was the ethic that Bill Gates had lambasted in his Open Letter to
Hobbyists. Microsoft and a few other key companies at the time, notably
AT&T, are largely responsible for nearly extinguishing the old hacker
spirit and replacing it with a view that redefined sharing software as
piracy.
Thanks to the advent of relatively low-cost but powerful PCs and the
global wiring of the Net, the new hackers are immeasurably more nu-
merous, more productive, and more united than their forebears. They are
linked by a common goal—writing great software—and a common code:
that such software should be freely available to all. Hackers rebel against
the idea that the underlying source code should be withheld. For them,
these special texts are a new kind of literature that forms part of the com-
mon heritage of humanity: to be published, read, studied and even added
to, not chained to desks in inaccessible monastic libraries for a few au-
thorized adepts to handle reverently.
As a result, the open source movement poses a challenge not just to
Microsoft but to the entire software industry. And perhaps beyond. For
as the Internet moves ever closer to heart of the modern world, it in-
evitably carries with it the free programs that drive it, and seeds the val-
ues that led to their creation. Its basic code of openness, sharing, and
cooperation is starting to spread outside the confines of one or two high-
tech industries. Many now believe that this potent combination of the
Net with open source has more than a ghost of a chance of thriving in
the current post-Microsoft era.
0738206709-01.qxd 9/4/03 10:20 AM Page 5
in Microsoft’s history,
I F 1 9 9 8 A N D 1 9 9 9 WER E T H E WO R ST Y E A R S
1991, by contrast, must have been a period when Bill Gates was feeling
good. Windows 3.0, launched in May 1990, was a growing success; in
the first year alone, 4 million copies had been shipped, a huge number
for the time. In May 1991, Microsoft launched Visual Basic, a radically
new way of programming that employs visual design methods rather
than traditional ones based on editing text files.
Even better, Windows 3.1 was close. Despite the point upgrade, ver-
sion 3.1 represented a major advance over 3.0 in almost every way. Mi-
crosoft claimed that it contained over 1,000 enhancements. Its cool new
user interface seduced almost everyone that saw it.
When Windows 3.1 shipped in June 1992, it cemented Microsoft’s
dominance on the desktop. It also created a discontinuity in the software
world as companies switched from DOS-based programs to those run-
ning under Windows. Microsoft was able to exploit this crossover to
wrest leadership in the spreadsheet and word-processor sectors through
early graphical programs such as Excel and Word.
Windows 3.1 was not the only major operating system nearing com-
pletion in 1991. Microsoft Windows New Technology, better known as
Windows NT, had been started back in 1988 in a drive to create an enter-
prise-level operating system that would be as widely deployed in compa-
nies’ back offices as MS-DOS and then Windows were in the front offices.
5
0738206709-01.qxd 9/4/03 10:20 AM Page 6
6 REBEL CODE
The Windows NT project was headed by Dave Cutler, who had built the
operating system VMS for the computer giant Digital, formerly known as
DEC. VMS was a rival to Unix, another robust operating system, but was
an official corporate product, unlike Unix, which had always been re-
garded as software for hackers. Windows NT, too, was unashamedly
meant as a Unix-killer when, against expectations, Unix—not VMS—
emerged as the leading enterprise operating system.
NT’s chances looked good. By the late 1980s, Unix was highly frag-
mented, each vendor offering a slightly different version; this meant that
application software had to be rewritten many times, and that users were
locked into the software of one supplier. Because Unix had also failed to
embrace fully the new wave of graphical interfaces, its solutions in this
area were crude when compared with the Apple Macintosh or Microsoft
Windows. Windows NT, by contrast, was designed to marry the power of
a VMS-like operating system with the elegance and usability of Windows
3.1.
But plenty was happening outside Microsoft’s immediate sphere of in-
terest. In 1991, Tim Berners-Lee, a British physicist at CERN, the Euro-
pean Centre for Nuclear Research, released for public use a hypertext
system that he had been developing over the past two years. The system,
which he called the World Wide Web, ran across the Internet, then still
small-scale and used mostly by academics.
Also in 1991, a new programming language called Java was being
drawn up by a team at Sun Microsystems. Born originally as part of an at-
tempt to develop an interactive set-top box for the cable TV industry,
Java was later adapted for use on the Internet. An important feature of
Java was portability: The same program could run without modification
on a wide range of hardware, a novelty at the time.
Although the threats these developments represented at the time were
negligible compared to their later impact, it is possible that Microsoft
was tracking them in 1991. The Internet was well known enough, the
Web was in the public domain, and Sun was a competitor whose moves
were certainly watched with interest. But surely it is impossible that, in
the same year, Microsoft could have had even the merest whisper of a
suspicion that a key element of an equally potent challenge was about to
take shape in the bedroom of a second-year computer science student in
Helsinki, the capital of Finland.
As one of the most northerly capitals in the world, Helsinki is a city of
seasonal extremes: dark, cold winters with only a few hours of daylight,
and bright summers where the days never seem to end. Geographically,
Helinski is close to Russia’s St Petersburg, and parts of it are visually sim-
0738206709-01.qxd 9/4/03 10:20 AM Page 7
ilar; culturally, it has closest ties with Sweden, of which for centuries
Finland formed a province. When Russia invaded and annexed Finland
as part of its empire in 1809, it separated from Sweden. Finland only
won full independence in 1917.
Two cathedrals form important landmarks on the eastern side of the
city center. The Lutheran church possesses an elegant neo-classical exte-
rior, but takes the form within of a huge stone shell almost devoid of or-
nament; the Russian Orthodox Uspensky Cathedral is a typical
concoction of onion domed towers outside, and icons inside. Helsinki is
compact; its low buildings and broad streets are mostly laid out on two
grid systems that abut at a slight angle. Green spaces abound, and the
sight and smell of the sea, which surrounds Helsinki on three sides, is
never far away.
Into this ordered but individual world, Linus Benedict Torvalds was
born on 28 December 1969. Linus explains the significance of this day in
his culture: “It’s special in the same way most days are special—many
days have some traditional meaning. December 28th is ‘menlösa barns
dag,’ which means roughly ‘day of children without defects’ although
‘menlös’ is fairly old Swedish and a lot more poetic than ‘without de-
fect.’”
The name Linus was an unusual choice: “not unheard of,” Linus ex-
plains, “but not exactly common” in Finland either. The name itself has
a history that stretches back to the roots of Western civilization. It is
mentioned in Homer’s Iliad in the original Greek form Linos, where it is
associated with a song of mourning. There is also a St. Linus, who is tra-
ditionally listed as the second pope after St. Peter. Another famous Li-
nus—though better-known for his last name—is the American inventor
Linus Yale.
But rather than any of these, Linus was named in part after the U.S.
scientist Linus Pauling, who received two Nobel Prizes: one for chem-
istry, and one for peace. “I think I was named equally for Linus the
peanut-cartoon character,” Linus adds, and notes that this makes him “a
mix of 50 percent Nobel-prize-winning chemist and blanket-carrying
cartoon character.”
Torvalds is a rare surname: “There are probably something like twenty-
five people in the world with that last name,” Linus says. “Torvald” is “a
rather old-fashioned Nordic name,” Linus explains. “You’ll find it in
Sweden and Norway. The genitive ‘s’ is unusual, an old way of turning a
regular name into the name of a farmstead—and from there into a sur-
name.” Linus explains that it was his paternal grandfather “who took the
name for personal reasons, namely, that he didn’t like his family.” As a re-
0738206709-01.qxd 9/4/03 10:20 AM Page 8
8 REBEL CODE
sult, he says, “to my knowledge, all Torvalds in the whole world are [his]
descendants.” In other words, one of the most famous names in comput-
ing is “completely made up, and not more than two generations ago.”
The Torvalds clan is notable not just for its unusually small size: A sur-
prising number of them are journalists. Linus runs down the list: “My
dad, Nils Torvalds: reporter for Finnish radio. His brother, my uncle, Jan
Torvalds: Finnish TV. My paternal grandfather—the founder of the Tor-
valds clan, deceased—used to be a newspaper reporter, writer, and poet.
My mom: Mikke (Anna) Torvalds: working at the Finnish News Agency.
Used to do translations, does news graphics these days. My sister, Sara
Torvalds: used to do translations for the Finnish News Agency too, but is
moving more towards books and films.”
The Torvalds family formed part of the Swedish-speaking community
in Finland, about 300,000 strong in a total population of 5 million. That
their mother tongue has nothing in common linguistically with the
Finnish language that surrounds them has doubtless helped them be-
come a very close-knit group. Reflecting this, Swedish speakers them-
selves call this society within a society “Ankdammen”—the Duck Pond.
One of Linus’s friends from Helsinki, future fellow hacker Lars Wirze-
nius, says, “Almost all Swedish-speaking people know lots of other
Swedish-speaking people, who know lots of other Swedish-speaking
people, and the end result is that everyone either knows everyone, or
knows someone who knows someone.”
Growing up as a member of the Duck Pond, Linus spoke Swedish at
home and with family friends, and only started learning Finnish when he
was five. Contact with English, which would become a crucially impor-
tant factor in his work, came a few years later.
It was around the same time that he first encountered a computer. Li-
nus explains that his maternal grandfather was a statistician at Helsinki
University; he bought a Commodore Vic-20 microcomputer, “one of the
first ones to get on the market, at least in Finland.” He adds, “It wasn’t
exactly what you’d call a number-cruncher today, but it was certainly
faster than any calculator.” The speed of the processor was just 1 Mega-
hertz (MHz), a thousandth of that of modern PCs.
Linus recalls that his grandfather “bought the Vic-20 for his own needs
to do math,” but soon asked his young grandson to help. “I think my
grandfather wanted me to learn, so he made me help him,” he explains.
“So I started off helping him with the computer, and then I did my own
stuff on the side.”
Linus recalls, “I had the Vic for five years because I couldn’t afford to
upgrade. I programmed in Basic, maybe the first two years.” But Linus
0738206709-01.qxd 9/4/03 10:20 AM Page 9
10 REBEL CODE
very powerful feature, thanks to its choice of chip: it could run several
programs simultaneously, just like commercial minicomputers.
This area of computing—multitasking—led him to start coding the
simple program that would eventually turn into Linux. But this was still
some years off from the time when Linus was hacking on his Sinclair QL.
First, in the fall of 1988, he entered Helsinki University to study com-
puter science, which by now already looked likely to turn from a passion
into a profession.
At university, Linus found the same tendency for the two language
communities in Finland to keep to themselves. His fellow student in
computer science, Lars Wirzenius, comments that “at that time there
weren’t very many Swedish-speaking computer science students, and
those that were, were at least a couple of years older than we were.” It
was therefore only natural that a pair of Swedish speakers isolated
among the Finnish majority of the new intake for computer studies
should gravitate towards each other.
Wirzenius recalls that the first time he met Linus, “it was one of the in-
troductory lectures for new students.,” Wirzenius didn’t notice much of
his friend that day except that “the end of his big nose tends to twitch up
and down when he speaks, and it is fun to look at.” Aside from the nose,
which, in truth, is not that big, little in Linus’s appearance is out of the
ordinary: He is of medium height, with brown hair; his blue eyes gaze
steadily from behind his glasses. Only the eyebrows, which are remark-
ably dark and bushy, jar slightly with the otherwise boyish face.
In an effort to meet more student members of the Duck Pond, Wirze-
nius and Linus joined one of the many societies that form an important
element of Helsinki University’s student scene. “The club that Linus and
I joined was called the Spektrum,” Wirzenius recalls, “which is the
Swedish-speaking club for people who study math, computer science,
physics, or chemistry.” As far as the teaching at the university was con-
cerned, Wirzenius explains that “there isn’t actually a curriculum which
says that during the first year you have to take this course and that
course. The so-called academic freedom in Finland is interpreted so that
students can take any course they want, mostly in whichever order they
want.
“The nominal study length is five years, but most people study for six
or seven years,” he continues. Alongside the main courses there were
various ancillary subjects. “To get a computer science degree, you have
to study a certain amount of math, and also some other subjects. Each
week we got some exercises to do, and then we had small groups of peo-
ple and a teacher who asked each one of us to demonstrate some of the
0738206709-01.qxd 9/4/03 10:20 AM Page 11
problems we were supposed to have solved. And if you hadn’t solved all
the problems, that was OK, but you had to solve a certain amount to pass
the course.
“One of these weeks, Linus—for some reason, I don’t remember—had-
n’t done all his homework. So he just claimed that he had done one of
the exercises that he hadn’t done—and the teacher asked him to demon-
strate his solution. [Linus] walked up to the blackboard . . . and faced
the problem that he claimed he had solved. Linus decides this is a simple
problem, draws a couple of diagrams, and waves his hand a lot. It takes a
long time for the teacher to understand that yes, this is actually a correct
solution.”
According to Wirzenius, this incident was atypical in one respect: “[Li-
nus] didn’t usually try to cheat, because he didn’t have to. He knew math
well from high school, and had a sort of mathematical brain, so he didn’t
have to spend much time on homework to get it done.” Wirzenius be-
lieves there is something characteristic of his friend in the way Linus
handled the situation—“the attitude, the arrogance that he displayed—
most people would just have acknowledged that they didn’t actually have
a solution,” but Linus hates having to admit that he doesn’t know the an-
swer.
Wirzenius says that “the arrogance he showed then is still visible” in
how Linus handles challenges within the Linux community. “These
days, he might claim that a certain kind of approach to a problem is the
correct one, even though he hasn’t necessarily thought about all the
other approaches or solutions, and then the rest of the Internet either
agrees with him or tries to convince him otherwise.”
This approach works because Linus is “careful enough to not do it all
the time. Most of the time when he says something, he has been thinking
about it,” Wirzenius notes, and concludes: “I wouldn’t like to play poker
against him. He can bluff.” Just as important, if after all the bluffing Li-
nus is caught out and shown to be wrong, Wirzenius says, he will accept
corrections with good grace, a trait that would prove crucial in managing
the community of equally opinionated and able hackers that would later
coalesce around him.
After the first year’s introductory course, there followed a major hiatus
in their university studies. “All Finnish men are required to do either
military or civil service,” Wirzenius says. “Civil service is sort of you get
an ordinary job but you don’t get paid for it.” In 1989, when he and Li-
nus were required to choose, “the shortest time to do military service
was eight months [and] all civil service was sixteen months. So I figured
that’s eight months too much, so let’s do the shortest possible. Actually,
0738206709-01.qxd 9/4/03 10:20 AM Page 12
12 REBEL CODE
13
0738206709-01.qxd 9/4/03 10:20 AM Page 14
14 REBEL CODE
16 REBEL CODE
18 REBEL CODE
the usual ways of doing things that would make it ugly. And another idea
was that they wouldn’t just hire away the hackers from the [AI] Lab, but
would hire them part-time, so the Lab would still have its hacker culture,
and the community that we belonged to wouldn’t be wiped out.” But this
vision was never realized.
“The other people [in the AI Lab] said that they didn’t trust Greenblatt
to run a business,” Stallman says. As a result, a company called Symbol-
ics was formed without Greenblatt, who, undaunted, went on to set up
his own, called Lisp Machine Incorporated (LMI). But as Stallman re-
calls, “Symbolics had more money and hired several of the best hackers
away from the Lab, and a year later they hired the rest of the hackers ex-
cept for me and Greenblatt. And the result was that my community was
wiped out.” Stallman is still moved when he recalls these events. His
beloved Lab “felt like a ghost town,” he says. “It was desolate, and I was
grief-stricken.”
Not only were his friends and colleagues leaving but he was losing ITS
as a result. “At that time, the AI Lab was buying a new computer,” he
says, “and there were no hackers to port ITS to it” because all the hackers
had been hired away. “Everything was dying.”
This turn of events might have left Stallman in despair, but he chan-
neled his pain into an anger that would spur him on to undertake a cru-
sade that continues to this day. He decided to fight back in the only way
he knew how: by coding. As the Symbolics development team–largely
the former hackers of the AI Lab–made additions to their version of the
software for the Lisp Machine, Stallman set about reproducing those
same features in the version of the software that the AI Lab used.
“I had myself taken off of all mailing lists, because I wanted nothing to
distract me. I knew I had to compete with a much larger army and that I
would have to do the work of several people; I’d have to work in the
most focused way I possibly could,” he says.
Because LMI and Symbolics both had the right to use features from the
AI Lab’s software for the Lisp machine—and Symbolics already had the
new features—Stallman enabled LMI to match every move of Symbolics,
and so denied Symbolics any advantage from its larger team of develop-
ers.
Looking back, Stallman says that this period beginning March 1982
saw “absolutely” the most intense coding he had ever done; it probably
represents one of the most sustained bouts of one-person programming
in history.
“In some ways it was very comfortable because I was doing almost
nothing else,” he says, “and I would go to sleep whenever I felt sleepy;
0738206709-01.qxd 9/4/03 10:20 AM Page 19
when I woke up I would go back to coding; and when I felt sleepy again
I’d go to sleep again. I had nothing like a daily schedule. I’d sleep proba-
bly for a few hours one and half times a day, and it was wonderful; I felt
more awake than I’ve ever felt. And I got a tremendous amount of work
done [and] I did it tremendously efficiently.” Although “it was exhilarat-
ing sometimes, sometimes it was terribly wearying. It was in some ways
terribly lonesome, but I kept doing it [and] I wouldn’t let anything stop
me,” he says.
Stallman kept up this unprecedented feat for nearly two years. But by
1983, the situation was beginning to change. On the plus side, “LMI was
getting bigger and was able to hire some programmers,” he remembers,
“so I could see that LMI was going to be able to do this work for itself.”
But more negatively, Symbolics had designed a new kind of Lisp ma-
chine, so “there was no hope and not much point, either, in making the
MIT version of the system [software] run on those machines,” he ac-
knowledges.
In hindsight, this situation provided an important demonstration of
one of the key factors of the new way of developing programs that free
software in general and Linux in particular would build on in the coming
years. As Stallman explains, “I could write code, but I couldn’t test every-
thing myself. Users had to find the bugs.” Consequently, “as the AI Lab
switched over to [Symbolics’ new] machines, I would lose my ability to
do the job” of producing good code for the LMI machine because no-
body could test it at the Lab. “I would have been unable to continue,” he
says.
This development proved a blessing in disguise. “I decided I didn’t
want to just continue punishing Symbolics forever. They destroyed my
community; now I [wanted] to build something to replace it,” he says. “I
decided I would develop a free operating system, and in this way lay the
foundation for a new community like the one that had been wiped out.”
Another important consideration in all this, he recalls, was that “the
MIT Lisp Machine system was proprietary software. It was licensed by
MIT to Symbolics and LMI, and I wasn’t happy with that” because the
essence of proprietary software is that it cannot be shared; this stifled the
formation of the kind of software community that Stallman now wished
to create. Working on the MIT Lisp Machine software had in some ways
revealed itself to have been a wrong turn.
Once Stallman had decided on this new course of action—creating a
free operating system—he soon made “the major design decision–that
we would follow Unix,” he says. In retrospect, it might have seemed the
obvious choice, but at the time this was by no means the case because
0738206709-01.qxd 9/4/03 10:20 AM Page 20
20 REBEL CODE
Stallman knew little about the system. “I’d never used it,” he says, “I’d
only read a little about it, but it seemed like it was a good, clean design
[that had some] nice clean and powerful ideas.”
But the main impetus for his choice grew once again out of his experi-
ences during the diaspora of the hacker community at the AI Lab. “Digi-
tal had discontinued the PDP-10”–the last machine on which ITS had
run–“which meant that ITS was now completely unusable on any mod-
ern computer,” Stallman says. “And I didn’t want that kind of thing to
happen again. The only way to prevent it from happening again was to
write a portable system.”
Stallman wanted an operating system that could be easily transferred
from one type of hardware to another. Because most operating systems
had been conceived as the housekeeping software for one type of com-
puter, portability was the exception rather than the rule. “Unix was, at
least as far as I knew, the only portable system that really had users on
different kinds of computers. So it was portable not just in theory but in
actual practice,” Stallman says.
Another important reason why Stallman chose Unix as his model, he
explains, was that he “decided to make the system compatible with Unix
so people who had already written programs for Unix would be able to
run them on this system. People who knew how to use Unix would
know how to use this system without being required to learn something
new.” Once again, as he had with Emacs, he was building on something
that already existed, but making it better–in this case, by creating a Unix-
like system that could be shared freely.
Although the project was born at this most unhappy time for Stallman,
the name he chose for his new project showed a typical hacker humor.
His Unix work-alike would be called GNU, an acronym that stood for
“GNU’s Not Unix,” and which thus explained itself self referentially. This
kind of recursion is often used as a programming technique, and apply-
ing it to words is highly popular amongst hackers.
The name GNU also proved a fruitful source of inspiration for simi-
larly self-referential names for many of the programs that went to make
up the overall GNU project. Another important virtue of Unix was that
its design “consists of a lot of small pieces, often fairly small pieces,”
Stallman explains; to create a Unix-like system, “what you do is you just
have to write each piece. So I sat down and started writing small pieces.”
Although this makes the entire process sound disarmingly simple,
writing these “small pieces” involved reproducing work that had taken
hundreds of people fifteen years–an even harder task than when Stall-
man matched the team of programmers at Symbolics. In his fight against
0738206709-01.qxd 9/4/03 10:20 AM Page 21
22 REBEL CODE
back and modifications from users, although not new–it had formed an
implicit part of the entire Unix culture for years–became central to the
rapid improvement of free software over the next fifteen years. Its adop-
tion in an extreme form for the development of Linux was one of the key
factors in its rapid growth and robustness.
The second consequence of making Emacs available was that Stallman
could sell it on tape, the standard distribution medium before floppies;
this was alongside free distribution on the Internet because at that time
few people had access to this fledgling medium. As well as serving users
who might otherwise find it difficult to obtain copies, selling tapes had
the even more beneficial effect of giving Stallman an income. “See,” he
says, “I had quit my job at MIT to start the GNU project. And the reason
is I wanted to make sure that MIT would not be in a position to interfere
with making GNU software free in the way I wanted to do it. I didn’t
want to have to beg the MIT administration for permission to do this and
then maybe they would insist on spoiling it in some way.”
To its credit, the AI Lab allowed Stallman to continue to use its com-
puters, and to sleep in his former office at 545 Tech Square for twelve
years. He was even registered to vote at that address. Stallman did retain
one significant memento of those happy times: his log-in ID. To this day
he signs himself “rms,” his original username for the ITS machine, and
most of the hacker world knows him by these initials.
Although Stallman says that he “lived cheaply–I would have done so
for the sake of this project, but I’d always lived cheaply,” money was an
obvious concern in those early days of GNU. “I didn’t know at the begin-
ning whether I would have any sources of income,” he recalls, though
“for the first year I think I was paid by LMI to update some manuals
from MIT.” In any case, “the point is,” he says, “I was utterly determined
to do this; nothing could have intimidated me at that point.”
Fortunately, selling GNU Emacs for $150 a tape–“I just thought it
would be a good price,” he explains–soon provided him with a steady in-
come. “By the middle of the year, I was getting some eight to ten orders a
month and that was enough for me to live on [although] I couldn’t have
saved any money from that alone.”
With GNU Emacs out of the way, and perhaps encouraged by its recep-
tion and the money he was starting to receive from it, Stallman turned his
thoughts to the other projects, including that problematic C compiler.
This time, he decided he would not try to adapt a pre-existing program,
but would write it all himself. It proved a hugely complex project. “It was
the most challenging program I’ve ever written,” he says simply. “It’s one
that required me to write plans down on paper rather than just code.”
0738206709-01.qxd 9/4/03 10:20 AM Page 24
24 REBEL CODE
The C library is a chunk of auxiliary code that other programs can call.
By separating out an entire set of commonly used functions in this way,
user programs can be much smaller. Creating a C library for GNU was
therefore a key prerequisite before the GNU operating system could run
such user programs and do something useful.
Both the C library and Bash were finished around 1990, along with
many other elements of the Unix operating system. Still missing, though,
was in many ways the single most important program of all: the kernel.
As its name suggests, the kernel sits at the very heart of the operating
system. It acts as housekeeper (handling tasks such as the screen and
keyboard) and judge (ensuring that all the competing calls for attention
from user programs are handled correctly). Although many other pro-
grams are indispensable for a fully functional operating system–for ex-
ample a C library and a shell–it is the kernel that defines its essence.
That Stallman had left the kernel until last might seem strange given
its central nature. In part it was because he had been distracted by his
feud with Tanenbaum and his vow that a C compiler would be one of the
first programs that he wrote for his GNU system. In addition, it made
sense to develop all the programming tools first so that they could then
be used in the creation of the kernel.
There was another reason, however. “I was trying to bypass the hardest
part of the job,” Stallman says. Debugging kernels is difficult because the
standard programming tools depend on a functioning kernel. To get
round this, Stallman had been on the lookout once more for suitable
software that could be adapted—and he had something particular in
mind: “I was looking for a microkernel,” he says.
A microkernel is a kernel that takes the idea of a small, central adjudi-
cating piece of software to its logical conclusion; it makes all the other
housekeeping tasks normally found in a kernel into programs that are
controlled just like end-user applications such as word processing and
spreadsheets. The pros and cons of this approach with respect to the
classic all-in-one kernel–known as a monolithic kernel–form the subject
of what hackers call a “religious war” between their respective adherents
that rages to this day.
Stallman was interested in microkernels not for reasons of doctrine but
because, he says, “that way we could avoid the job of debugging the part
of the software that runs on the bare machine”–the underlying hardware.
As usual, he was sensibly trying to save time and effort by taking existing
code that had tackled one of the harder parts and building on it. He also
thought the design provided some “nice features.”
0738206709-01.qxd 9/4/03 10:20 AM Page 26
26 REBEL CODE
28 REBEL CODE
its power grew, because the more copylefted programs there are the
greater the pool for future such programs to draw on; this eases their cre-
ation and the pool of GPL’d software grows yet further.
This enormous efficiency acted as one of the main engines in driving
the free software projects on to their extraordinary successes during the
1990s. For this alone, and for his work in putting together the GNU sys-
tem, the computer community’s debt to Stallman is immense.
Yet for Stallman, this emphasis on inherent efficiency misses the point
about the GNU project and about the GNU GPL. His essential concern is
freedom, and the GNU project a means to an end rather than an end in
itself.
“The overall purpose,” Stallman explains, “is to give the users freedom
by giving them free software they can use and to extend the boundaries
of what you can do with entirely free software as far as possible. Because
the idea of GNU is to make it possible for people to do things with their
computers without accepting [the] domination of somebody else. With-
out letting some owner of software say, ‘I won’t let you understand how
this works; I’m going to keep you helplessly dependent on me and if you
share with your friends, I’ll call you a pirate and put you in jail.’
“I consider that immoral,” Stallman continues, “and I’m working to
put an end to that way of life, because it’s a way of life nobody should
have to be part of, and the only way you can do that is by writing a lot of
software and saying to people, ‘Come, use it, share it with your friends,
change it to do whatever you want it do. Join us, have fun.’ And that’s
what GNU is for, it’s to give people the alternative of living in freedom,
and the individual software packages like GNU Emacs and GCC or the
GNU Hurd, they’re important because they’re parts of getting that larger
job done.”
Stallman has not only devoted much of his time and energy to this
cause, he has given his money, too. In 1990, he won a McArthur Founda-
tion fellowship, worth somewhere in the neighborhood of $230,000, di-
vided into quarterly installments over five years. “That was a lot more
money than I need to live on, and more income than I had ever had be-
fore. Rather than spending it, I decided to invest most of it in mutual
funds so that I can live on it for the rest of my life while working on free
software or another worthy cause.”
He has no car: “I live in a city where you don’t need to have a car.” He
rents a room: “I don’t want to own a house, I don’t want to spend a lot of
money. If you spend a lot of money then you’re the slave of having to
make money. The money then jerks you around, controls your life.”
Stallman has never married or had children. “That takes a lot of money.
0738206709-01.qxd 9/4/03 10:20 AM Page 29
There’s only one way I could have made that money, and that is by doing
what I’d be ashamed of”–writing nonfree software. “If I had been devel-
oping proprietary software, I would have been spending my life building
walls to imprison people,” he believes.
Deadly serious intensity aside, Stallman also has a lively sense of hu-
mor that manifests itself in a constant stream of ideas, puns, and hacker-
type plays on words that courses through his conversations. His personal
Web site carries a picture of him dressed as the outrageous be-toga’d
Saint IGNUcius–a character he is happy to re-create in his travels around
the world with his music CDs and a battered Toshiba portable as he gives
talks whenever and wherever he can to spread his message.
And yet many have a different image of Stallman: It is not so much
playful mock saint as implacable Old Testament prophet–a kind of geek
Moses bearing the GNU GPL commandments and trying to drag his
hacker tribe to the promised land of freedom whether they want to go or
not. His long hair, which falls thickly to his shoulders, full beard, and in-
tense gaze doubtless contribute to the effect.
This impression arises largely from Stallman’s ability to offend even
supporters through his constant refusal to compromise. “The only rea-
son we have a wholly free operating system is because of the movement
that said we want an operating system that’s wholly free, not 90 percent
free,” he says. “If you don’t have freedom as a principle, you can never
see a reason not to make an exception. There are constantly going to be
times when for one reason or another there’s some practical convenience
in making an exception.”
He admits that sometimes he gets tired of hammering home this mes-
sage over and over again, but believes he must keep on doing it, even be-
yond the tolerance of most of his audiences if need be. “I’m never
optimistic,” he says, “I’m never sanguine. I’ve achieved a lot, but it’s not
the whole job. The job is not finished.”
When will it be finished? “Well, maybe never,” he acknowledges. “In
what sense is freedom ever assured for the future? People are always go-
ing to have to work to keep a free society.”
This is the real point of the self-referential GNU’s Not Unix, and of
ideas like “copyleft.” For Stallman, free software matters not because
software is special and deserves a special freedom; indeed, he acknowl-
edges that “there are more important issues of freedom–the issues of
freedom that everybody’s heard of are much more important than this:
freedom of speech, freedom of the press, free assembly.”
Free software matters because all freedoms matter, and software hap-
pens to be the domain in which Stallman can contribute most. “I don’t
0738206709-01.qxd 9/4/03 10:20 AM Page 30
30 REBEL CODE
A Minor Rebellion
31
0738206709-01.qxd 9/4/03 10:20 AM Page 32
32 REBEL CODE
expensive,” Wirzenius says, but adds, “that was a joke, especially from
my side; I’m not quite as good a programmer as Linus is. It’s not some-
thing you decide–‘this week I will write an operating system.’” The jest
proved truer than either could have imagined.
Fortunately, in addition to whetting his appetite for Unix, Linus’s stud-
ies that fall provided him with a way to acquire it. He recalls that “one of
the course books was Operating Systems: Design and Implementation.”
The book came with its own illustrative software, which was unusual.
Called Minix, it had been written to teach students how an operating
system worked through examining its source code. Linus found that
Minix was essentially a Unix-like system that ran on a PC. He could not
know then that not only would it provide him with the scaffolding for
building his own PC-based Unix kernel, Linux, but it would also fore-
shadow many of the techniques and even much of the development his-
tory of that system.
Minix was written by the same Andrew Tanenbaum who had rebuffed
Richard Stallman’s overtures in 1984 when Stallman was searching for a
compiler and who provoked Stallman into writing GCC. Despite this in-
auspicious start, Tanenbaum says Stallman had later suggested to him
that Minix could provide the missing kernel of the otherwise complete
GNU system. “He approached me a couple of times,” Tanenbaum says,
“but he had all these conditions; it’s got to be this, and it’s got to be that.
I said ‘whoa, whoa, whoa.’ He’s kind of an abrasive person [and] it didn’t
really turn me on.”
Stallman, for his part, says he has no recollection of the idea because it
was so long ago, but acknowledges that that the discussions could have
taken place.
Had Tanenbaum said yes to Stallman’s proposal, the history of com-
puting would have been very different. Linus emphasized later that “if
the GNU kernel had been ready”–for example, through a form of
Minix–when he was casting around for his own copy of Unix, “I’d not
have bothered to even start my [Linux] project.” But Tanenbaum said
no, and instead of nipping Linux in the bud, Minix played a fundamental
role in helping it germinate and flower.
After his studies at MIT as an undergraduate, and at the University of
California at Berkeley as a graduate student, Tanenbaum joined the com-
puter science faculty at the Free University of Amsterdam in 1971; he be-
came a professor there in 1980.
In 1979, ten years after Unix was first created, Version 7 of the system
was released by AT&T. Version 7 was of note for several reasons; for ex-
ample, it was the first Unix to be ported to (translated for use on) an In-
0738206709-01.qxd 9/4/03 10:20 AM Page 33
A Minor Rebellion 33
tel processor, the 8086, a slightly more powerful version of the 8088 chip
later used in the first IBM PC. Xenix, as it was called, was the joint pro-
ject of two companies. One was The Santa Cruz Operation, better known
today as SCO (and later the owner of Unix). The other was Microsoft. It
is little known today that one year before Microsoft had launched MS-
DOS in 1981, it had already brought out a PC version of Unix. But the
existence of Xenix, never a popular product, makes the current challenge
from systems based around Linux piquant, to say the least.
Tanenbaum was more affected by another significant change from Ver-
sion 6. “The license said fairly clearly [that] the source code shall not be
made available to students,” Tanenbaum recalls. “Period. Nothing.
None. Zero.” Version 7 represented the symbolic closing of Unix inside
the black box of proprietary software—a sad end to what had long been
the ultimate student hacker’s system.
From an educational viewpoint, the ban on discussing Unix’s source
code was unsatisfactory. “Between 1979 and 1984 I stopped teaching
practical Unix at all and went back to theory,” Tanenbaum says. He real-
ized that the only way to make something comparable available to his
students “was to write an operating system on my own that would be
system-call compatible with Unix”—that is, working in exactly the same
way—“but which was my own code, not using any AT&T code at all.”
When Tanenbaum began Minix in 1984—just as Stallman started
work on GNU—he decided that it would be compatible with Unix, but
also that, as he puts it, “while I was at it, let’s design it right.” This offers
an interesting contrast to Stallman’s approach, where the emphasis was
on the free availability of what he wrote, not on how he wrote it. Tanen-
baum’s “designing it right” in this context meant in part using a micro-
kernel–although as Tanenbaum is quick to point out, “I decided to make
a microkernel long before anybody had heard of microkernels.” In fact,
he says, “I think Minix may have been the first.”
As a result, “the basic kernel is very, very simple,” although this did
not mean that completing Minix was easy: “It was thousands and thou-
sands of hours [to get it working]; it’s not a trivial matter,” Tanenbaum
explains. He wrote Minix at home in the evenings.
Writing a new operating system is difficult, one reason Stallman
searched for an existing microkernel system that he could use. Tanen-
baum recalls that “debugging any kernel on flaky hardware which you
don’t really understand well is horrible. I almost gave up the project a
couple of times. I had bugs that I just couldn’t find.” The difficulty was
caused in part because the hardware he was working with had undocu-
mented features and strange bugs that depended on the temperature of
0738206709-01.qxd 9/4/03 10:20 AM Page 34
34 REBEL CODE
A Minor Rebellion 35
months; there was certain new stuff and at some point I said, ‘Gee, there’s
enough new stuff that warrants a new version. It’s stable and it’s been
tested by a lot of people on the Net; it’s a working version.’ Then at that
point it was worth putting up a new version on the floppy disk.” Minix
was also ported to chips other than the 8088 found in the original PC,
including the Intel 80386 processor.
The arrival of the 80386 chip, and the gradual fall in price of systems
using it, proved a key development for Linus. “The 386 was a lot better
than any of the previous chips,” he recalled later. Moreover, Linus says,
“from the Tanenbaum book I knew that you could get Unix for a PC. So
that’s when I actually broke down and got a PC,”–something he had been
loath to do before.
Because of the way the Finnish university education system is funded,
Linus was in luck. Says Wirzenius, “The Finnish government gives
money to students and also backs student loans that the students take
from banks. The money is supposed to be used for food and housing, but
also for stuff needed for studies. Linus managed to get a student loan,
which he could use for buying the computer since he was living with his
mother–so he didn’t have to spend as much money on living.” The debt
was finally paid off in November 1992.
Linus added other recently acquired funds. “I had Christmas money,”
he says. “I remember the first nonholiday day of the New Year I went to
buy a PC.” That was 5 January 1991. In 1996, he still recalled the ma-
chine’s specification: “386, DX33, 4 Megs of RAM, no co-processor; 40
Megs hard disk.” This is unbelievably puny by today’s standards, but aside
from the hard disk, which Linus acknowledged was “not very large,”
these specs were respectable for the time—especially for running Minix.
In his first published interview, with Linux News, “a summary of the
goings-on of the Linux community” put together initially by Wirzenius,
and which ran for six months from the fall of 1992, Linus explained
what happened next. “While I got my machine on January 5th 1991, I
was forced to run DOS on it for a couple of months while waiting for the
Minix disks.” He passed the time in an unusual way: “Jan-Feb was spent
about 70–30 playing [the computer game] ‘Prince of Persia’ and getting
acquainted with the machine.”
In retrospect, it seems extraordinary that Linus’s main activity before
writing Linux was playing “Prince of Persia” for two months, even if it
had been named Best Computer Game, Action Game of the Year and
Game of the Decade.
Whatever the reason, Linus was not completely lost in the blocky if ef-
fective graphics of the sultan’s dungeons as he fought off the minions of
0738206709-01.qxd 9/4/03 10:20 AM Page 36
36 REBEL CODE
the evil Grand Vizier Jaffar. By his own calculation, during this period Li-
nus was spending around a third of his time “getting acquainted with the
machine,” presumably in a typically thorough manner.
Wirzenius says that at this time Linus “started playing around with
programming tools for MS-DOS on a PC. At some point he had an as-
sembler for writing assembly language programs” just as he had done on
his grandfather’s Vic-20. The benefits of assembly language program-
ming include faster programs and direct control of the hardware. Doubt-
less the ability to explore the 386 chip in this way was the main reason
Linus employed it, not as a demonstration of hacker prowess; he had
been programming in assembly language since his early teens and had
nothing to prove there.
Wirzenius recalls one of those first DOS assembler programs. “I re-
member him being very proud over a short piece of code that imple-
ments a subroutine that did nothing except calculate the length of a
sequence of characters. It wasn’t the task that was complicated, it was the
fact that he had written everything for himself” that made Linus so
pleased.
But as Wirzenius said in a speech he made at the 1998 Linux Expo,
“when Linus decides to learn something, he really learns it, and very
quickly.” Just because these first steps were small certainly did not mean
that they were doomed to stay that way. Already his next assembly lan-
guage program was more sophisticated than the earlier subroutine; it in-
vestigated one of the key features of the 386: task-switching.
The ability of the Intel 80386 chip to switch between tasks meant that
it could handle more than one task, or user, than once (by jumping
quickly between them). Task-switching lay at the heart of multitasking,
which in turn was one of the key capabilities of Unix. Linus may not
have been aware then that almost imperceptibly his experiments were
moving in the direction of creating the kernel of a Unix-like system.
Linus describes his first experiments in this area on the 386 this way:
“I was testing the task-switching capabilities, so what I did was I just
made two processes and made them write to the screen and had a timer
that switched tasks. One process wrote ‘A,’ the other wrote ‘B,’ so I saw
‘AAAABBBB’ and so on.” Later, Linus would write of this task-switcher,
“Gods I was proud over that”; but this was still hardly an operating sys-
tem kernel, let alone one that was Unix-compatible. Linus was, however,
getting closer–at least to the realization of what might be possible. The
final leap forward came after Minix turned up.
That historic moment is memorialized in his first posting to the
newsgroup comp.os.minix, which had been set up at the time of
0738206709-01.qxd 9/4/03 10:20 AM Page 37
A Minor Rebellion 37
Minix’s launch. Linus had doubtless been reading this for some while
before—“I read a lot of newsgroups then”—but not until Friday, 29
March 1991, did he summon up the courage to make his first posting
there. It begins:
Hello everybody,
I’ve had minix for a week now, and have upgraded to 386-minix (nice), and
duly downloaded gcc for minix. Yes, it works–but . . . optimizing isn’t work-
ing, giving an error message of “floating point stack exceeded” or some-
thing. Is this normal?
Several interesting points stand out in this posting. First, Linus had in-
stalled 386-Minix. This was a series of patches (changes and additions)
to the original Minix code “to make it usable on a 386 so that you could
actually take advantage of 32 bits, because the original Minix was 16
bits,” as Linus explains. These patches constituted the port written by
the Australian, Bruce Evans; he remembers that he had to fight to get
them accepted, so conservative was Tanenbaum when it came to enlarg-
ing his teaching system. Evans was soon to become the first person to
provide Linus with help and advice as he wrote Linux.
Linus was already using the C compiler GCC, which shows that he
was writing or about to write in C—the computer language most widely
used by professional programmers and, like assembler, requiring consid-
erable skill. These C programs, combined with the simple assembly lan-
guage experiments he had been conducting, would eventually become
the basis of Linux.
Linus’s first posting to comp.os.minix is interesting, but his second is
extraordinary. By his own admission he had had Minix for just a week;
and yet two days later—on 1 April 1991—in response to a polite ques-
tion from someone having problems with 386-Minix, Linus replies as if
he were some kind of Minix wizard for whom such questions are so triv-
ial as to be almost insulting. He wrote:
RTFSC (Read the F**ing Source Code :-)–It is heavily commented and the
solution should be obvious
0738206709-01.qxd 9/4/03 10:20 AM Page 38
38 REBEL CODE
(take that with a grain of salt, it certainly stumped me for a while :-).
Even though the first part is meant presumably as a joke, its tone of
exaggerated self-confidence is characteristic; in public Linus frequently
deflates some expression of apparent arrogance with a self-deprecating
and honest humility.
The explanation for this confusing behavior is not hard to find. Like
many gifted intellectuals, Linus as a young man lacked self-confidence;
for example, at university, he says, “I was really so shy I didn’t want to
speak in classes. Even just as a student I didn’t want to raise my hand
and say anything.” Perhaps as a Minix newbie, posting for just the sec-
ond time to comp.os.minix, Linus was similarly overcompensating with
his coarse reply. Whatever the reason, it is interesting to see displayed
here a behavior that in various and often more subtle forms would be re-
peated in other public forums, though less as time went on; as Linux
grew more obviously successful, Linus grew more self-confident.
His third posting to the Minix newsgroup, another reply to a fellow
Minix user’s query written a week after the 1 April outburst was more
moderate: “I’m rather new to 386-Minix myself,” he begins modestly,
“but have tried porting some stuff and have a few comments.”
The demands of his university studies may have been responsible for
the nearly three months that pass until Linus’s next major posting to the
newsgroup. As Wirzenius recalls, despite the growing fascination his
programming project held for him, “Linus did actually spend enough
time at the university to pass his courses.” This was not because of what
his teachers might otherwise have done: “The university itself doesn’t pe-
nalize you if you don’t pass your courses,” Wirzenius explains. “It just
means if you want to graduate then you have to take those courses later.”
Linus needed to pass his exams because of the student loan he had
taken out to buy his PC. As Wirzenius notes, “If you don’t get enough
[academic] credits for a year then you won’t get support from the gov-
ernment for the next year. If you don’t get enough credits during two
years or so then you have to start paying back the loan, which is quite a
bit of money if you aren’t working.”
Fortunately, Linus had enough natural ability to pass the required ex-
ams and still have time for serious hacking. In April, at the same time he
began exploring Minix, he decided to turn his simple task-switching pro-
gram, where one process wrote “A” and another wrote “B,” into a termi-
nal emulator that would allow him to read Usenet newsgroups on the
0738206709-01.qxd 9/4/03 10:20 AM Page 39
A Minor Rebellion 39
40 REBEL CODE
it got: “Hairy coding still, but I had some devices, and debugging was
easier. I started using C at this stage, and it certainly speeds up develop-
ment.” Moreover, “this is also when I start to get serious about my mega-
lomaniac ideas to make ‘a better Minix than Minix.’”
Linus was using Minix as a kind of scaffolding for his work on what
became Linux. But there was a drawback to this approach. As Linus re-
calls, “I essentially had to reboot the machine just to read news with my
fancy newsreader, and that was very inconvenient” because it took time
and meant that Linus lost Minix’s other capabilities. He continues, “I de-
cided hey, I could try to make all the other features of Minix available”
from within the “fancy newsreader” he had written. Linus says this evo-
lution “was really very gradual. At some point I just noticed that hey, this
is very close to being potentially useful instead of [needing] Minix.”
It was much more than just “potentially useful”: “Essentially when
you have task-switching, you have a file system, you have device dri-
vers–that’s Unix,” he explains. Linux–or rather some distant ancestor of
Linux–had been born.
When university holidays began, Linus was able to devote himself
fully to his project, which doubtless aided the subsequent rapid develop-
ment of Linux. As Wirzenius explains, the holidays ran “in theory June,
July, August, and in practice from the middle of May to the middle of
September.” Linus recalls that “the first summer I was doing [coding]
full-time. I did nothing else, seven days a week, ten hours a day.”
As Linus wrote later in a short history of Linux he put together in
1992, the consequence of these sustained bouts of programming was
that “by July 3rd I had started to think about actual user-level things:
Some of the device drivers were ready, and the hard disk actually
worked. Not too much else.” The shift of interest is evident from his next
posting to the comp.os.minix newsgroup, on 3 July, where he asked
about something called Posix for “a project”–Linux, in other words. He
wrote
Hello netlanders,
Due to a project I’m working on (in minix), I’m interested in the posix stan-
dard definition. Could somebody please point me to a (preferably) machine-
readable format of the latest posix rules?
A Minor Rebellion 41
up to address the problem of the highly fragmented Unix market that ex-
isted at that time where programs running on one flavor of Unix could
not run on another. When an operating system follows the Posix specifi-
cations it can run any Posix-compliant application.
Linus explains, “I wanted to know what the standards said about all
the interfaces”–the ways programs interact with the kernel–“because I
didn’t want to port all the programs. Every time that I had a problem
porting a program to Linux, I changed Linux so that it would port. I
never ported programs, but I ported the kernel to work with the pro-
grams.” This was the approach that Richard Stallman had adopted for his
GNU project, and with the same benefit of ready access to a large base of
existing Unix applications.
Unfortunately, as Linus recalls, “One of the people who responded to
my original query about the Posix standards said ‘Sorry, they aren’t avail-
able electronically, you have to buy them.’ That essentially meant that
OK, I couldn’t afford them.” But Linus had another idea: “We had SunOS
at the University.” SunOS was an early version of Sun’s variety of Unix
that later became Solaris. “So I used SunOS manual pages to find out the
interfaces for stuff.”
What began as a second best “actually turned out to be a lucky move,”
Linus says, “because the SunOS interface was kind of what Unix became
a year or two later.” This would not be the first time that a move dictated
more by chance or circumstances proved to be a blessing in disguise for
the future growth of Linux.
But Linus’s request for information about Posix turned out to have
even more important consequences. “The same person who told me that
the standards weren’t available also told me his area of interest was ker-
nels and operating systems,” Linus says of a member of the Helsinki Uni-
versity staff called Ari Lemmke. “[Lemmke] had this small area on
ftp.funet.fi”–an Internet server at the university where files were stored
for visitors to download using the standard File Transfer Protocol (FTP).
“It was called nic.funet.fi at that point, and he said that ‘hey, I’m putting
a directory aside for you.’ So he created the /pub/os/linux directory,” Li-
nus recalls.
“Linux was my working name,” Linus continues, “so in that sense he
didn’t really name it, but I never wanted to release it as Linux.” He says
he was afraid that “if I actually used it as the official one people would
think that I am egomaniac, and wouldn’t take it seriously.”
Linus had originally planned to call his slowly evolving software some-
thing else. In moments of depression, he sometimes felt like calling it
“Buggix” to reflect what seemed its hopelessly buggy nature, he revealed
0738206709-01.qxd 9/4/03 10:20 AM Page 42
42 REBEL CODE
in a 1995 FAQ entry. But most of the time he had another name for it. “I
chose this very bad name: Freax–free + freak + x. Sick, I know,” Linus ac-
knowledges. “Luckily, this Ari Lemmke didn’t like it at all, so he used
this working name instead. And after that he never changed it.” Linux
had now been christened as well as born.
At first, there was nothing in this Linux subdirectory on the university
server; Linus wasn’t willing to release his young and fragile kernel to the
public for a while. In the 1992 interview with Wirzenius, he said, “I was-
n’t really ready for a release yet, so the directory contained just a
README for about a month (‘this directory is for the freely distributable
Minix clone’ or something like that),” he said. At this stage, Linus is still
thinking of Linux as a Minix clone, nothing grand like Unix.
Though unwilling to release Linux, he was ready to mention its exis-
tence. On Sunday, 25 August 1991, under the subject line “What would
you most like to see in minix?” he wrote in the comp.os.minix newsgroup:
I’m doing a (free) operating system (just a hobby, won’t be big and profes-
sional like gnu) for 386(486) AT clones. This has been brewing since April,
and is starting to get ready. I’d like any feedback on things people like/dis-
like in minix, as my OS resembles it somewhat (same physical layout of the
file-system (due to practical reasons) among other things).
. . . I’ll get something practical within a few months, and I’d like to know
what features most people would want. Any suggestions are welcome, but I
won’t promise I’ll implement them :-)
Linus ([email protected])
Response to this posting was immediate. A fellow Finn wrote less than
four hours later, “Tell us more!” and asked: “What difficulties will there
be in porting?” A Minix user from Austria said: “I am very interested in
this OS. I have already thought of writing my own OS, but decided I
wouldn’t have the time to write everything from scratch. But I guess I
could find the time to help raising a baby OS :-)”–a portent of the huge
wave of hacker talent that Linux would soon ride.
In reply to the question about porting, Linus was pessimistic: “Simply,
I’d say that porting is impossible. It’s mostly in C, but most people
wouldn’t call what I write C. It uses every conceivable feature of the 386
I could find, as it was also a project to teach me about the 386.”
0738206709-01.qxd 9/4/03 10:20 AM Page 43
A Minor Rebellion 43
In conclusion, Linus detailed the current state of his project: “To make
things really clear–yes, I can run gcc on it, and bash, and most of the gnu
utilities, but it’s not very debugged. It doesn’t even support floppy disks
yet. It won’t be ready for distribution for a couple of months. Even then
it probably won’t be able to do much more than minix, and much less in
some respects. It will be free though.”
In his 1992 history, Linus recalls that from this first mention of his
“hobby” he got “a few mails asking to be beta-testers for Linux.” A cou-
ple of weeks later, in September 1991, he had pulled together the first of-
ficial Linux, which he dubbed version 0.01. Nonetheless, Linus was still
not happy with the product of this sustained bout of coding. “It wasn’t
pretty; I wasn’t too proud of it,” he said, and decided not to announce it
in comp.os.minix.
Instead, he recalls, “I put together a list of all the people who had re-
acted to my [25 August] posting by e-mail.” And then as soon as he had
uploaded Linux 0.01 to the /pub/os/linux directory that had been created
by Ari Lemmke, “I sent them an e-mail and said that ‘hey, now you can
go and look at this.’ I don’t think that this list was more than maybe ten
to fifteen” people.
The only reason he posted this first, unsatisfactory version he says,
was that “I felt that because I had this site, I had to upload something to
it.” He uploaded just the source code.
Token gesture or not, the sources came with surprisingly full release
notes–some 1,800 words. These emphasized that “this version is meant
mostly for reading”–hacker’s fare, that is, building on a tradition of code
legibility that had largely begun with the creation of C, as Dennis Ritchie
had noted in his 1979 history of Unix.
And highly readable the code is, too. As well as being well laid out
with ample use of space and indentation to delineate the underlying
structure, it is also fully commented, something that many otherwise
fine hackers omit. Some of the annotations are remarkably chirpy, and
they convey well the growing excitement that Linus obviously felt as the
kernel began to take shape beneath his fingers:
44 REBEL CODE
Well, that certainly wasn’t fun :-(. Hopefully it works . . . This is how
REAL programmers do it.
For those with more memory than 8 Mb–tough luck. I’ve not got it, why
should you :-) The source is here. Change it. (Seriously–it shouldn’t be too
difficult. Mostly change some constants etc. I left it at 8Mb, as my machine
even cannot be extended past that (ok, but it was cheap :-)
This kernel is © 1991 Linus Torvalds, but all or part of it may be redistrib-
uted provided you do the following:
— Full source must be available (and free), if not with the distribution
then at least on asking for it.
— Copyright notices must be intact. (In fact, if you distribute only
parts of it you may have to add copyrights, as there aren’t ©’s in all
files.) Small partial excerpts may be copied without bothering with
copyrights.
— You may not distribute this for a fee, not even “handling” costs.
The last clause meant that people could not charge for the work in-
volved in making floppies with the kernel (just 72K when compressed)
on them, a clear brake on Linux’s wider distribution.
In the 1992 interview with Linux News, Linus explained why he had
chosen this license. Shareware, where the software is distributed free but
you must pay if you decide to use the program, was never an option for
him: “I generally dislike shareware. I feel guilty about not paying, so I
don’t use it, but on the other hand it is irritating to know that it’s there.
0738206709-01.qxd 9/4/03 10:20 AM Page 45
A Minor Rebellion 45
Illogical, but that’s how I feel.” The first form of the license, he said, “was
probably an overreaction to the dislike I felt against the way Minix had
been set up; I thought (and still do) that Minix would have been better
off had it been freely available by FTP or similar.”
The rest of the release notes for 0.01 are mainly technical, but contain
two interesting statements. “The guiding line when implementing linux
was: get it working fast.” “Get it working fast” would become one of the
fundamental principles of Linux development and would distinguish it
from other, more careful, slower approaches.
Linus also noted that “this isn’t yet the ‘mother of all operating sys-
tems,’ and anyone who hoped for that will have to wait for the first real
release (1.0), and even then you might not want to change from minix.”
Little did Linus suspect that Linux 1.0 would be another two and a half
years away, but that by then few would hesitate about making the switch.
Linus signed off his release notes for 0.01 with a phrase that had be-
come Richard Stallman’s trademark farewell: “Happy hacking.” The re-
sults of Linus’s own happy hacking soon showed themselves in Linux
0.02. This time Linus had no qualms about telling the world. On Satur-
day, 5 October 1991, he sent a posting to comp.os.minix that began
Do you pine for the nice days of minix-1.1, when men were men and wrote
their own device drivers? Are you without a nice project and just dying to
cut your teeth on a OS you can try to modify for your needs? Are you finding
it frustrating when everything works on minix? No more all-nighters to get
a nifty program working? Then this post might be just for you :-)
As I mentioned a month(?) ago, I’m working on a free version of a minix-
look-alike for AT–386 computers. It has finally reached the stage where it’s
even usable (though may not be depending on what you want), and I am
willing to put out the sources for wider distribution.
A later paragraph in the same posting gives us some insight into how
Linus regarded Linux in the context of the other, better-established Unix
kernels, the GNU Hurd and Minix:
I can (well, almost) hear you asking yourselves “Why?”. Hurd will be out in
a year (or two, or next month, who knows), and I’ve already got minix. This
is a program for hackers by a hacker. I’ve enjoyed doing it, and somebody
might enjoy looking at it and even modifying it for their own needs. It is still
small enough to understand, use and modify, and I’m looking forward to any
comments you might have.
0738206709-01.qxd 9/4/03 10:20 AM Page 46
46 REBEL CODE
Linus was beginning to see that between the finished but essentially
frozen Minix program, and the promising but still incomplete GNU,
there was a niche for his “program for hackers by a hacker.” Linux might
be crude now, but it worked–just–which meant that it could be im-
proved, whereas the Hurd remained a tantalizing promise. Just as impor-
tant, Linus, unlike Tanenbaum, was soliciting ideas for improvement
and welcomed other people’s own efforts in this direction.
Linus says that “the second version was much closer to what I really
wanted. I don’t know how many people got it–probably ten, twenty, this
kind of size. There was some discussion on the newsgroup about design,
goals, and what the kernel should support.” Things were still pretty
small-scale, but growing. In the 1992 Linux News interview, Linus said of
the preceding version (0.01): “I don’t think more than about five [to] ten
people ever looked at it.”
As a result, he told Linux News, “Heady with my unexpected success, I
called the next version 0.10.” Linus recalls that “things actually started to
work pretty well.” Version 0.11 came out some time in early December.
“It’s still not as comprehensive as 386-minix,” Linus wrote in a posting to
comp.os.minix on 19 December 1991, “but better in some respect.” This
posting included some interesting comments from him on the present
state of Linux and its future development: “/I/ think it’s better than minix,
but I’m a bit prejudiced. It will never be the kind of professional OS that
Hurd will be (in the next century or so :), but it’s a nice learning tool
(even more so than minix, IMHO), and it was/is fun working on it.”
The launch date of the GNU Hurd is now receding fast into the “next
century or so,” and Linux is even a better learning tool than Minix. But
the main thing is that Linux “was/is fun,” the ultimate hacker justifica-
tion for any kind of project.
The 19 December posting also included a copy of Linus’s “.plan.” This
is a file that is sent when a user is “fingered” over the network (using a
program called “finger”), and was an important way for people to obtain
information about Linux without needing to sort through hundreds of
newsgroup postings. Linus’s plan at that time was headed “Free UNIX for
the 386–coming up 4QR 91 or 1QR 92”. It begins: “The current version
of linux is 0.11–it has most things a unix kernel needs”–probably the
first time Linus had publicly pitched Linux as a project to create a Unix-
like kernel rather than just a development from Minix.
The plan also noted that two other sites were now carrying the Linux
software, one in Germany, and one in Richard Stallman’s home town,
Boston, at MIT. This was run by Ted Ts’o (pronounced “cho”), whose
name appears at the bottom of the same information page (using his e-
0738206709-01.qxd 9/4/03 10:20 AM Page 47
A Minor Rebellion 47
mail name tytso) as someone who was working on new features for the
imminent Linux 0.12.
Although barely visible, this was an extremely significant develop-
ment; it meant that already other hackers were contributing to the Linux
project. This same 19 December posting to comp.os.minix also spoke of
“people” who were working on drivers for SCSI to allow another kind of
hard disk system to be accessed by Linux. “People” meant not Linus, an-
other indication that others were involved.
This is confirmed by a message posted the next day from Drew Eck-
hardt, who was the person writing those SCSI drivers, and who had al-
ready contributed to version 0.11. His posting was in response to a
general enquiry about Linux, which said in part,
Could someone please inform me about [Linux]? It’s maybe best to follow
up to this article, as I think that there are a lot of potential interested peo-
ple reading this group.
It is striking that Eckhardt answers rather than Linus, and in great de-
tail, an indication that at least one other person already knew enough to
act in this capacity, and that he uses the pronoun “we” when talking
about Linux. What had once been one hacker’s “hobby” was turning into
a community. Moreover, even at this early stage, Linus was prepared to
listen to that community’s needs.
Some of the first users of Linux wanted something called Virtual Mem-
ory (VM), which is the ability to use hard disk space as if it were ordi-
nary RAM; this was a standard feature of Unix, and a big help at a time
when memory chips were expensive. Linus wasn’t interested in this fea-
ture, probably because he had just about enough RAM, and his own hard
disk was so small. But over that Christmas he nonetheless sat down and
wrote the code to add Virtual Memory to the Linux kernel and released it
as version 0.11+VM. Wirzenius recalls that Linus “just decided, now it’s
Christmas, it’s boring, it’s family and I need to hack.”
In the early history of Linux, which he wrote in 1992, Linus recalled
that 0.11+VM “was available only to a small number of people that
wanted to test it out; I’m still surprised it worked as well as it did,” and
modestly omitted to mention that he had written this code in just a cou-
ple of days. Virtual Memory then became a standard part of the kernel
from the next version. What had begun as a small private request had
turned into a public benefit.
It seems both appropriate and yet astonishing that version 0.12 of
Linux should appear on 5 January 1992, one year to the day from that
0738206709-01.qxd 9/4/03 10:20 AM Page 48
48 REBEL CODE
pivotal moment when, as Linus put it, “I actually broke down and got a
PC.” Appropriate, because the PC thus celebrated its first birthday with
what Linus later called “the kernel that ‘made it’: that’s when linux
started to spread a lot faster.” Astonishing, because in that time Linus
had gone from a student who knew little about C programming, and
nothing about writing for the Intel architecture, to the leader of a rapidly
growing group of hackers who were constructing a fully functional Unix
kernel for the PC from the ground up.
The release notes for 0.12 display well Linus’s growing sense of
achievement; they are full of a kind of exhilaration, a sense that maybe,
just maybe, he was on the brink of something big.
They begin with a plea, written entirely in capitals, not to install Linux
unless the user knows what he or she is doing: “If you are sure you know
what you are doing, continue. Otherwise, panic. Or write me for expla-
nations. Or do anything but install Linux.–It’s very simple, but if you
don’t know what you are doing you’ll probably be sorry. I’d rather answer
a few unnecessary mails than get mail saying, ‘you killed my hard disk,
bastard. I’m going to find you, and you’ll be sorry when I do.’”
Linus has said that “earlier releases were very much only for hackers,”
implying that version 0.12 was suitable for a wider audience. This may
well be true, but the installation instructions still include such steps as
the following: “Boot up linux again, fdisk to make sure you now have the
new partition, and use mkfs to make a filesystem on one of the partitions
fdisk reports. Write ‘mkfs -c /dev/hdX nnn’ where X is the device number
reported by linux fdisk, and nnn is the size–also reported by fdisk. nnn is
the size in /blocks/, ie kilobytes. You should be able to use the size info to
determine which partition is represented by which device name.”
This indicates the gulf that separated Linux from Windows 3.1, say, re-
leased just a couple of months later.
As well as important improvements to its code, Version 0.12 brought
with it a revised license. Linus explains why:
Fairly early, there were people who happened to live in the same area and
wanted to make Linux available to others who were interested. But the
original license didn’t even allow copying fees. So you couldn’t even just
sell the diskettes for more than the price of the diskette. So it didn’t make
sense, somebody obviously had to do a lot of work because copying
diskettes is boring, it takes time, and very few people have access to these
automated copiers. So there were people asking me to allow some kind of
copying fee–even just a small one. Not because they wanted to make
money, but because they didn’t want to lose money on making Linux avail-
able to others, and helping others.
0738206709-01.qxd 9/4/03 10:20 AM Page 49
A Minor Rebellion 49
I really don’t wish to flame, but it’s starting to annoy me that 50% of the
articles I read in this newsgroup are about Linux.
On the same day that this not unjustified plaint was posted, Linus was
reporting an amusing accident on his machine. He had been trying to
connect with the university’s computer, but by mistake instructed his
terminal emulator program to dial his hard disk. Contrary to appear-
ances, this was an easy thing to do, thanks to what Peter Salus had called
“perhaps the most innovative thing that [Unix’s creator] ever thought
of,” which was that everything is a file for Unix and hence for Linux.
This means no conceptual difference exists between sending data to a
modem or to a disk drive.
This slip wiped out the Minix system that Linus had kept alongside
the steadily growing Linux. Initially, Minix had formed an indispensable
scaffolding for the development of Linux; but since Linux could now
function without the crutch of Minix, there was no reason to re-install
Minix after this mishap. Perhaps that command to dial his hard disk had
been not so much an accident as a Freudian slip on Linus’s part, a sym-
bolic cutting free from an older-generation program.
By a remarkable coincidence, the final rupture between the worlds of
Minix and Linux was brought about just two weeks later, when Andrew
0738206709-01.qxd 9/4/03 10:20 AM Page 50
50 REBEL CODE
While I could go into a long story here about the relative merits of the two
designs, suffice it to say that among the people who actually design operat-
ing systems, the debate is essentially over. Microkernels have won.”
“LINUX is a monolithic style system.This is a giant step back into the 1970s.
That is like taking an existing, working C program and rewriting it in BASIC.
This was a real insult to any hacker, as Tanenbaum must have known.
As far as portability is concerned, Tanenbaum began with a witty com-
pressed history of chip design, and concluded:
A Minor Rebellion 51
Well, with a subject like this, I’m afraid I’ll have to reply. Apologies to
minix-users who have heard enough about linux anyway. I’d like to be able
to just ‘ignore the bait,’ but . . .Time for some serious flamefesting!
Look at who makes money off minix, and who gives linux out for free. Then
talk about hobbies. Make minix freely available, and one of my biggest
gripes with it will disappear.
That’s one hell of a good excuse for some of the brain damages of minix. I
can only hope (and assume) that Amoeba doesn’t suck like minix does.
52 REBEL CODE
I wrote:
Well, with a subject like this, I’m afraid I’ll have to reply.
And reply I did, with complete abandon, and no thought for good taste
and netiquette. Apologies to [Andrew Tanenbaum], and thanks to John Nall
for a friendy “that’s not how it’s done” letter. I overreacted, and am now
composing a (much less acerbic) personal letter to [Andrew Tanenbaum].
Hope nobody was turned away from linux due to it being (a) possibly obso-
lete (I still think that’s not the case, although some of the criticisms are
valid) and (b) written by a hothead :-)
It was by no means his “last flamefest,” but the tone of the discussion
was now more moderate as he and Tanenbaum argued some of the more
technical points at stake. This drew in several other people and called
forth the following wise words that neatly summarize the situation from
an independent standpoint:
Ken
A Minor Rebellion 53
Between Linus and Tanenbaum, the tone became almost playful. Pick-
ing up again on the two main themes, kernel design and portability,
Tanenbaum wrote, “I still maintain the point that designing a monolithic
kernel in 1991 is a fundamental error. Be thankful you are not my stu-
dent. You would not get a high grade for such a design :-)” As for porta-
bility, he said, “Writing a new OS only for the 386 in 1991 gets you your
second ‘F’ for this term. But if you do real well on the final exam, you can
still pass the course.”
To which Linus replied in good humor
Well, I probably won’t get too good grades even without you: I had an argu-
ment (completely unrelated–not even pertaining to OS’s) with the person
here at the university that teaches OS design. I wonder when I’ll learn :)
Much of this work has to be available still. I made my 1.40 port [of GCC]
available, but I have to admit that I don’t know where it ended up. It’s been
eight years . . .
0738206709-01.qxd 9/4/03 10:20 AM Page 54
54 REBEL CODE
It is hard not to detect in these words a hint of a nostalgia for those far-
off, heady days of youth when, starting alone and working from nothing,
he built, bit by bit, the program that bears his name; when he planted the
seed of what would grow into the kernel of a complete operating system
used by tens of millions of people, and began a movement whose ramifi-
cations continue to spread.
0738206709-01.qxd 9/4/03 10:20 AM Page 55
Factor X
55
0738206709-01.qxd 9/4/03 10:20 AM Page 56
56 REBEL CODE
Factor X 57
MS-DOS. As Linus said in his interview with Linux News in 1992, “When
Minix finally arrived, I had solved PoP [Prince of Persia]. So I installed
Minix, (leaving some room for PoP on a DOS partition)”–a separate area
of his hard disc–“and started hacking.”
Linus wrote Linux so that it could work alongside the Microsoft oper-
ating system, just as Minix could, rather than as an all-or-nothing re-
placement for it. The knock-on effect of this personal requirement was
that, later on, those curious about Linux could try it without needing to
throw out DOS completely, a much bigger step that would have throttled
the early take-up of Linux considerably.
Matt Welsh, later author of one of the first books on GNU/Linux, Run-
ning Linux, and then an undergraduate at Cornell University, was typical
of these hesitant PC users. He says, “Early on, I was very skeptical about
Linux. I had a nice setup with MS-DOS with most of the GNU DOS util-
ities installed”–versions of Stallman’s GNU programs that had been
ported to run on MS-DOS–“so my DOS machine acted much like a Unix
system. Why would I want to throw that out and struggle with this
bleeding-edge, barely working Unix clone?”
The reason Sladkey had taken the plunge was simple. “The idea of a
free compiler with the quality of GCC, already well established at the
time, and a free hosting OS [operating system] supporting a full multi-
tasking Unix environment was very attractive. So attractive, it had to be
tried to see if it could be real,” he says. Despite this attraction, Sladkey
chose to leave MS-DOS on his machine, just as Linus had. Sladkey
couldn’t help noticing that “from my very first installation, Linux was
dramatically more stable than my co-resident installation of Windows
3.1 on top of MS-DOS 5.”
Once again Stallman’s GCC had played a key role in driving the uptake
of Linux. Less apparent is that the “free hosting OS” Sladkey refers to
consisted of Linux plus several other indispensable GNU programs. In
effect, Linux was able to drop into the hole left in the GNU project by the
long-delayed Hurd kernel and realize Stallman’s dream of a complete and
free Unix clone. Rather than being called just “Linux,” the resulting op-
erating system is more correctly described as “GNU/Linux” to reflect its
two major components.
Welsh remembers that when he eventually installed GNU/Linux, he,
too, “got the fever. It was all about doing things yourself,” he explains,
“adding new features, and constantly learning more about how comput-
ers and operating systems work.”
Sladkey provides a good example of what the practical benefits of “do-
ing things yourself” were, and why even at this early stage GNU/Linux
0738206709-01.qxd 9/4/03 10:20 AM Page 58
58 REBEL CODE
had crucial advantages over Windows. “There were in fact bugs,” he re-
calls, “But the essential difference was in the obviousness of bugs, the re-
peatability of bugs, and potential for fixing bugs oneself. In this
environment, bugs were only temporary delays on a steady road towards
excellence and stability.
“Windows, on the other hand,” he continues, “was miserably unreli-
able with mysterious crashes, inexplicable hangs, and a pervasive
fragility that taught you to avoid doing anything fancy with even adver-
tised features of the OS because it could not take the stress. The only way
to make things more stable on Windows was to avoid doing things or to
fiddle ad nauseam with the configuration and then just wait and see if
things improved.”
Sladkey recalls the first time he found and sent a bug to Linus: “My
first contribution was in porting some program, probably one of my
smaller personal projects. I discovered a bug. Since Linux came with
source, my first inclination as a hacker was to take a look under the hood
and see if I could fix the problem. I found that although I had never done
any kernel work, that I was able to navigate around the code pretty easily
and provide a small patch to correct the problem.
“With my heart beating and my palms sweating, I composed the most
professional message I could muster and sent it off to linus.torvalds@cs.
helsinki.fi describing the bug and including my proposed fix. Minutes
later he replied something like, ‘Yup, that’s a bug. Nice investigation.
Thanks. Fixed,’ and I was hooked.”
These events are emblematic of the entire Linux development. Users
running a program—perhaps something unusual that reveals hitherto
unsuspected problems—find bugs; operating system code can be made
to work well with all the most common programs and yet still contain
more subtle bugs that are thrown up only in the course of time.
Because Linux comes with the source code (unlike MS-DOS or Win-
dows, which are sold only as “black boxes” that cannot be opened), a
hacker is able to poke around inside the program. Whatever reservations
some Minix users might have had about the quality of the Linux code, it
was written in such a way that hackers could find and fix the cause of the
problems they encountered.
Thanks to the Internet, the solutions to these problems could be sent
directly to Linus, who could similarly use the medium to reply, some-
times within minutes. Linus’s own openness to bug-fixes, and his lack of
protectiveness regarding his code, is key in making the process work.
Linus explains his viewpoint on these early bug fixes. “They started
out so small, that I never got the feeling that, hey, how dare they impose
0738206709-01.qxd 9/4/03 10:20 AM Page 59
Factor X 59
on my system. Instead, I just said, OK, that’s right, obviously correct, and
so in it went [to the kernel]. The next time it was much easier, because at
that time there weren’t many people who did this, so I got to know the
people who sent in changes. And again they grew gradually, and so at no
point I felt, hey, I’m losing control.”
This sensible attitude speaks volumes about how much Linus had ma-
tured since his early and somewhat adolescent outbursts to the Minix
newsgroup; his accepting implicit criticism with such good grace shows
that not just Linux had made progress by 1992.
Sladkey’s experience is also a good example of the benefits to Linux
and its users that this openness to suggestions and willingness to act on
them engendered. As he explains, “Initially I just fixed bugs everywhere
I found them. I would spot a problem, research the problem, determine
the expected behavior, fix the problem, test my fix,” and then send off a
description of the bug with the resultant patch to the kernel code.
Although finding and fixing bugs was an extremely useful activity,
Sladkey was soon doing more: “I was becoming more loyal to Linux, and
so any negative publicity about a bug or missing feature was enough mo-
tivation for me to hunt down a problem and fix it, even though it didn’t
affect me personally because I didn’t use that feature of the OS.”
The culmination of this altruism was his work on the Network File
System (NFS). Even though Sladkey describes NFS as “a dated and
brain-damaged protocol”–a protocol is just a set of rules–“and useful
mostly for connecting with legacy Unix systems,” it was an important
networking capability for a Unix-like operating system to offer.
He continues, “I wrote the NFS client”–the software that runs on a
user’s machine so that he or she can access a Unix system running the
NFS server–“partly as a challenge for myself.” What is more remarkable,
he says he wrote it “partly because people who were critical of Linux
were using the absence of NFS as a fault” that could be laid against what
had become his system of choice.
“When I initially announced my NFS client on the kernel mailing
list, there was a great deal of interest,” Sladkey recalls. “Many people
were willing to try early alpha code because they wanted this feature so
badly.”
Being able to draw on many users “was a whole model that was
worked out and debugged in the Minix world,” as Tanenbaum points
out. That Linus himself was aware of this early on is shown by his post-
ing to the Minix newsgroup on 19 December 1991. In it, commenting on
the pros and cons of Minix and Linux, he noted that the “minix still has
a lot more users,” and that this meant Minix had “better support” than
0738206709-01.qxd 9/4/03 10:20 AM Page 60
60 REBEL CODE
Linux. Linus noted that his kernel “hasn’t got years of testing by thou-
sands of people, so there are probably quite a few bugs yet.”
Similarly, eighteen months later, when Linus introduced yet another
version of the kernel, he made what he called a by now “standard re-
quest”:
“Please try it all out, give it a real shakedown, and send comments/bug-re-
ports to the appropriate place. The changes are somewhat bigger than I’d
prefer, so the more testers that try it out, the faster we can try to fix any
possible problems.”
Factor X 61
“So in my case, Linus improved the kernel in a way that made more
work for himself and for me in the short term, but made the kernel
clearer, cleaner, and more maintainable in the long run. This lesson by
example of taking the high road and doing things right, instead of taking
the path of least resistance, made a very big impression on me at the time
and became an essential part of my programming philosophy.”
What Linus chose to do in this case was significant and typical. Upon
being presented with code that added major new functions to the kernel,
he did not bolt it on in the simplest way possible; instead, he used
lessons learned from the new code to extend and strengthen the existing
kernel for possible future developments. Although this meant more
work, not only did it provide a sounder base to work from but it helped
to ensure that (unlike many commercial software programs that become
increasingly spaghetti-like and convoluted as new features are added)
the structure of Linux would improve the more such functions were
added.
Linus adopted the same approach for Sladkey’s port of GNU Emacs to
GNU/Linux, an important addition to the collection of applications.
Sladkey describes this as “a classic example of how a port showed up
weaknesses or incompatibilities in Linux that were addressed not so
much by porting the program to the host but by adapting the host so that
the program required fewer changes.”
Linus had chosen this approach early on. “I even fetched GNU sources
off the Net and compiled them under Linux, and checked that they
worked just like they should work even though I didn’t use them person-
ally,” he recalls. “Just to make sure that, yeah, I got that part right.” In
doing so, he was building once more on many years’ work by Richard
Stallman.
Stallman had decided to create a Unix-like system that was not based
on AT&T’s source code in any way; instead, it used the features of Unix as
a template and implemented them independently. Because he succeeded
so well, the GNU suite of programs provided a bench test of Unix kernel
compatibility against which Linux could be measured without the need to
run or, just as important, to buy expensive commercial Unix software.
This approach of adapting the kernel to fit the applications paid off
handsomely when it came to porting what would prove to be one of the
strategically most important additions to the portfolio of GNU/Linux
programs: the X Window System.
Drew Eckhardt, who has followed Linux from the very earliest days,
rates the arrival of X as one of the three milestones in the history of
0738206709-01.qxd 9/4/03 10:20 AM Page 62
62 REBEL CODE
Linux. The other two are the virtual memory (VM) capabilities, which
Linus had added in December 1991, and networking.
At first, Linux had been controlled through a bare shell like GNU’s
Bash, one of the reasons for calling the overall operating system
GNU/Linux. This shell works rather like MS-DOS: Users type in com-
mands that cause the kernel to respond in certain ways. Such command-
line interaction was standard for the Unix world at the time; as a result,
most Unix hackers were completely at home entering these opaque com-
mands on a keyboard.
The world of the Apple Macintosh or Microsoft Windows, then at ver-
sion 3.1, was a complete contrast. Here, control was effected by selecting
options from pull-down menus available from various overlapping win-
dows present on the screen, or, even more directly, by clicking on-screen
graphical icons.
For many years Unix had possessed the foundation of a similar
graphical approach; this was called X because it was the successor to a
windowing system called W, and it was a standard component on com-
mercial Unixes. Unlike the Macintosh or Microsoft Windows systems,
its principal use was not to make control easier (X on its own lacked
menus and icons) but to enable viewing and control of multiple pro-
grams simultaneously through several X windows open at once.
As soon as GNU/Linux began to offer more than the most basic Unix
functions, X climbed high on the wish list of many users. Because Linus
was busy improving the kernel, and windowing systems are not part of
that in Unix systems (unlike Microsoft Windows, where the graphical in-
terface is tightly integrated with the basic operating system code), some-
body else needed to step forward to start things moving.
The Linux movement did not and still does not have a formal hierar-
chy whereby important tasks like these can be handed out, an apparent
weakness that has proved a strength. A kind of self-selection takes place
instead: Anyone who cares enough about developing a particular pro-
gram is welcome to try. Because those most interested in an area are often
the most skillful, they produce high-quality code. If they succeed, their
work is taken up. Even if they fail, someone else can build on their work
or simply start again.
And so it was that in early 1992, Orest Zborowski started to port X to
GNU/Linux. Dirk Hohndel, who later took over responsibility for the en-
tire area of X for Linux explains that “getting X ported wasn’t all that
hard, since X is already rather portable. The majority of the work
[Zborowski] did was actually adding features to the Linux kernel.” So
that he could obtain this important feature for systems running the
0738206709-01.qxd 9/4/03 10:20 AM Page 63
Factor X 63
Linux kernel, Linus was once more prepared to accept changes. But un-
like other ports for Linux, X soon became part of a major independent
free software project that had interesting parallels to Linux in its origins
and subsequent development.
The X Window standard was managed by the X Consortium, a body
put together with the aim that whatever other splits there were in the
Unix world, at least windowing would be unified. To ensure that no
manufacturer could gain an unfair advantage from this work, the X Win-
dow standards were made freely available under the MIT X licence,
which essentially allows you to do anything with the code.
This was good news for those outside the consortium because it
meant that they could take the code for X and port it to other plat-
forms. For example, at the end of 1990, Thomas Roell had created
X386, which was designed for Unix running on the Intel 80386, the
chip Linus had in his PC. “Everybody said the Intel-Unix PCs would be
just toys,” Roell recalls, “but I wanted to have graphics.” Roell gener-
ously contributed the updated version of his code, X386 1.2, to the X
Consortium for inclusion in its next release of X Window, X11R5,
which came out in October 1991.
One user of X386 1.2 was David Wexelblat. He remembers how “the
X11R5 server itself was basically buggy as hell, a huge step backwards
from the X11R4 stuff.” Even more unfortunately, Wexelblat explains,
Roell “had taken the [X386 server] stuff commercial [so] there weren’t
going to be any free enhancements” from him to fix these problems. As a
result, Wexelblat and a group of other people started putting together
their own patches to the X386 server.
Two of them, “Glenn Lai and Jim Tsillas,” Wexelblat remembers,
“each independently were working on the performance problems,” one
of the main areas that needed fixing through patches to the X386 code.
“David Dawes, down in Australia,” Wexelblat continues, “had started
working on some of the various other bugs in the R5 server, and he was
also distributing patches.” Meanwhile, Wexelblat was working on his
own solution.
“And we were all out there, all doing these things and distributing lit-
tle bits and pieces out through Net news,” he says, referring to the
Usenet newsgroups that were commonly used at the time for passing
around code. “And after I had been doing that for a couple of weeks, I
got the brilliant idea, you know, this is really stupid, there are four peo-
ple doing this work and some of it’s at cross-purposes, and we’re dupli-
cating effort–why don’t we all get together and just produce one
package?
0738206709-01.qxd 9/4/03 10:20 AM Page 64
64 REBEL CODE
“We first started this in April ’92,” Wexelblat recalls. A month or two
later, they started distributing their combined patches. “We called what
we were distributing simply X386 1.2e,” an enhanced version of the cur-
rent X386 1.2. Afterward, Wexelblat and his fellow hackers renamed
their software:
These days most people don’t get the pun [Wexelblat notes], but this thing
came out of X386 which had gone commercial, and since we were doing
freeware, ours was XFree86. We really had no grand plans at that time, but
basically once we had gotten past this first bug-fixing thing, we started get-
ting requests for other stuff. Like, oh, we need a driver for this video card,
and oh, there’s this operating system out there, we need to support it. And
at that time, about June or July of ’92, we first started seeing the Linux peo-
ple coming to talk to us. There seemed to be from the start a philosophy in
the Linux community of well, we’re defining this from scratch, we might as
well define this to what’s as close to a standard as is out there.
Changing the kernel to follow standards in this way made it much eas-
ier for the XFree86 group to support GNU/Linux, and was a continua-
tion of the approach Linus had adopted when modifying his early kernel
to ensure better compatibility with GNU software.
The relationship between the XFree86 and GNU/Linux movements
became positively symbiotic: XFree86 for GNU/Linux made the latter
more attractive and brought it more users; and the spread of
GNU/Linux-based systems gave XFree86 the perfect, free platform. Dirk
Hohndel remembers discussing this with Linus: “We talked about the
synergy between the two. XFree86 wouldn’t be anywhere close to where
it is without Linux. And vice versa.”
That XFree86 came into being and flourished at just the time Linux
too was created and growing is no coincidence. Both arose then thanks
to the fall in cost of PCs based around Intel’s 386 chip. In a reply to a
posting in comp.os.minix in December 1992 in which the writer had
said, “I’ve never used a 386. I’ll never use one, unless you put a gun on
my head,” Linus wrote,
You are missing out on something. I programmed a 68k machine [his Sin-
clair QL, which had Motorola’s 68008 chip] before getting a PC, and yes, I
was a bit worried about the intel architecture. It turned out to be ok, and
more importantly: it’s the cheapest thing around.
Minix had successfully created a Unix clone on the earlier 8086 chip,
but it lacked such crucial features as virtual memory, the feature that Li-
0738206709-01.qxd 9/4/03 10:20 AM Page 65
Factor X 65
nus had written in a few days before Christmas 1991, or any X Window
port. And although 386-Minix existed, Tanenbaum was unwilling for it
to be developed too far lest it lose its value as an educational tool, the
original impetus behind writing Minix. This left many Minix users frus-
trated and only too ready to switch to a faster-moving and more ambi-
tious project such as GNU/Linux when it came along.
Some of the success of GNU/Linux can therefore be attributed to hard-
ware that was powerful enough to allow full Unix-like features to be
adopted, and yet cheap enough for many people to own it. The arrival of
the 386 chip by itself was by no means enough to drive the GNU/Linux
revolution; this was shown by the history of 386BSD, another free Unix
operating system created to exploit the power of the Intel chip at the
time Linux came into being.
The project had been begun by Bill and Lynne Jolitz. The idea was to
port the highly popular Berkeley Systems Distribution (BSD) variant of
Unix created at the University of California at Berkeley to the PC and
thus satisfy the needs of die-hard Berkeley fans who had bombarded
Tanenbaum with hundreds of e-mails every day in an effort to convince
him to turn Minix into something similar.
The early Linux hacker Richard Sladkey says of the time he was trying
to gauge which one he should install, “I was reading both the Linux and
386BSD [news]groups. Most anyone who was familiar with Unix knew
that [the widely used editor] vi and networking came from Berkeley, not
AT&T, so the acronym BSD automatically commanded some respect. A
lot of people just assumed 386BSD would get there first [ahead of
GNU/Linux].”
Linus says, “I knew about the 386BSD project by Jolitz because he
wrote about it in Dr Dobb’s [Journal],” a major series of articles that ran
from January 1991 in that magazine, the leading programmer’s title of
the time. Linus told Linux News in 1992 that “386BSD was helpful in giv-
ing me some ideas,” but acknowledges, “if 386BSD [had] come out a
year earlier, I probably wouldn’t have started Linux. But it was delayed,
and delayed some more. By the time it actually came out, I already had a
system that worked for me. And by that time I was so proud of [its] be-
ing mine that I didn’t want to switch.” Moreover, “I already had a few
users, maybe a few hundred users,” he says. This was in early 1992.
In addition to the delay, there were a couple of other factors that gave
GNU/Linux important advantages in its competition for users. Sladkey
recalls that “Linux could multiboot with another OSes, while 386BSD re-
quired a whole disk [formatted in such a way] that wasn’t compatible
with [MS-]DOS. So for just that one reason alone I chose Linux because I
couldn’t afford a whole ’nother computer.”
0738206709-01.qxd 9/4/03 10:20 AM Page 66
66 REBEL CODE
Given the huge installed base of MS-DOS, Sladkey was certainly not
alone in wanting to be able to run DOS programs too, and 386BSD’s ap-
proach doubtless lost it users, especially as Microsoft Windows became
more popular after the release of version 3.1. It is ironic to see Microsoft’s
success on the desktop emerging as a factor that helped GNU/Linux gain
a crucial early foothold.
Others were put off because 386BSD required a more powerful PC
than did GNU/Linux. Lars Wirzenius recalls that 386BSD needed “a 387
coprocessor, which I didn’t have, and so there was no chance that I
would actually look at that one.” The Intel 80387 was a math coproces-
sor used alongside the 80386 to speed up numerical routines. Linus, by
contrast, had chosen to emulate the 387 in software (or some of it at
least; full 387 compatibility was added by the Australian, Bill Metzen-
then). This meant that Linux users did not need to add the then-costly
387 chip. Once again, whether by luck or judgement, Linus was making
all the right decisions.
Last, there was a legal cloud hanging over 386BSD. Sladkey explains
that “at the time, the AT&T versus Berkeley lawsuit was going on” over
whether the BSD Unix included proprietary material from AT&T’s Unix.
“And even though everyone on Usenet was in favor of UCB [the Univer-
sity of California at Berkeley, which owned BSD],” there was a concern
that maybe some shady stuff had happened, “and even if it hadn’t,” as
Sladkey says, “AT&T would probably win, which would throw anything
derived from BSD into question,” and that included 386BSD. “So this le-
gal doubt over the outcome of the lawsuit played favorably towards
Linux, which was a clean room implementation,” he adds.
Sladkey reckons that Linux needed all the advantages it could get at
this time “because BSD was a complete OS that just needed to be ported
[to the 386], whereas Linux was a kernel looking for a few good utilities
that could make it look like a real OS”; these were utilities that would
generally be obtained from the GNU project. Linus confirmed this view
in the Linux News interview. Referring to 386BSD, he said, “It’s bit scary
to have big and well-known Unix that fills a similar niche as Linux.”
Potentially, then, 386BSD might still have won out against GNU/Linux
once the port was finished, and the price of 387 chips fell. But as Sladkey
notes, “After getting involved with Linux, I continued to read both of the
newsgroups and it became clear that I had made the right decision. Re-
leases [of 386BSD] were infrequent, patches didn’t make it into releases
when they did happen, etc.” Drew Eckhardt confirms this view: “Bill
Jolitz didn’t accommodate users’ needs. He said, ‘ours is a research sys-
tem,’ meaning we’re not going to accept patches, and it doesn’t have to
0738206709-01.qxd 9/4/03 10:20 AM Page 67
Factor X 67
68 REBEL CODE
Factor X 69
tralia, but the Net’s ability to cancel distance meant that Linus could turn to
him for help whenever he needed it, and often receive a response within an
hour. E-mail also played a key role in overcoming Linus’s early awkward-
ness with people. “One of the reasons I liked [e-mail] so much at first,” he
says, “[is that] you don’t see anybody, so you don’t have to be shy.”
The Linux code grew out of the basic terminal emulator program Li-
nus had written so that he could access Usenet newsgroups held on the
university computer. Although he read a lot of news in those early days
of Linux, he notes that he “wasn’t really writing a lot of news.” When he
did post, it often had a major effect on Linux; for example, in asking
about the Posix specifications in the comp.os.minix newsgroup, Linus
had come into contact with Ari Lemmke, who offered him space on his
FTP site, a critically important step.
“I made a lot of good design issues with Linux,” Linus said in 1996,
“but the best design decision I ever made was making it available by
FTP.” This implies that making it available in that way was by no means
a foregone conclusion, as it might be today when everything is routinely
uploaded to a Net server. Linus explains that “what happened was not so
much that people began sending me patches, because that didn’t happen
at first. It takes a while before people get so used to the system that they
start making their own changes and sending them back. But what hap-
pened immediately was all the feedback.”
Lars Wirzenius has no doubts that this was critically important for his
friend at this point. He says, “Without the Internet, Linus would have
created 0.01 and maybe 0.02, then he would have become bored [and
moved on to something else]”; one of Linus’s key character traits is a
constant need for new challenges. Linus confirms this view: “Without
Usenet, I’d probably have stopped doing Linux after a year or something
. . . it was a fun project, but at some point I’d have felt that hey, it’s done,
I’ve proved it, I did this thing, it was fun. What’s the next project in life?
“But because people started using it, motivation went up, there was a
sense of responsibility, and it got a lot more motivating to do it. And so
thanks to Usenet I just continued doing it.”
Just as the Net motivated Linus, so it would motivate the hackers who
started working alongside him. Users who found bugs in software and
perhaps had suggestions for their resolution (because the source code
was available) could contact the relevant author directly using the Inter-
net. The user benefited by responses that often came in hours or even
minutes, and the hacker received instant feedback and kudos.
The Net allowed hackers almost anywhere in the world, starting with
Linus in a country that found itself off the beaten track geographically,
0738206709-01.qxd 9/4/03 10:20 AM Page 70
70 REBEL CODE
Patching Up
IT MIGHT SEEM STRANGE THAT LINUX, a system that was born and
grew up across the Internet, lacked the ability to connect to it for the first
eighteen months of its life. The explanation is that Linux was born on
the cusp of the current Internet era, just as it moved out of academe into
the mainstream; this is reflected by a comment Andrew Tanenbaum
made during the “Linux is obsolete” episode. On the 3 February 1992 he
wrote, “A point which I don’t think everyone appreciates is that making
something available by FTP is not necessarily the way to provide the
widest distribution. The Internet is still a highly elite group. Most com-
puter users are NOT on it.” Linus, by contrast, said, “the best design de-
cision I ever made was making it available by FTP” for downloading
from a Helsinki-based server connected to the Internet.
In effect, Linux came out at the right time to ride the growing wave of
interest in the Internet. The reason early computers running GNU/Linux
had not needed to connect to the network is explained by Olaf Kirch, au-
thor of Linux Network Administrator’s Guide, and one of the key figures in
the Linux networking world. “The point is that everyone was on the In-
ternet” already, he says, “otherwise they wouldn’t have been able to par-
ticipate in Linux development. I think most people were at universities
and therefore had decent connectivity.”
This comment throws an interesting light on where and how
GNU/Linux was used in those early days. In Kirch’s view, most people
71
0738206709-01.qxd 9/4/03 10:20 AM Page 72
72 REBEL CODE
Patching Up 73
74 REBEL CODE
Patching Up 75
76 REBEL CODE
Patching Up 77
78 REBEL CODE
Suppose Fred van Kempen . . . wants to take over, creating Fred’s LINUX
and Linus’ LINUX, both useful but different. Is that ok? The test comes
when a sizable group of people want to evolve LINUX in a way Linus does
not want.
I think coordinating 1,000 [software] prima donnas living all over the world
will be as easy as herding cats.
Linus posted his reply to the newsgroup the next day, 6 February
1992:
Patching Up 79
months before I expect to find people who have the same “feel” for what
happens in the kernel. (Well, maybe people are getting there: [Ted Ts’o] cer-
tainly made some heavy changes even to 0.10, and others have hacked it as
well.)
In fact I have sent out feelers about some “linux-kernel” mailing list
which would make the decisions about releases, as I expect I cannot fully
support all the features that will /have/ to be added: SCSI etc, that I don’t
have the hardware for. The response has been nonexistent: People don’t
seem to be that eager to change yet . . . if Fred van Kempen wanted to make
a super-linux, he’s quite welcome.
Yes, coordination is a big problem, and I don’t think linux will move away
from me as “head surgeon” for some time, partly because most people un-
derstand about these problems. But copyright /is/ an issue: if people feel I
do a bad job, they can do it themselves.
the only thing the copyright forbids (and I feel this is eminently reasonable)
is that other people start making money off it, and don’t make source avail-
able etc. . . . This may not be a question of logic, but I’d feel very bad if
someone could just sell my work for money, when I made it available ex-
pressly so that people could play around with a personal project.
The GNU GPL allows others to take the Linux source code and modify
it provided they make the modifications available; that is, if someone else
makes a “super-Linux,” there is nothing even Linus can do to stop that
person if the source code is provided. The advantage of this approach is
that Linus could then take changes he approved of from such a “super-
Linux” and fold them back into “his” Linux. The downside was that the
GNU GPL made such divergences possible in the first place.
Linus’s answer to the growing split in the networking code between
Fred van Kempen’s work and Alan Cox’s code proved to be the same as
the one he gave in February when he replied to Tanenbaum’s posting. Al-
though he could not forbid anyone to develop in a certain way, he could
use his authority (built, as he had rightly said, on his knowing it “better
than anybody else” and on his proven ability in managing the kernel de-
velopment) to approve one branch of the code, thus making it part of the
0738206709-01.qxd 9/4/03 10:20 AM Page 80
80 REBEL CODE
Patching Up 81
dards. “The protocol stuff was mostly right, it was just everything under-
lying was a bit flaky or had holes in it.”
Fixing code that “was a bit flaky or had holes in it” was one of Cox’s
fortes. “Cleaning horrible code up was one thing I appeared to be good
at,” he says, and he soon emerged as Linux’s bug-fixer par excellence. “I
often figure bugs out in my sleep,” he confesses.
As well as debugging and writing new code, Cox also began to assume
the important additional role of one of what are often called Linus’s
“trusted lieutenants.” These are senior hackers who are responsible for
certain areas of the kernel. Their job is to filter patches from other hack-
ers, check them, and then pass them on to Linus. Alan Cox recalls,
“Fairly early on, people started sending me things” for the networking
code. “If there’s anything they’re not sure about, someone would say, ‘I
think this is a fix but I’m not sure, what do you think?’”
Linus had never planned the addition of this crucially important de-
velopment infrastructure; it arose in response to the situation. As he said
in 1996,
I’ve never made any conscious decisions that this is how I want it to work
in the future. All the changes have been automatic responses to different
circumstances. At first, people just sent me patches directly, and what hap-
pened was I got to trust some people because I’d been e-mailing with them;
I knew they were technically sound, and did the right thing [a key hacker
concept]. Other people started sending patches to these people I trusted
because that offloaded some of my work because the people I trusted
checked the patches.
It’s worked pretty well. In some cases I tell people who e-mail me di-
rectly, ‘Can you send it to this other person just to check?’ because I don’t
have the hardware, for example. The other person’s the guy in charge of
that part. And then after that, it kind of automatically goes the right way. So
there’s maybe ten or twenty who are in charge of certain parts of the kernel
like networking, SCSI drivers subsystem, whatever. And then there are a lot
of people who have one specific driver that they are in charge of. Usually
the original authors of that driver, but in some cases they may have gotten a
new machine and somebody else has taken over that driver. Stuff like that I
generally apply directly, because I can’t test it and I just have to trust the
person who wrote it originally.
82 REBEL CODE
ver it’s really so localized that it should never impact anything else.” In
other words, the kernel design is highly modular, with clean interfaces
between the parts. “It has to be so, when there’s people all over the world
doing this,” Linus noted. “There’s no weekly brainstorming session
where everybody gets together and discusses things.”
This model had been evolving even before Cox had taken over respon-
sibility for the networking code. Generally, though, the kind of show-
down between rival strands of development that had taken place there
had been avoided. The case of another trusted lieutenant, Ted Ts’o, is
typical.
Like Cox, Ts’o was born in 1968, but a third of the world away, in Cal-
ifornia. His father was a doctor, and his mother helped run the family
practice. It was through his father’s research work that Ts’o first came
into contact with computers: “I got started at a relatively early age, play-
ing with my father’s [Digital] PDP-8, and PDP-15,” Ts’o recalls. “In fact,
those were the only computers I really had to play with until high
school.” He programmed these Digital machines not in a high-level lan-
guage such as Basic, but in “raw assembler.” When Ts’o went to MIT, in
1986, he studied computer science and graduated in 1990.
Soon afterwards, in the fall of 1991, he came across GNU/Linux. This
would have been version 0.02 or 0.03, because Ts’o had contributed to
the kernel as early as 0.10, which came out in December 1991. He added
new code that gave users extra ways of controlling programs that were
running. The motivation for sending this code to Linus is interesting.
“Part of the reasons why I did it,” Ts’o says, “was just simply that it
needed to be done, and it would make it a whole lot more useful for my-
self.” But also, he says, “the thought did occur to me, gee, if I did this, I
bet a lot more people would find Linux useful.
“I think a number of people instinctively knew that there was such a
thing a critical mass, and if you could get enough people working on
[Linux], then you might get something very interesting out.” Moreover,
Ts’o and others had already put this theory into practice elsewhere:
“There were a lot of folks who really understood that you could con-
tribute changes back to [free programs], and they would get accepted or
not, but that the end result was a much better product.”
As well as continuing to feed through patches, Ts’o made another im-
portant contribution to the GNU/Linux movement in the early days by
setting up the first site in the United States that carried the Linux kernel
and related software. Ts’o explains the background to this move. “I no-
ticed that the only place you could get Linux in those days was on ftp.fu-
0738206709-01.qxd 9/4/03 10:20 AM Page 83
Patching Up 83
84 REBEL CODE
Responsibility for one major area, that of the Linux file system ext2, is
shared with another of the most senior Linux hackers, Stephen Tweedie.
Tweedie was born in Edinburgh, Scotland in 1969. Both his parents
worked in the computer industry, and Tweedie says, “I can’t ever remem-
ber not having been in the presence of computers.”
When he was fourteen, he was already creating operating system ex-
tensions for his Sinclair ZX-81 micro, a machine Alan Cox had also used
as a boy. Tweedie studied computer science at Cambridge University; af-
ter graduating, he returned to Edinburgh to do research at the Scottish
capital’s university, where he discovered GNU/Linux, around the begin-
ning of 1992. “I didn’t specifically get into it for the single purpose of
hacking on it,” he says, “but I did do that from very early on.”
Tweedie echoes the explanation of Ts’o as to how people came to be as-
signed areas of responsibility. “It’s very simply a case of who’s actually
working on something,” Tweedie says, “and who has got a track record
of making good choices. And if you are working on something and
you’ve got credibility in the eyes of other people who are doing that, then
other patches tend to come through you.
“There are people who are generally recognized as being not owners
but simply experts in particular areas. The community takes advantage
of that and works through those people to get work done in those areas.”
In other words, “it’s very, very much a meritocracy,” he says, and repre-
sents a kind of natural selection by the community.
Linus’s willingness to defer to his lieutenants and to hand off responsi-
bility for major chunks of the kernel has had another important effect.
As the continuing contributions of Alan Cox, Ted Ts’o, and Stephen
Tweedie over the last decade indicate, they have all remained deeply
committed to the Linux movement for many years, even though any one
of them could easily have started up and led his own rival group—as
Linux’s license permitted.
This fierce loyalty to a single strand of development stands in stark
contrast to other free Unix-like kernel projects. For example, after
386BSD lost momentum, NetBSD was started. As the announcement by
the founder, Chris Demetriou, of NetBSD 0.8 in April 1993 explained,
NetBSD was “based heavily on 386BSD 0.1,” with various patches ap-
plied and new programs added. Similarly, FreeBSD, “had its genesis in
the early part of 1993, partially as an outgrowth of the ‘Unofficial
386BSD Patchkit,’” as Jordan Hubbard, one of its instigators, writes in
his history of the software. Yet another offshoot of 386BSD is the
OpenBSD project, led by Theo de Raadt, which places particularly strong
emphasis on security.
0738206709-01.qxd 9/4/03 10:20 AM Page 85
Patching Up 85
86 REBEL CODE
87
0738206709-02.qxd 9/4/03 10:20 AM Page 88
88 REBEL CODE
Linus placed copies of the boot and root disks on the Helsinki server,
where he also placed the source code. Because both the source and bina-
ries were freely distributable, several other sites around the world started
“mirroring,” or copying, them. One of these was the Manchester Com-
puting Centre (MCC), part of the University of Manchester, in the
United Kingdom.
After simply mirroring Linus’s early boot and root disks, MCC decided
to make its own distribution, or collection of files that could be used to
install GNU/Linux. The first MCC Interim version, using the 0.12 ker-
nel, appeared in February 1992. The Readme file that accompanies it
gives a fascinating insight into what was involved in getting GNU/Linux
running in those pioneer days.
The Readme explains, “The MCC Interim versions of Linux are de-
signed to allow people who are not Unix experts to install a version of
the Linux operating system on a PC. The installed system should be self-
contained, but easy to extend.” This indicates that there was already de-
mand from people who were not Unix experts for a distribution that
would allow them to install GNU/Linux on their PCs.
The Readme goes on, “Our versions are called ‘interim’ because they
are not intended to be final or official. They are small, harmonious, and
moderately tested.” The last point was important: Not only was it neces-
sary to debug the Linux kernel; as extra components were added to a dis-
tribution, it was vital to check that they would all work together, that
they were “harmonious.” This kind of large-scale debugging became one
of the key tasks of people putting together distributions.
As the MCC Readme notes, “Very shortly after the first MCC Interim
version of Linux appeared, other people released similar versions: Dave
Safford’s TAMU [Texas A&M University] releases, and Martin Junius’s MJ
versions were eventually followed by Peter MacDonald’s massive, com-
prehensive SLS releases and H. J. Lu’s small base systems.”
Peter MacDonald was one of the first Linux hackers to submit patches
to Linus, but he became better known for putting together the Softland-
ing Linux System (SLS) releases. As the MCC Readme says, these were
“massive” and “comprehensive.” That is, they tried to provide not just
the kernel and basic utilities, but many of the GNU/Linux ports that
were starting to appear; this included such major new elements as X
Window and TCP/IP.
The drawback of this approach is that as the Linux kernel grew more
complex, and the choice of applications richer, the task of keeping up-to-
date with everything and making it all work harmoniously grew ever
harder. This meant that the SLS distribution, admirable as it was in its in-
0738206709-02.qxd 9/4/03 10:20 AM Page 89
90 REBEL CODE
the idea that anyone could help build a kernel had been when Linus
opened up his project.
The parallels between the Linux and Debian go deeper. Just as Linux
arose in part from a frustration with Minix, so Debian grew out of dissat-
isfaction with SLS. Both projects used the Internet as a medium for col-
laboration, and both were emphatically open, whereas Minix and SLS
had been more controlled and closed. Moreover, the Debian project also
adopted Linus’s idea of parceling out areas to his “lieutenants.” As Mur-
dock explains, “Debian would be based on the idea of a package, and all
these people who wanted to work on it could then take responsibility for
all these different packages.” He notes, “Other people taking on subtask-
ings is very much how Linux worked.”
To make this approach function, he says, “We would define standards
and rules that would allow a package from any source to be able to fit
into the system well. So that when you take all these packages and you
install them, you get an entire system that looks like it’s been hand-
crafted by a single closely knit team. And in fact that’s not at all how it
was put together.”
Even the decisionmaking process of Debian was modeled on Linux.
“Essentially what would happen is, we would be presented with a deci-
sion point,” Murdock explains. “And I would ask people who were
working on Debian, ‘What do you think we should do?’ And that would
spark some conversation, some discussion, and possibly some disagree-
ment. And then I’d make a decision,” he says. “A lot of the time my deci-
sion reflected what the group wanted, and some of the time it didn’t.
This is really the way Linux works, too.”
Together with a huge response from those interested in helping with
the Debian project, in the fall of 1993 Murdock received an e-mail from
Richard Stallman on the subject. “He said, ‘We’re learning more about
Linux and we’ve been thinking about distributing it ourselves,’ because
at the time their GNU kernel was taking longer than expected,” Murdock
explains. “They saw Linux appear on the scene and use pretty much the
rest of the system they’d been putting together.” That is, the new distrib-
utions put together the Linux kernel with many elements of the GNU
project to form a complete operating system. “He basically said, ‘I’m in-
terested,’” Murdock recalls.
Stallman’s approach proved important for Debian. “His early interest
was critical in not only giving me the confidence to do it, but also giving
it an air of legitimacy,” Murdock says. “I really think that, more than any-
thing else in those early days, is what caused people to take notice and
get involved.” But Debian was also important for the GNU project and
0738206709-02.qxd 9/4/03 10:20 AM Page 91
Stallman: “He basically got to know Linux through his involvement with
Debian,” Murdock believes.
There were two consequences. First, the Linux kernel became a candi-
date to complete the GNU project begun ten years before and held up at
that time by delays in finishing the Hurd kernel. In a sense, the Debian
project led to the formal completion of Stallman’s dream of a Unix-like
operating system that was freely available. The other consequence was
that Stallman’s Free Software Foundation sponsored the development of
Debian in the early days. Murdock explains why: “We took a while to
come out with our first releases. Part of the problem had to do with the
fact that we were entirely reliant on volunteers, [and] we had such a big
job to do. So in a large sense I needed to get paid to keep working on De-
bian because my attention to it when I had the free time simply wasn’t
cutting it.”
Fortunately, Stallman recognized this need and offered to pay Murdock
to work on Debian through the Free Software Foundation (FSF), another
example of Stallman’s generosity in funding others and his growing
recognition of the importance of Linux-based distributions.
But relations with the idealistic and uncompromising Stallman were
not easy. Murdock explains the situation: “By this time, Debian was
much larger than myself,” he says, “and while I had certain ideas about
what Debian should be, I was no longer the single opinion. There were a
lot of people who didn’t agree with the FSF’s goals.” As a result, “the FSF
was sponsoring Debian, so they had certain expectations of me; and at
the same time, I was asking people to do work in their free time and they
had certain expectations of me. Sometimes those expectations did not
match up very well. There turned out to be some conflicting goals,” he
concludes.
Partly because of these conflicts, Murdock decided to stand down as
Debian leader in March 1996. There were other important factors in his
decision. “I wanted to finish my degree,” he says, “and I had recently
been married and I wanted to spend more time with my family. I’d been
doing it for three years, and I think I was ready for something new as
well.” His successor was Bruce Perens. “Bruce was a natural choice, actu-
ally,” Murdock says. “He had been involved with Debian for a long time,
and he had shown a large amount of assertiveness and ability to deal
with people.”
Perens gives some more details on the problems with Stallman.
“Richard wanted to have a part in the technical direction, and the devel-
opers on the project didn’t really want a boss,” he says, “they were all
volunteers. If they wanted to take any direction it would be from some-
0738206709-02.qxd 9/4/03 10:20 AM Page 92
92 REBEL CODE
one who was working with them on the project.” Moreover, Perens says,
“We took issue with some of his technical direction because we felt it
would lower the performance of the system and make it take up more
disk space. We decided that we just could not live with FSF having a
right to tell us what to do, and so we split off our own organization. We
figured we could make as much money as we needed just from dona-
tions, which was the case.”
But the dispute did not end there. One of the early Debian developers
was Linus’s friend and fellow student Lars Wirzenius. He explains what
happened: “When Richard Stallman started to notice Linux a lot, he
started to make a lot of noise about the relationship between the Free
Software Foundation and the GNU project on the one hand, and the
Linux community on the other hand.
“His point was mostly that the Linux community would not exist
without all the free software that the Free Software Foundation was pro-
ducing,” Wirzenius says. “And he was right about that, so his point was a
good one.” But from this good point, according to Wirzenius, Stallman
made an unfortunate deduction: “He decided that Linux should be re-
named LiGNUx,” pronounced “LIG-NUX.” Stallman doubtless found
this wordplay irresistible in its combination of both cleverness and, as he
saw it, justice.
“And that was a very terrible thing to try to do,” Wirzenius notes, “be-
cause one of the things that people get very attached to is the right for
them to decide on the name of the thing they have created. And the GNU
project didn’t have very direct input into the kernel itself, they just made
the tools for it, and the tools necessary to add to the kernel. The way that
Stallman tried to push his views made it very distasteful and it almost felt
like he was betraying everyone in his community.” It felt like betrayal,
Wirzenius says, “because up until that time it had been all the free soft-
ware people against the evil people who did not give source code away.
But then Stallman started to fight, or at least it looked like he was start-
ing to fight the Linux community [because of] the fact that he wanted
Linux to be renamed.
“My reaction was fuelled somewhat by the fact that initially Debian
had been partly funded by FSF,” he says, “which was a very good thing of
them to do.” Perhaps because of this earlier help, and despite this ugly
incident, the Debian team agreed to call their distribution Debian
GNU/Linux. “And that is precisely because the Debian people agree with
Stallman that GNU is such a big part in the operating system,” Wirzenius
says. Perens explains: “Richard asked me to do that, when I was still De-
0738206709-02.qxd 9/4/03 10:20 AM Page 93
bian project leader, and it sounded fair to me. In fact, I don’t believe any-
one [in Debian] objected to it at the time.”
Linus tended to remain aloof from such squabbles. “I don’t get too
involved,” he said in 1996, but added, “personally I don’t think GNU/
Linux flies as a name; it should be catchy.” And he pointed out that “it’s
not just GNU programs we are using”—the X Window software, for ex-
ample, comes from the XFree86 group—“so personally I prefer to have it
called just Linux and then give credit where credit is due, and that’s cer-
tainly with the GNU project, too. Most of the Linux [software] literature
has the GNU copyleft and I try to make it very clear that there is more
than just the kernel, and a lot of it is GNU stuff.” Nonetheless, it is cer-
tainly true that GNU software represents the lion’s share of most distrib-
utions, and there is a strong argument that GNU/Linux, even if it does
not “fly” as a name, is a fairer description.
Like the Debian team, Ian Murdock harbors no ill-will against Stall-
man for his attempts to influence the development process, or even to re-
name Linux. “We wouldn’t have our current idea of free software
without him, and without his uncompromising stand, which has not wa-
vered an inch the past fifteen years,” Murdock says, “and I think that’s
very important.” But he recognizes that “some people disagree with
me”—Stallman tends to polarize opinions.
That Murdock views Stallman so positively is not surprising; one of
the main motives for creating the Debian distribution in the first place
was to create a foil to what he viewed as an irresponsible commercializa-
tion of GNU/Linux: “I saw that Linux was one day going to be a com-
mercial product,” he says, “and I was concerned at the way it might
become a commercial product.
“One of the problems with SLS was it was starting to be commercial-
ized a bit,” he continues. “You were seeing advertisements in magazines
talking about Linux, and some of the features they were advertising were
just simply false.” He emphasizes that “it wasn’t the guy who was doing
SLS, it was people who were just appearing and selling diskettes.” This
was an important new development; it represented the first signs that
people outside the GNU/Linux movement had seen a business opportu-
nity in this fast-growing market. And it was from these modest and not
entirely praiseworthy beginnings that the huge GNU/Linux and open-
source industry of today came into being.
Although Murdock was concerned about new distributions based on
SLS, it was Slackware, SLS’s successor as the most widely used distribu-
tion, that really helped to create the commercial Linux world. Murdock
0738206709-02.qxd 9/4/03 10:20 AM Page 94
94 REBEL CODE
recalls, “Slackware and Debian had essentially the same origins in that
we were both dissatisfied with SLS. We both got started without knowing
about the other, and once we found out about each other we started de-
ciding well, maybe we ought to combine our efforts here. Nothing ever
really came out of that, partly because we had different goals. I wanted to
take the distributed development path, and [the person behind Slack-
ware] wasn’t as interested in that; he thought that that would be a disas-
ter waiting to happen, trying to get dozens of hundreds of people
pointed in the same direction is a fairly difficult task”—obviously shar-
ing Tanenbaum’s view that it would be like trying to herd cats.
While Debian evolved into the purest distribution, created in the same
way that the Linux kernel was built, Slackware first of all took over from
SLS, which Peter MacDonald had stopped supporting, as the leading dis-
tribution, and then spawned a variety of new distributions itself. In the
June 1994 issue of Linux Journal, in what was only the second interview
this new magazine had conducted (the first was with Linus), Slackware’s
creator, Patrick Volkerding, said, “It would be nice to make money as a re-
sult of [Slackware], but not from selling the actual package.” This indi-
cates that Slackware was very much part of the earlier, noncommercial
tradition of GNU/Linux distributions that existed mainly to serve the com-
munity. The official Slackware was sold not by Volkerding but by Walnut
Creek, which produced many popular CD-ROMs containing free software.
The arrival of cheap CD-ROM technology in the early 1990s probably
played as crucial a role in the commercialization of GNU/Linux distribu-
tions as the arrival of cheaper PCs using the 80386 did in the original
genesis of Linux itself. Had this new medium not been available at the
right price, there would not have been the sudden flowering of companies
selling low-cost CD-ROM-based distributions. Some of the credit for ush-
ering in the CD-ROM as a viable distribution medium must go to Mi-
crosoft. As early as 1987, it had shipped the Microsoft Bookshelf in this
form, which the company described as “the first general purpose applica-
tion to bring the benefits of CD-ROM technology to personal computers.”
The availability of CD-ROMs as a distribution medium for GNU/Linux
was important not just because they were cheap (typically a few tens of
dollars) and convenient. As more and more programs were ported to
GNU/Linux and placed as a matter of course on these CDs, they pro-
vided a huge boost to the nascent GNU/Linux community. They not only
provided much needed tools and applications but allowed the same dis-
tributed development of software to occur beyond the Linux kernel,
where it had been perfected. Already, then, the desirability of GNU/Linux
0738206709-02.qxd 9/4/03 10:20 AM Page 95
96 REBEL CODE
98 REBEL CODE
“Our relationships with free software developers around the world are
absolutely vital,” Ewing said, recognizing that his company depended on
them for the continuing progress of the product they sold. “We help de-
velopers out by contributing hardware and money to organizations like
the Free Software Foundation, the Linux Documentation Project, the
XFree86 Project, and others. We are absolutely committed to the free
software community.”
In addition to demonstrating its commitment to the developer com-
munity, Red Hat had to win the confidence of users, particularly those in
the business world, which was just starting to explore GNU/Linux as an
option; a lack of direct support from the developers was one of the cor-
porates’ main concerns.
A strength of the GNU/Linux development model was that practically
every software author could be contacted directly by e-mail, or indirectly
using Usenet newsgroups. Stories abound of people’s posting queries
about bugs in software and receiving fixes within hours from the original
author. But for companies, this novel and slightly ad hoc approach was
not enough; they needed someone they could turn to who would be reli-
able. As a result of this demand, Red Hat and other GNU/Linux distribu-
tions started to offer free and paid-for support with their products. “We
offer free installation support with Red Hat Linux, and we offer support
contracts for those that need long-term security,” Ewing said in 1996.
Apparently a minor point, the provision of support for GNU/Linux
system would prove a key development. On the one hand, it provided se-
curity for companies that were contemplating using free software in a
business context; on the other, and just as important, it provided a vital
new source of income for companies that were, after all, selling software
that was also generally freely available.
Despite GNU/Linux’s growing success, in 1996 Ewing was still unsure
how Red Hat would ultimately fare in the corporate world. “As a solu-
tion to drop on a secretary’s desk Linux is pretty lacking at the moment,”
he admitted. “Our Applixware products”—proprietary software from
Applix, and one of the earliest business applications for GNU/Linux—
“does bring a high-quality office suite to Linux, but there are not a lot of
other end-user applications available. How this pans out in the years to
come remains to be seen.”
In summary, he described the issue of how important GNU/Linux
would become in a business context as “the $1,000,000 question.” Ew-
ing was out by a factor of over 1,000: Just three years later, Red Hat’s
value was several billion dollars. And yet in 1996, the company that
0738206709-02.qxd 9/4/03 10:20 AM Page 99
Linux community, for the hackers on the Net,” Sparks explained. “We
then do two other products targeted towards resellers and the commer-
cial environment, with many more commercial components, features,
management capabilities and so forth.”
The commercial components came on a separate Solutions CD, and
were ports of applications from major software vendors such as the Ger-
man company Software AG, which produced the heavyweight Adabas D
database. As Sparks recognized, “One of the dilemmas we or anyone has
when introducing a new operating system or system-level product is you
have got to have the application base. An operating system in and of it-
self only solves a certain set of solutions, a pretty tight niche, but if you
can bundle it with third-party products the range of solutions it can offer
is quite large.”
The ports of commercial software often arose in a peculiar manner.
Sparks explained, “The Software AG partnership has kind of stemmed
from the fact that many people inside of Software AG, particularly the
engineers, had started to port without management’s knowledge—that
isn’t uncommon, we find. We just called a company not long ago and
said we would like to talk to you about your Linux port, and they in-
sisted they didn’t have one when we knew that they did. And we said,
well, you may want to go check with your engineers because you really
do have a port done.”
The arrival of these proprietary commercial packages enabled
GNU/Linux systems to be offered as solutions in a wide range of applica-
tions. But Caldera tended to concentrate on one area where GNU/Linux
was now particularly strong. As Sparks said, “Many people perceive
Linux as a kind of a hacker’s Unix, and we come in and say, well, yes, it’s
that, but do you know what it really is? Linux is a connectivity product;
you want to connect to a remote office, you want to connect to the Inter-
net, you want to connect to this or that peripheral device, Linux can do
that.”
This was a shrewd move on Caldera’s part in several ways. The busi-
ness use of the Internet was growing rapidly at this time, and many IT
departments had to meet two apparently contradictory requirements: to
get their company online fast with reliable software, and to do it without
any extra budget. GNU/Linux systems offered the perfect solution. They
were now robust and yet extremely low-cost. As a result, many compa-
nies were using GNU/Linux without even knowing it.
Sparks described a typical situation. “We had a company call up,” he
said, “an MIS [Management Information Systems] guy at a large com-
pany here in the U.S., who said, ‘We’re using Linux for gateways and
0738206709-02.qxd 9/4/03 10:20 AM Page 101
other things to attach all our Internet sites. My MIS director just found
out that I’m running our company on Linux.’ And he said, ‘My MIS di-
rector said we’re not going to run our company on no free software.’ So
he said, ‘Can you help me,’ and I said, ‘Well, you bet I can.’ We set up a
meeting, I went out and said, ‘Yes, Linux is free, but we’re a real vendor.
If you have problems we’ll stand up and take the bullet. We’ll fix it for
you and we’ll give you more value-added than you expect.’ And that is
very typical, we’ve done that kind of presentation now dozens of times
here in the U.S.”
This provides interesting evidence that GNU/Linux was already being
widely used in business, but that few companies would own up to using
it. Mark Bolzern confirmed this. Around the same time, in 1996, he said,
“Nearly every one of the Fortune 5000 have bought our product and use
it in various ways. In most cases, management does not yet know and
denies it.” He added, “Even Microsoft uses our Linux extensively and
has bought hundreds of copies.”
Alongside this burgeoning if hidden commercial activity based directly
on GNU/Linux software, ancillary sectors began to emerge. For example,
companies sprang up that offered ready-built systems optimized for run-
ning GNU/Linux. One of the earliest of these was VA Research, a com-
pany that would grow to become one of the main players in the
GNU/Linux world.
Its founder was Larry Augustin, born in Dayton, Ohio, in 1962. “I was
a graduate student at Stanford in electrical engineering,” he says. “Most
of the work we did at the time was on Sun workstations. I thought, gee,
if I had a Unix workstation at home I’d be a lot more efficient and I’d
graduate sooner. Now, I couldn’t afford a Sun SparcStation”—which cost
around $7,000 at the time—but “I’d seen this thing called Linux.” He de-
cided to build his own system, and run GNU/Linux on it. “I was able to
put together a machine for about $2,000 that was one and a half to two
times faster than that $7,000 Sun machine.
“Other people saw what I’d put together, and they said, ‘Gee, can you
do one of those for me?’” Augustin recalls. “So people started paying me
to put these systems together for them.” To begin with, Augustin sold the
systems through a pre-existing company called Fintronic. Although this
business soon prospered—“About a year later I had three people full-
time just putting systems together,” Augustin says—his life had almost
taken a different course.
While he was at university, he got to know two fellow students, Dave
Filo and Jerry Yang. “Dave and Jerry had started doing this directory of
Internet sites,” Augustin says, “that they were running out of their rooms
0738206709-02.qxd 9/4/03 10:20 AM Page 102
at Stanford.” Augustin was also into the Internet by this time. “I had put
up this Web page in November ’93, which was something very new. In
fact, at the time Intel had just barely started a Web site. Intel had five
companies listed on their home page that sold systems [using Intel
processors] over the Internet. We were one of them.”
As a result of their common interest, Augustin recalls, “Dave Filo,
Jerry, and I said, ‘Gee, there’s something going on around this Internet
thing, we should figure out a way to create a business.’ I was doing a
pretty good e-commerce business over the Internet, and they had this di-
rectory, and we started writing a business plan about how to build com-
panies around the Internet.”
Despite this, “the end-result was they decided to go off and try and do
something with this Internet directory they were building,” Augustin ex-
plains. “And I looked and said, ‘Gee, that’s just marketing.’ I want actu-
ally to make computers, right? So I went off and kept doing systems,
based around Linux.” What he describes as “just marketing” would soon
be the pre-eminent Internet company Yahoo, and worth tens of billions
of dollars.
But Augustin has no regrets, not least because his own company would
also do well. VA Research was spun out of Fintronic in February 1995.
“VA” stood for Vera and Augustin: “James Vera originally founded the
company with me when we were at Stanford,” Augustin explains. VA Re-
search’s IPO in 1999 would show the largest opening gain ever, though
few would have believed this when looking at the muddy black-and-
white advertisement for what was called “VA research” in the July 1995
issue of Linux Journal. The top system on offer there was a 90 MHz Intel
Pentium with 16 Mbytes of RAM and a 1 Gigabyte hard disk, for $3,755,
including a 17-inch monitor.
Augustin says, “We were one of four advertisers in the first issue of
Linux Journal,” which bore a cover date of May 1994. The man behind
the first hard-copy publication devoted to the GNU/Linux world was
Phil Hughes. “In 1993, I started talking with a few friends about the idea
of starting a Free Software magazine,” he recalls. “We quickly figured out
that we couldn’t afford to start something like this because, to be nonbi-
ased, it would need to have no advertising. I suggested, initially as a joke,
that we start a magazine on just Linux.”
After various problems, Linux Journal was eventually launched with
help from the Canadian-born entrepreneur Bob Young. At that time,
Young was publishing a local newsletter called New York Unix, and run-
ning the ACC Bookstore, which sold many GNU/Linux distributions
and books. Young later merged his company with Red Hat, adding his
0738206709-02.qxd 9/4/03 10:20 AM Page 103
Linus 2.0
106
0738206709-02.qxd 9/4/03 10:20 AM Page 107
invitations.” But Linus turned down the first one. “I said ‘no’ to an invi-
tation to Madrid,” he says; this was “some Unix conference” in spring
1993. “Linux wasn’t really well known,” he explains, “but it was starting
to be something that some people talked about.” And yet, “I really
wanted to go,” he says, “I’d never been in Spain, but I was so nervous
about giving a talk there.
“I started to feel so bad about it later that I decided that the next con-
ference I had to do it. So I gave my first talk in Holland to five hundred
people.” It required considerable effort on his part. “I’d never given a talk
anywhere before,” he explains. “This was in a foreign language”—Eng-
lish. “I felt really badly, I couldn’t sleep the night before, people noticed I
was very nervous. But it was a kind of shock treatment, and it certainly
got a lot easier in that sense.”
One of the first places he spoke was at a meeting of DECUS, the Digital
Equipment Corporation User’s Society, held in New Orleans in the spring
of 1994. He had been invited by Jon “maddog” Hall, senior marketing
manager for the Unix Software Group at Digital—the DEC in DECUS.
Hall had never heard of GNU/Linux until then, but a colleague, Kurt
Reisler, had convinced him to fund Linus’s trip. As Hall wrote in an arti-
cle that appeared in the October 1995 issue of Linux Journal, “I had my
doubts about this funding as Kurt struggled to get Linux installed on
that first PC, but after some able assistance from Linus, he did get it
working. I had my first look at the operating system running, and in less
than ten minutes I had convinced myself that ‘this was Unix enough for
me.’”
The result of this conviction was that Hall persuaded Digital to lend
Linus a powerful personal computer based on its Alpha microprocessor
chip. The idea was that Linus would port Linux to this architecture,
which was completely different from that of the Intel chips in ordinary
PCs.
Unix had become the portable operating system par excellence, one
reason Richard Stallman had chosen it as the model for his GNU project.
Linux, by contrast, had grown out of a series of explorations of the Intel
80386 processor; the first versions of Linux even contained code written
in assembler for that chip. For this reason, Linus had emphasized in his
first official announcement of the Linux project, on 25 August 1991, that
“it is not portable.”
Later, Linus explained. “The reason Linux wasn’t really portable at first
was not because I conceived portability bad, but because I considered
portability to be a nonissue. In the sense that I didn’t think that there was
any interest in it, and because the PC platform is so good price-perfor-
0738206709-02.qxd 9/4/03 10:20 AM Page 108
Despite his lack of Unix experience, Miller was soon hacking the ker-
nel code and sending the results to Linus. “At this point, I knew C for
two days,” he notes, “and I had no idea what I was doing. I think the re-
sponse was something like ‘I think a better idea would be to try and’. . .
So picture this, I’m clueless, I have no idea what I’m doing, and yet he
sends suggestions back as if I did have a clue.” As Miller emphasizes,
“It’s probably this part of Linus’s attitude which placed us where we are
today. When you got treated like this, you just wanted to work on such
a project.”
Miller stayed at Penn State for only a semester. After realizing that he
“didn’t belong there,” he went to Rutgers. It was here that his
GNU/Linux hacking took off, but not his academic career. Even though
he was studying computer science, “I also considered chemistry for a
while,” but decided that “I’d rather play with Linux than listen to what
some nitwit wants me to regurgitate onto a test.” As a result, “I failed out
of school after two semesters,” he says. “My final semester was pure per-
fection; I failed every class.”
Luckily, alongside his studies Miller had picked up a job as student
systems administrator at the Center for Advanced Information Process-
ing (CAIP) on the university campus. “My job at the CAIP research cen-
ter was the key, actually,” he says. “It gave me access to huge numbers of
nice computers, of all types. There were lots of Sun machines, old and
new. Several HP workstations, a few PC systems, SGI servers and work-
stations, you name it. I was in geek heaven.
“I had noticed some older Sparc systems were gathering dust,” he con-
tinues. “I had played a lot with Linux on Intel machines, so I wondered
what kind of work would be needed to make it work on this Sun box. I
programmed before; how difficult could this be, right?” This is a classic
young hacker thought, born in part of arrogance and in part of naïveté;
and it has driven much of the finest free coding.
“I started to actually toy with the work [of porting Linux to the Sun
Sparc] about two or three months after beginning to work at CAIP” in
1993, Miller recalls. “I think three weeks later, I got a very simple
minikernel to run; all it did was say ‘Hello.’ The amount of effort it took
to get something so simple made me realize what I had gotten myself
into.”
As for his motivation, Miller suggests it was “probably boredom. I
mean, after I’d run the backups, put paper in the printers, and finished
the other sysadmin work that I need to do, there was some time left.” He
adds in a more serious tone, “I suppose also, now that I had gotten in-
volved with Linux to a certain extent, I must have had some desire to do
0738206709-02.qxd 9/4/03 10:20 AM Page 110
“What happened was that I really had to clean up a lot of code,” he ex-
plains, “especially in the memory management, because it was very spe-
cific to one architecture and I had to really take a completely different
approach. I never wanted to have two separate versions; I wanted to have
one Linux that worked on both Intel and Alpha and anything else. So I
had to abstract away all the hardware dependencies, create a special sub-
directory for hardware specific stuff. So it resulted in quite a lot of orga-
nizational changes in the kernel.”
The changes were drastic. “When I did the portability stuff on the net-
working code, for something like three weeks I didn’t have a kernel that
worked, on either Alpha or Intel,” Linus explains. “Because I was re-
structuring everything, and I was doing it one part at a time, and at first
the problem was that it doesn’t work until everything is perfect. So you
actually have to get everything correct before you can even test it.
“I spent one year doing mainly porting [from 1994 to 1995]. Although
my main work was spent on e-mail, keeping up with Linux, when I
coded I mainly coded for the Alpha.” The main thread of kernel develop-
ment proceeded at what had now become a typically steady rhythm. But
along the way there were some novelties. For example, less than a month
after Linux 1.0 was released, in March 1994, Linus put out what he
called 1.1. But this was not just a simple point upgrade; it represented
the beginning of an important addition to the Linux development
methodology.
One of the key elements of the methodology was the frequent release
cycle of a new kernel version—sometimes every few days. This drove the
rate of development at a furious pace, but proved something of a prob-
lem for the proliferating nonhacker users. New kernels were sometimes
just fixing bugs, and sometimes adding major new features—that were
often just beta code. This left users in something of a quandary: Should
they attempt to keep up-to-date with each new version, or stick with an
older, perhaps more stable one, but miss out on important new features?
The tension was exacerbated by the flowering of the commercial
GNU/Linux sector. Because it was not possible to revise an entire distrib-
ution every time that new kernel was released, GNU/Linux companies
were forced to create snapshots of the development process. Inevitably,
different vendors created different snapshots, which resulted in an in-
creasingly fragmented market.
Linus’s solution was brilliant: to serve both the hacker and the user by
creating two strands of kernel development. Those with even point num-
bers—for example, 1.0, 1.2—were the stable releases. Updates to them
would fix outstanding bugs, not add new features. Odd-numbered point
0738206709-02.qxd 9/4/03 10:20 AM Page 112
Ok, the final release of Linux’95, also known among those in the know as
“v1.2.0” is now out. After the extensive beta-release-period, Linux’95 is
reality.
Before you get Linux’95, I’d like to outline the Licensing stuff, and re-
mind you that copyright infringement is a crime.
Linux now has a logo thanks to the artistic talents of Larry Ewing, and one ver-
sion (the “pretty” version) is included with the kernel. Some people have told
me they don’t think a fat penguin really embodies the grace of Linux, which
just tells me they have never seen an angry penguin charging at them in excess
of 100 mph. They’d be a lot more careful about what they say if they had.
0738206709-02.qxd 9/4/03 10:20 AM Page 113
What Linus didn’t mention here was that he chose the penguin for
Linux’s logo partly as the result of an encounter on one of his trips
abroad, this time to Australia, at the end of 1993. His host for the Can-
berra leg of the trip was fellow hacker Andrew Tridgell.
“We went to the national aquarium here in Canberra,” Tridgell ex-
plains. “It’s an aquarium plus a little animal park, and one of the things
there is a little pen of fairy penguins. Fairy penguins are quite small,
they’re sort of like ten inches tall, they’re very cute little things. And
there’s a little sign up saying don’t put your hand in the pen. And of
course Linus ignored that, and put his hand in, and one of the fairy pen-
guins gave him a very friendly little nip—just a little inquisitive ‘I won-
der what that tastes like?’-type nip.”
“And I thought nothing of it,” Tridgell says. Later, when Linus was
looking for a mascot for Linux, “he decided to make it a penguin,”
Tridgell explains, “he’d already liked penguins, apparently.” But Tridgell
says that Linus also explained his choice with a “story about this pen-
guin in Canberra that mauled him—it was a six-foot penguin by this
stage. He wrote his story to the Linux kernel mailing list, and I wrote
back to the list saying that it was slightly exaggerated. So he wrote to me
and said that I have no sense of the dramatic.”
Tux, as the penguin is generally known, has proved a remarkably
shrewd choice. Its uncorporate nature accords well with the spirit that
imbues the entire Linux movement. It seems somehow appropriate that
software created by someone from the Swedish Duck Pond in Finland
should have a bird for its mascot.
In his announcement for version 2.0., Linus continued with the two
main additions since 1.2:
multiprocessor support.Yes, you can buy one of those dual Pentium (or Pen-
tium Pro) motherboards, and Linux will actually take advantage of multiple
CPUs.
“The ‘real’ teaching I did was in ’93 and ’94,” he explained, “when I
was in charge of the ‘Introduction to Computing’ class in Swedish. Not
rocket science, and the classes were pretty small, but hey, I can use it on
my résumé if I want to. The last two years [1995 and 1996] the powers
that be have avoided having me do much teaching so that I could con-
centrate on research—Linux in my case.”
Linus chose to write his master’s thesis about Linux portability—fruit
of his experience with Digital’s Alpha PC. “It’s very much the issues,
what the problems were, what the solutions were. Porting Linux not just
to the Alpha, but the Alpha is certainly one of the examples,” Linus ex-
plained in 1996, when he was still working on the document—a task he
admitted he was not enjoying.
Linus’s thesis was in English. “I wouldn’t say it’s normal, but it’s not to-
tally unusual either. For me, it’s much easier because I know all the
terms in English,” he said. His English is first-rate, with only a very
slight sing-song accent that is characteristic of Swedish speakers of that
language. English was also convenient for the two supervisors of his the-
sis who would have to read and approve it. As Linus pointed out in 1996:
“My mother language is Swedish and they would actually have had more
problems finding Swedish-speaking people to comment on the language
than it is for them to comment on the English.”
Linus also uses English in another situation. “When I program I tend
to think in English,” he said. The way he codes is similar to Richard
Stallman’s method: “I never use paper,” Linus explained. “If I have some-
thing that’s tricky and I have to think out exact details, I occasionally
write down small figures, stuff like that. But in general, what happens is
that I do it straight on the machine; if it’s something major, before I start
coding I just think of all the issues for a few weeks, and try to come up
with the right way to do it—that’s completely no paperwork.
“When I have something I’m working on, I don’t care what happens
around me. I’ll think about it on the bus, I’ll think about it in bed when
I’m trying to sleep. I don’t rush out of bed, I just take paper and pen and
write it down and go back to bed and hope it makes sense in the morn-
ing. But that’s pretty rare,” he acknowledged. Not only does Linus gener-
ally never write things down first, “I never print out parts of the code to
find stuff,” he said, “and I don’t use any other electronic help to index it
for me. I keep a mental index of the kernel at all times.”
The thesis may have been a less than pleasant experience for him, but
at least he knew what was needed. Less clear was the answer to the
perennial student question that posed itself afterwards: What would he
do after he had finished his university studies?
0738206709-02.qxd 9/4/03 10:20 AM Page 116
Linus 2.0 was Patricia Miranda Torvalds, born at 6.22 AM, 5 December
1996, the time-stamp of the press release. One of her godfathers was Jon
“maddog” Hall. Because Linus has always been a private person, the an-
nouncement that he not only had a wife but a daughter came as some-
thing of a shock to many in the Linux community. It also tended to
undercut stereotypes about the kind of person a computer hacker was.
But if the surprise over Linus 2.0 was great, that caused by another post-
ing Linus made a couple of months earlier was even greater, because it
had implied an imminent event that could not be greeted with such un-
equivocal joy.
Appropriately enough, perhaps, because the subject had been at the
center of Linus’s hacking for the last two years, it turned up on a mailing
list about Linux for the Alpha processor. Replying to a “2.1.4 exception
question,” on the morning of October 16 1996, after reams of fairly
heavy technical matters, Linus ended his posting with the following
throwaway line:
I’ll be leaving for a week in the U.S. this Sunday (house hunting), and quite
frankly I’d like to get the five “critical” points above fixed first.
Needless to say, the parenthetical “house hunting” did not escape peo-
ple’s attention. A few hours later, someone asked whether this meant that
Linus was moving to the United States and that he now had a job. Linus
replied the same day with a characteristic posting that began:
[ inline: Linus running around in circles and tearing his hair out ]
Now, it had to be told some day. I have a life.There. I did it. I came out of
the closet. It’s sad, it’s shocking, but it’s true. I even have a girlfriend [ col-
lective GASP ], and it all adds up to the fact that it had to happen sooner or
later. I’m getting a RealJob(tm).
And concluded:
P.S. Company called Transmeta. Don’t bother looking them up, they aren’t
advertising.They are in the business of getting the best minds in the industry
and doing cool things that I won’t tell you about. Not selling Linux, so it’s
not unfair competition in that field.
As usual, hidden among the wit are some serious points. The first is
that Linus was well aware of the effect his decision would have. The de-
velopment process that had evolved over the previous five years was cen-
tered on him so absolutely that even his smallest decisions were felt
throughout the entire movement; this made his choice of company to
work for critical, as his closing remark indicates. He explained at the end
of 1996, “I wanted to do something that wasn’t directly related to Linux.
I was actually offered work from Linux distributors, but I didn’t want to
go work for them because I didn’t want to give my implicit stamp of ap-
proval of any one specific distribution, just because I worked there. So I
wanted to have something neutral.”
His solution was typically resourceful. On the one hand, even though
Transmeta was “not selling Linux,” he said in 1996, “it’s actually in my
contract that I’m doing Linux part-time there. It hasn’t been formalized. I
don’t want to be in the position where I actually count the number of
hours I spend doing Linux. I think sometimes I will do almost only
Linux, and at other times what this Transmeta company is doing.”
But Transmeta also offered something else very important. “I just
wanted it to be something interesting, something new,” Linus said. “Ex-
actly the reason I decided to move was that I’d been at the university
something like seven years. Part of that I’ve been both working and
studying there. It seemed to be a good time to try to see the other side of
the world”—the world of commerce—“not just the academic side.”
Throughout his life, Linus had always looked for the next challenge. As
a teenager, he had felt compelled to replace his Vic-20 microcomputer be-
cause he had grown to know it too well. He had later taken on the Alpha
port because he thought, “Hey, this might be it,” when he was beginning to
wonder whether there was “something fundamentally interesting left.”
0738206709-02.qxd 9/4/03 10:20 AM Page 118
The Linux world was left a mystery that would not be revealed for
more than three years. But it was also left with an astonishing achieve-
ment.
The Linux kernel had grown by a factor of nearly 100, from version
0.01, an 80K student exercise in C and 386 assembler, to version 2.0, al-
most 5 Mbytes of compressed code. It had progressed from something
that could barely even run on a PC to an almost complete Unix kernel
clone; it supported X Window, TCP/IP networking, several completely
different architectures, and multiprocessor machines, although in a fairly
rudimentary manner.
A high-speed development methodology had evolved that was un-
precedented in the way it drew on thousands of users around the world,
connected via the Internet, supplying bug reports and often fixes to se-
nior Linux coders—the “lieutenants”—who fed them through to Linus.
He, in turn, chose the most suitable of these fixes to produce the next
experimental release of the kernel—while he tended a more slowly
0738206709-02.qxd 9/4/03 10:20 AM Page 119
THE YEAR 1969 WAS IMPORTANT because it saw the birth of Unix —
and of Linus. It was also then that the first steps were taken in the cre-
ation of a network that would eventually become the Internet.
Originally called ARPAnet, it was funded by the Defense Advanced Re-
search Projects Agency (DARPA), part of the U.S. Department of De-
fense. In some ways, ARPAnet was close to the Internet; for example, it
employed what is called packet-switching whereby data is broken into
small packets that are routed over the network separately, and then re-
assembled on their arrival. Packet-switching contrasts with circuit-
switching, employed for ordinary telephone links, for example, where a
single circuit exists between origin and destination for the duration of
the connection.
The key step from ARPAnet to Internet was the move to open architec-
ture networking. As a history of the Internet put together by its creators
and held on the Internet Society’s Web site, explains, “In this approach,
the choice of any individual network technology was not dictated by a
particular network architecture but rather could be selected freely by a
provider and made to inter work with the other networks.”
From the beginning, then, openness lay at the Internet’s heart, and its
subsequent development has been intimately bound up with core speci-
fications implemented through free software. One of the key players in
that story is Eric Allman, who helped make the Internet what it is today;
120
0738206709-02.qxd 9/4/03 10:20 AM Page 121
mail from one network to the other network, and then they wouldn’t
need an account,” Allman says, and adds with characteristic modesty,
“you know, there’s this saying that all innovation is done by lazy people;
and I’m fundamentally a lazy person. So if I was less lazy, I wouldn’t have
objected to managing all of the user accounts.”
But he did object, and created Delivermail to handle the forwarding
between the Berkeley network and ARPAnet automatically through what
would now be called a gateway. “I wrote that program, and the configu-
ration for it was all compiled in,” he explains; that is, setting up the pro-
gram for each machine on the network required some small fiddling with
the source code. This was acceptable, Allman says, because “there were
like five or six [computers] on the Berkeley network then, so recompil-
ing it on five or six machines wasn’t a problem.
“Delivermail was an interesting thing because I wasn’t actually trying
to produce anything that anyone else would use. I just had this problem
that I wanted to handle as easily as possible.” Nonetheless, other people
did start to use Delivermail once the program was included in Berkeley’s
distribution of Unix, BSD, starting with the 4.1BSD version, which came
out in 1981.
And when Allman had other users, he also received feedback. “I did
get some, although it didn’t really start to explode until the Internet
came along. We actually got a fair amount from Berkeley, though. Berke-
ley was always very good about taking input from the outside world,
their user base, if you will, and folding it back in,” he says.
Allman notes that “we had this ethos of picking up information from
folks in the outside world, and turning it back in, which I think is an im-
portant part of the open source model.” Therefore, “instead of having the
information from the users filtered through five or six layers before it
gets to the implementers,” as happens with conventional software devel-
opment in companies, “the implementers talked to the users and could
find out exactly what it is they need.” Allman emphasizes that, as far as
free software is concerned, “it’s much, much more than just a legal li-
cense, it’s a way of doing developing; it’s a process of involving the users
in the development at a very early stage,” and he sees this as firmly
rooted in the Berkeley tradition.
Soon, however, Delivermail wasn’t enough. “Berkeley got bigger fast
and [the network grew] to twenty or so machines, and so recompiling
for each individual machine got to be painful,” he recalls. The increased
work required was bad enough, but there was another, more difficult
problem. “The other issue was that the network configuration got more
complicated,” Allman says. “We got other network links.”
0738206709-02.qxd 9/4/03 10:20 AM Page 123
Sending the university e-mail to the right place was no longer a simple
matter, and the system employed by Delivermail couldn’t cope with the
various kinds of networks that were being hooked into Berkeley. Allman
decided that he needed to rewrite Delivermail so that machines could be
configured without fiddling with the software itself, and to build more
intelligence into the program so that it could decipher from the e-mail
addresses where messages should be sent.
He called the result Sendmail. Its success was driven in part by the
new flexibility it offered. “If you had a new network, you could integrate
that into Sendmail more quickly than you could into the others,” Allman
explains. “This was long before everyone was doing Internet, and so new
networks would show up, mushrooms in the night.”
Another factor was more fortuitous. “DARPA had decided to standard-
ize on BSD [Unix] as their research operating system of choice,” Allman
says. “And BSD came with Sendmail. So to a certain extent there was at
least a piece of my being in the right place at the right time.” By now the
Internet had come into existence, and operated on a completely different
scale from the earlier ARPAnet.
“There was a lot of excitement going on,” Allman recalls, “it was really
clear that there were going to be a lot more people available on the Inter-
net. The ARPAnet only had capacity for 256 hosts, and the Internet [peo-
ple] realized that this was too small; so they took a 32-bit field [for
addresses], so there were 4 billion possible hosts. And a lot of people
suddenly realized that they were going to be able to have Internet access.
So it exploded pretty quickly. Now, this is still in the research area, but it
meant that it was going to be able to get into more computer science de-
partments, more research labs, etc. And those people were very avid
users” of the Internet and of Sendmail.
If Sendmail fed off the huge growth of the Internet at this time, it
might also be said to have helped create the global network as we know
it today. For as Allman explains, central to Sendmail was “the concept
that you could send email between different kinds of networks, even
when they weren’t designed to work together.” This was pretty revolu-
tionary at the time: “Other people tell me that Sendmail invented it; I
don’t know if it did. There may have been prior art. That was definitely
one of my goals, and I think I did a very good job of that.”
In effect, Sendmail created a kind of virtual Internet out of the hetero-
geneous networks of the time. In doing so, it may well have been at least
partly instrumental in convincing people of the virtues of a homoge-
neous global network that would allow many different services, not just
e-mail, to circulate freely. This is the situation today; the Internet has be-
0738206709-02.qxd 9/4/03 10:20 AM Page 124
fun.” Investigating further, Behlendorf soon found out about the then-
new World Wide Web.
In mid-1993, he happened to talk to a friend who worked at Wired
magazine about this new technology. “They had an e-mail robot,”
Behlendorf recalls. “You could request articles from it, and they would
send those articles out. I came in and upgraded that to a Gopher server
that also happened coincidentally to do this thing called HTTP, which no
one really knew much about at that point.” Gophers, a series of nested
menus as opposed to the Web’s hypertext links, were an alternative way
to navigate through information on the Internet. In his book, Berners-
Lee suggests that one reason that the Gopher system failed to take off as
the Web did was because the University of Minnesota, where Gopher
had been developed, decided to ask for a license fee from some users in-
stead of releasing it freely as he had.
This “thing called HTTP” was a Web server. Behlendorf recalls, “That
started getting a lot of traffic, and eventually grew to the point where it
didn’t make sense to have the Gopher site around any more. We were
getting 3 to 5 thousand hits a day.” This was at the end of 1993.
“So around middle of 1994 it was clear that this could be something
interesting,” Behlendorf continues. “We started creating HTML [Hyper-
Text Markup Language] at that point”—new Web pages—“adding in
extra content that wasn’t in the magazine. It kind of became clear that
this could be a platform for a unique brand or a unique publication.
That was the genesis of HotWired,” one of the first online magazines.
Even though ultimately it failed to establish itself as an independent,
profitable concern, HotWired was soon recognized as a showcase for
Web technologies.
Behlendorf became chief engineer at HotWired, and chose the NCSA
HTTPd server to run the site. “I’d started using it pretty heavily,” he says,
“and then there was a security hole that needed to be closed. Somebody
published a patch to close it, and then there were other various patches
floating around. There were a couple of performance-related ones, and a
couple of ‘this is just dumb, needs to be fixed’ kind of patches.” These
were posted to a mailing list called WWW-Talk, which was hosted by
CERN, the European subnuclear research establishment in Geneva,
Switzerland, that had been the birthplace of the Web. In the past, these
patches had been picked up by the development team at NCSA and used
to improve the code. Then things changed.
Behlendorf recalls that “there was somebody who posted to WWW-
Talk: Is NCSA ever going to respond to the patches that we send them?
No one had heard anything back from NCSA so there wasn’t any explicit
0738206709-02.qxd 9/4/03 10:20 AM Page 127
continued, but soon it was decided that a complete redesign was needed.
“We benefited from a rewrite that was done by [Robert Thau], one of the
developers, who separated out a lot of the functionality in the server, be-
tween the core of the server and a set of modules,” Behlendorf explains.
This approach of making the code modular was similar to that adopted
by Linus for the Linux kernel; it lent itself well to the distributed devel-
opment model employed by most free software efforts.
Apache 1.0, using the new architecture, was released in December
1995. A couple of months later, a year after the Apache group had been
formed, Apache became the most widely used Web server on the Inter-
net, according to the survey produced by Netcraft. Since then, its market
share has risen almost every month, and as of the year 2000 it outpaces
Web servers from Netscape and Microsoft by a huge margin.
Behlendorf attributes this extraordinary success in part to timing: both
Netscape and Microsoft had failed to deliver the advanced Web server fea-
tures users wanted when Apache appeared and provided them. Apache
also received a boost from the new Web hosting services being offered by
Internet Service Providers (ISPs). Alongside the Net connectivity that
ISPs had always offered, they now ran Web servers on which companies
could set up corporate sites more cheaply than by creating them in-house.
Apache was popular for Web hosting, Behlendorf believes, because
these ISPs had “needs that couldn’t easily be captured by a single com-
pany,” for example, “the ability to host on one box 10,000 different
[Web] domains,” he says. Because Apache’s code was available, ISPs
could adapt it to meet these needs without having to wait for commercial
companies such as Netscape or Microsoft to get around to adopting vari-
ous features.
Although the Apache team has no animus against either Netscape or
Microsoft, being able to present Apache as a serious alternative to com-
mercial servers was an important consideration. Behlendorf explains
why: “Had Apache not existed,” he says, “I definitely think we would
start seeing people having to build Microsoft-specific Web sites, hosted
on Microsoft servers, to talk to Microsoft clients, and then Netscape
servers to talk to Netscape clients, and have to maintain two different
sets of Web sites essentially to talk to these two different clients.”
Without Apache, then, it is highly likely that the browser wars that
erupted in 1996 would have spilled over into the world of server soft-
ware. This, in turn, would have led to a serious schism in the World
Wide Web, with companies forced to align with either Microsoft or
Netscape—and so missing out on huge swathes of potential customers—
or needing to create two Web sites to serve the whole market. In this
0738206709-02.qxd 9/4/03 10:20 AM Page 130
sense, Apache has played a key role in ensuring that the Web’s unitary
standards have been preserved. And this, in its turn, has allowed e-com-
merce to flourish. Paradoxical as it may seem, the dot-com world with its
volatile IPOs and outlandish valuations owes much to this piece of free
software.
Apache has had another crucial effect. From the very first release of
Apache, Behlendorf explains, “it was being used on sites like HotWired,
and by MIT—Yahoo’s been using it for three or four years, I think. So it
very quickly was something people considered a tier-one product right
on the same level as Microsoft Internet Information Server, or Netscape’s
Web server.” As a result, he says, “the most important thing we’ve
proved, and probably proved more successfully than the others, is that
you can use open source in a mission-critical situation.”
Free software such as Sendmail and BIND had been used in a similar
mission-critical context for far longer, but had the misfortune to be al-
most invisible, not least because they worked so well. With the visible
and measurable success of Apache, shown in the monthly Netcraft re-
ports detailing how many public Web servers were using which program,
people were increasingly aware not only that free software was widely
used by companies but that it was running the single most important
new development in computing for decades: the World Wide Web.
As such, Apache played a crucial role in preparing the ground for the
later and continuing success of GNU/Linux, and for the dramatic uptake
of open source programs and methods in the late 1990s. In one respect,
Apache still leads the field among free software projects. Although the
debate still rages fiercely about whether open source software such as
GNU/Linux can ever hope to best Microsoft, Apache has already done it.
And yet even this achievement has been surpassed by Perl, a freely
available programming language universally used for system adminis-
tration tasks and to power complex e-commerce sites. Apache may beat
Microsoft and Netscape by a wide margin, but Perl has no serious rivals
at all from the commercial sector. Its creator, now routinely referred to
as the “Perl god,” is Larry Wall, one of best-known and most influential
figures in the free software community, alongside Richard Stallman and
Linus.
Wall was born in California, in 1954. His father was a minister, and
Wall’s religious upbringing has strongly marked his life and work. Unlike
many other top hackers, computers never loomed large early on “The
first computer I ever actually saw was the programmable calculator that
we got at our high school in our senior year,” Wall says, “so I didn’t have
0738206709-02.qxd 9/4/03 10:20 AM Page 131
Wall did not have the luxury of time to spend thinking about what to
do next. “Well, we were expecting,” he explains. “My wife wanted to be
near her mom when we had this baby, so we transferred down to UCLA
[University of California Los Angeles] before we actually made this deci-
sion to drop out entirely. So that summer I needed a job, and my father-
in-law worked for a company called Systems Development Corporation
(SDC). He got me hired on there, with hopefully not too much nepo-
tism.”
Wall’s work at SDC was classified, but alongside this he found time to
do some programming on his own account. “We had a Usenet feed,” he
says. “And there was a newsreader that came with it, but it was really
badly designed. I thought I could do a little better than that.” Wall cre-
ated rn—named in memory of the original readnews program he had
been so unimpressed with. “I think I was developing in ’83 and sent it
out in ’84,” he says. “And what I discovered was people liked it.”
When he released his free software to the general computing commu-
nity, he also discovered something else. “People would find bugs and I’d
patch them and make enhancements,” he says, “and I would send out
these patches, and people would not apply them. Or, worse, they would
apply them helter-skelter, and not in any kind of sequence, and that just
made for a maintenance nightmare, trying to do this distributed stuff on
the Net.” In effect, Wall was grappling with a new set of problems raised
by the novel distribution medium of the Internet, which had only come
into being shortly before, in 1982.
As a result, Wall says, “basically in self-defense I wrote the Patch pro-
gram.” This is a fairly small program whose job is to make sure that the
program corrections, the patches, are applied in exactly the right way.
Wall points out that his little Patch program, trivial as it sounds, “in
some ways actually changed the culture of computing more than Perl
did” later on. Wall’s Patch can be considered the small but significant
leap forward that made the distributed development model possible, and
which led, as a result, to a blossoming of free software that continues to
this day.
For a couple of years, Wall concentrated on developing his two main
projects, rn and Patch; but then practical needs in his daily life as system
administrator inspired him to tackle the new challenge that led to the
creation of Perl. His employer, SDC, had a problem: “Half of our classi-
fied project was in Santa Monica [California], and half of it was in Paoli,
Pennsylvania,” Wall explains. “We had to exchange configuration con-
trol information, but it was all classified. So we had an encrypted link be-
0738206709-02.qxd 9/4/03 10:20 AM Page 133
tween Santa Monica and Paoli that ran at like 1200 baud, something like
that.
“In terms of networking, we could not predict when the thing would
be up or down,” Wall continues, “so we needed a way of pushing stuff
across when the thing went up, automatically, without actually having to
intervene and say, ‘Start the process.’ Well, this is what the Usenet news
system does all the time,” he notes. “I said, let’s just take the Usenet news
and make a few tweaks to it. And then it will automatically send [the
control information] across when the link goes up.
“That’s what I did,” Wall says, “and it worked great. But then all our
information was out in lots of little article files.” These could be read us-
ing the rn newsreader, but Wall’s management didn’t want its informa-
tion in a heap of “little article files”: “They wanted reports,” he says. Wall
needed an easy way to take all the small files that came across the link
and automatically combine them into a report suitable for management.
“So I looked around at the tools that I had available,” Wall recalls, “and
started trying to cobble something together, and realized it just wasn’t
going to work real good. That’s when I began to realize where the prob-
lem with the Unix toolbox philosophy was.”
This philosophy said that every tool should do one thing well, and had
led to the many small, powerful tools that made up Unix; it represented a
big break with previous operating systems, and had made the entire
GNU project possible. Wall discovered, however, that its strength could
also be a weakness. “Since each tool is a separate process,” he says, “they
tend to be sort of complicated, and have a lot of options. But they’re lim-
ited, and if you try to push them beyond their capabilities, you can’t do
it.”
For example, one popular tool is the Unix shell. Wall comments,
“Shells were very good at whipping things up—so there was the whip-
ping-things-up dimension—and rapid prototyping.” However, the “C
[language] was good for manipulating things at a very low level, but
that’s different dimension. And they were both good at that thing and not
at the other. So there was this big gap where there was nothing that was
good at both of those things. There’s a big ecological niche out there that
ought to be filled,” he says, and adds, “so right from the start, I was plan-
ning in a sense to take over the world.”
His planned take-over was purely benevolent, and was rooted in a
long-standing desire to create tools that would be as useful as possible.
With his new tool, Wall wanted to create something that was so useful it
would be taken up by many people. Drawing on his previous experi-
0738206709-02.qxd 9/4/03 10:20 AM Page 134
ences with rn and Patch, already widely used, he had a pretty good idea
about how to go about this.
“From the very beginning I understood that I would probably be dis-
tributing it,” he says, “and that people would probably like to use it. Peo-
ple are surprised when I say that, because most people will tell you they
didn’t expect what happened to their tool, but you have to understand
that I’d already written languages that other people had used. And I’d al-
ready distributed programs like rn and Patch, so I had a sneaking suspi-
cion that anything that I’d like, other people might like too.”
As he set about designing his new language, he took his favorite fea-
tures from pre-existing languages. “I know what I like about various lan-
guages, and what I don’t like—mostly what I don’t like,” he says. “The
things that bother me about some languages is when they take some
point and drive it into the ground.” Another source of inspiration, how-
ever, was just as important. “I also wanted to borrow ideas from natural
languages,” he explains. This was a fairly radical thing to do, and was
born in part from his own personal interests and training. “It had been
talked about before,” he says, “but it had never actually been done as far
as I’ve researched. [The business programming language] Cobol was
claimed to be self-documenting because it was English, but all that
meant was you’d take pseudo-English phrases and plug things in instead
of words. So it’s actually English-like at a very shallow level. I wanted
something that worked at a much deeper level,” he says.
Above all, he wanted to add a key feature to his new computing lan-
guage. “I wanted expressiveness,” he says. “I didn’t want a language that
dictated how you were supposed to solve a problem. Because so many
languages have this idea that they should be minimalistic—it’s partly be-
cause of the math background of most of the people who did computer
science.” Wall points out that with natural languages, people “want to
use expressive words; they don’t want to talk in basic English.” He
thought that they should be able to do the same with programming lan-
guages.
“My overarching theme of my life is actually trying to help people,” he
explains. “So it’s sort of natural that I would open up a language in a way
that I would think would be most useful to people.” Opening up the lan-
guage means that he should “try to make it easy for the other people to
do what they want to do, not what you think they ought to do,” he says.
An interesting parallel arises between Wall’s approach with his new
language and with open source software in general. Where the latter
gives users the freedom to do things their way—by changing the source
code if need be—so Wall’s creation lets them employ the language in
0738206709-02.qxd 9/4/03 10:20 AM Page 135
ways that represent a closer fit to their way of thinking and working.
What came to be known as Perl offers the same freedom as open source,
but this time explicitly built in to the structure of the language, not im-
plicitly as a consequence of the availability of source code.
Wall’s choice of a name, and the way he went looking for it, was char-
acteristic. “This is another indication that I was already thinking that it
would have wide use [because] I wanted a name with a positive connota-
tion,” he says. This goes back to his linguistic training: “A word in a nat-
ural language fuzzes off in many different directions; it can be used in
many different ways. It has many different shades of meaning and you
can’t help but think negative thoughts about something that has negative
connotations, even if you’re using it in a positive context,” he notes.
He explains how this led to the name “Perl.” “What I did was,” he
says, “I looked through the online dictionary, and looked at every three-
and four-letter word, to see if any of them were suitable names for a lan-
guage, and I actually rejected all three- and four-letter words—‘Perl’ not
being in the dictionary. I thought of several other things, and eventually
settled on ‘Pearl’ because I thought it has positive connotations.” He jus-
tified his choice retrospectively by coming up with the explanation Prac-
tical Extraction And Report Language. “You can tell it still had the ‘a’ in
there because of the ‘and,’” he notes.
The explanatory name is apt. Throughout his programming career,
Wall has been concerned with making things that were practical so that
they might be used by as many people as possible. Extraction had also
formed a continuing theme: His newsreader rn was partly about extract-
ing threads of information from the undifferentiated stream of Usenet
postings, and Patch was built around the idea of extracting only the
changes that needed to be applied to source code. Above all, as a text-
processing tool par excellence, Perl is another manifestation of Wall’s fas-
cination with manipulating and analyzing the streams of words and
characters that make up natural and artificial languages.
Apt as it was, the name “Pearl” had to be modified, rather as “Monix,”
Tanenbaum’s original name for Minix, had to be tweaked. “I heard ru-
mors of another language called ‘Pearl,’ an obscure graphics language,”
Wall says. “So I decided I need to change the spelling of it, and I might as
well shorten it, because I’m a lazy typist. I wanted a name that had posi-
tive connotations, [but] I also wanted one that was short, because I
thought people would be typing it all the time.” “Pearl” became “Perl.”
In 1986, Wall started working on what became Perl, and finished the
first version after around six months’ work. The official Perl 1.0 was re-
leased in October 1987. He says that he often programmed for long
0738206709-02.qxd 9/4/03 10:20 AM Page 136
stretches, and rarely wrote anything down: “The only thing I use paper
for is to make notes of to do’s—not extensively: a short ‘fix this,’ ‘imple-
ment that,’” he explains.
Wall recalls that after he released Perl, “it took off pretty quickly. Of
course there’s always early adopters,” he says, but points out that “there
were things that I did explicitly to help it take off faster,” another indica-
tion of the extraordinarily conscious way in which Wall was hoping to
make his free program widely used. For example, he provided users of
other programming languages with translation programs that would take
their code and convert it automatically to Perl. Wall says he did this “be-
cause I thought people needed a migration path from what they had, and
that would make it easier for them to start using it and get an idea of
what the corresponding Perl would look like.”
This was a fairly remarkable thing to do, especially for a noncom-
mercial language that might well have turned out to be of no interest to
anyone; it required even more work from Wall over and above creating
Perl in the first place. Once more, it flowed naturally from his desire to
help users. “It’s complexifying my life in order to simplify other peo-
ple’s lives,” he says, and then adds with a kind of mock arrogance that
is simultaneously self-deflating: “I talk about helping people, but the
flip side of that is yeah, I want to take over the world. Maybe I’m a
megalomaniac, or something, and I’d like to make my mark on the
world.”
Aside from the light that this statement throws on Wall’s complex
character, it is also fascinating to see the same behavior as exhibited by
Linus years later when he was creating his kernel. He, too, frequently
talks of “world domination” as his aim; like Wall, he means to puncture
his own self-importance. Wall, of course, is well aware of the parallel,
and likes to assert a kind of precedence for his world domination plans:
“Well, I predate him,” he says, referring to the young upstart Linus.
Wall’s promotion of Perl has been equally premeditated. For example,
contrary to what others might have done, he did not set up a Usenet
newsgroup to muster support for Perl. Once more, his reasoning shows a
shrewd understanding of the process through which information about
free software is spread.
“Pretty soon after [Perl] was released there were requests for a Perl
newsgroup,” he recalls, “and I put that off. I didn’t want a Perl news-
group because I didn’t want to be ghettoized. The dynamics of how
Usenet worked, at least back then, was if you posted something off-topic
for the newsgroup then people would say, ‘Well, take it to this other
newsgroup,’” he explains.
0738206709-02.qxd 9/4/03 10:20 AM Page 137
I say that I thought that Perl would take off, but I thought it would take off
primarily as a system administration language, and as a text-processing lan-
guage. I did not anticipate the Web, but [Perl] was a natural for it. What I
did anticipate was that people wanted a language with two qualities: that it
could edit text well, and that it was good at gluing things together. If you
think about what a Web server has to do, it has to do text processing [to
manipulate Web pages] and it has to have data from somewhere, so a lan-
guage that can do text processing and can hook up to anything else is ex-
actly what the doctor ordered. I wanted Perl to be a humble language in the
sense that it can compete by co-operating.”
Thanks to its humility and usefulness, Perl is now probably the most
widely used “glue” language in e-commerce, from giants such as Yahoo
and Amazon.com (both of which use Perl extensively) downwards. As
far as estimating how many Perl users there are, Wall thinks “a million is
the right order of magnitude,” and adds, “I usually say ‘a million, plus or
minus a million.’”
Wall’s achievement goes much further than simply writing a great lan-
guage. Perl was one of the key enabling technologies that made possible
the first wave of interactive Web sites, and hence e-commerce itself. It es-
tablished yet another field—that of programming languages—where the
Net-based open-development process had produced better results than
those traditionally employed within software companies to create their
black box products.
It is significant that the success of Perl was not an accident, as had
been the case with Sendmail, say, or even with Linux. Even though Perl
grew out of Wall’s own needs, from the beginning he designed it to be
not only generally applicable but widely popular. This unusual strategy
worked because Wall was self-aware about what he was trying to do and
0738206709-02.qxd 9/4/03 10:20 AM Page 139
what the processes for achieving this were—an awareness doubtless born
of his natural bent and intensified through his dual linguistic training.
This awareness has been displayed most entertainingly in the verbal
pyrotechnics Wall routinely employs in his State of the Onion talks at
the annual Perl conferences; these have taken place since 1997 and are
run by O’Reilly & Associates, which also publishes numerous books
about Perl (including one by Wall). In these talks, Wall has talked about,
and around, in the most tangential but illuminating ways, the ideas at
the heart of Perl programming. The analytical and theoretical side sets
him somewhat apart from the other free software leaders, who tend to be
more intuitive and practical in their approach to coding.
Partly as a result of this highly personal intellectual approach that he
has adopted in his speeches, Wall has made another important contribu-
tion to the free software movement. He offers an example of a top pro-
grammer who is highly articulate, witty, and deeply concerned with the
broader implications of his work. Soft-spoken and avuncular, Wall’s calm
presence at conferences and before the press has helped to combat
clichéd images of hackers as unstable, antisocial individuals with little
sense of morality and no sense of humor. If Richard Stallman is the father
of today’s free software movement, Larry Wall can be rightly considered
its favorite uncle.
0738206709-02.qxd 9/4/03 10:20 AM Page 140
LONG BEFORE LARRY WALL WAS THINKING about how to make Perl
successful, Richard Stallman had analyzed why the hacker community at
the MIT AI Lab had worked so well. He then set about re-creating that
community, consciously taking steps that would maximize the chances
for its survival. As GNU/Linux and the other free software projects such
as Perl, Apache, and the rest grew not only in maturity but in visibility,
others, too, began to notice a larger trend, and a few tried to understand
it.
One of the first formal attempts to analyze the GNU/Linux phenome-
non was written by Larry McVoy. He was born in Concord, Massachu-
setts, in 1962, and so was a good measure older than the likes of the
other top Linux hackers such as Linus, Alan Cox, and Ted Ts’o. McVoy
was steeped in the mainstream Unix tradition, having worked first for
The Santa Cruz Operation (SCO)—which later bought Unix from Nov-
ell—and then for Sun, in both its software and hardware divisions. This
work experience gave him a depth of practical knowledge in how the
software industry functioned that few others in the GNU/Linux move-
ment could match.
While he was at Sun, McVoy put together what he called his Source-
ware Operating System Proposal, written during September and October
1993. In it, he suggested that Sun give away the source code to its SunOS
4 version of Unix so that a united Unix platform could be created that
140
0738206709-02.qxd 9/4/03 10:20 AM Page 141
McVoy noted in his paper that “the Unix problems are not being ad-
dressed. The vendors think ‘standards’ are the answers. The program-
mers think ‘free’ software is the answer. The customers think NT is the
answer.” Unfortunately, as McVoy explains, when vendors pushed for
“standards,” they really meant something else. “It was just so incredibly
pathetic,” he says. “Each of these Unix vendors said they wanted to do a
‘let’s unify’—and then the next statement was, ‘OK, you dump yours, and
we’ll use mine.’ And it was sort of like, jeez, we’re really not getting any-
where.” Because McVoy was trying to avoid recourse to the “customer
solution” of NT, that left the programmers’ solution: free software, or
“sourceware,” as he termed it.
McVoy wrote: “More and more traditional Unix developers have given
up on the proprietary Unix implementations and are focusing their en-
ergy on sourceware.” Among the “results of their labors,” he singled out
GNU software (GCC and GNU Emacs), the X Window system, and
GNU/Linux, which he described as having “many features found only in
mature operating systems,” and noted that it was “feature rich but lacks
the quality and stability of a commercial product,” a fair assessment for
the state of Linux in 1993.
McVoy pointed out that “almost every good feature in computer oper-
ating systems today, including most features in DOS, Windows, and
Windows/NT, came from the mind of one hacker or another. Typically,
the work was not commissioned by a company.” He commented that “for
the business community to dismiss the hackers as ‘nuts’ is a disaster,”
and suggested, presciently, that “it is far better to figure out a way to al-
low the business world and the hacker world to coexist and benefit one
another. If businesses want the best technology, then businesses should
move towards sourceware.”
McVoy’s Sourceware proposal to solve the problems of Unix fragmen-
tation, cost, and stagnation, called for the vendor community to rally
around a single Unix version whose source code would be freely avail-
able. Ideally, this would be his beloved SunOS, but if that were not possi-
ble, GNU/Linux would still be “a win on the political front,” even if
something of “a lose on the maturity front,” as he put it.
“I never really published it, it just sort of made the rounds,” McVoy
says of his paper. “And I asked people to distribute it amongst people
that could have some influence. I was hoping to try and build up a
groundswell of pressure on McNealy and the other executives. I got in to
see most of them, and spent a lot of time talking to them about this
stuff.” The response was encouraging: “Every body who read it was very
0738206709-02.qxd 9/4/03 10:20 AM Page 144
how well you adapt to it over time.” Raymond would later describe him-
self as an “anxious, angry, frustrated child.” He explains, “Other people
gave me a hard time. There are certain things that are pretty tough to do
if you have impaired physical co-ordination. Kind of threw me back on
my mental resources.”
He describes how the shift to computers occurred. “I got interested in
computers in college mainly as a sideline, and then I burned out on
mathematics,” his original field of study. “Well, computers were there.”
Raymond was entirely self-taught: “The books didn’t exist then,” he
points out. “That was only a few years after the first degree in computer
science, actually. Most of the learning apparatus we have today didn’t yet
exist. People who were programmers then basically did what I did; they
drifted in from other fields and taught themselves.”
Someone else who had taught himself, and was also originally a math-
ematician, was Richard Stallman. Raymond says he knew Stallman not
just seven years before he started the GNU project but before he had ac-
quired his trademark mane and beard. As Raymond explains, “I have oc-
casionally stood beside him, and said to people, yes, I knew Richard
before he had long hair and a Jesus complex. To his credit, Richard
laughs when I say this.”
Through Stallman, Raymond got to know the MIT culture, which he
characterizes as “organized around PDP-10, assembler language, and
LISP.” He already knew about the Unix and C culture, which meant that
“I got to see both sides of that [hacker] scene pretty early on,” he says.
Even in the early 1980s, then, Raymond was a privileged observer of a
wide range of coding activity. He likes to emphasize that much of this
was disconnected, or at least connected only from time to time; there
was no unified hacker culture as we know it today. “That period was very
different from the open-source culture now,” he says, “because the tech-
nological infrastructure to do software sharing the way we do it today
didn’t exist. The only way you could pass around software during that
stage was on magnetic tapes, and so it was only done sporadically. The
cultures that did that were very small and very isolated from each other.
“There’s a certain school of historical interpretation that views today’s
free software as a return to a pre-proprietary golden age. That golden age
never existed,” Raymond insists. “I was there and I know. From ’77 to
the early ’90s, the free software culture, or the open-source culture as we
think of it today, only existed in the sense that there were a bunch of
people doing things not grossly dissimilar from what we do today.
“But it had no self-consciousness,” he points out. “It had no cultural
identity; there was nobody who was thinking ‘let’s do these things for
0738206709-02.qxd 9/4/03 10:20 AM Page 146
One person at least partly “acculturated” in this way was Linus. “When
we first met in 1996,” Raymond recalls, “he wanted my autograph.”
Finally, The New Hacker’s Dictionary was, as Raymond puts it, “a way
of paying tribute to my roots.” Already by the late 1980s he was begin-
ning to think of the global hacker community as not just a distinct, well-
defined tribe, united by a culture and a set of beliefs (both of which The
New Hacker’s Dictionary was hoping to record and stimulate), but more
specifically as his own tribe: a tribe, moreover, that tended to judge peo-
ple by their real achievements in hacking and that was indifferent to
mundane issues of who you were—or whether you have cerebral palsy.
Aside from these novel motivations, The New Hacker’s Dictionary was
also unusual in the way it was put together. “I consciously approached it
as a large-scale collaborative process,” Raymond explains, “in which I
would be a central nexus, but not assert as much control of the process
as a conventional editor does.”
The New Hacker’s Dictionary was published—appropriately enough—
by MIT Press, and became something of a best-seller; as such, it achieved
one of Raymond’s major aims in evangelizing about his tribe and its lore.
The resulting interest also allowed Raymond to develop his skills in han-
dling the media, an extremely rare ability among hackers, and gave him
“some street cred among journalists” as he puts it, which he would put
to good use later on.
The New Hacker’s Dictionary nearly had one other dramatic conse-
quence, as Raymond describes. “In 1991, I was working on the first edi-
tion of The New Hacker’s Dictionary, and I was sent an interesting hunk
of code by a guy named Ray Gardner. It was what we would now think of
as a hypertext viewer for the Jargon File.” That is, you could use links
within the text to jump to other entries as with a Web browser.
“Ray had written it for MS-DOS,” Raymond continues. “He sent me
the first build, and I thought this was a cool idea. I completely rewrote it,
and generalized and ported it to Unix, and I turned it into a baby hyper-
text reader which I called VH, or VolksHypertext.” He explains the rather
unwieldy name: “At the time I was aware of the Xanadu project, which
was this huge complex Cadillac of a hypertext system.” Raymond had
been invited to join the Xanadu project in 1980. “VH was a very stripped
down, simple, lightweight hypertext system, so I thought of it sort of as a
Volkswagen to Xanadu’s Cadillac. No chrome, no tail fins, but it got the
job done. So I called it VolksHypertext.”
0738206709-02.qxd 9/4/03 10:20 AM Page 148
Xanadu wasn’t the only hypertext system that was being developed in
this period. “It was late ’91, I think, I got mail from Tim Berners-Lee—
who nobody had heard of at the time—saying, ‘I hear you’ve been doing
some interesting things with hypertext; shall we collaborate?’ And so I
sent him back some mail, basically saying sure, standards are good. And
I didn’t hear back from him. I still don’t know why.” Happily for Ray-
mond, he has many other claims to fame other than almost co-inventing
the World Wide Web.
Probably his most important achievement, the essay The Cathedral and
the Bazaar, which analyzes the success of Linux and open-source soft-
ware, arose in part because of a CD-ROM that turned up on his doorstep
in late 1993. The CD-ROM was from the GNU/Linux distributor Yg-
gdrasil, and as Raymond recalls was “the first available CD-ROM Linux
distribution. That landed on my doorstep because I had already written
enough free software that some of my stuff was on that disk.” As men-
tioned previously, this was the “token of appreciation” that Yggdrasil had
sent to “the many free software developers” who had contributed to its
distribution, often unwittingly as in Raymond’s case.
He admits that he was skeptical of Linux until he tried it; and he might
never have tried it had it meant wiping his current system, as would have
been the case with any GNU/Linux distribution other than Yggdrasil. But
“Yggdrasil had an interesting feature,” Raymond notes. “It was possible
to run Linux as a demo without changing any partitions on your hard
disc”—in other words, without needing to delete or install anything.
“And that’s what I did for a few hours, and I was astonished at what I
saw. I abandoned the rest of my plans for that day, scrubbed the commer-
cial [AT&T] System V that I had been using off my disk, and installed
Linux.”
Raymond was more than astonished, he says: “Flabbergasted wouldn’t
be too strong a way to put it. Because it completely contradicted all my
expectations about the quality of a project like that. I still had the classi-
cally Brooksian idea that you couldn’t do high-quality software with a
mob,” another reference to Brooks’s The Mythical Man-Month paper. Ray-
mond set out to study and understand this interesting phenomenon.
“The turning point for me in terms of getting to know the people of
the culture deeply was in 1996 when I was on the program committee
for the first Freely Redistributable Software Conference.” This confer-
ence was held in Boston, from 3 February to 6 February 1996. It had
been organized by Peter Salus, and was being run under the auspices of
the Free Software Foundation.
0738206709-02.qxd 9/4/03 10:20 AM Page 149
it does gloss over the complex reasons that top hackers such as Linus
start coding, as the genesis of Linux demonstrates.
His second aphorism, “Good programmers know what to write. Great
ones know what to rewrite (and reuse)” is close to a similar phrase that
is usually attributed to Pablo Picasso: “Good artists borrow, great artists
steal.” Raymond says the “parallelism was intentional.” The key point, of
course, is that traditional, black box closed software does not permit
such rewriting and reusing, and so cannot enjoy the benefits they bring.
“Plan to throw one away; you will, anyhow” is not just similar to the
words of someone else, but a direct quotation, in this case of Fred
Brooks. Brooks hovers as a constant presence over not only Raymond’s
paper but of the entire free software movement. Although Brooks was
writing about the creation of mainframe software decades ago, the essays
in which he described and warned against the pitfalls of large-scale de-
velopment projects also indirectly suggest ways of avoiding them—do
things in a radically different manner. Much of The Cathedral and the
Bazaar is about understanding how to realize that difference in practical
terms.
The next three aphorisms form the heart of Raymond’s paper. “Treat-
ing your users as co-developers is your least-hassle route to rapid code
improvement and effective debugging,” he writes. “Release early. Release
often. And listen to your customers.” And last: “given a large enough
beta-tester and co-developer base, almost every problem will be charac-
terized quickly and the fix obvious to someone.”
Raymond recasts the last of these in the more punchy phrase “Given
enough eyeballs, all bugs are shallow,” and dubs it Linus’s Law; in many
ways this stands as the resolution of the conundrum posed by what is
known as Brooks’s Law: “Adding manpower to a late software project
makes it later.” Linus’s Law, taken with the previous two aphorisms,
states that the strength of open source lies in its ability to harness a huge
user base, notably via the Internet.
Raymond rightly calls Linus’s Law the “core difference” between the
cathedral-builder and the bazaar modes. As he writes, “In the cathedral-
builder view of programming, bugs and development problems are
tricky, insidious, deep phenomena. It takes months of scrutiny by a dedi-
cated few to develop confidence that you’ve winkled them all out. Thus
the long release intervals, and the inevitable disappointment when long-
awaited releases are not perfect.
“In the bazaar view, on the other hand,” he continues, “you assume
that bugs are generally shallow phenomena—or, at least, that they turn
shallow pretty quick when exposed to a thousand eager co-developers
0738206709-02.qxd 9/4/03 10:20 AM Page 152
effect of this has been that many more traditional members of the free
software community see Raymond as promoting himself more than the
tribe. In part, this is a natural suspicion in the face of that rarest of
beasts, the extrovert hacker. But these critics miss the point: Those who
are good at this role have to push themselves in the face of the media. To
ask them to be successful and retiring is simply a contradiction in terms.
Raymond has been effective in this role not only because of his ability
to coin quotable sound bites but also because his wide and unusual
range of interests make him into what journalists like best: a “colorful”
figure.
Raymond possesses a black belt in the martial art Tae Kwon Do and
describes himself as a “happy gun nut”; he has even persuaded people as
diverse as Linus and Stallman to join some of his shooting parties. A
hacker into firearms is striking enough perhaps, but Raymond manages
to cap that by calling himself a neo-pagan.
“I’m a third-degree witch” in the Wiccan religion—the highest level,
he says. “Wicca is a relaxed, nature-centered polytheism long on direct
experience and short on dogma,” he explains. “My beliefs don’t involve
much resembling anything that you would call traditional religious
faith,” he says, and in any case, he adds, “I don’t practice a lot anymore.”
But Raymond believes that “the techniques and attitudes that I’ve
learned from Zen and neo-paganism are very much part of what makes
me publicly effective.” They are also completely consistent with the
other beliefs that are central to his life: free software, no gun control—or
“an armed and self-reliant citizenry” as Raymond prefers to put it—and
libertarianism, which he explains as “the original individualist-, small-
government, free-trade, rely-on-the-market-not-on-coercion ideology.”
More specifically, he describes himself as belonging to a group called
“market anarchists” who “would like to abolish government altogether.”
It is not hard to divine where this thoroughgoing dislike of centralized
powers—be it in the world of software, religion, or politics—has its
roots. “I hated feeling powerless as a child,” Raymond says. “And it was-
n’t just the palsy: Even if I had not had the palsy, I would not have liked
being a child, because I was in a condition of dependence all the time.
Some people can readily live with that; I could not. I guess in some sense
I’ve generalized that; I want everybody to feel empowered.”
His latest project in empowering people is called The Art of Unix Pro-
gramming. “This is a book about how to think like a Unix guru,” he says.
“You can view it as a continuation of a theme that’s been present in my
work all along, which is the conscious elucidation of unconscious
knowledge.
0738206709-02.qxd 9/4/03 10:20 AM Page 154
“The specific motivation in this case,” he says, “is that there are thou-
sands of eager young Linux programmers who are running around with
bits and pieces of this Unix tradition, but they don’t have the whole
thing; they don’t have this synthetic overview, the philosophy, and it will
make them more effective if they have that.” As with The New Hacker’s
Dictionary, Raymond views this as a collaborative venture. “But the pop-
ulation that I expect to be involved with this is somewhat more re-
stricted,” he notes, given the more rarefied nature of its contents.
Unix programming is an art, Raymond believes, because “when you do
it at a high enough level, there’s a very strong aesthetic satisfaction that
you get from writing an elegant program. If you don’t get that kind of
gratification, you never join the culture. Just as you don’t get composers
without an ear for music, you don’t get hackers without an ability to be
aesthetically gratified by writing programs.” This element of aesthetic
gratification perhaps provides the key to explaining one of the missing
pieces of Raymond’s otherwise comprehensive explanation of the open-
source process.
In a follow-up essay to The Cathedral and the Bazaar, called Home-
steading the Noosphere, Raymond explored an apparent paradox at the
heart of open-source software: If everyone is free to take the code and
modify it, why do major projects like Linux or Apache rarely split, or
‘fork,’ as hackers say, just as the old-style commercial Unixes did? He
suggests that peer esteem, the key driving force for people working in
the world of free software, explains the effect. He demonstrates well that
the dynamics of such a “gift economy”—where prestige is measured not
by what you have, but by what you give away—tend to reduce the threat
of forking.
Certainly this analysis goes a long way to explaining why so many
coders around the world willingly devote themselves to these projects
with no immediate thought of material reward. In one respect, it is un-
satisfactory: It does not explain why top hackers—Linus or Larry Wall—
began and then devoted so much effort to their fledgling projects.
Raymond’s “scratching an itch” does not suffice, either: The itches would
have been amply scratched by projects far less ambitious—and far less
impressive—than what became Linux or Perl, say.
There are precedents for gifted individuals who conceive and work on
huge projects, often with little hope not only of recompense but of recog-
nition. In other spheres, these people are called artists, and much of the
history of art is the story of those who create masterworks out of an in-
ner necessity to do so, over and above what they may have been paid to
produce.
0738206709-02.qxd 9/4/03 10:20 AM Page 155
Striking parallels exist between the stories of top hackers and famous
artists. For example, generous funding from aristocratic patrons enabled
Ludwig van Beethoven to devote himself to writing works of such origi-
nality that they were often incomprehensible to his contemporaries, and
also to promote the then-radical ideals of liberty he believed in. In the
same way, Richard Stallman has been able to devote himself to his coding
and campaigning thanks in part to patronage such as the McArthur
Foundation fellowship.
Johann Wolfgang von Goethe was a minister of state at the German
court of Weimar, and took his duties there as seriously as his work on
such projects as his masterpiece Faust, a vast patchwork of poetry that
occupied him for fifty years. Alongside these major responsibilities,
Goethe managed to juggle a family life (unlike Beethoven), rather as Li-
nus somehow does (unlike Stallman) while holding down a full-time job
at Transmeta and pushing forward his own life work that grows through
constant accretion.
Perceptive analyst that he is, Raymond is well aware of this aspect. In
Homesteading the Noosphere, he writes, “In making this ‘reputation game’
analysis, by the way, I do not mean to devalue or ignore the pure artistic
satisfaction of designing beautiful software and making it work. We all
experience this kind of satisfaction and thrive on it.” But he tends to
downplay it and subsume it in the ‘reputation model.’
Support for the view that artistic satisfaction is a better explanation, at
least for the key initiating figures of the free software movement, is pro-
vided by someone to whom Raymond made a nod of respect in his
choice of the title The Art of Unix Programming for his next book. “It was
a conscious reference to Donald Knuth’s The Art of Computer Program-
ming,” he says.
Although not well known outside computing circles, Donald Knuth
towers over the discipline of computer science; he is a kind of grandfa-
ther of the free software movement and foreshadowed many of its ideas
and techniques with his own important and widely used programs, all
freely available.
Knuth was born in 1938, in Milwaukee, Wisconsin. During a brilliant
early career as a physicist and mathematician at Case Institute of Tech-
nology—where he was awarded a master’s degree contemporaneously
with his B.S.—he became interested in the young world of computer sci-
ence.
In 1962, he started work on a book that was originally intended to be
about the design of compilers—the programs that convert source code
into binaries. The project blossomed and became the multivolume work
0738206709-02.qxd 9/4/03 10:20 AM Page 156
spired in part by the example of Charles Dickens, whose novels were se-
rialized in a similar way. The main advantage in serialization is that
Knuth will receive feedback from his readers as he goes along; thus the
debugging process will move faster.
Knuth’s masterwork would not have been possible had he not had ac-
cess to the work of his fellow computer scientists and the academic tra-
dition they formed part of. The ready availability of their results meant
that he could draw ideas from many sources. “You put together two ideas
that are in two different journals,” he says; “maybe the authors never
knew each other. I’m the first person that really knew that these two peo-
ple were working on the same subject, which happens an awful lot.”
Putting them together in this way often enabled Knuth to come up with
important and original results, but far more quickly and efficiently than
if he had performed all the preliminary work, too. In this sense, The Art
of Computer Programming is one of the greatest monuments to the open-
ness and sharing that lies at the heart not only of open-source software
but of the entire scientific tradition.
Conscious of the magnitude of the task still before him, Knuth has not
used e-mail since 1990. This apparently extreme decision to cut himself
off from the outside world to minimize distractions during the creative
process is not without precedents. In the early years of the twentieth
century, Marcel Proust was similarly engaged in a work that would dom-
inate his life: his vast, autobiographical novel A la recherche du temps
perdu (In search of lost time). To aid his concentration, Proust had the
room where he worked in his Paris home lined with cork to muffle the
sounds from the outside world. It is hard to think of a better physical
correlate for Knuth’s act of online disconnection.
The link between Proust and Knuth goes deeper. The opening page of
The Art of Computer Programming reads, “This series of books is affec-
tionately dedicated to the Type 650 computer once installed at Case In-
stitute of Technology, in remembrance of many pleasant evenings”—a
conscious reference, Knuth acknowledges, to the title of Charles Scott-
Moncrieff’s standard English translation of Proust’s masterwork: Remem-
brance of Things Past.
It seems appropriate that somebody who began his career working
with the then-new IBM 650 computer system now finds himself an hon-
orary and highly honored member of today’s computer vanguard: open
source. Given his personal track record of openness and free distribution
of software, it is not surprising that he should be such a staunch sup-
porter of source code availability. “I’m more than sympathetic to that; I
consider that absolutely indispensable for progress,” he says.
0738206709-02.qxd 9/4/03 10:20 AM Page 160
Knuth uses GNU/Linux for his own computing, and one of the main
programs he runs is Emacs: “That’s my favorite piece of open-source soft-
ware,” he says. Knuth has even sent Stallman suggestions for Emacs. “He
hasn’t put them in yet,” he notes, but acknowledges that “I didn’t send
them by e-mail,” because he never uses it, and that this omission may
have had an influence. Nonetheless, Stallman remains something of a per-
sonal hero for him. “He’s on the right side of the crusade,” Knuth says.
As well as being a distinguished academic, a prolific author of key
books and papers in his field, and what can only be described as one of
the most senior proto-hackers, Knuth is also a keen musician: He even
contemplated studying music at university instead of physics. More re-
cently, he had a fine two-manual baroque organ built in a specially de-
signed room of his house.
Knuth shares a love of music with many of his fellow programmers of
free software: Larry Wall trained as a musician, Richard Stallman takes
music and instruments with him wherever he goes, Ted Ts’o and Stephen
Tweedie sing in choirs (and Tweedie conducts them, too), Dave Miller
has variously played guitar, drums, and piano, and Eric Raymond has
been a session flutist on two albums. The Victorian art historian and the-
orist Walter Pater wrote in his 1877 essay “The School of Giorgione” that
“all art constantly aspires towards the condition of music”; perhaps this
applies to the art of code, too.
But whatever the close links between coding and artistic creation, it
would be a mistake to regard hacking as an ivory tower activity, of inter-
est only to some inner circle of adepts. Knuth himself wrote in his essay
Computer Programming as an Art: “The ideal situation occurs when the
things we regard as beautiful are also regarded by other people as use-
ful.” He adds, “I especially enjoy writing programs which do the greatest
good, in some sense,” a clear echo of Stallman’s comment about the pride
he felt early on in his programs that were useful to people.
Just as the best source code always implies the existence of a usable bi-
nary, so there seems to be a sense among hackers that the best software—
the most beautiful as Knuth and others would say—implies that it serves
its users as much as possible. Time and again, the top hackers refer to the
sense of community their work creates, a community defined by the
users of the binaries generated from their code.
This sense of creating and serving a community, perhaps as much as
any aesthetic sense, is the commonest motivation for the greatest hack-
ers. The community was the goal, not just a result, of Stallman’s creation
of the GNU project. Sendmail’s creator Eric Allman says that for him free
software is “about society,” and adds, “I suppose that sounds fairly out
0738206709-02.qxd 9/4/03 10:20 AM Page 161
there, but that really does it for me.” Similarly, speaking for the core
members of the Apache Group, Brian Behlendorf puts it this way: “To us
the primary benefit of using Apache was using it as part of the commu-
nity at large, rather than just using it for the 1s and 0s it represented.”
Perhaps the most articulate exponent of the centrality of the community
to open source is Larry Wall. He has spoken of wanting to serve people
through Perl, and realized that this “needed the culture around the lan-
guage,” as he says—and cultures are generated by a community of users.
“That starts off small with the feedback loop,” he says, “but I also real-
ized that there’s not only the first-order feedback loop going on there,
there’s the interactions among people, and that it’s probably more valu-
able what the other people exchange among themselves in terms of Perl
scripts than what I’m exchanging with them. Because the second-order
effects are many-to-many, whereas the relationship with the center is
one-to-many.”
In a sense, Wall wrote Perl and distributed it freely to give members of
the community that grew around it the possibility of a similar generosity
in their interactions with each other—but one on an even larger scale,
thanks to the network effect. “By and large I think I’ve got my wish on
that,” Wall says. “People really do help people for the sake of helping
people.”
For Wall, a committed Christian, this hoped-for effect is nothing less
than “theological.” Wall draws explicit parallels between his creation of
Perl, artistic creation in general, and the primordial Creation. “The
metaphor of messianic overtones is certainly there,” he admits, and adds
with a characteristic ambiguity, “I simultaneously try to take them seri-
ously and not seriously.”
Surely it is no accident that Knuth, too, is a deeply religious man, as
are other leading hackers such as Ted Ts’o, and that even those who are
not religious, such as Richard Stallman and Linus, have a strong underly-
ing ethical component to their characters. This pervasive aspect means
that, unlike commercial software vendors, which at most can promise
material rewards, the free software projects can offer something much
more valuable on a human level, if more intangible. Tapping into the
best that is in people, it calls forth the best in all sorts of ways, not least
in code quality.
After 1996, even the most cynical and jaded observers of the technol-
ogy scene were beginning to notice this. Free software, the Internet’s
best-kept secret, was about to enter the mainstream.
0738206709-02.qxd 9/4/03 10:20 AM Page 162
10
162
0738206709-02.qxd 9/4/03 10:20 AM Page 163
nel released, including no less than sixteen in April alone. In 1997, at the
beginning of which Linus moved with his wife, Tove, and daughter to
Silicon Valley, there were under sixty—still an astonishing number, of
course, were it not Linux.
As a result, 1997 might have seemed a relatively quiet year for
GNU/Linux. But beneath the surface, much was happening, particularly
as far as increasing people’s awareness was concerned. An important cat-
alyst for these changes was the appearance of Eric Raymond’s essay The
Cathedral and the Bazaar, which he had finished in January 1997. “One
of the first people I bounced the paper off was Erik Troan,” Raymond ex-
plains. “He was a key developer at Red Hat, and a good friend of mine.
“And he said, ‘Wow, you’ve really got something interesting here, and
you should give it at the next conference. And I happen to know that the
people at Linux Kongress are looking for good material.’ So I sent it to
them,” Raymond explains, “and they said, ‘Wow, yeah, we want you to
come to Bavaria and give this paper.’”
The Linux Kongress was one of the most venerable annual meetings in
the Linux calendar, and was already in its fourth year when Raymond
went in May 1997. It was held in Würzburg, Germany, and other speak-
ers included top hackers such as Dave Miller, Ted Ts’o, and Stephen
Tweedie, as well as the publisher Tim O’Reilly.
“When I heard Eric’s talk,” O’Reilly explains, “I said, this is fantastic,
and that was when I invited him to be the keynote speaker for our up-
coming Perl conference that summer. I had decided that one of the
things that I needed to do was to raise the visibility of some of these pro-
grams [like Perl],” he says. “There was just some part of me that was just
really irritated by the fact that the computer industry was ignoring these
incredibly important people and programs.” O’Reilly had been alerted to
the rise of free software by a surge in sales of the Perl titles he published.
“Eric’s thinking certainly shaped mine at that point,” O’Reilly says. “We
bounced off each other a lot, and so I used some of his ideas in setting up
the conference. Really in a lot of ways that first Perl conference was
themed around some of Eric’s ideas.”
The conference took place in San Jose, California, on 20 August 1997.
Larry Wall made his first idiosyncratic keynote, which bore the punning
and yet descriptive title of ‘Perl Culture,’ and Raymond read The Cathe-
dral and the Bazaar. This time, Raymond notes, there was a subtle differ-
ence to the response he received from that of the audience in Germany.
“In the intervening couple of months,” he explains, “the paper had
spread through the culture so rapidly that at the Perl conference there
was already a sense of celebration that hadn’t been present in Bavaria,”
0738206709-02.qxd 9/4/03 10:20 AM Page 166
Raymond saw that the free software community had a unique opportu-
nity to exploit the media interest it had generated.
After visiting Netscape at its headquarters in Mountain View, Califor-
nia, Raymond convened a meeting at the offices of the GNU/Linux hard-
ware company VA Research, at that time also located in Mountain View,
on 3 February 1998. “I put forward the proposition that we needed a
new label that was less threatening to the mainstream,” he remembers,
“and we brainstormed it and we came up with ‘open source.’”
Those taking part included Raymond; Larry Augustin, the CEO of VA
Research; John “maddog” Hall, who was there by telephone for part of the
meeting; Sam Ockman, from the Silicon Valley Linux User Group; and
Christine Peterson, president of Foresight Institute. Raymond explains,
“The Foresight Institute [is] a bunch of thinkers who are concerned about
nanotechnology,” building machines on a molecular scale, “trying to bring
it into existence, and trying to control it so that we use it properly.” It was
Peterson, he says, “who actually came up with the term ‘open source.’”
One of the main items of business for the Freeware Summit organized
by O’Reilly, which took place on 7 April 1998, in Palo Alto, was to find
an alternative to the name “free software” that all the leaders present
were happy to rally behind. Suggestions included not only the newly
coined “open source,” but also “freeware,” “sourceware,” and “freed soft-
ware.” After some discussion, a vote was held, and “open source” won.
In addition to the name, Raymond had also come up with the idea of
an Open Source Definition. He wanted something that allowed licenses
other than that of the GNU GPL, but that still promoted the key ideas
behind free software. The same issue had been confronted by the Debian
group a year earlier. As Perens, who was Debian leader at the time, ex-
plains, “We wanted Debian to be 100 percent free software, and at the
time we sort of knew what free software was from the philosophy of
Richard Stallman and the Free Software Foundation. But we wanted to
be able to accept more licenses than just the GNU one, and we had al-
ready put together a system that we thought was all free software, but we
hadn’t really formalized what free software was.
“So I sat down and wrote the first draft of the entire Debian Social
Contract,” Perens says. “I submitted the first [draft] in early June [1997].
We discussed on the private mailing list for an entire month, and then
we voted it into project policy.” An important part of the Social Contract,
formally announced on 6 July 1997, was the Debian Free Software
Guidelines, which, Perens adds, essentially outlined “what we would
give back to that [free software] community in return for all of the great
programs we were getting from them.”
0738206709-02.qxd 9/4/03 10:20 AM Page 168
Perens explains how the Open Source Definition came about. “Eric
Raymond called me up,” he says, “the day after the meeting at which the
term ‘open source’ had been coined. Raymond explained the thinking be-
hind the new name, and said he was looking for definition to go with it.
When Eric called me, I said, ‘OK, that sounds like a good idea, let’s trade-
mark “open source” and let’s bind it to [the] Debian Free Software
Guidelines; we’ll call that the Open Source Definition.’” Little needed to
be changed: “I did not make any substantive changes to the document,”
Perens says. “I only changed it from Debian to make it general.”
The Open Source Definition lays down nine criteria that the distribu-
tion license of software must meet to be called “open source.” The first
three—the ability to distribute the software freely, the availability of the
source code, and the right to create derived works through modifica-
tions—enshrine the basic characteristics that lie at the heart of the new
software methodology. The other criteria spell out ancillary require-
ments; for example, they ensure that the license does not discriminate
against persons or groups or fields of endeavor (such as business), and
they close loopholes that might otherwise be exploited.
In a press release issued before the Freeware Summit, the software pro-
jects it embraced were called “freeware”; after the summit, another press
release called them “open source” (“sourceware” was also mentioned).
The reason given for the shift is significant: “While this type of software
has often been called ‘freeware’ or ‘free software’ in the past, the develop-
ers agreed that commercial development of the software is part of the
picture, and that the terms ‘open source’ or ‘sourceware’ best describe the
development method they support.”
That is, the meeting represented a conscious attempt to make compre-
hensible and acceptable to software companies what had hitherto been
something of a fragmented and fringe activity. This was an important
repositioning. The leaders present at the Freeware Summit had agreed
that to drive the uptake of their software further they needed to adopt a
more business-friendly approach—including an easily remembered and
understood name. In other words, a brand.
Richard Stallman has always viewed this shift with alarm. “The open
source movement is Eric Raymond’s attempt to redirect the free software
movement away from a focus on freedom,” he says. “He does not agree
that freedom to share software is an ethical/social issue. So he decided to
try to replace the term ‘free software’ with another term, one that would
in no way call to mind that way of framing the issue.
“In the GNU Project,” Stallman emphasizes, “we want to talk about
freedom, so we continue to describe our software as ‘free software’; we do
0738206709-02.qxd 9/4/03 10:20 AM Page 169
not use the term ‘open source.’ Raymond hopes that using the term ‘open
source’ will convince existing software companies to release useful pro-
grams as free software. This is useful when you can do it, but what our
community needs most is to be full of users who value their freedom and
will not easily let it go.”
There was little that Stallman could do to prevent the others from car-
rying out what amounted to a rebranding exercise. The marketing aspect
of the summit had been one of the key considerations even while it was
being planned. “As we thought about it,” O’Reilly recalls, “we said, gosh,
this is also a great PR opportunity—we’re a company that has learned to
work the PR angles on things. So part of the agenda for the summit was
hey, just to meet and find out what we had in common. And the second
agenda was really to make a statement of some kind about this was a
movement, that all these different programs had something in common.”
At the end of the summit, a press conference was held. Nothing could
symbolize better the new approach than this phalanx of top hackers fac-
ing the press for all the world like the board of some conventional corpo-
ration. O’Reilly recalls that “the basic message was, you guys [in the
press] are talking about ‘are you going to beat Microsoft?’—and I said,
look at these [open source] guys. Every one of them has dominant mar-
ket share, with no money, nothing but the power of their ideas, and this
new model. And I went down the list and said look, here’s Apache, here’s
Sendmail, all these programs that are the market leader. I said, so tell me
this isn’t a winning model.”
The Freeware Summit was a key moment in the history of free soft-
ware. And as Eric Raymond points out, “One of the things that was inter-
esting about it was [that it was] the first intentional act of community.”
All the key players—except for Stallman—had not only agreed to oper-
ate collectively but had also sketched out the strategy for growing this
new community.
The summit and these decisions would have important long-term ef-
fects, but even in the short term they produced a heightened awareness
among the media. That awareness was furthered at another meeting,
which would also have direct and significant consequences, held in Sili-
con Valley three months later.
Called the “Future of Linux,” it was organized on 14 July 1998 by the
Silicon Valley Linux User Group (SVLUG) at the Santa Clara Convention
Center, with support from Taos, a company offering interim staffing in
the Unix and Windows NT fields, and VA Research. As the report on the
user group’s Web site explained afterwards, it took the form of a panel
discussion that offered “some frank talk about things Linux needs to be
0738206709-02.qxd 9/4/03 10:20 AM Page 170
any input from us; he said he expected us to use all their changes pre-
cisely as supplied and refused to adapt them in any way to make them do
what we wanted. If their aim was to help, they ought to have started by
talking with us about how to proceed.”
The events that brought Linux perilously close to a fork show interest-
ing parallels to the Emacs situation. The events of autumn 1998 were
triggered when Linus was unable to keep up with the flow of patches
that were being sent to him; as with the XEmacs split, “cultural and tech-
nical” differences arose between the two main factions.
Things began in the most innocent way imaginable. On 28 September
1998, Michael Harnois, a kernel hacker, posted a simple technical ques-
tion to the Linux-kernel mailing list, long the key debating chamber of
the inner core of top hackers. Harnois asked:
That is, he wanted to know why a small piece of code didn’t work in
version 123 of the 2.1 development kernel. As is usual, somebody soon
came along with an explanation for the problem, and attached a patch
that would sort things out.
But then another hacker, Geert Uytterhoeven, stepped in and said
Please don’t waste your time on creating these patches. These things are
functional in the “vger” tree.
“Vger” has entered into Linux mythology, and refers to a computer lo-
cated at Rutgers University that ran the Linux-kernel mailing list, among
others. It was set up by Dave Miller when he was studying and working
there. Miller explains how this came about. “The [mailing] lists which
existed when I showed up were run in Finland,” he recalls, “and were
starting to reach a state of being unmaintained. People began to com-
plain, the lists worked only every other day, you couldn’t get responses to
help unsubscribe when one had problems, etc.
“So I asked my co-workers [at Rutgers], ‘Can I run a few small mailing
lists here on this old machine that doesn’t get used much?’ The response
was something like, ‘Test one, and if no problems show up you can add
the others.’ I think they all were moved over to Rutgers within the next
two weeks or so,” Miller continues. “The machine’s name at Rutgers has
always been vger. Vger sounds like a funny name, doesn’t it? Remember
Star Trek, where the spaceship named ‘Voyager’ flies past the screen, and
0738206709-02.qxd 9/4/03 10:20 AM Page 173
some of its letters had been scuffed off? The remaining letters spelled out
VGER. Hey, I’m not a Trekkie and I hadn’t named this machine, but it’s
an interesting Linux history tidbit, I suppose.”
Vger was also used for running a program called Concurrent Versions
System (CVS). This is special software that is used to manage software
development; essentially, it allows people to keep track of the current
state of a project, for example, which patches have been applied. It is
widely used, and is a boon for keeping on top of a complex and fast-
moving project such as Linux. Uytterhoeven’s message that things were
“functional in the vger tree” meant simply that the patches had been ap-
plied to the code in the CVS system on the vger machine. There was only
one problem: Linus did not use CVS on vger.
As a result, when replying to the previous message about not wasting
time creating new patches, Linus wrote:
We’re painfully aware of this. We do sync up with you though, so the vger
tree does have what he needs along with your changes.
Note that saying “it’s in vger, so you’re wasting your time” is still completely
and utterly stupid.The fact that it is in vger has absolutely no bearing, espe-
cially as there’s a _lot_ of stuff in vger that will probably never make it into
2.2.
That is, patches had been entered into the vger CVS system that Linus
was not going to take; therefore, the snapshot of the kernel was still out
of sync with his, despite Dougan’s efforts and those of others. This meant
the kernel held there had elements that would not be used in the official
release of the upcoming 2.2 version of Linux.
0738206709-02.qxd 9/4/03 10:20 AM Page 174
After Linus’s comments about not syncing with vger, the specter of a
fork was raised by Kurt Garloff:
And you think it’s a good idea to have the linux community divide into to
parts: The Linus party and the VGER party? One of the problems of free
software is to have one version accepted as the standard one and to have
people fix this one and not their highly customized one. Don’t do that!
A few hours later, Martin Mares brought up the issue of having patches
ignored:
Some time ago, you promised you’ll accept a single large video patch before
2.2 and I did send it to you, but I got ignored.
Dave Miller then pointed out to Linus that it was this large video patch
that was at the root of vger’s difference from Linus’s code:
_everyone_ who maintains the video driver layer is telling you “look at Mar-
tins patch,” nothing more. If you continually ignore him, you are the road-
block, plain and simple. So instead of being a roadblock, please express your
grievances about Martin’s attempt to merge things with you.
Stop pointing fingers towards “vger,” people are sending you patches, con-
tinually, and are being ignored and not being told why _even_ after you had
told them you would accept such a patch for the video subsystem.
Garloff voiced a growing fear to Linus about where all this might lead:
We all know that there are and will be splits. But the problem is that
vger is sort of important, because some really strong people use it
(DaveM . . . )
Seems it’s hard for some of the vger people to accept that you are still
the Maintainer of the official release.
But please spend some time to have this cleared. And please tell us, what
the outcome of this is and whether one should care about vger, if we want to
fix things . . .
But his comment, “I see, it’s not your fault” provoked Miller’s wrath:
Bullshit, nobody has told him “Linus, go look at vger and take that code,”
this is utter crap and want this claim to cease immediately.
He reiterated his view that the basic problem was Linus’s dropping
patches:
People have been trying over and over to send him a patch which fixes bugs
and gets the drivers/video directory sync’d up driver wise for other architec-
tures, which he said he would put in, and over and over this person is being
flat out ignored by Linus.
The reason I’m disappointed is that vger in particular has been acting as a
“buffer” between me and bug-fixes, so that now we’re in the situation that
there are obviously bugs, and there are obviously bug-fixes, but I don’t see it
as such, I only see this humongous patch.
and concluded:
I’m going to ask David once again to just shut vger down, because these
problems keep on happening.
At this point, Ted Ts’o, perhaps the most senior Linux lieutenant in
length of service, and therefore a figure of considerable authority within
the movement, entered the debate with comments and wise words ad-
dressed to Linus on the subject of what Ts’o termed “bandwidth con-
straints,” which referred to Linus’s inability to keep up with the flood of
patches.
0738206709-02.qxd 9/4/03 10:20 AM Page 176
To be fair to the vger people, one of the problems which the vger CVS tree is
trying to fix is that sometimes you don’t take patches very quickly. There’s
been at least one set of . . . patches which I had to send you two or three
times before you finally accepted it—and they were short patches (1-3 line
changes in 4 files), and with a full explanation of what it did. Heck, in the
most recent case you and Alan [Cox] and I discussed different approaches
for solving this problem before I even started coding the patch.
So we have a problem, and perhaps vger isn’t the best solution. . . . But be-
fore you go slamming the vger folks because this patch batching effect
which you don’t like, it might be nice for you to acknowledge that some of
your bandwidth constraints may have contributed to the problem, and they
were simply trying to find a way around it.
What had begun as a simple question about an obscure bug some forty
hours earlier had turned into an increasingly heated argument raging
across two continents and ten time zones. Suddenly, Linus has had
enough.
First, he fires off a shot directed specifically at Dave Miller:
Quite frankly, I just got very fed up with a lot of people. David, when I come
back, I expect a public apology from you.
Others, look yourself in the mirror, and ask yourself whether you feel confi-
dent that you could do a better job maintaining this. If you can, get back to
me, and maybe we can work something out.
Quite frankly, this particular discussion (and others before it) has just made
me irritable, and is ADDING pressure. Instead, I’d suggest that if you have a
complaint about how I handle patches, you think about what I end up having
to deal with for five minutes.
Go away, people. Or at least don’t Cc me any more. I’m not interested,
I’m taking a vacation, and I don’t want to hear about it any more. In short,
get the hell out of my mailbox.
From these exchanges, nobody could mistake the dire state of relation-
ships between the key hackers on the kernel mailing list.
A few hours after Linus’s final posting, Eric Raymond added his com-
ments on the situation.
People, these are the early-warning signs of potential burnout. Heed them
and take warning. Linus’s stamina has been astonishing, but it’s not limit-
less. All of us (and yes, that means you too, Linus) need to cooperate to *re-
duce* the pressure on the critical man in the middle, rather than increasing
it.
He points out one central fact for the Linux development process:
Linus is god until *he* says otherwise. Period. Flaming him doesn’t help,
and isn’t fair—and you need to have been the key man in development of a
must-never-fail piece of software before you even have standing to *think*
about doing it.
Patches get lost. Patches get dropped. Patches get missed. This is bad and
wasteful in itself, but it has a secondary effect that is worse—it degrades
the feedback loop that makes the whole process work. . . . The effect of ris-
ing uncertainty as to whether good work will make it in at all is certainly
worse than that. Anybody who starts to believe they’re throwing good work
down a rat hole will be *gone*. If that happens too many times, we’re his-
tory.
In other words, Linus’s dropping patches too often was not just incon-
venient but undermined the very mechanism that powered the open
source development model.
0738206709-02.qxd 9/4/03 10:20 AM Page 178
These risks are bound to get worse over time because both system complex-
ity and the developer pool are increasing. And the critical man in the mid-
dle—the “Jesus nut” in our helicopter—has a stress limit. We’re going to hit
that limit someday. Maybe we’re pushing it now.
He concludes:
I’ve been worrying about this problem for months. (I’m our anthropologist,
remember? It’s part of my *job* to notice how the social machinery works
and where the failure modes are.) I was reluctant to say anything while it
was still theoretical, but I take the above as a neon-lit warning that it’s
damn well not any more.
Raymond was not the only person observing these developments with
disquiet. “There are some of us, myself among them, that have been wor-
ried about this for a while and are working on a solution,” commented
Larry McVoy—author of the original Sourceware document and now one
of the most committed Linux hackers—the day after Linus announced
his “vacation.”
Like Raymond, McVoy had a memorable way of describing the situa-
tion: “The problem is that Linus doesn’t scale,” he wrote in his posting to
the Linux-kernel list. This is a sly reference to one of the common accu-
sations against Linux: that it does not scale up to handle enterprise-level
tasks. McVoy then went on: “We can’t expect to see the rate of change to
the kernel, which gets more complex and larger daily, continue to in-
crease and expect Linus to keep up. But we also don’t want to have Linus
lose control and final say over the kernel, he’s demonstrated over and
over that he is good at that.”
As well as an analysis, McVoy thought he had a solution, or at least the
blueprint for one. His idea was to create a new piece of software that ad-
dressed the growing problems of kernel development.
He explains why he was willing to go to such efforts to resolve the
problem back then in the autumn of 1998. “If you look back at my view
of how the Unix vendors splintered,” he says, as explained in his Source-
ware proposal, “and how that was universally a bad thing for the source
base itself, and the product itself, my point of view was the worst thing
that you could do to a project is split the source stream. There’s nothing
worse. Two leaders, doesn’t work.”
0738206709-02.qxd 9/4/03 10:20 AM Page 179
Adds Miller, “I got very frustrated and emotional about the whole or-
deal; I’d lost my mind and my common sense. Things were working in
slow motion; we’d been in a development series kernel for nearly two
years with no end in sight. Linus was overextended and he was probably
just as frustrated as the rest of us about how backed up and busy he was
with other things. Linux is his baby, so I’m rather confident about that
statement. I’m pretty sure that he just didn’t have the time to allocate
what was needed during this period, he had so many other things hap-
pening in his life.”
Miller then goes on to make an interesting point: “Really, when [Li-
nus] has the time to put towards it, nobody can keep up with him. So in
the end, the issue really wasn’t so much a ‘Linus doesn’t scale’ thing as it
was a ‘Linus has too much other crap happening in his life right now’
thing”—involving a complex mix of his work at Transmeta, his growing
family, and those increasingly demanding journalists.
“It was an evening session,” McVoy says of this critical meeting. “They
showed up by probably somewhere between 5 and 7 in the evening. Li-
nus was pretty frustrated and burned out. It was a very tough time for
him. And legitimately so, right? Everybody viewed him as a superhuman
guy, and everybody wanted him to drop what he was doing and pay at-
tention to their patch, and nobody had really realized yet that wait, this
guy is human, there are limits to what he can do. Because so far there
had been no limits.
“I think we talked technical stuff before we hit dinner for a number of
hours,” McVoy adds, “and then talked a number of hours after dinner.”
Miller recalls: “Initially, the discussions were procedural, about what
things we were doing as developers that made more work for Linus, and
how we could alleviate some of that.
“Then I started describing how I thought this [BitKeeper] stuff could
work,” McVoy continues. “And it took several go-arounds; I think it took
a couple of hours. I described the architecture, and I described how in-
formation would flow through it, and how this would solve problems.”
Miller was supportive: “Dave was pushing anything that would help the
process,” McVoy says. “[BitKeeper] has a lot of the kinds of things we
need,” Miller explains. “And I’ll say one thing, Larry really listened to Li-
nus about what he’d like if he ever used BitKeeper for managing the
Linux kernel sources.”
As to whether Linus will use the system for day-to-day work on the
kernel, McVoy says, “I get the impression that Linus will use it—he’s
stated what he will do.” But on one important condition: “He will use it
if it’s the best,” McVoy notes. “The best in his mind is not better than
0738206709-02.qxd 9/4/03 10:20 AM Page 181
anybody else’s; the best is the best, as good as it could possibly be. The
question is, where’s his bar? His bar’s pretty high. How close am I to
reaching that bar? Well, I don’t think I’m there yet, but I’m damned
close.”
In September 1998, BitKeeper 1.0 lay far in the future. Meanwhile,
other measures were put in place to relieve at least some of the pressure
on Linus. Alan Cox says, “Once we figured out the right way to get patch
submissions flowing smoothly, it all worked out fine.” During 1999, Li-
nus gradually began to reduce his time with the press, until by the fall of
that year he had more or less ceased to give interviews. Eric Raymond
says, “One reason I took on the ‘minister of propaganda’ role was so Li-
nus himself would be able to do less of it.”
Against this background, what is remarkable about the “Linus does
not scale” episode is that nobody outside the Linux world noticed. Even
though the equivalent of a major boardroom power struggle took
place—one that might have been seen as exposing a potentially fatal
weakness in the Linux phenomenon—the mainstream press completely
missed out on this hot story.
The press probably overlooked the story because it played out through
the Linux-kernel mailing list. Although publicly accessible, this list is
hardly the sort of forum that journalists just discovering open source
would be monitoring closely, even though in some senses it represented
the interior dialogue within the “brain” of the Linux organism. As a re-
sult, the Linux-kernel mailing list functioned rather like the Swedish
community’s Ankdammen, or Duck Pond, where everybody knew every-
body, but where the group as a whole tended to keep to themselves.
The other significant reason the national and general technical press
missed the gripping drama being played out in this obscure if important
arena was that 1998 also saw an accelerating uptake of GNU/Linux and
open source by the giants of the commercial computer world. Reporters
were too busy trying to keep up with and understand the enormous cul-
tural shift this represented to be worrying about a few dropped patches.
0738206709-03.qxd 9/4/03 10:19 AM Page 182
11
that it was
W H E N NE T S C A P E A NN O UN C E D O N 2 2 J A NUA RY 1 9 9 8
making the source code for the next generation of its Web browser soft-
ware freely available, this not only marked a watershed in the history of
commercial software but also represented the final and symbolic coming
together of two great currents in the field of computing: the Internet and
open source.
As described previously, the main Internet services are run almost en-
tirely by free software on the server side: BIND for the Domain Name
System, Sendmail for e-mail, and Apache and Perl for the Web. The rise
of the World Wide Web occasioned a battle between open and closed
software in the increasingly important area of the client—the browser
software that ran on what soon amounted to millions of desktops as the
Web was quickly taken up by business and end-users.
The silent struggle begins in October 1991, around the time Linus
posted version 0.02 of Linux. At the end of that month, Tim Berners-Lee
had inaugurated a mailing list called WWW-Talk for all those interested
in the new World Wide Web, which he had only recently released to the
public. The following day, Berners-Lee told the members of the list that
somebody called Dan Connolly was working on an X11 browser; that is,
software to access the Web that made use of the X Window system, also
known as X11.
182
0738206709-03.qxd 9/4/03 10:19 AM Page 183
Dan Conolly (Convex Inc) has put together a W3 browser for X but could
not release the code. A group of students in Finland were also going to do
this for a project—I don’t know the status of that work. Anyone who makes
a good X11 W3 browser will be very popular.
The “group of students” was not just in Finland, but working under
Ari Lemmke, the person who cajoled Linus into giving his kernel the
name “Linux” rather than “Freax.” There were at least two other X11
browser projects—ViolaWWW and MidasWWW—alongside the
Finnish software, which was called Erwise. But it was another that was
to prove, as Berners-Lee had put it, “very popular”: Mosaic.
Mosaic had been written at the National Center for Supercomputing
Applications (NCSA), part of the University of Illinois. The Mosaic pro-
ject was led by a young man named Marc Andreessen, who not only be-
0738206709-03.qxd 9/4/03 10:19 AM Page 184
came one of the founders of Netscape but played a key role in defining
the Web as we know it.
Andreessen’s first appearance on the WWW-Talk list is unassuming
enough. On 16 November 1992 he asked:
Anyone written code to construct HTML files in Emacs? I’m hacking some-
thing up; let me know if you’re interested.
OK, here’s a first pass at an html-mode for Emacs. Comments, bug reports,
and enhancements are welcome.
The html-mode for Emacs was a way of turning Richard Stallman’s in-
finitely adaptable tool into what would be called an HTML editor today.
In fact, Andreessen was one of the top Emacs hackers at NCSA. As an ex-
tension to Emacs, his html-mode was naturally released under the GPL,
just as Berners-Lee had hoped Connolly’s X11 browser might have been.
After this fairly low-key start, Andreessen was soon posting regularly.
He quickly became the third most frequent name on the WWW-Talk list,
after Connolly and Berners-Lee. On 1 December there appeared what
would turn out to be a prescient posting from Berners-Lee, who was
sounding a note of mild alarm:
Although this is precisely how images are handled today, at the time it
was controversial because it added a new HTML tag, and Berners-Lee
was then in the process of formally defining HTML. The last thing he
wanted were new tags popping up when he was trying to nail the old
ones down. It also raised the spectre of others coming up with their own
solutions, and a fragmentation of Web standards.
But as Andreessen pointed out:
My purpose in suggesting IMG is that things are reaching the point where
some browsers are going to be implementing this feature somehow, even if
it’s not standard, just because it’s the logical next step, and it would be
great to have consistency from the beginning.
tency from the beginning” was a beautifully judged way of saying “we
want everyone to adopt our standard.”
On 12 March, Andreessen staked his team’s claims more strongly when
he announced the imminent arrival of the new browser:
Back to the inlined image thread again—I’m getting close to releasing Mo-
saic v0.10, which will support inlined GIF and XBM images/bitmaps, as
mentioned previously.
which the personal computer platforms had reached parity with the orig-
inal Mosaic X11 browser. Similarly, the performance optimization (for
14,400 baud modems) was a signal that Mosaic Netscape was aimed at
users outside universities, since those who were in universities would
typically have faster links than this to the Internet and would not be us-
ing modems.
The most revolutionary aspect of this announcement is probably not
even evident today. That a company was not only making a program
freely available—there was no charge for downloading Netscape version
0.9, and the program was “free to individuals for personal use”—but
throwing open the entire beta-testing process to the general public was
without precedent for a launch product.
The announcement reflected Netscape’s origins in the university and
free software world where this kind of public debugging was standard
and, as much as anything else, heralded Mosaic Communications as the
first of a new breed of what would soon be called Net companies. Along
with a beta-testing program on a scale that was unprecedented, the deci-
sion to allow anyone to download copies of Netscape free had another
key effect: It introduced the idea of capturing market share by giving
away software, and then generating profits in other ways from the result-
ing installed base. In other words, the Mosaic Netscape release signaled
the first instance of the new Internet economics that have since come to
dominate the software world and beyond.
At the time, none of this was apparent. One thing that was obvious to
the University of Illinois, though, was that it was losing its browser’s
name and its dominant position in the browser sector. With most of its
programmers gone, there was little the university could do to stem the
rising tide of Netscape’s success, but it could and did take exception to
“Mosaic Communications.” As a result, the company became Netscape
Communications on 14 November 1994: “To further establish its unique
identity in the industry and to accommodate concerns expressed by the
University of Illinois,” as the press release put it.
A month later, Netscape Communications shipped version 1.0 of its
flagship Navigator browser and set the single license price for commer-
cial use at $39. It also released two Web servers from its Netsite Web
server line. The Netsite Communications Server cost $1,495, and Netsite
Commerce Server, which offered secure transactions through new tech-
nology invented by Netscape, cost $5,000.
Partly because the company had shrewdly placed details of its secure
transaction technology on the Internet so that others could adopt it—
and establish it as the de facto standard—Netscape proudly referred to its
0738206709-03.qxd 9/4/03 10:19 AM Page 188
been doing free software since I’ve been doing software,” he explains. Za-
winski had been one of the leaders in making the painful decision to fork
the Emacs code in 1993.
Zawinski joined Netscape in June 1994. He says, “That was like a
month or two after the company was founded; there was basically no
code written yet.” His task was to create the X Window version of what
would become Netscape—and to do it in just a few months. On his per-
sonal Web site, Zawinski has placed excerpts from his diary at the time
in a piece called “netscape dorm.” As he writes there: “This is the time
period that is traditionally referred to as ‘the good old days,’ but time al-
ways softens the pain and makes things look like more fun than they re-
ally were. But who said everything has to be fun? Pain builds character.
(Sometimes it builds products, too.)”
“Netscape dorm” provides a vivid picture of what those heady early
days at Netscape were like: “I slept at work again last night; two and a
half hours curled up in a quilt underneath my desk, from 11 A.M. to 1:30
P.M. or so.” And, “I just got home; the last time I was asleep was, let’s see,
39 hours ago.”
It was around this time that alongside his feverish coding Zawinski
made another important contribution to the company. “We were sitting
around a conference table—this was probably July 1994—trying to come
up with product names,” he recalls, “and someone said, ‘we need to
crush NCSA like a bug,’ since back then, taking away NCSA Mosaic’s
market share was our primary goal. I thought of Godzilla. Netscape
Communications Corp was at the time still called Mosaic Communica-
tions Corp. Thus, ‘Mosaic Godzilla’ [became] ‘Mozilla.’”
Although the more corporate “Netscape” was chosen for the final
product name, “Mozilla” lived on within the company, notably as the
name of Netscape’s dinosaur-like mascot, and would finally achieve a
glorious resurrection in the open source browser project.
Zawinksi’s practical experience of open source meant that he played an
important if unsung role in evangelizing this development process
within Netscape’s engineering department. The third key player in the ef-
forts to get Communicator released became interested through Za-
winksi’s explanations in the private newsgroups that circulated around
the Netscape intranet of why opening up software development brought
considerable advantages.
Frank Hecker had worked for the computer manufacturers Prime and
Tandem before joining Netscape in February 1995. His Unix background
meant that not only was he familiar with the GNU project but had even
carried out ports of parts of it himself. He became technical manager for
0738206709-03.qxd 9/4/03 10:19 AM Page 193
the government sales group at Netscape and worked out of an office lo-
cated in Bethesda, Maryland. In the autumn of 1997, at around the time
Netscape was seriously considering producing Javagator—the “Pure
Java” version of Communicator—Hecker was reading the internal news-
groups where engineers discussed various issues of the day.
“One of the things that was brought up in the newsgroups,” Hecker
remembers, “was, well, if you’re using Java as a development language,
then we’re going to have to find some way to obscure the byte-codes
because otherwise people could just disassemble the Java and recover
the original source code.” The byte-codes were the special kind of bina-
ries that Java produced. The point was that it was easy to go from byte-
codes to the original Java language source code, something that is
difficult to do with compiled versions of programs written in C, for
example.
As Hecker explains, “The implication being there that if we’re going to
release some stuff as Java-based systems . . . we’d figure out some way to
essentially obscure the source code or otherwise keep it confidential” be-
cause the prevailing wisdom was that commercial software houses never
allowed others to see the source code. “To which some other people, in-
cluding Jamie Zawinski, replied: Why are you bothering with this? Why
don’t we just release the source code for this stuff and that way we’ll get
the benefit of letting other people work with it and help us fix it, im-
prove it and so on,” Hecker continues.
“And that struck me as a sort of interesting argument,” he says. He
knew that people such as Zawinski had been advocating this idea for
some time. “Jamie and many other people inside Netscape had proposed
it at various times over the past few years, from 1994/95 on.” But for all
their practical experience in the area, the suggestion had never been
taken up. Hecker believes he knows why. “One of the problems I saw
there was [that] traditionally releasing source code had been proposed
by developers,” he notes. “It was obvious to them why it was important.
It wasn’t really clear from a senior management level why releasing
source code could be of use because nobody ever made the business
case.”
This was what Hecker set out to do in a comprehensive document,
which he called “Netscape Source Code as Netscape Product,” begun in
August 1997. This “somewhat obscure title,” Hecker says, “[reflected]
my opinion that Netscape should consider treating source code as a
product delivered to Netscape customers in the same sense that binary
versions were considered products. I felt that proprietary software ven-
dors treated source code as this magical mystic thing—‘the crown jew-
0738206709-03.qxd 9/4/03 10:19 AM Page 194
ing excitement. He knew only too well what the company was really do-
ing. To explain his position to others, he would tell a story:
“Two guys go camping, and they’re barefoot, and they run into a bear.
And one guy stops and puts on his sneakers. And the other guy looks at
him and goes: What are you doing? You can’t possibly outrun a bear. And
the first person says: I don’t have to outrun the bear, I just to have to out-
run you.”
Hahn explains, “I think that’s the view that I had about the open
source strategy and Netscape. I’m not sure everybody got the message
that open source was some sort of wonderful magnificent new thinking,
but it became uncontroversial because it was way better than what we
were doing prior to that [on the browser side].” He notes, though, that
Netscape “never got there on the server,” where products remained res-
olutely closed source.
Hahn also says, “We may not have really gotten it, but, boy, we were
encouraged to keep going by the Internet and that was a very, very pow-
erful force.” The positive response from the Net community had been
one of Hahn’s subsidiary aims in proposing that Netscape take Commu-
nicator open source. “One of the motivations in the Heresy Document
for open sourcing it,” he explains, “was the observation that Netscape
historically was much more successful in the market when it was per-
ceived as the underdog against the Evil Empire.
“We had clearly lost our underdog status; we had been arrogant, ob-
noxious, and downright insensitive to the technical backbone of the In-
ternet that had really thought the browser was cool in the first place. So
this move was in part designed to reposition the company as humble and
needing people’s help, and wanting to be a team player so that we could
all fight the good fight against Microsoft. And that worked very well. The
day we announced, there was a deluge of positive enforcement from the
market.”
The announcement was made 22 January 1998. It began: “Netscape
Communications Corporation today announced bold plans to make the
source code for the next generation of its highly popular Netscape Com-
municator client software available for free licensing on the Internet. The
company plans to post the source code beginning with the first Netscape
Communicator 5.0 developer release, expected by the end of the first quar-
ter of 1998. This aggressive move will enable Netscape to harness the cre-
ative power of thousands of programmers on the Internet by incorporating
their best enhancements into future versions of Netscape’s software.”
The announcement introduces an idea hitherto viewed with great
skepticism by computer businesses—that of the distributed, global de-
0738206709-03.qxd 9/4/03 10:19 AM Page 196
velopment process that lies at the heart of GNU/Linux and other major
open source projects—as if it were the most obvious and natural thing in
the world. For a high-profile company such as Netscape to give its stamp
of approval in this way was extraordinary.
It was entirely appropriate that Netscape, of all companies, took this
step. Netscape had already changed the rules once with its first product,
Netscape 0.9. By effectively offering a free download it already drew on
the online world to speed up the debugging process and spread the word
further in one of the first examples of viral marketing. Making Commu-
nicator 5.0 open was the next logical step: a product that was truly free
and would, or so it was hoped, be developed in partnership with the en-
tire Net community.
In taking the code open source, Netscape was at last giving Tim Bern-
ers-Lee the browser he had been asking for since 1991, and adding the
last piece in the Net’s free software jigsaw puzzle.
Netscape’s announcement did all these at least in theory. But taking the
decision to make the source code of Communicator 5 freely available
was one thing, and difficult enough; creating a working program—never
mind one capable of taking on Microsoft’s increasingly polished Web
browser—was quite another.
One of the key areas to be addressed involved the license. The press re-
lease announcing the release of the source code had promised that “the
company will handle free source distribution with a license which allows
source code modification and redistribution and provides for free avail-
ability of source code versions, building on the heritage of the GNU Gen-
eral Public License (GPL), familiar to developers on the Net.”
Netscape had rightly understood that open source development only
worked if the licensing terms were couched in such a way as to encour-
age others to contribute. Its invocation of the GNU General Public Li-
cense, by far the best-known of the various licenses in this area, was a
further sign of its good intentions. Coming up with a license that met the
needs both of Netscape and of the hacker community whose aid it was
trying to enlist proved a challenge, however.
The person charged with this was Mitchell Baker, who was associate
general counsel at the time. “I became involved at the meeting of the
Netscape executive staff to make a decision is this something we want to
do or not,” she recalls. This was in January 1998, just a few days before
the public announcement.
The implication was that Netscape was prepared to take this momen-
tous step without knowing which form one of the key pieces—the li-
cense under which the code was released—would have. Baker confirms
0738206709-03.qxd 9/4/03 10:19 AM Page 197
this: “Netscape was self-selected for people who could live comfortably
in that setting.”
To help draw up the license, the company involved many open source
luminaries, including Linus, Richard Stallman, Eric Raymond, and Bruce
Perens. Raymond describes the meeting with Netscape at this time as a
“tremendously impressive experience” because it was clear that they re-
ally “got it.” He says the people he met were “intelligent” and “hip” to
the issues, and that, most impressive of all, there was no “primate pos-
turing” of the kind he had feared when entering such a corporate envi-
ronment.
The GNU GPL was not a practical option. Had Netscape simply re-
leased Mozilla under this license, software used in conjunction with
even part of it would automatically have become GPL’d because of the
way the GNU GPL worked. Neither Netscape nor the third parties were
able to accept this.
In the end, two new licenses were drawn up, the Netscape Public Li-
cense (NPL) and the Mozilla Public License (MPL). Mozilla had now
been chosen as the official name of the new project, as a further press re-
lease dated 23 February 1998 explained: “Netscape Communications
Corporation today announced the creation of mozilla.org, a dedicated
team within Netscape with an associated Web site that will promote, fos-
ter and guide open dialog and development of Netscape’s client source
code.”
Jamie Zawinski recalls, “My first reaction, a few hours after I heard the
announcement [to make Communicator 5 open source], was to go regis-
ter the domain mozilla.org.” The creation of a separate identity for the
project was an important step in reassuring potential developers that this
was not mere window dressing, and that Mozilla really would be inde-
pendent. It was also highly appropriate that for what amounted to a re-
turn to Netscape’s roots, it should adopt the coders’ original name for its
browser.
Despite the complication they represented, two licenses were neces-
sary to resolve an intractable legal problem. Some of the code in Com-
municator was used elsewhere by Netscape, and some had also been
licensed to third parties. The Netscape Public License was used for
Netscape’s programming and was GPL-like in many respects, but it
gave the company special rights to allow the code to be supplied to
third parties without passing on its GPL-like properties. It also enabled
Netscape to include the initial Communicator source code in other
products that would not be subject to open source requirements for
two years.
0738206709-03.qxd 9/4/03 10:19 AM Page 198
Through newly created newsgroups that were thrown open to all for
discussion of licensing and other matters, another unprecedented step
for a software company to take, Netscape tried to convince potential con-
tributors why it needed special rights. “We had a discussion through the
newsgroups,” Baker recalls, “and explained OK, this is why in the NPL
stuff those rights are important and why we think we need them.
“And many people came to the point of saying, well, OK, I can sort of
understand that for the code that you’re releasing,” she recalls. “And if I
fix bugs, or make some minor contribution to that code, OK, maybe it’s
OK you get special rights to some piece of my work; I don’t like it, but I
can sort of understand that. But if I ever do anything significant that’s
new to me that I want to contribute to your project there is no way that I
will do that under the Netscape Public License.
“And so that made sense,” Baker continues, “so we went back and
said, OK, we feel we need the Netscape Public License for the stuff we’re
releasing, but we will create the Mozilla Public License, which is exactly
the same except that it has no special rights. And if you want to con-
tribute code to our project—we hope you will—here’s a license that
doesn’t give Netscape anything special. And that seemed for most people
to resolve the issue.”
Netscape came up with the MPL, which is close to the GPL, because
“ultimately we decided we should have a license that promoted commu-
nity,” Baker explains. As well as throwing light on the extent to which
Netscape understood the process it was attempting to harness, this em-
phasizes once more the centrality of a community to open source, and
how profoundly insightful was Stallman’s early insistence on this element.
Even with these licenses, Netscape was unable to keep all the code that
it had licensed from third parties. Unless the owner was happy for it to
be released as open source, this code had to be removed from Mozilla,
and it left some serious gaps. As the Netscape Frequently Asked Ques-
tions (FAQ) on the Mozilla source code explained: “This release does not
include the source code for the 5.0 version of Communicator’s Messen-
ger, Collabra, or Calendar components. It also does not include source
code from third parties that were not willing to have their source code
distributed under the terms of the Netscape Public License (for example,
Java). In addition, the source code does not include elements that are il-
legal to export under U.S. law (such as cryptography source).”
These were major elements that had been excised. Messenger provided
all the e-mail capabilities, and Collabra was used for reading news-
groups. As a result, the initial release of Mozilla code on 31 March 1998
had gaping holes where core functions had been ripped out.
0738206709-03.qxd 9/4/03 10:19 AM Page 199
This did not dampen spirits the following day at what was called the
mozilla dot party, whose motto was “free the lizard.” The party had been
thought up and organized by Zawinski, who acted on his own initiative
and picked up many of the costs; entrance was free. It took place at the
Sound Factory, one of San Francisco’s largest nightclubs. Among the DJs
that night was Brian Behlendorf, who was a major figure in the Rave mu-
sic scene as well as a key player in the Apache group. Another open
source icon, Eric Raymond, was there too, and even joined in with his
flute during the second set of the Kofy Brown Band, who were playing at
the event.
Like the Freeware Summit, which took place less than a week later, on
7 April 1998, the mozilla dot party was another highly symbolic coming
together of the diverse open source tribes. But where the Freeware Sum-
mit was a meeting of the clan chiefs, the mozilla dot party embraced the
rank and file, too. “Open Source, Open Party,” as Zawinski had put it in
his mozilla dot party dot faq. As did the Freeware Summit, the party gen-
erated useful press coverage for what was increasingly perceived as a
movement.
On 16 April 1998, Netscape issued a press release in which Tom
Paquin, Netscape fellow managing mozilla.org, is quoted as saying,
“Since Netscape made its Communicator 5.0 source code available
through mozilla.org two weeks ago, we have received an overwhelm-
ingly positive and welcoming response from developers around the
world.” It gave some statistics: “To date, there have been more than
100,000 downloads of the source code from mozilla.org alone, and an es-
timated 100,000 downloads from more than a hundred mirror sites lo-
cated around the world. In addition, twenty-four hours after the source
code was released on March 31, the first change was checked in to
mozilla.org by a developer.”
The press release also recorded the donation of a piece of software that
was of critical importance in handling eXtensible Markup Language
(XML), a kind of generalized version of the Web’s HTML and fast emerg-
ing as the most important Internet technology since the advent of Web.
James Clark, the technical lead of the main XML working group, had
made his Expat program available to the Mozilla project—a considerable
boost to its advanced capabilities, and something of a publicity coup.
Another was provided just hours after the source code was released. A
group in Australia and the U.K. called Cryptozilla had put back the cryp-
tography elements that Netscape had been forced to rip out because of
U.S. export laws. It seemed a vindication of everything the open source
community had been saying about how the serried ranks of coders
0738206709-03.qxd 9/4/03 10:19 AM Page 200
around the world would just step forward and fill in the holes. Hecker’s
assessment is more measured. Cryptozilla’s feat “was not really a sur-
prise,” he says. “It was obvious that even with the code removed it would
not be all that difficult to put the code back.” Moreover, “once they’d
done that, they didn’t really keep up with the code base, and so that code
is no longer useful.”
On 15 May 1998, Brendan Eich, who is responsible for the architec-
tural and technical direction of Mozilla, outlined what he called the
Mozilla Stabilization Schedule—a road map for Mozilla. He identified 1
September 1998 as the date when “features stop going in and aggressive
stabilization begins” so that the code would be in a fit state for release as
a product. On 26 October 1998, however, Eich posted a new develop-
ment road map that announced a radical shift. Instead of taking a key
browser component called the layout engine—the code that processes
the HTML and displays it on the user’s screen—from the current
Netscape development, they would throw it away and start again with
what he called NGLayout (Next Generation Layout), later dubbed
“Gecko.” As Eich explained, he had arrived at this decision because of
his own views as “mozilla.org technical bigshot,” taking into account
“the judgment of the module owners”—Mozilla’s lieutenants—and, most
tellingly, “the fervent wishes of Web content authors,” as Eich put it.
Above all, the Web content authors wanted adherence to the standards
that the World Wide Web Consortium (W3C)—the independent body
led by Tim Berners-Lee—had come up with over the last few years. They
included such things as Cascading StyleSheets, which allowed complex
Web page designs to be constructed in a clean and conceptually simple
way; and the Document Object Model (DOM), which was essentially a
method to allow such items on a Web page as headings, tables, etc. to be
accessed and manipulated as if they were separate elements.
All these standards were well defined, and their benefits well accepted.
But Netscape and Microsoft offered only partial support for the W3C
standards, and to different degrees, an unfortunate consequence of the
browser wars. This meant that a Web page following standards would
look different, sometimes wildly so, according to whether it was viewed
with Microsoft’s or Netscape’s browser; this was a nightmare for Web de-
signers, and ran counter to everything Berners-Lee had striven for in his
stewardship of the Web.
Users of the Web had made pleas for compliance with standards be-
fore, but this time something was different. By opening up Mozilla devel-
opment, Netscape had also opened up the decisionmaking process.
“That’s the way it’s supposed to work,” as Hecker explains. “If you’re go-
0738206709-03.qxd 9/4/03 10:19 AM Page 201
12
A Foothold
205
0738206709-03.qxd 9/4/03 10:19 AM Page 206
“When IBM announced that they were doing it,” he continues, “it
made me realize that IBM, which is a big, historically slow-moving com-
pany, got it more than Netscape did.” It highlighted his own failure to
help refashion Netscape to the point of the company’s “getting it” as
much as IBM did, the Mozilla experiment notwithstanding.
IBM’s decision was probably the single most important boost the pro-
ponents of free software could have wished for. At a stroke, it elevated an
open source program to the level of commercial software and gave it a
key official foothold in the enterprise. It helped turn Netscape’s earlier
move from the singleton act of a maverick company into a prescient fore-
shadowing of trends to come—and therefore provided an implicit justifi-
cation for any software house that wanted to back GNU/Linux or other
open source software to do so.
One of the key players in this story was James Barry. Barry was not a
long-standing employee of IBM; before joining, he had sold his previous
company, the startup ResNova, to Microsoft. The ResNova team had
written most of Microsoft’s main Web server product for Windows NT,
Internet Information Server (IIS), and the Personal Web Server for Win-
dows 95. Coming from outside IBM in this way may well have given him
freedom to propose bold new ideas.
“I was being brought in to take a look at IBM’s lineup in Web server
space,” Barry explains. The problem there was that “IBM had almost over
forty or fifty products at the time,” he says. “You’ve got a commerce
suite, you’ve got payment suites, you’ve got store catalogues, and you’ve
got things that interpret [Web server] logs and tell you where your visi-
tors came from, things that hook up to mainframes. And these were all
separate software pieces that are on the server side.
“So I came back with a report, and basically said you’ve got some pro-
jects out there that make no sense. You’ve got a lot of them that do al-
most the same thing, but they’ve got a different brand.” Barry’s first
report, which came out in December 1997, contained the proposal that
everything be rebranded as part of what he called the WebSphere prod-
uct line.
During his analysis, he had realized something else. “What’s the first
thing people do when they come into the Web site? You serve up a Web
page,” he notes. For that, you need a Web server, and IBM had one.
“When I first started working, Internet Connection Server, ICS, was
what we called it,” Barry recalls. “And then it was branded to Domino
Go.”
IBM’s go-it-alone approach to Web servers had a big problem: “We had
like .2 of one percent” market share, Barry says. “Basically 90 percent of
0738206709-03.qxd 9/4/03 10:19 AM Page 207
A Foothold 207
ers. “I believe I was the one who figured out how, with a lot of help from
many friends, to move this IBM bureaucracy. IBM is like a big elephant,”
he says. “Very, very difficult to move an inch, but if you point the ele-
phant toward the right direction and get it moving, it’s also very difficult
to stop it. So we kind of joke [in IBM that] figuring what’s the right thing
to do is less than 10 percent of the effort; 90 percent of the effort is to fig-
ure out how to leverage the bureaucracy.”
Shan explains his approach. “First . . . I went around and asked people
what [they did] and how they failed” in previous attempts to get Apache
adopted. Apparently there had been others trying even before Barry’s
own unsuccessful pitches. “And it turns out that it has to do with timing;
it also has to do with the parties that you have to get agreement on. You
need to have marketing, legal, and development all in agreement; with-
out any one of them you’re not going to go anywhere,” because of IBM’s
rigorous consensus-driven approach.
“So in February that year, I started looking into the situation and I de-
termined that, yes, if we do it right there is a chance for us to push it
through,” he says. Confidentiality was paramount. “Just imagine that . . .
you have a team of sixty, seventy developers working in a lab on a prod-
uct,” Shan says, “and you’re investigating ways to ditch the product and
embrace open source. If the news get out, you’re going to have a lot of
resignations. You can imagine how difficult it would be to convince the
developers that it is the right thing to do.” Not only that, “You want to
have developers buy in; you want them to embrace open source and see
that as a positive thing and [you want them] . . . to contribute to the
open source movement,” Shan explains—otherwise the entire exercise
will be futile.
Shan remembers thinking that “if you have to convince developers,
you have to convince their leaders. So I got two of them, and asked them
to join me to analyze Apache.
“I asked them to compare and contrast,” Shan recalls. “And when they
really dug into it they were surprised by the elegance of the architecture.
Of course, it took them a little time to get in there and start having an
open mind and look at things from a bigger picture perspective. But they
did. So with that, I was able to convince myself that technology-wise this
was a go, and at the same time we were pushing marketing and legal.”
Marketing and legal was being handled largely by Barry; he and Shan
had worked together before. “I met him in various meetings,” Shan says,
“and I’ve always thought that he’s the kind of guy who knows a lot, and
wouldn’t mind speaking his mind, and stand firm on what he believes.
And he also had a lot of respect for my developers.”
0738206709-03.qxd 9/4/03 10:19 AM Page 209
A Foothold 209
He and Barry would spend a lot of time together on the Apache project
as they traveled around the various IBM divisions trying to win accep-
tance for the idea. The point was, as Barry explained tirelessly to every-
one they pitched to at IBM, “We can’t go in half-heartedly. We can’t go in
and say, yep, we’re going to adopt Apache but we’re going to keep our
Web server. So what we said is, it’s going to be on all the systems.
“IBM has probably more VPs than many other companies [have] em-
ployees, so it was a sales job,” Barry notes. He was well suited to the task.
“I’ve been at executive levels for a while at different companies, so I
know how to talk to the VP level,” he says. It went much higher than VP
level—to Steve Mills, for example, who was a general manager in the
software group. Mills reported to John M. Thompson, who was a senior
vice president. It may even have gone higher; Barry believes that Thomp-
son took it to Lou Gerstner, the chairman and CEO of IBM. But the key
person was Mills: “Once we had Steve Mills and his people aligned with
the fact that we needed to do the Apache move,” Barry says, “basically
that gave me a heavyweight hammer.” At this point, mid-March 1998,
Barry and Shan were authorized to make contact with the Apache group.
They got in touch with Brian Behlendorf. He had left his job at
HotWired, where his official title had been “Unix sherpa,” and was work-
ing full-time at Organic, a Web-design company he had helped found.
Previously, he had worked there in his free time for no salary, but he had
taken up the role of chief technical officer in January 1995, when Or-
ganic was able to pay him.
Before Behlendorf could meet up with Barry and Shan, a preliminary
had to be dealt with. “We gave Brian a call over at Organic,” Barry recalls,
“and said, ‘We need you to sign an NDA [Non-Disclosure Agreement].
We’d like to talk to you.’ He had no idea what we [were] doing. So he
signed the NDA, and we flew out to meet him, Shan and I.” Shan recalls,
“The meeting was March 20, at 6 o’clock in the evening, in a little Italian
restaurant right across the street” from Organic, which was based in San
Francisco.
Shan and Barry dropped the bombshell at the restaurant. Barry recalls
saying to Behlendorf, “‘We’d like to adopt Apache as our official Web
server.’ And he kind of looked at us, like, huh? And he goes, ‘Well what
does that mean?’ [I said:] ‘Well, we want to participate in the process,
we’re going to drop our proprietary Web server and adopt it for all plat-
forms. And we really want to see how we can work with you.’”
Behlendorf recalls, “My first thoughts were amazement and, to a cer-
tain degree, yeah they finally get it, right. But I wouldn’t say it was a sur-
prise. We’d always thought that companies would start very quietly just
0738206709-03.qxd 9/4/03 10:19 AM Page 210
A Foothold 211
As with Mozilla, after the elation came the mundane issue of imple-
mentation. “In IBM, the executives got it,” Barry explains, “and the pro-
grammers got it. But it’s this huge mid-level of management that didn’t
get it. They’re being judged by profit and loss. They’re being judged by a
set of metrics that did not include we’re going to give away products.”
As a result, he says, “Apache took off and then kind of died within
IBM.” The problem was that “we couldn’t market it separately,” Barry ex-
plains. “We had to bundle it with stuff that we charged for. From a mar-
keting standpoint, we could have taken a huge advantage, but we didn’t.
Everything got vetoed because we had to make money on it. We had to
bundle it with WebSphere. Everything had to mention WebSphere first,
Apache as a ‘by the way, we run on Apache.’ And as a result, it didn’t take
off as fast as I had actually hoped.”
But there were, nonetheless, concrete achievements even in this disap-
pointing first phase. As Shan explains, “We realized that for Apache to be
seen as a serious Web server, there’s got to be serious support behind it.
So we have to get people mobilized to build an IBM 7x24 support team
to make it real.”
The issue was absolutely crucial—as its mention in the IBM press re-
lease in June 1998 attests. Shan’s creation of a “7x24 support team”—one
that was on call every hour of every day—removed concerns about
Apache (at least for IBM customers) and provided an exemplar for other
companies.
IBM’s decision to replace its own Web server with the open source
Apache was a watershed, but by no means the end of the story. Although
the company would release all its contributions to the Apache project as
open source (as it had to if it wanted to be part of the process), it was not
taking the bold step of converting its own proprietary products into open
source as Netscape had done. But that move followed soon after the
Apache move, and partly as a result of it.
The person who made this happen was David Shields. He had joined
IBM’s research division in 1987, after working at New York University for
twenty years. “For PhDs in research” at IBM, he explains, “there are basi-
cally three job titles: research staff member, IBM Fellow, and Nobel Prize
winner.”
In early 1996, Shields started collaborating with Philippe Charles on a
major project called Jikes. This was a Java compiler that took programs
written in the Java programming language and produced special code
that could run on any system that supported Java—the basis of Java’s
much-vaunted platform independence. As a research project, Jikes was
made freely available on IBM’s alphaWorks Web site, but only as binaries
0738206709-03.qxd 9/4/03 10:19 AM Page 212
A Foothold 213
the headline-making moves of Netscape and IBM, 1998 also saw nearly
all the major back-end applications ported to GNU/Linux.
The earliest outing of a first-tier corporate software package running
under GNU/Linux predated IBM’s Apache announcement by some
months. Computer Associates (CA), one of the world’s largest software
companies, demonstrated a beta version of its Ingres II database at CA
World in April 1998.
It was highly appropriate that Ingres II led the charge to GNU/Linux.
Ingres II was based on the original Ingres project undertaken at Berke-
ley—the one that Sendmail creator Eric Allman had been a part of. And
as early as 1992, there was a port of that first Ingres. On 24 November
1992, Linux News reported that “Zeyd M. Ben-Halim announced a new
version of his port to GNU/Linux of Ingres, the relational database man-
ager”; this was barely a year after Linux had first been released. The orig-
inal Ingres project has since turned into the open source PostgreSQL,
which is still being developed.
Computer Associates did not release their public beta until October
1998, and the final supported commercial version did not see the light of
day until February 2000, by which time there had been over 3,500
downloads of the beta version, and another 4,000 distributed on CDs.
In the meantime, the other top database vendors announced their sup-
port for GNU/Linux in an embarrassing about-turn. A news item in In-
foworld, dated 6 July 1998, and headed “Linux Not Yet Critical Mass,
Database Vendors Say,” quoted a representative of Oracle, the leading
database company, as follows: “Right now, we’re not seeing a big demand
from our customers that we support it.” The story went on to add that
IBM, Sybase, and Informix also had “no intentions of releasing versions
of their databases on Linux.”
And yet two weeks later, on 21 July 1998, Oracle announced that it
would be porting its Oracle8 database to GNU/Linux. A day after Ora-
cle’s flip, Informix went even further and released a GNU/Linux version
of its Informix-SE database, along with free development licenses. As Or-
acle had done, Informix proclaimed this was “in response to user de-
mand.” The actions of IBM and Sybase were more consistent with their
July pronouncements. IBM released a beta of its DB2 database in Decem-
ber 1998, and Sybase remained the most skeptical about GNU/Linux,
waiting until February 1999 before announcing the port of its SQL Any-
where database.
Without doubt, Oracle’s announcement represented the key move, and
the one that began to put GNU/Linux on the map as far as serious busi-
ness use was concerned. Oracle’s policy reversal also came as a relief to
0738206709-03.qxd 9/4/03 10:19 AM Page 214
A Foothold 215
Miner then adds, “We even took a serious look for some time at [Ora-
cle’s] providing support for the Linux operating system itself, because
that appeared to be one of the factors slowing adoption in corporate en-
vironments. We felt that was perhaps a thing we could do to help accel-
erate both our business on Linux and that of Linux itself.” Oracle
eventually decided that the growing support from other companies
would meet this need.
Fortunately for Miner and the other advocates of the port, the reaction
to Oracle’s announcement was unprecedented. “I was amazed at the re-
sponse to the prerelease of the production CD we put out in October,” he
says. Although the release was low-key—the company “issued one press
release, threw up a page on the Web site,” he says—“there were several
thousand CDs ordered before the CDs were cut. And no one in market-
ing could recall ever having had that quick a take-up of a new Oracle
platform.” Miner says this “certainly gave us confidence that as a move-
ment and a potential marketplace there was a lot more possibility than
we had imagined prior to that.” It also “was a very good validation that
in fact the Linux movement was quite large,” he says, “and that it wasn’t
all just hype and PR, that there were real people out there wanting to do
real things with it.”
This was good news for Oracle, because its commitment to porting Or-
acle8 to GNU/Linux also meant that its other key enterprise products
would follow in due course. As Miner explains, the corporate approach
was “once you do something, you basically do it completely.” When Ora-
cle delivered the final version of its Oracle8 database for GNU/Linux on
7 October 1998, it also launched Oracle Application Server. The follow-
ing year, the company not only released its WebDB product, a “database-
powered Web publishing tool for end users,” for GNU/Linux but set up
an entire business unit dedicated to “develop, market and support Linux
software.” A press release explained that Oracle’s products had “received
a phenomenal response from the Linux market, establishing Oracle as an
enterprise software leader on Linux with more than 50,000 developers
and 800 customers.”
As with the moves from Netscape and IBM, Oracle’s high-profile com-
mitment to the platform, together with the porting of all of the other top
databases, lent yet more credibility to the GNU/Linux system for server
applications. This was bolstered even further when the German com-
pany SAP announced in March 1999 that it would be porting its enter-
prise resource planning (ERP) software R/3. The SAP press release
quoted Hasso Plattner, cochairman and CEO of SAP as saying, “We have
received a significant number of serious customer requests for R/3 on
0738206709-03.qxd 9/4/03 10:19 AM Page 217
A Foothold 217
Linux over the past year. After extensive testing in-house and discus-
sions with our partners and customers, we are confident that Linux
meets our standards.”
Although little known to the general public, SAP’s R/3 is the nearest
thing that exists to a global standard for enterprise software. On top of
the basic R/3 platform run software modules that handle generic corpo-
rate tasks such as payroll and human resources, as well as the specialized
requirements of industry sectors such as aerospace, automotive, bank-
ing, chemical, oil and gas, etc. According to the company, “SAP has more
than 10 million licensed users, more than 20,000 installations in more
than 100 countries and supports 28 different languages. More than half
of the world’s top 500 companies use SAP software.”
Microsoft may power the world’s front offices with its Office suite on
Windows, but behind this desktop veneer, SAP provides the core func-
tions that keep companies running. The announcement of the port of
SAP R/3 to GNU/Linux was in some ways the culmination of the growing
support for open source that had begun barely a year before with
Netscape’s dramatic move. The first shipment of R/3 to customers was
made on 24 August 1999.
SAP’s belief in open source was made abundantly clear through some
forthright statements in a FAQ document about SAP on Linux that it put
up on its Web site. As part of its answer to “What is Linux or Open
Source?” it states, “Open Source is a development method that has good
chances of revolutionizing the complete software industry.” As to “Why
does SAP do this?” the answer is unequivocal: “We expect Open Source
to be the software model of the future and Linux to be successful in low-
and high-end installations.” Lest people accuse the company of not prac-
ticing what it preaches, it poses and addresses the inevitable question, “If
it’s so good, why doesn’t SAP go Open Source? In fact, we are currently
thinking about publishing selected R/3 kernel components as Open
Source to start with.”
Alongside these remarkably positive statements in the FAQ, one is
more ominous, though apparently innocuous enough. In reply to the
question, “Where can I get Linux for R/3?” the document explains that
although the Linux kernel is standard, the distributions built around it
are not. As a result, “Breaking completely new ground with Linux, we
decided to concentrate on one distribution to start with, which is Red
Hat. We hope that in the future we will be able to support more than one
distribution.”
Although not the first occasion that Red Hat had been singled out for
preferential treatment in this way, it was certainly one of the most signif-
0738206709-03.qxd 9/4/03 10:19 AM Page 218
A Foothold 219
13
O NE ST R I K I N G F E AT UR E A B O U T T H E 1 9 9 8 G NU / L I NU X announce-
ments was that they all came from software companies; these on their
own, however, were not enough to make GNU/Linux fully acceptable to
corporations. The backing of hardware vendors was indispensable, not
so much because it meant that companies could buy machines with the
free operating system already installed, but because such systems would
be fully supported. The lack of formal support from recognized compa-
nies was perhaps the last serious barrier to widespread adoption of the
GNU/Linux platform.
The company to break through that barrier, and to kick off another ex-
hilarating year for the GNU/Linux community, this time with announce-
ments of backing from all the leading hardware vendors, was
Hewlett-Packard (HP). The press release dated 27 January 1999, stated,
“HP will now provide customers with Internet solutions and services
based on Linux,” and announced that “an alliance with Red Hat to sup-
port Official Red Hat Linux 5.2 on the Intel-based HP NetServer family.”
The man behind HP’s GNU/Linux moves was Wayne Caccamo, who
was a strategic planning manager at the time. He explains how HP’s
moves came about. “It was summer of ’98,” he says, “and I got involved
not because I was assigned to look at Linux as much as I was generally
viewed as someone looking out on the horizon.” He describes
GNU/Linux at that time as “on the list of nuisance issues that kept pop-
220
0738206709-03.qxd 9/4/03 10:19 AM Page 221
ping up on our radar screen, and something that seemed like an oppor-
tunity.”
Such an opportunity was important to his company. “HP at that time
had missed some major technology revolutions like the Internet, and
Web on the first wave,” Caccamo recalls. “If we were going to recapture
the old HP image of an innovative, visionary company, we [would] have
to demonstrate that we are on top of some key trends and are proactively
rather than reactively responding to them.” This meant not just taking
chances on things such as open source, but doing it before anyone else.
Caccamo started by reviewing the situation at HP. “I did some investi-
gation around the company on the various activities that were happening
that were Linux-related or directly involving Linux. It was pretty clear to
me that we had actually a lot of stuff going on. So what I recommended
was that we form the Open Source Solutions Operation,” he says. Its job
would be “to surface a lot of the activity that’s going on internal to the
company, and knit it together into a cohesive story that we could take
outbound to our customer base.”
The Open Source Solutions Operation (OSSO) was formally launched
on 1 March 1999, by which time HP had already started to “surface”
some of its GNU/Linux activity. As well as the alliance with Red Hat, an-
nounced in January, HP had also ported its Web JetAdmin peripheral
management software. On 17 March, it issued a release that it had “opti-
mized its . . . Kayak PC Workstations for the Linux operating system.”
Much more significant than these was the announcement on 20 April
1999 that HP would “provide customers with around-the-clock, world-
wide support of Linux and HP Linux applications. HP’s new support ser-
vices include a maximum two-hour response-time commitment, and
immediate response for critical calls, on multivendor Intel-based plat-
forms.” These platforms were spelt out later as Red Hat, Caldera, Pacific
HiTech, and SuSE, perhaps the first official recognition of their status as
the top four GNU/Linux distributions, running on servers from HP and
“other vendors”—Compaq, Dell, and IBM.
Despite the enormous symbolism of this step—with one of the world’s
leading support organizations offering what amounted to a safety net for
the use of GNU/Linux within companies—Caccamo says, “I don’t re-
member there being a lot of consternation about it” among senior man-
agement. Nor, surprisingly, was Microsoft’s possible reaction to what
amounted to the recognition of GNU/Linux as an enterprise-ready operat-
ing system and implicitly as a worthy rival to Windows NT a real issue.
In part Microsoft’s muted response was attributable to the scrutiny the
company was under as a result of the U.S. Department of Justice’s an-
0738206709-03.qxd 9/4/03 10:19 AM Page 222
titrust lawsuit. “I think that maybe the whole situation Microsoft was in
[meant] they just weren’t in a position to make the phone call,” Caccamo
suggests. “They’ve done it in the past, saying, ‘HP, we’d appreciate it if
you would not do this product introduction’—incredibly ballsy—just
amazing that they even have the gall to do that. But they did.”
Caccamo believes HP’s move “was pretty important,” but part of a
broader “snowball effect.” Before the big announcements of 1998 and
early 1999, he says, “Linux was very grassroots within corporations,”
with support coming mainly from the engineers. “What the IBMs and
HPs and SAPs and everybody else did is connect that grassroots move-
ment with more of a top-down management interest” by getting the mes-
sage across at a corporate level. “And when those two ends started to
touch, sparks went off.”
Sparks started to fly in 1999 as nearly all the main hardware vendors
announced their backing. Almost all of them chose to partner with Red
Hat, at least initially, rather than with other distributions, or with
GNU/Linux in general.
For example, shortly after HP announced its first alliance with Red
Hat, Dell dipped its toe in the GNU/Linux waters. On 4 February, Red
Hat recorded that it had “designated selected server and workstation
configurations from Dell . . . as certified and compatible with Red Hat
Linux.” Hardware compatibility had always been something of a thorny
issue for GNU/Linux, principally because many peripheral vendors had
refused to publish details of their hardware to allow software drivers to
be written. The press release also contained the interesting information
from Dell that “we have been offering Red Hat Linux preinstalled to cus-
tomer specification for some time,” although it had not made this widely
known.
A couple of weeks later, IBM joined the hardware club. A press release
announced that “a development lab will be established to maximize per-
formance, reliability, and security for Red Hat Linux on IBM server and
client systems.” IBM followed this up shortly afterwards with the an-
nouncement of much broader support for GNU/Linux. A press release
dated 2 March trumpeted, “IBM launches biggest Linux line-up ever.”
Key points were that IBM would “support major versions of Linux glob-
ally” and that it would “work with four commercial distributors of
Linux”—the same quartet of Caldera, Pacific HiTech, Red Hat, and SuSE
that HP would back even more comprehensively a month later—“to pave
the way for comarketing, development, training, and support initiatives
that will help customers deploy Linux.”
0738206709-03.qxd 9/4/03 10:19 AM Page 223
IBM also announced some major ports to Linux. These included Web-
Sphere products (the family whose creation James Barry had recom-
mended at the time he was working on the Apache announcement),
Lotus Domino, the top messaging and collaboration server, and also a
port of GNU/Linux to run on some of IBM’s RS/6000 machines.
On the same day, Compaq got in on the act, too; it announced the
availability of some of its ProLiant servers preloaded with GNU/Linux.
Although it did not specify which distribution it was using, its prefer-
ence was made clear a week later when Compaq, IBM, Novell, and Ora-
cle invested in Red Hat, adding considerably to the boost given by the
first round of investment by Intel and Netscape in September 1998. Sep-
arately, on 30 March 1999, SAP also made an equity investment in Red
Hat.
HP, IBM, Dell, and Compaq formed the principal group of hardware
vendors that made high-profile statements of support for GNU/Linux in
1999, and created strategic partnerships with Red Hat. Two other compa-
nies joined the party later; these were SGI, which announced an agree-
ment to provide Red Hat on its Intel-based products on 2 August, and
Gateway, which entered Red Hat’s authorized reseller program on 7 Sep-
tember.
The reason the top hardware companies—Intel, IBM, HP, Dell, Com-
paq, SGI, and Gateway—had chosen to work with and often invest in
Red Hat rather than in Caldera, say, can be found, perhaps, in Allen
Miner’s experience when he coordinated GNU/Linux at Oracle. “Red
Hat was without a doubt the most active at building the Linux indus-
try,” he recalls, “in terms of marketing, in terms of [their] proactively
approaching potential partners and working to see that they’re involved
in all aspects of rolling Linux and other open source out into the mar-
ketplace.”
Red Hat’s CEO, Bob Young, a Canadian, although a permanent resident
in the United States, was, more than anyone else, responsible for formu-
lating this strategy. His proactive approach to business partners grew out
of an equally direct relationship with the customer that was evident even
before he teamed up with the founder of Red Hat, Marc Ewing, at the
end of 1994.
In 1990, Young was working for a computer rental company. “Our goal
at that time was to get into the Unix workstation rental and leasing busi-
ness,” he explains. But then, as now, Young had some formidable rivals.
“I was up against some billion-dollar competitors,” he says, “so I would
cozy up to the technical users, the Unix user groups in Boston and New
0738206709-03.qxd 9/4/03 10:19 AM Page 224
York City and Washington, [D.C.], with the idea that the sys[tem] ad-
mins and the programmers would be aware of when they needed equip-
ment even before their purchasing department was. And this way I’d get
the inside scoop on their equipment requirements.”
Getting close to the users had an unforeseen side effect. “I spent a lot
of time with these user groups,” Young explains. “I was helping them by
publishing a little East Coast Unix newsletter to try and attract more
members, because of course it was in my interest to have these user
groups be as successful as they could be. So I would ask them what arti-
cles should I write in this New York Unix newsletter that they didn’t get
in the big national magazines because I needed a niche. And the story
that these guys kept talking about was what they termed—this was back
in 1992—free software.”
Despite his later enthusiasm, Young was initially skeptical about this
new world of free software. “I was somewhat astounded by it because I
was a capitalist. You would ask these guys, OK, if all this free software is
really better than the proprietary stuff, where does it all come from? And
they’d give you answers like it’s from engineers according to their ability,
to engineers according to their need. And I’m going, yeah, right. I was a
skeptic, saying Linux is going to make Unix fragmentation look like
choirboy stuff.” And yet, he says, “every time I looked around, there
were more people using these Linux-based OSes, who were more enthu-
siastic about it.”
By this time, Young had left the rental business. Building on his experi-
ence in the Unix market, he entered the mail-order world in March 1993.
“I was running a small Unix software distribution business called the
ACC Bookstore to pay my bills,” he explains. “ACC stood for whatever it
did for a living; for example, A Connecticut Computer Company when
we were in Connecticut. But it was really designed to get us to the top of
any alphabetical listing of suppliers.
“My catalogue was called the ACC PC Unix and Linux Catalog,” he
says. “The theory was that I would use some of this enthusiasm for this
Linux and free software stuff to build a mailing list of customers [whom]
I could sell, quote unquote, real technology to. The problem was, my
sales of real technology, SCO Unix and this sort of thing, never really
took off. But meanwhile my sales of Linux-based products—and they
were mostly pretty small sort of low-end primitive technologies, like the
Slackware CDs from Walnut Creek, and Linux Professional from a little
outfit called Morse Communications—the sales from those products just
kept growing exponentially.
0738206709-03.qxd 9/4/03 10:19 AM Page 225
“So I started doing some research into it, saying I must have this
wrong. There’s something going on here that even the people I’ve been
asking the advice of haven’t been doing a good job explaining to me.”
Characteristically, Young’s research consisted of going back to his cus-
tomers. “At this point, I was trying to figure out who were the primary
users of Linux,” he explains.
His travels around the Linux user groups left him more perplexed than
ever. The more users he met, the more diverse his target market seemed
to be. “I’m going, OK, let’s see, my target market is rocket scientists at
NASA or it’s blue-haired art students in Toronto,” he says, mentioning
two of the more memorable users he had encountered. “I’m having prob-
lems with this, until later that night I was stewing on this conundrum in
my room, and finally the penny dropped.”
Young had finally realized “that the one unique benefit that was creat-
ing this enthusiasm was not that this stuff was better or faster or cheaper,
although many will argue that it is all three of those things. The one
unique benefit that the customer gets for the first time is control over the
technology he’s being asked to invest in.” Put another way, open source
software provides freedom—just as Richard Stallman had planned from
the start. Young’s starting-point and overall philosophy may have been
poles apart from Stallman’s, but the end result has been an almost equally
fanatical devotion to freedom for the user.
Young also realized that the real business model for GNU/Linux and
open source was selling services, not the product. He knew that he
needed something around which to base these services. “I was looking
for products to brand,” he says. Young found one thanks, once more, to
his habit of keeping close to the customer. “I would grill my repeat cus-
tomers on a regular basis: What other products should I add to my cata-
logue, what other things do you know about out there that I should be
paying attention to?” he says. “And a couple of these guys said, ‘Yeah,
you should take a look at what this Red Hat guy is doing.’”
Young wanted to move beyond just selling Red Hat as one of many dis-
tributions. “I was looking for a product that I could add to my catalogue
on some sort of exclusive basis,” he explains. Fortunately for Young,
Marc Ewing “desperately needed some help selling this thing, because he
was starving to death in his bedroom of his small apartment near Duke
University in Durham,” North Carolina.
“So I call up Marc, and say, ‘If my customers are right, yours really is a
better Linux than Slackware. I’m selling a thousand copies of Linux a
month at this point, mostly Slackware, but a certain amount of Yggdrasil.
0738206709-03.qxd 9/4/03 10:19 AM Page 226
Because here for two years in a row—and the second year we won it out-
right up against Microsoft—the industry was recognizing that we were
building better technology, but everyone was choosing Microsoft nonethe-
less.
The problem was that at that point there were thirty-five of us, down in
the tobacco fields of North Carolina. And push come to shove, if you’re go-
ing to make a decision for which infrastructure you are going to use to
build your Oracle application stacks for your corporation, you’re not going
to do it on thirty-five guys in North Carolina when you have the choice of
the world’s most profitable, most successful technology company offering
you what they claim to be a much better alternative.
0738206709-03.qxd 9/4/03 10:19 AM Page 227
It didn’t matter how big we got, we could become twice as big, we could
become ten times as big, and we would not overcome that problem. So
what we realized is we had to partner ourselves; we couldn’t become big
enough, but we sure could partner ourselves with the industry who was be-
ginning to take notice [thanks to the Netscape Mozilla project, IBM’s
Apache announcement, and the rest]. Customers no longer thought of this
as Red Hat Linux from your little Red Hat. They would think of it as Red
Hat Linux from Dell, running Oracle databases, supported by IBM’s global
support team. And suddenly the customer goes, ‘Oh, OK, that makes sense
to me.’
And so that’s very directly what led us into the conversations with Intel
and Netscape. On my whiteboard at work for the whole of 1998 was a list
of the top ten technology companies on the planet in order of size and in-
fluence. In that initial round for September ’98, we targeted Intel on the
hardware side, because they were the guys who had the influence; all the
hardware was being built against Intel chips. On the software side, it was
much more difficult because it was sort of earlier stages for the software
companies. We ended up picking Netscape simply because they had, at that
time, the most influence. And they had also opened the source code to the
Netscape [browser] technology and clearly were making a commitment in
our direction.
The Intel connection brought Red Hat more than money, or even corpo-
rate credibility, Young recalls.
Intel, back in March of ’98, were the guys who really stressed a particular
concept. Their line was, the success of technology platforms has relatively
little to do with the guys selling the operating system, and a great deal to do
with the success in building out an ecosystem around that operating sys-
tem.
In other words, Microsoft might be the most profitable supplier in the
Windows market place, but they earn a small share of the total revenue of
that industry when you start adding up all the support organizations that
exist, all the application vendors from Oracle to Corel, Computer Associ-
ates, Symantec, all these guys. And then you add all the hardware guys in
there in that sort of ecosystem, you recognize the reason you can get almost
anything done with a Windows-based computer is because there’s some
vendor out there who can help you do it.
And what Intel reinforced, but we recognized—in part because of win-
ning this Infoworld product of the year award for the second time—was that
we had to build out this ecosystem. And that’s where we spend a lot of time
0738206709-03.qxd 9/4/03 10:19 AM Page 228
these days, with the largest technology vendors [as Miner had noted appre-
ciatively] because all the smaller technology vendors tend to follow the
lead of the Oracles, and the Computer Associates, and the IBMs.
Young still spends most of his time with vendors and companies, evange-
lizing the benefits of this ecosystem, of open source. In many ways, it’s
the perfect job for someone who talks endlessly and entertainingly about
a subject dear to his heart. Promotion of his company is surprisingly re-
strained, although Young is always happy to don his trademark red hat
for photo shoots: “I have a business card that reads Red Hat
Spokesmodel,” he says in gentle self-mockery. But his apparently selfless
devotion to the broader cause of open source has some hard-headed
business logic behind it.
“The moment we get across this point that Red Hat’s brand stands for
this commitment of delivering open source technology to the market-
place, then we create massive opportunities for Red Hat’s sales team,” he
says. “If I can’t do that, it doesn’t matter how much our sales team brag
about their products being better, faster, and cheaper, the customer is go-
ing to say, ‘I can buy it from little Red Hat or I can buy it from big, safe
Sun, or Microsoft’ and we’re going to end up losing the bid.”
“Little Red Hat” is not so little now, at least based on market capitaliza-
tion, which accounts for much of Young’s successful approach. On 11
August 1999, the day of its IPO, Red Hat’s shares shot from $14 to $52 at
the close of business; those people “down in the tobacco fields of North
Carolina” now had a market value of $3.5 billion.
This was a notable success, and not only for Red Hat. Just as the in-
vestment of Intel and Netscape had provided the first, high-profile vali-
dation of Red Hat and therefore of GNU/Linux, so the wild ride of the
Red Hat stock was also implicitly a vote of confidence by investors in
GNU/Linux and open source. The victories and achievements of Red Hat
have been gains for the entire commercial GNU/Linux world.
This halo effect, where the success of one open source company bene-
fits the others too, is nothing new; it made Caldera’s pioneering efforts to
move GNU/Linux into the corporate environment in 1994 so important.
Caldera can claim much of the credit for paving the way for the string of
successes during 1998 through its careful nurturing of software retailers
and vendors, and its sponsoring of ports to GNU/Linux. Ironically,
Caldera’s trailblazing meant that it gained least from Netscape’s an-
nouncement in 1998 that it was taking its browser open source, and its
later moves to embrace GNU/Linux as a server platform.
0738206709-03.qxd 9/4/03 10:19 AM Page 229
color maps, we could actually point and click and navigate the network.
You could go from department to department, within the corporation, or
an intranet, by clicking on a map. And you could grab the information
you needed without having to understand the complexities of the net-
work itself.”
Sparks and Ransom Love had come up with a tool for browsing
through information whose navigation model—a virtual world—has not
been matched by today’s products, never mind those of 1994. “It was
funny because we had the Mosaic folks come out,” Love says, “and they
couldn’t get their jaws off the table.”
Novell’s management had no inkling of what was running down in
their labs. “They would have owned the Internet space,” Love says sim-
ply. But Novell shut down the project; because of its stubborn refusal to
fully embrace TCP/IP’s open networking standards, the company became
increasingly marginalized by the Net instead of owning it. In this sense,
Caldera (the word means a volcanic crater, particularly one whose
breadth is far greater than the current vents of lava) is a sadly apt name
for a company that represents a kind of noble relict of the even more
powerful ideas its founders had dabbled in at the beginning.
Novell was also the impetus behind the creation of an outfit that
would eventually become the third major distribution in the GNU/Linux
Gang of Four, Pacific HiTech.
Pacific HiTech’s founder, Cliff Miller, had led a pretty adventurous life
up to that point. Born in San Francisco, he had lived in Australia for a
year as a child, and then went to Japan for two years, where he stayed
with a Japanese family and attended a public school. After he moved
back to the United States, Miller attended college, and spent a year in
Macedonia, then a part of Yugoslavia, to further his studies of the Mace-
donian language. “I finished my BA when I was nineteen,” he explains,
“and then a year later got my MA in linguistics as well,” and adds with
what amounts to something of an understatement, “I tend to be pretty
intense, and just get through things as fast as I can.”
Miller’s life then took a different turn. “I went to work in a salt factory,
and then in the Wyoming oil fields for a little over half a year, driving a
big truck.” But Miller succumbed to the lure of Asia, one of the fixed
points of his world, and returned to Japan to teach English there. He
then moved to China, where he taught English at a science university
and met and married his wife, Iris, a native of Beijing who speaks fluent
English and Japanese.
When the couple returned to the United States, Miller says, he
“worked at Xerox for a while, in their multilingual software group. At
0738206709-03.qxd 9/4/03 10:19 AM Page 232
the time I didn’t know much at all about computers and programming,
but was really interested. I applied to a number of schools and got ac-
cepted into graduate school in computer science, with scholarships at
the University of Utah.”
In 1992, when he had started a PhD at Utah, Novell had made Miller’s
wife redundant after it acquired the company where she was working.
Rather than look around for another job, Miller and Iris started a new
company, Pacific HiTech, in their basement. “Our initial aim was to sell
U.S. or Western software into the Japanese market,” Miller explains—a
business they were well qualified to take on given their shared linguistic
and computing skills. “And very quickly we became a CD-ROM pub-
lisher, taking free software and shareware from the Internet, publishing it
on CD-ROM.” Miller notes, “We were one of the first companies to be
profitable by basing our business mainly on the Internet.”
During their investigation of free software that was available on the In-
ternet, they came across GNU/Linux. “It was another thing that we could
take and turn into a product,” Miller recalls, and adds: “but it was much
more than that as well. We actually used Linux very early on, as a devel-
opment tool, internally.”
This development work paid off a few years later. After selling free
software and other people’s GNU/Linux distributions, Miller and his wife
decided to come out with their own—but with a difference. “We figured
well, gee, we’re already testing and doing some development on Linux,
and Japan was kind of a wide open market for Linux because nobody
was doing a real Japanese commercial version of Linux, so we decided to
do it there.” This was in late 1997.
As with Caldera’s first product, Pacific HiTech’s distribution included
some proprietary software, and largely for the same reasons. A GNU/
Linux distribution for Japan largely needed to support the Japanese lan-
guage fully; GNU/Linux was capable of doing this, but required the addi-
tion of Japanese fonts, for example, as well as other specialized elements.
Pacific HiTech’s product offered these in the form of commercial soft-
ware. “There were things out there in the free software world,” Miller ex-
plains, “but they weren’t of the same quality, so we have a few different
versions” of the distribution, including “a completely free version that
doesn’t have the commercial stuff. The basic distribution is all GPL.”
Pacific HiTech’s distribution has done well in Japan, and also in China
following the launch of a localized version there in the spring of 1999.
Miller says that “nobody really knows” exactly how big their market
share is in these markets because distributions can be downloaded and
copied freely, “but we figure it’s over 50 percent in Asia.” Given this suc-
0738206709-03.qxd 9/4/03 10:19 AM Page 233
cess, it was natural that Miller should think about entering the U.S. mar-
ket. “We’ve always sold just a trickle of products. We really beefed up
our selling efforts last quarter of ’99.” By this time, the company had
changed its name to TurboLinux, reflecting in part that it was no longer
concentrating on the Asian market. It was an appropriate choice for a
company run by the quietly spoken but intellectually turbocharged
Miller.
“In the fourth quarter of [1999] we started selling to retail stores in the
United States, and that was mainly our workstation product,” Miller
says. “Through some pretty aggressive promotions we were able to get a
pretty good start in the retail market. But the real emphasis of our strat-
egy is to serve the enterprise market with the server product that we are
developing. We concentrate our development on our server product line
and on clustering.”
Clustering involves taking several machines and running them in a co-
ordinated fashion. “We have what’s called a daemon,” Miller explains,
software “that runs in the background and provides for the high avail-
ability. So if one [computer] node in your cluster goes down, then the
daemon that’s running on all of the nodes will notice that and route
around that bad computer.” The ability to cope with the failure of indi-
vidual machines without disturbing the overall running of the system is
the main attraction of such high-availability clusters, as they are called,
especially in hot areas such as e-commerce.
TurboLinux adopts an unusual licensing approach for this daemon.
“It’s closed [source] for a period of six months, and then we open it up
after that,” Miller explains. “So it’s kind of a compromise in order to pro-
tect our investment on the development.” This kind of innovative ap-
proach—one that had been pioneered by a program called
Ghostscript—is possible because technology is moving so fast that even
this six-month window provides purchasers with enough incentive to
buy software that will later become free. Miller explains, “For a large
company to get clustering software at one or two thousand dollars is re-
ally quite a bargain”
Miller seems to have no difficulty convincing investors that his ap-
proach is viable. At the end of 1999, Intel invested in TurboLinux, along
with a couple of venture capital companies. This round of financing was
followed by another, in January 2000, from what Miller calls “a whole
slew of companies; Dell was the lead investor, and then there were Com-
paq, Seagate, Novell, NEC, Fujitsu, NTT, and Legend in China—there
are about twenty-five or so corporate investors.“The amount of money
was pretty substantial,” Miller adds, “we got $57 million. But even more
0738206709-03.qxd 9/4/03 10:19 AM Page 234
important than that was the relationships that we were able to enhance
by doing that.” Nonetheless, the money will be useful in financing
Miller’s ambitious expansion plans in Europe. Red Hat and Caldera have
been present there for some time, but the dominant distribution in this
region comes from the German-based SuSE (pronounced roughly like
the American composer “Sousa”).
SuSE’s original German name was Software- und System-Entwicklung,
or Software and System Development. Like Pacific HiTech, the company
was created while its founders were still at university. One of them was
Roland Dyroff, who today is SuSE’s CEO. “The plan for setting up the
company was Unix-focused consulting and software development,” he
says. “But before this plan could really be executed, Linux came in and
the focus went to Linux.”
The company was formed in 1992. Its first offering to customers was in
March 1993, and consisted of Peter MacDonald’s pioneering SLS distribu-
tion. Later, SuSE used Slackware as the basis of a localized version. Where
Japan was attractive to Pacific HiTech because of its lack of GNU/Linux
activity, Germany was a good place to sell GNU/Linux distributions for
exactly the opposite reason. Then, as now, Germany had some of the most
advanced users in terms of awareness and deployment of GNU/Linux. In
1996, SuSE decided to come out with its own distribution. “The reason
being the time lag,” Dyroff explains, “Slackware released and then we had
to apply our patches” to make it suitable for German users. Since then,
SuSE has gone on to add many original elements.
When SuSE had established itself as the leading European distribution,
it decided in 1997 to open an office in Oakland, California. Dyroff him-
self describes this as a “ridiculous decision.” As he explains, “the reason
is, the company was very small at this time. We had a ’96 revenue of like
3 million Euros [about $3 million], and we had not much financial liq-
uidity to invest in the United States. And we didn’t have much experi-
ence. We just did it.” When it entered the U.S. market, SuSE used retail
as a strategy to promote its brand, as had TurboLinux. “We focused on
selling the SuSE box product,” Dyroff says. But now it is branching out
into professional services, an extension of something SuSE began in Ger-
many back in early 1999.
Although services are “the fastest growing sector of our business by
far,” as Dyroff says, SuSE’s main traditional strength has been in retail
sales. “One hundred percent of our customers are private customers, be-
cause the individuals we are selling to in the companies are nearly [all
technicians],” Dyroff says. “The area of potential customers now ex-
pands beyond those where a technician makes a purchase decision or
0738206709-03.qxd 9/4/03 10:19 AM Page 235
Fairness to the hackers who made me bankable demands that I publicly ac-
knowledge this result—and publicly face the question of how it’s going to
affect my life and what I’ll do with the money. This is a question that a lot
of us will be facing as open source sweeps the technology landscape.
Money follows where value leads, and the mainstream business and finance
world is seeing increasing value in our tribe of scruffy hackers. Red Hat and
VA have created a precedent now, with their directed-shares programs de-
signed to reward as many individual contributors as they can identify; fu-
ture players aiming for community backing and a seat at the high table will
have to follow suit.
So while there aren’t likely to be a lot more multimillion-dollar bonanzas
like mine, lots of hackers are going to have to evolve answers to this ques-
tion for smaller amounts that will nevertheless make a big difference to in-
dividuals; tens or hundreds of thousands of dollars, enough to change your
life—or wreck it.
He goes on to point out the irony: “Gee. Remember when the big ques-
tion was ‘How do we make money at this?’” Then, more seriously, he ad-
dresses the central question this new-found wealth poses:
“Reporters often ask me these days if I think the open-source commu-
nity will be corrupted by the influx of big money. I tell them what I be-
lieve, which is this: commercial demand for programmers has been so
intense for so long that anyone who can be seriously distracted by
money is already gone. Our community has been self-selected for caring
about other things—accomplishment, pride, artistic passion, and each
other.”
Raymond emphasizes that, despite the spectacular irruption of money
into the open source world, and the transformation of the GNU/Linux
distributions into financial powerhouses, the true hacker knows that
there is a world of difference between shares—however stratospheric
their valuation—and the sharing that lies at the heart of free software.
0738206709-03.qxd 9/4/03 10:19 AM Page 237
14
237
0738206709-03.qxd 9/4/03 10:19 AM Page 238
seriously were the other two cofounders,” John Gilmore and David
Henkel-Wallace.
“John Gilmore’s e-mail handle is ‘gnu,’” Tiemann explains, “and some
claim that he had nicknamed the GNU project.” Moreover, “John was
employee number 5 at Sun, renowned programmer and civil libertarian,
and also independently wealthy because of his Sun stock.” The other
partner, David Henkel-Wallace, was somebody Tiemann had met while
doing his research on compilers, and who “also seemed reasonably
bright,” he says. “So the three of us pooled our resources. Actually, I was
the short end of the stick, I only had $2,000 of liquid money to invest at
the time. I told John and also Gumby—David Henkel-Wallace’s nick-
name—that I wasn’t tapping them for their money. And so we each put
in $2,000, and started the company.” This was in November 1989.
The new Cygnus Solutions offered services based around Stallman’s
GNU software. “Basically, I would go and sell a contract and then John
and Gumby would figure out how to deliver it,” Tiemann explains. “Ini-
tially this worked quite well. However, it got more complicated as I bur-
dened them down. And we needed to build a scalable model. The
challenge was we knew that we wanted to be support services for the
software, [but] we didn’t have [our own] product at the time.” In fact, “it
took about three years before we really had a credible support offering,”
he says, and adds, “we probably lost money on every support deal we did
before 1992.”
While they were casting around for a product, one of them came up
with what, in retrospect, was an interesting idea. “John Gilmore pro-
posed that we write a free kernel for the 386,” Tiemann recalls. This was
in 1990, fully one year before Linus would sit down and do exactly that.
“I always considered myself to be an unprejudiced individual,” Tiemann
explains, “but the thought of taking something like Unix and running it
on an Intel 386 just boggled my mind. I didn’t feel it was a real processor.
And I didn’t feel that users who bought PCs were real users.” This state-
ment echoes the sentiments of those who had questioned the sense of
porting a real operating system to such a footling processor when Linux
was first released.
This was not the last time that opportunity in the form of GNU/Linux
came knocking. “Adam Richter, who was the purveyor of the first Linux
on CD, Yggdrasil, came to me and asked for some business advice,” Tie-
mann recalls. “And I poured out my heart and I told him what mistakes
we did and didn’t make.” Unfortunately, “he was definitely uninterested
in joining forces with Cygnus.”
0738206709-03.qxd 9/4/03 10:19 AM Page 240
Even later, Cygnus once more narrowly missed its chance. “In 1995,
Larry McVoy”—author of the Sourceware document—“came to me,”
Tiemann explains, “and said, ‘There’s this company in North Carolina
that’s doing Linux, and they’re really making a lot of money for the num-
ber of people they’ve got. You ought to check it out.’ And I tried to con-
vince my partners at that time that we attempt to acquire Red Hat, and
they would have none of it. In 1990-91, I said no to John Gilmore, and in
1995 he said no to me. So there was a long window of opportunity that
we simply ignored.”
The problem was hardly lack of vision. After all, in 1992, Tiemann had
made a speech to the Sun Users Group conference in which he not only
assessed correctly the strengths of the free software movement, but pre-
dicted it would become the dominant force in the following years. Nor
was the problem caused by a lack of ambition; partly because Cygnus
had become so successful—“by 1995 we were doing north of 10 million
dollars a year in our core business,” Tiemann says—they decided to stick
with what they knew. “It’s awfully difficult to say we’re going to duck out
of this race right now,” he says, “and we’re going to go back to zero.”
The product that was driving these impressive revenues was GNUPro,
which offered a more comprehensive and integrated development envi-
ronment based around free software such as GCC. “We subscribed to the
theory that we needed to have a focused product that people could iden-
tify,” Tiemann explains.
One consequence of Cygnus’s success was that, in addition to being
built around GCC, the company became the guardian of GCC’s develop-
ment. “Stallman wrote the original version” of GCC, Tiemann explains.
“GCC 2 actually came out shortly after Cygnus was founded, and a large
part of that was already set in motion by work I had done before
Cygnus,” he says. “But from about 1991 onwards, we had more full-time
people working on GCC than any other single entity, and today we’ve got
more by an order of magnitude at least.”
Once again, Cygnus here prefigured the approach later taken by other
companies based around free software. For example, Red Hat has taken
on many of the top kernel hackers. The list is impressive, and includes
the lieutenants Alan Cox, Dave Miller, and Stephen Tweedie, and other
hacker stars such as the Hungarian, Ingo Molnar. The other company
that has built up an impressive portfolio of big names in the open source
world is VA Linux. Top figures associated with the company include Ted
Ts’o, Eric Raymond, and Jon “maddog” Hall.
Given the parallels between Cygnus and the GNU/Linux distribution
companies that have come after it, and the wealth of talent Cygnus pos-
0738206709-03.qxd 9/4/03 10:19 AM Page 241
sessed in the field of open source tools, it is perhaps no surprise that Red
Hat acquired Cygnus, on 15 November 1999. Even before this, Cygnus
had put out feelers—and had been turned down. “We talked with Red
Hat and were rebuffed early [1999],” Tiemann explains, “because the ad-
vice they got from one of their board members was nothing good comes
from joining two small companies”—at that time Red Hat had forty peo-
ple, he recalls.
However, things changed. “After their successful IPO,” Tiemann says,
“they could go look for strategic partners. It turns out that the fit is actu-
ally quite good. Cygnus has a lot of the technical depth that Red Hat
needed, especially related to development tools.” Cygnus also helped re-
alize Red Hat’s vision of deriving more of its revenue from services. At
the time of the acquisition, 60 to 70 percent of Cygnus’s turnover came
from services, and for Red Hat, “their box product was probably more
than 80 percent,” Tiemann says. Although the philosophies of the two
companies—and of Tiemann and Red Hat’s Bob Young—are close, in one
area Red Hat caused Cygnus to modify its approach slightly.
Unlike Bob Young, who believes that the open source nature of the
products he sells is central to the value they deliver to customers, Tie-
mann has been more pragmatic about including proprietary elements.
“At the end of the day,” he believes, “in a capitalist society, you’re judged
against the competition.”
But Tiemann has come to recognize that one of the key features of the
brand that Red Hat has built is what he calls “the ‘always open’
promise”—everything the company does will always be open source.
“And it turns out that brand trumps almost everything.” As a result, he
says, “being religiously pro open source for the sake of being religiously
pro open source doesn’t carry much water with me. But being religiously
pro open source as a component of a brand strategy that is effective car-
ries a lot of water.”
The question of how or even whether to add proprietary elements to
commercial products based around free software remains one of the
touchstones for companies working in this area. It was an issue that Eric
Allman had to face when he decided to create Sendmail, Inc. Unlike Tie-
mann, the perennially unassuming Allman was not driven by some belief
that his free software was so good that it should be possible to profit
from it. “I did not start Sendmail Inc. to make money, although that’s a
nice thing,” he says. “But I did it to make Sendmail sustainable; I did it
so we could do great innovation. I did it to make the world smaller”—
emphasizing once more his central belief in the community aspect of free
software.
0738206709-03.qxd 9/4/03 10:19 AM Page 242
I love the open source process, but Sendmail was becoming a victim of its
own success. We were going through a period when the Internet standards
were starting to go through dramatic expansion in the e-mail area. I found I
was unable to supply those resources, and so Sendmail was going to be-
come unable to step up to the demand.
When I first started working on what became Sendmail version 8, I set
up a mostly open mailing list for people doing testing,” [in early 1993.] “It
became clear that there were a few people that were exceptionally good
[on that list]. They were not only asking good questions and pointing out
bugs and so forth, but they were contributing fixes. And so I set up an-
other list which was intended as a support list. It was invitation only, but I
would publish that address so that outside folks when they had questions
could send to that list and there wouldn’t just be me answering. And a lot
of the people that started on that list are still there. They’re a very hardcore
group.
The second list was created around the beginning of 1996, he says.
“As Sendmail evolved,” Allman explains, “a lot of the support moved
on to those other folks, and I was able to continue to do development.
And some of those people, one or two in particular, were helping in-
creasingly with development. But it started to become clear that this was
not sustainable. The Internet was exploding, it was growing 100 percent
a year or whatever it was, and the user base was growing a lot faster than
I could.” As a result, he started looking at other solutions.
“My first attempt was to try and create something where I got a small
amount of funding,” Allman continues. “I had this fantasy if Sun and
Digital and IBM and SGI and a few others were each to kick in $50,000,
just like maybe the half to a third of the cost of one loaded engineer, I’d
be able to do what I wanted. Then they’d get good stuff, and I’d get good
stuff, and we’d all be happy. But I couldn’t float this.” This was in late
1996.
By early 1997, Allman was making a living as a consultant—“primarily
on Sendmail,” he says. “My fantasy was that I’d charge ridiculous
amounts of money, and that would let me work half-time, and the other
half-time I could work on Sendmail.” Note that Allman’s “fantasy” was
not to get rich by charging “ridiculous amounts of money,” but to sup-
port himself so that he could work on Sendmail—and then give away the
results. “In the process,” he says, “I ran into Greg Olson,” an old friend.
0738206709-03.qxd 9/4/03 10:19 AM Page 243
fee only,” Allman says. “But in a year, that would be a very bad decision.
So from a point of view of a pure capitalist model, we’re better off leaving
it open source, assuming we believe that we’re a long-term play.”
This is an important point: Companies that attempt to take free soft-
ware proprietary—as is possible under some licenses—are effectively
cutting themselves off from the very power they seek to tap. For their
product to thrive, they need a flourishing free community. As Allman
and Olson recognized, the health of their company is intimately linked
to the vigor of the community that surrounds it.
Allman’s view that Sendmail would have failed to grow had not he or
someone else created an open source company to support and nurture it
is one shared by a fellow free software pioneer, John Ousterhout (pro-
nounced “oh-ster-howt”). As Allman and many others have, Ousterhout
has strong ties with the University of California at Berkeley, where he
was a professor of computer science. Ousterhout is best-known for his
language Tcl, pronounced “tickle,” which began as a “tool command lan-
guage” for creating design tools for integrated circuits, and Tk, a toolkit
for producing graphical interfaces.
Ousterhout originally wrote this for himself and his students, begin-
ning in 1988. Tcl percolated to the outside world in “late ’89,” he says.
“I’d given away a few copies then, and I started getting comments back
from people. It was in ’90 and ’91 when things really picked up.” As it
had been for Stallman and Wall, this feedback was crucial in spurring
him on to develop his software further. “For me, the biggest thrill in
building software is knowing that there were lots of people out there us-
ing it,” he says. Ousterhout points out that this drives a key strength of
the open source process.
“It’s sort of the opposite of design by committee,” he explains. “You get
all the benefits of this huge pool of ideas, but it’s not a democratic system.
There’s typically one person who is the god or the tsar who has final au-
thority over everything that goes into the package. One of the great bene-
fits is that you can get a degree of architectural simplicity and uniformity
that you can’t get if you have design by committee or a zillion people all
with total authority to make changes to the core of the system.”
Ousterhout acknowledges that he failed to follow up on one idea that
came from his users. “We started hearing stories about this crazy physi-
cist who had this weird idea for something called the World Wide Web,”
he recalls. “Somebody did this package called TkWWW, which was a
graphical way of displaying this Web stuff that had been invented by this
guy in Switzerland.” This was in 1992, barely a year after “this guy in
Switzerland”—Tim Berners-Lee—had released the Web publicly, and
0738206709-03.qxd 9/4/03 10:19 AM Page 245
well before the NCSA’s Mosaic browser project at the University of Illi-
nois had begun. To compound his embarrassment, Ousterhout says rue-
fully, “I never actually started [TkWWW] up. I thought this World Wide
Web thing seemed pretty hokey. One of those things you sort of look
back and kick yourself for.
“I’d been at Berkeley a dozen years or so by then,” he continues, “and I
had always wanted to spend a good chunk of my career in industry, as
well as in academia.” The final push was the realization that as Tcl was
becoming more popular, it would need more support than the open
source structure could offer. Echoing Eric Allman, Ousterhout says, “I
think there’s a set of things you can do very well in a purely open source
project, and a set of things that tend not to get done in that project. All of
the additional kinds of services and value-adds that you need if the com-
pany is going to be using this for mission-critical applications—develop-
ment tools, higher-level kinds of facilities—those tend not to get
developed in a project that’s purely open source.”
As a result, he decided to leave Berkeley and work on Tcl at Sun, which
he joined in the early summer of 1994. The move had its critics, notably
Richard Stallman, who issued a typically direct call to arms headed “Why
You Should Not Use Tcl.” Ousterhout explains the reason. “When I went
to Sun, that was acknowledging that I felt that it was OK to build a com-
mercial business model around open source software,” he says. “That re-
ally bugged Stallman, and so he tried to get people to boycott Tcl as a
way of showing protest against that.” What provoked Stallman’s ire was
that the commercial model would involve proprietary software.
“The belief was that Sun was going to somehow take Tcl proprietary,”
Ousterhout says. “And what really was the deal with Sun was we planned
to continue developing the open source Tcl and distributing it freely, and
that was in my contract with Sun. But we would also build some com-
mercial things. We’d prototyped some commercial products that would
be built around Tcl, and if all went well, Sun would sell those products
while still giving away the source of Tcl freely.
“We actually had gone so far as to begin to create a new business unit
at Sun,” Ousterhout recalls. “It was going to be called SunScript. But a
few months into the effort, the person who was our angel among the Sun
senior staff, Eric Schmidt, he went off to become CEO at Novell. Shortly
after he left, it became clear to me that we just didn’t have the kind of
support and enthusiasm in the top-level management at Sun that we
would need in order to make this a success over the long term.”
Recognizing that Sun was not likely to be so committed to Tcl, not
least because it was around this time that the company began promoting
0738206709-03.qxd 9/4/03 10:19 AM Page 246
heavily its own Java programming language, Ousterhout asked that the
project be cancelled. “But as part of the cancellation,” he says, “we
agreed that we would take all of the software that we thought we were
going to commercialize, and release it all freely on the Internet”—gener-
ously sacrificing his own interests for the sake of the Tcl community.
Ousterhout was faced with a critical decision: whether to move on
from Tcl entirely, or to set up his own company to support it. “Early sum-
mer of ’97 we cancelled SunScript,” he recalls, “and then over the sum-
mer and fall I sort of tested the waters to see if I’d be able to raise enough
money to start a company. Things looked good, so in January of ’98 I
spun out from Sun to start Scriptics.” Ousterhout was fortunate that as
part of the preparations for creating SunScript, “we’d done a bunch of
marketing studies at Sun trying to figure out what kinds of products to
build and where there was demand,” he says. So “we did have some evi-
dence that at least the community was there, and of the interest in the
development tools.”
Scriptics’s first product, TclPro, offered an extended development envi-
ronment, as had Cygnus’s GNUPro. Ousterhout recalls that, for the most
part, Tcl users accepted these new proprietary elements alongside the
open source Tcl. That community was already large: Ousterhout esti-
mates that there were around half a million Tcl users when Scriptics was
launched, and that there are more like a million now.
“We did our development tools in ’98,” he explains. “But we were
thinking about what might be interesting areas where people need inte-
gration applications that we could apply Tcl to. And it converged pretty
quickly around B2B [Business-to-Business e-commerce].” The resulting
product, Connect, was an ambitious piece of enterprise-level software
that allowed companies to link together their respective computer sys-
tems by using XML (eXtensible Markup Language) as a platform-inde-
pendent format for exchanging information across networks such as the
Internet, and Tcl to provide the integration.
The subsequent proliferation of global B2B exchanges has shown the
shrewdness of Ousterhout’s bet on this area. Reflecting the company’s
new focus, in May 2000, Scriptics was renamed Ajuba Solutions, and
Connect became Ajuba2. Not surprisingly, perhaps, Ousterhout says,
“We do a lot of services right now. One of the general lessons I’ve learned
over the last couple of years is that if you’re doing a business that’s purely
based around the open source stuff, the biggest opportunity is in the ser-
vices area.”
One person who couldn’t agree more is Dave Sifry, co-founder and
CTO of Linuxcare, a company set up to provide nothing but services to
0738206709-03.qxd 9/4/03 10:19 AM Page 247
that, hey, this is part of doing the right thing for the world,” he says,
“there are a number of very significant strategic business reasons why we
do this.
“Number one, it encourages us to get the best developers in the world.
When you actually are telling people, hey, I want you to work on open
source software while you’re at work, that is pretty unique. And then
once you get some [of the best coders], you end up getting more. Be-
cause everybody wants to work with the best people. Number two is, the
more good open source software that’s out there, the more people who
are using open source software. And guess what, that means the more
people who are going to need the services of a company like Linuxcare.
Number three, when you encourage people to work on open source soft-
ware while they’re here at work, you end up getting leaders and teams of
open source engineers who now work for your company. And then lastly,
and by no means least,” Sifry points out, “it’s great PR.”
This analysis implies, of course, that the key asset for any open source
company, even more perhaps than for conventional companies, is the
people it employs. Because the “product” is open source, and freely avail-
able, businesses must necessarily be based around a different kind of
scarcity: the skills of the people who write and service that software. This
also explains in part why Red Hat and VA Linux have been prepared to
offer such flexible contracts to the top hackers, many of whom work
from home, often outside the United States.
Just as Linuxcare saw that it could abstract the services side from the
products still offered by companies selling GNU/Linux distributions, so
another innovative venture aims to move one stage further by creating a
marketplace in those increasingly valuable programming skills.
The original idea came from Hewlett-Packard, and it was the open
source coordinator there, Wayne Caccamo, who helped realize it. “When
I was gathering my network of people at HP,” he says, “I came across a
guy who was very much an expert on Linux. He was doing some project
that happened to involve Linux development and he was very frustrated
by the process of bringing in [outside software] contractors. The process
at HP, like any large company, is very cumbersome.
“He came up with this concept that we could create an intermediary,”
Caccamo recalls. The idea was that the intermediary would handle the
details of finding and hiring open source contractors. At the suggestion
of the originator of the idea, Caccamo decided to work with O’Reilly &
Associates. He had noted how “they played a positive role bringing peo-
ple together in the Linux community,” he says, “and they also employed
people within the open source world that were viewed and known as
0738206709-03.qxd 9/4/03 10:19 AM Page 249
celebrities and that were all respected from that standpoint.” Among
these celebrities, few were more respected than Brian Behlendorf, who
had recently joined O’Reilly.
The job at O’Reilly came up, Behlendorf says, “when I was fishing
around for something new. I sat down with Tim O’Reilly and started talk-
ing about what could be done in open source. I really felt it was worth
spending the time to see if there was yet another business model that
open source could facilitate that wasn’t just sell support. He said, ‘Well,
why don’t you incubate those ideas here at O’Reilly?’” Behlendorf recalls.
“And so I joined there with the intent of coming up with a set of ideas
and testing the commercial viability of them.” This was at the beginning
of 1999, just when Caccamo was considering O’Reilly for the intermedi-
ary he was looking for.
The result was SourceXchange, a site where companies can post pro-
posals for programming work and solicit bids from open source coders.
It is intended to form the first of a series of projects exploring new busi-
ness models based on open source, and which collectively make up Col-
lab.Net. A list of those involved reads like a roll call of the leading
players in the open source industry. Employees include Frank Hecker,
who played a major role in convincing Netscape to take its browser open
source, and James Barry, who helped convert IBM to Apache. Alongside
Behlendorf, Tim O’Reilly and Marc Andreessen are board members, and
investors in a $35 million round of funding closed in June 2000 included
Dell, HP, Intel, Novell, Oracle, Sun, and TurboLinux.
“The SourceXchange model is designed for projects that are substan-
tial in size,” Behlendorf explains, “that aren’t the ‘I need someone to fix
this bug, here’s $100, or beer money,’ which a couple of other sites are
based on. But it really is focused on the $5,000 to $25,000 projects, the
ones that involve two man-weeks’ to four man-months’ worth of work.”
Not surprisingly, given Behlendorf’s own history, all the code produced is
released as open source, but which particular open source license is
adopted is “part of the negotiation,” he says.
Behlendorf points out that not only companies can benefit from
SourceXchange. “Every open source project out there has a list of unfin-
ished items,” he says, “things that no one really wants to tackle because
they are complex, because they require specialized knowledge, or for one
reason or another they’re just not the kind of things that people attack in
their spare time. SourceXchange is a way to get past that. That’s why it
wasn’t designed for small little bug fixes; it was designed for the bigger
problems, the ones that are keeping bigger projects from getting to the
next level.”
0738206709-03.qxd 9/4/03 10:19 AM Page 250
“So we looked at these things, and we said, ‘Gee, let’s take the ideas of
ColdStorage and OpenProjects and pull them together.’” The result was
named SourceForge, launched on 4 January 2000 as a free service to
open source developers. Just a few months later, SourceForge already
had around 5,000 projects, including many of the most important, and
over 1 Terabyte—a million Mbytes—of free software.
Though born in part of the business model of VA Linux, the appear-
ance of SourceForge represents a large boost for the open source commu-
nity. It provides both a historical repository of code—allowing people to
track how programs develop; for example, to find out when a certain bug
appeared or disappeared—and a high-quality, zero-cost infrastructure for
expanding current projects. Perhaps even more important, it makes
starting new projects easier. As a result, it may well offer the necessary
catalyst for the creation of programs in an area where open source has so
far had little impact: that of end-user applications.
0738206709-03.qxd 9/4/03 10:19 AM Page 252
15
252
0738206709-03.qxd 9/4/03 10:19 AM Page 253
for the desktop was that the former did not require fancy graphical front-
ends, and this was where Unix was weak.
Although X Window provided a basic windowing technology, it did
not offer programmers what are generally called widgets—the basic
building blocks of graphical programs. These include such things as on-
screen buttons, selection boxes, and all the familiar components that go
to make up a Mac-like or Windows-like desktop. For these, it is neces-
sary to turn to one of the toolkits, which contain sets of widgets that can
be used to construct a graphical application.
The main graphics toolkit for Unix was Motif. But, as Mattis says, “we
found Motif painful. We didn’t have a firm grasp about how it worked in-
ternally.” He points out one reason for their difficulties: “There were no
good examples of how to do stuff in Motif. If I wanted to do something
new and interesting, it was very difficult to figure out how to do that.”
Another reason this was difficult was that Motif was not open source or
even zero-cost software. As a result, it was usually employed with propri-
etary programs, for which the source code was unavailable, making it
even harder to learn how to program.
Nonetheless, Kimball and Mattis struggled on with Motif and pro-
duced version 0.54 of the GIMP, which came out in February 1996: “It
was kind of stable, it did basic stuff,” Mattis says. “It wasn’t very ad-
vanced at all.” At this point, he recalls, “Spencer was off playing with the
way image manipulation worked.” Meanwhile, Mattis was thinking,
“You know, this Motif stuff is really frustrating. Let me see, how would I
design a better toolkit? I started playing around saying, like, how would
I do buttons?”—one of the most basic widgets. “And it kind of just
evolved into something like, wow, I have almost all the things you need
in a real toolkit.” This was how Linux came into being, with the same re-
alization on Linus’s part that the result of his “playing around” was al-
most a Unix-like kernel.
Mattis called his toolkit Gtk: the GIMP Tool Kit. When he showed it to
Kimball, “It’s like, OK, we’ll move over to using Gtk.” Doing so brought
some major advantages. “[Kimball] was frustrated by not being able to
use certain things in Motif,” Mattis says, “not because Motif didn’t allow
it, but because he didn’t know how to do it in Motif. But if he wanted to
do something in Gtk, I knew instantly how to do that. Or I could add it
in for him.” Gtk was much better than Motif for their purpose, but still
not quite right. “We really found that there were incompletenesses,”
Mattis says. “There were things that were made difficult by the way Gtk
was originally programmed.” Mattis rewrote almost all of his toolkit,
which he called Gtk+.
0738206709-03.qxd 9/4/03 10:19 AM Page 255
The switch to Gtk and then Gtk+ had another, even more important
benefit. “When we converted over to using Gtk,” Mattis recalls, “there
was suddenly a great influx of people starting to look at the internals
[of the GIMP], and say, OK, now I can modify this because I have ac-
cess to the toolkit as well. So it wasn’t so much that it made doing
things easier, it was more that a lot more people started contributing
internally. And that’s what really kind of freed Spencer and I eventually
to hand over the project to other hands.” In other words, the classic
open source benefits of bug fixes and new code being sent in began to
flow only when users could see all the code, something that was not
possible when the proprietary Motif libraries were used. The arrival of
Gtk ensured not only the GIMP’s further development but also its very
survival.
Mattis and Kimball eventually decided that they had contributed
enough of their time and energy to their project and the free software
world, and wanted to move on. “I guess I was a little bit disgruntled ini-
tially to see there weren’t people immediately picking up,” Mattis ac-
knowledges. “But in hindsight it was hard to expect people to pick it up
immediately” because the GIMP is a complex program, and it took time
even after the open Gtk libraries were adopted before other hackers felt
confident enough to continue development.
The GIMP, for all its deserved fame, was not the only free software
graphical application created in early 1995. At the same time, in the Ger-
man university town of Tübingen, another computer science student,
Matthias Ettrich, born in 1972, was starting work on his own ambitious
personal project: a document processor for GNU/Linux called LyX (pro-
nounced “lukes”).
LyX was a document processor rather than word processor; at its heart
it employed Donald Knuth’s TEX typesetting language, which handled
many of the design decisions automatically. “[TEX ] does almost every-
thing,” Ettrich explains, “things that you don’t want to worry about,
placement of pictures, how the line or the page breaks were done with-
out having these ugly defects.”
Ettrich began LyX for the joy of hacking and as a tool for writing his
papers at university. Unlike the GIMP, however, LyX became an official
project for a university course. Ettrich optimistically told his professor
that he’d have something usable in three weeks. “It was totally wrong,”
Ettrich notes. “It took me something like three months before I could ac-
tually do some typing. I think it was four or five months before I released
something, put it on SunSite,” one of the big Internet stores where soft-
ware was placed for download by others.
0738206709-03.qxd 9/4/03 10:19 AM Page 256
The move had unforeseen consequences for Ettrich, the same ones Li-
nus had experienced when he uploaded his project to a public server.
“The mail started coming,” Ettrich recalls, and adds that he had been un-
prepared for this because “[I] didn’t know anything about free software,
and how it works. I didn’t know the concept of people cooperating or
collaborating over the Internet,” he says.
“When I released LyX, I said right, people, this is an alpha version, not
really stable. The final product I will probably make a shareware thing—
because that’s what I knew from DOS software.” Shareware was a kind of
try before you buy: If you continued using a program that you had
downloaded, you were required to send a small fee to the author.
“The basic idea was not to become rich with that,” Ettrich explains.
“My idea was I invested lots of time in that thing, and I don’t want it to
die, and I need some time to be able to maintain it, because otherwise it’s
just a big blob of dead code.” But now something better than the share-
ware model presented itself. “I learned with all that feedback I got, dur-
ing that alpha and beta period, that lots of people wanted to join the
project and help me with maintenance, so it was for the project much
better to say, OK, we put it under the totally free license, and no money
involved whatsoever, please help. And that worked out.”
The world of free software may have brought with it unforeseen ad-
vantages, but it also revealed equally unsuspected problems. LyX was
written using the Motif libraries (as was the first version of the GIMP),
“which is not free software,” as Ettrich notes. “Then I switched over to
the Xforms library, which isn’t free software either—I mean, it’s perfectly
free, you can download it, you can get the source code if you ask the
maintainer, but it’s not free in the sense of the Free Software Founda-
tion.”
That was the problem. What Ettrich calls “hard-core free software
zealots” were not happy with this use of nonfree software, and the li-
cense under which LyX was released. The criticism came as something of
a shock to Ettrich. “The tone of these mails was very aggressive, as if I
had committed a crime,” he explains. “I said, ‘Excuse me, I was sitting
half a year there, writing some software. You can use it for free and give
me some feedback if you want to, and if you don’t want to, don’t use it.’
So I didn’t understand why these people were so aggressive. That was a
pretty strange experience,” he notes.
Ettrich came across GNU/Linux only at about the time he started work
on LyX, but he soon became an enthusiast and an expert. He certainly
didn’t miss the old DOS system he had used before—“Linux was so
much more powerful than DOS,” he says—or even Windows, then at
0738206709-03.qxd 9/4/03 10:19 AM Page 257
version 3.1. “Windows 3.1 was a quite poor graphical interface,” he re-
members, and he didn’t believe his GNU/Linux platform was lacking in
any important way.
Two new commercial programs changed Ettrich’s views, however. The
first was IBM’s OS/2 Warp operating system, which, he explains, was
much bigger in Germany than elsewhere, thanks to some clever local
marketing. The appearance of this smart new GUI (graphical user inter-
face), complete with drag-and-drop capabilities that made working with
files and programs much easier for nontechnical users, started raising a
few doubts in people’s minds about the GNU/Linux desktop. The subse-
quent arrival of Windows 95, and its enormous success with end-users,
added to this growing sense of unease.
It took the experience of a nonhacker using his LyX document proces-
sor, not a technical revelation, to shake Ettrich out of his complacency
about the superiority of the GNU/Linux way: “I had convinced my girl-
friend that she wanted to use [LyX] for writing her masterpieces. I tried
my best to strip Linux down and have a nice user interface. I added
menus, and a little bit of drag and drop here, and tried to set up some-
thing nice, something I always claimed it’s possible to do, you just have
to want it.
“And I did that, and she was working on the computer, starting the
word processor from an icon, but she discovered so many nasty things:
This doesn’t work, this is inconsistent, and how do I do this? I was going
oops, people are actually using these new graphical user interfaces. And I
had a closer look and I saw that it was just terrible.” It was terrible be-
cause the graphical programs that ran under GNU/Linux used half a
dozen toolkits, each with its own idiosyncratic appearances and ap-
proaches, making it a nightmare for nontechnical users.
Ettrich started exploring the possibilities, trying to find a way to recre-
ate the simplicity of the Warp/Windows 95 approach, whose strengths he
now saw. “I browsed the software mirrors”—the major stores of down-
loadable programs—“to find alternatives,” he says. While searching, he
came across a new toolkit called Qt. “I downloaded the thing, and played
around with it, and had the feeling, wow, it’s really easy to program.”
Moreover, although a commercial program, it was available free to those
producing free software on all operating systems that support the X Win-
dow system—as GNU/Linux did.
Qt came from a small Norwegian company called Trolltech, based in
Oslo. It had been founded in 1994 by Eirik Eng and Haavard Nord, and
was originally called Quasar Technologies. “Qt” was the Quasar Toolkit,
and is pronounced “cute” by those inside the company. The name Troll-
0738206709-03.qxd 9/4/03 10:19 AM Page 258
from the good old text-based command line to what Ettrich believes
“many old hackers” saw as “this sissy ‘GUI’ thing.”
But alongside such die-hard traditionalists there was another, more ar-
ticulate group who had problems with the KDE project. Ettrich had al-
ready predicted what they would say in his list of likely responses to his
proposal: “Why Qt? I prefer schnurz-purz-widgets with xyz-lisp-shell.
GPL! Check it out!” The key element here was the GNU GPL, or rather
the lack of it. Although Qt could indeed be used without payment by free
software hackers, and its source code was available (later, anyway—ini-
tially Qt was shipped only in binary form—“We were a bit scared people
could like steal our good ideas,” Trolltech’s Nord explains), it wasn’t free
software in the sense that Richard Stallman and the Free Software Foun-
dation understood it.
The new KDE team did not ignore this wave of protests. “Everybody
joining looked at alternatives [to Qt],” Ettrich explains, “and we had a
long discussion: Shall we do it with Qt? And the result was [we decided]
it’s the best technical solution if we want to reach the goal that we have.”
As a result, the KDE project became the focus of one of the most intense
and divisive struggles ever seen within the free software world, and one
that exposes more clearly than anywhere the two great currents that flow
through it.
In one camp are the pragmatists who, like Ettrich, say, “It’s the best
technical solution if we want to reach the goal that we have.” As far as
the licenses under which software is released are concerned, “for me it
was a matter of personal opinion,” he says. “If somebody tells me I will
not use your free software because of the library that has a license I don’t
like, I didn’t really care because I had experienced the same thing with
LyX. There’s basically two kinds of software,” Ettrich believes, “good
software and bad software.”
This belief that licenses come down to “a matter of personal opinion”
does not mean that the pragmatists are indifferent to what others think.
For Ettrich, one of the key events during the early days of KDE was the
February 1997 Linux Kongress at Würzburg in Germany. This was where
Eric Raymond first read his paper, The Cathedral and the Bazaar, to an in-
ternational audience. Trolltech’s Haavard Nord gave a presentation, and
so did Ettrich. Ettrich says that one factor that gave him the confidence to
continue basing his KDE project on Qt despite the criticism from some
circles was that at Würzburg “nobody was complaining about the license,
nobody.” As a result, “I said, OK, the problems are not that big.”
The problems were big enough for the other camp in the world of free
software, that of the purists. They believed with the pragmatists that
0738206709-03.qxd 9/4/03 10:19 AM Page 260
there were just two kinds of software, but for purists like Stallman and
his followers, those two kinds were free and proprietary. Technical issues
were secondary or even irrelevant to the principle concern of liberty.
Qt’s lack of the full freedom they required troubled the purists. Many
of the fundamental rights granted by the GNU GPL—the right to modify
the source code and to redistribute those modifications—were absent
under the free Qt. This meant that, whatever Qt’s other virtues, any desk-
top built using Qt would rest on nonfree foundations; Qt was therefore
to be deprecated, according to the Free Software Foundation adherents.
The great danger, as the purists saw it, was that KDE might well succeed,
creating a situation where the free GNU/Linux operating system was
used mostly to run the nonfree Qt for KDE applications, a terrible
prospect for them.
As a result, the purists tried to persuade Trolltech to change the license
terms of Qt to those of the GNU GPL when used for free software, with a
commercial license for those who wanted to develop products to be sold.
This would have made the KDE project true free software, and provided
Trolltech with revenue from commercial applications. But Trolltech saw
a danger in this approach. “There were many people in the free software
community who did not want us to succeed with Qt,” Nord explains. “So
they were threatening they would put lots of power behind making some
alternative version of Qt.”
Under the terms of the GNU GPL, it would be permissible for a large
team of free software hackers to take the source code for Qt and develop
it faster than Trolltech—and in ways the company might not like. Al-
though Trolltech would have the right to incorporate such changes, this
alternative development stream would nonetheless serve to split the Qt
world—to fork it. “The fork was maybe the thing we were most worried
about,” Nord says. As a result, Trolltech refused to consider the GNU
GPL, and the free software community decided that the only possible ap-
proach was to set in motion an entirely new, and unambiguously free
desktop project that would ensure that KDE did not end up as the stan-
dard by default.
The person who came to lead the project was the young Mexican,
Miguel de Icaza Amozurrita, born, as Ettrich was, in 1972. He had first
heard about the GNU project from a friend when he was studying math-
ematics at the National Autonomous University of Mexico, in his birth-
place, Mexico City. Gaining practical experience had to wait until he
became a system administrator at the university, parallel to his studies.
“When I got my Sun machine at the university,” he says, “the first thing I
did was install this free stuff that was out there. And I read RMS’s mani-
0738206709-03.qxd 9/4/03 10:19 AM Page 261
festo, and I was hooked up with the whole idea of creating a full operat-
ing system that was completely free.”
De Icaza’s first contribution to the GNU project was a file manager, a
program for viewing and manipulating the contents of directories. “I
wanted to have a good file manager, because I was coming from the DOS
world in which they had a similar tool which was pretty good.” That
“good file manager” from the DOS world was Norton Commander, and
provided de Icaza’s program with a name: Midnight Commander.
His next project involved the Sun machine he was using for his system
administration duties. Dave Miller had started work on a port of
GNU/Linux to the platform, and de Icaza offered to help. Miller has
some interesting comments on de Icaza at the time. “I think early on his
programming/engineering more closely matched his adrenaline-pumped
personality,” Miller notes. De Icaza’s extraordinary energy, famous within
the free software community, would stand him in good stead as he took
on ever more free software projects.
As well as this phenomenal drive, de Icaza is also notable for his dedi-
cation to the GNU cause. For example, one of the projects he con-
tributed to was “the Linux RAID [Redundant Array of Inexpensive
Discs] code, which is a thing that lets you have faster reliable access to
your hard drives in Linux.” De Icaza decided to work on this not out of
some personal need: “I didn’t have the hardware,” he notes, “so I was
emulating the actual physical devices for implementing that stuff.” That
is, he created a software model of a RAID system purely so that he could
write a driver for others to use.
After the RAID code, de Icaza moved on to another GNU/Linux port,
to an SGI machine. Once again, he was building on the work of Dave
Miller, who had begun the port during a summer vacation job at SGI
while working for Larry McVoy there. It was at this time, at the end of
1996, that Ettrich announced the KDE project. Initially, de Icaza was
pleased that someone was working on a desktop for GNU/Linux. “At the
beginning, I didn’t understand what was happening,” he recalls. “So I
said, ‘Oh, yeah, well, these are nice efforts, it’s free, it’s under the GPL’”—
KDE itself was released under the GPL, even though the Qt libraries
were not.
“So I talked to Erik Troan at Red Hat,” de Icaza continues. “I said,
‘Here, Erik, you should be shipping this, it’s wonderful stuff.’ And then I
talked to Richard [Stallman], and said, ‘Richard, this is great stuff going
on, we have to support this project.’ And he said, ‘Well, no, the problem
is the following,’ and they both said the same thing: The result is not
free. And I said, ‘Oh, wait, wait, wait a second, let me see that again.’ So I
0738206709-03.qxd 9/4/03 10:19 AM Page 262
went and read the license, and indeed, the problem was that the [Qt]
software was proprietary.”
After initial attempts to convince Trolltech to use the GNU GPL for the
Qt libraries had failed, people started considering the alternatives. “We
tried a number of approaches,” de Icaza says. But there were only two real
options: One was to come up with a free clone of Qt that would have no li-
censing problems. Although such a project, called Harmony, was later be-
gun, de Icaza had his doubts about this approach because it had been tried
elsewhere without success. The other option was ambitious: to create a
new desktop that would be based entirely on GPL’d software.
But this left proponents of the idea with a problem: Which toolkit
should they use? The main ones, such as Motif (and Qt), were propri-
etary, and so out of the question. The solution was daring: They would
use the Gtk+ toolkit Peter Mattis had created for the GIMP. In retrospect,
it is extraordinary that what became the huge edifice of an entire graphi-
cal user interface to GNU/Linux was built on foundations thrown down
entirely by chance. As Mattis himself says, “At the time, to write our own
toolkit, people would have said that’s stupid and crazy. I guess in hind-
sight that was an incredibly lucky decision. It had a huge impact having
a pretty good quality toolkit for people to do graphical programs with,
and put a big impetus behind the whole open source explosion.”
As well as being able to use a preexisting toolkit, the new project could
also recycle a name. The new GUI system would be called GNOME
(“guh-nome,” the initial “g” pronounced, as is usual for GNU software).
GNOME stood for GNU Network Object Model Environment—which
seemed to have little to do with graphical user interfaces. In fact, this
“old” GNOME project drew its inspiration from the most unlikely
source: Microsoft.
In 1997, “just before the GNOME project as a desktop replacement”
was started, de Icaza says, “I had visited my friends at Microsoft.” They
showed him a new technology called ActiveX, which was part of Mi-
crosoft’s ambitious plans to catch up in the Internet sphere. ActiveX con-
sisted of small programs that could be sent over the Internet to a Web
browser, where they would add advanced functions. Unfortunately for
Microsoft, this approach proved to be a security nightmare, and ActiveX
never caught on for Internet use.
But ActiveX formed part of a broader component strategy whereby
software could be built up out of these smaller elements. De Icaza was
impressed by this aspect, and wanted to create something similar for
GNU/Linux. “GNOME was initially developed to create this component
system,” he says.
0738206709-03.qxd 9/4/03 10:19 AM Page 263
Then along came KDE and the decision to create a rival desktop pro-
ject. It was decided to adapt the original component-based approach.
Just as Ettrich had realized that KDE would need to be a meta-project,
made up of many smaller stand-alone programs, so de Icaza and his fel-
low GNOME supporters saw that the component model offered many
advantages for free software coding. “It helps,” de Icaza explains, “be-
cause that means that if a component is buggy, instead of having to un-
derstand a huge application, you just have to understand the [smaller]
application in which the problem is actually happening.” That is, it takes
the modular approach adopted in the Linux kernel and applies it to
higher-level applications.
GNOME’s component model went through several names. “Eventually
the component model was not called GNOME,” de Icaza says. “It was
originally called Baboon, and then we changed it to Bonobo. My girl-
friend was deep into primate behavior,” he explains “so I got exposed to
a lot of monkeys.” In addition to being given a new name, the original
Baboon component system was substantially revised and improved be-
fore becoming the current Bonobo.
When the GNOME project was announced in August 1997, “there was
a lot of good response, and we started working right away on the pro-
ject,” de Icaza says. One of the earliest GNOME coders was Alan Cox.
“He has contributed a lot of software, a lot of fixes, a lot of patches,” de
Icaza notes. “He was working on GNOME because we had a very clean
license compared to the other one. Many of the people working on
GNOME were actually people who were very concerned with the licens-
ing issues.”
One company interested in the licensing issues and in having a com-
pletely free desktop was Red Hat. As a consequence, “about four months
after the creation of the GNOME project,” de Icaza says, “they actually
put people working on GNOME.” This provided an important boost, he
says, because “many of the things that other people wouldn’t have done
in their spare time [Red Hat] did.” de Icaza adds, “They even lost market
share due to their commitment to GNOME” because competitors
Caldera and SuSE were shipping their distribution with the new and
highly attractive KDE while Red Hat was waiting for GNOME to catch
up.
Despite all this broad-based effort, de Icaza says, even its most fervent
supporters wished that GNOME wouldn’t need to be finished. “Initially
we were hoping that the existence of the project would make [Trolltech]
change their minds, but they didn’t. So we just kept working and work-
ing until we actually had something to use.”
0738206709-03.qxd 9/4/03 10:19 AM Page 264
progress towards its first official release, which eventually came out in
March 1999. There was therefore little prospect that GNOME would
cease development or merge with KDE. One interesting difference be-
tween the two projects is that an office suite was not part of the original
KDE project, “mainly because I never used half the packages,” Ettrich
explains. “I knew what a word processor was, but I never used Microsoft
Office.” These office applications were soon added by other program-
mers. For de Icaza, by contrast, office applications were central to the vi-
sion. “The goal of the GNOME project is to run a completely free system
without using any proprietary applications. So the office suite is obvi-
ously part of this effort. If you want to be able to use a completely free
operating system, you need to provide the tools [end-users] need for the
day-to-day work.”
Even though its final vision was more complete, the purist wing was
dilatory in creating a desktop for end-users. Given that there were such
superb tools as Emacs, fancy GUI word processors may have seemed su-
perfluous. The pragmatist Ettrich saw that, on the contrary, it was vital to
move GNU/Linux forward into new areas, and to give nontechnical users
a way to enjoy the benefits of this powerful system. It was the pragmatist
KDE project that undertook what appeared to be an insanely ambitious
plan to create an entire GUI front-end to GNU/Linux that would be the
equal of the new and eye-catching Windows 95.
And it was the pragmatists’ decision to use the technically best toolkit
at the time—Qt—even though it was not completely free, that spurred
the purists into putting together their own desktop project to match
KDE in its functions and its suitability to the end-user.
Thus it is down to the sometimes fraught dynamic between the prag-
matist and purist wings that the free software world has made one of its
most important advances. Without the pragmatists, things might have
taken years longer—and Windows would have become even more firmly
entrenched. And without the purists’ refusal to compromise, Qt might
never have moved to the open source QPL, and the KDE desktop would
have been dangerously compromised.
In an e-mail dated 10 July 1998, Linus offered his thoughts on the
matters of licensing, KDE, and GNOME, and gave a clear indication of
where he stood on the matter.
Personally, I like KDE better than gnome right now, on the strength of
pretty user interface and it working better. I personally use neither, though. I
know there has been a lot of silly license flamage, and I don’t particularly
like it.
0738206709-03.qxd 9/4/03 10:19 AM Page 266
My opinion on licenses is that “he who writes the code gets to chose his li-
cense, and nobody else gets to complain.” Anybody complaining about a
copyright license is a whiner.
The anti-KDE people are free to write their own code, but they don’t have
the moral right to complain about other people writing other code. I despise
people who do complain, and I won’t be sucked into the argument.
Although Linus seemed to prefer KDE at that time, this was probably
more a reflection of its being further along than GNOME. He has since
distanced himself from the discussion, and promoted the value of having
more than one solution. For example, in September 1998 he said, “It’s
kind of like politics: You don’t need to have a one-party approach to be
stable, and in fact it tends to be more stable to have multiple parties that
try to vie for power than to have a single rule. I think the KDE versus
GNOME issue will just make the system better in the end, even if it
means some internal struggles for a while.”
Today, KDE and GNOME are growing vigorously and competing in a
healthy fashion. Both have hundreds of developers and many hundreds
of applications. There are also moves to make the two approaches more
compatible. “We’re definitely working closer,” de Icaza says. “We’re
agreeing on a number of standards.” And, as Ettrich points out, one rea-
son such cooperation functions is “because we have no commercial in-
terests in all these things.”
Those who like to point to the continued existence of the KDE and
GNOME projects, with their separate collections of software, as evidence
of a fundamental split in the GNU/Linux world forget one important
point: It is possible to run both KDE and GNOME applications on the
same machine, at the same time, no matter which desktop is running. So
there is no real fork between the two projects, just differences in ap-
proach and on-screen appearance. As free software, KDE and GNOME
are both routinely included with commercial distributions. And choice,
after all, has always been at the heart of free software.
The paths the two founders have since taken present an interesting
contrast. True to his role as the creator of a meta-project, Ettrich has
stepped even further back from a “star” position. Appropriately
enough, he is devoting himself to questions of technology, specifically
Qt technology, as an employee of Trolltech. His move there in August
1998 is in many ways the perfect solution for everybody. “Trolltech
0738206709-03.qxd 9/4/03 10:19 AM Page 267
De Icaza believes this joint work will lead to significant advances. “It’s
not any more about just providing a desktop; we’re actually going be-
yond that,” he says. “We’re very focused on usability, we’re very focused
on human-computer interaction. It’s no longer a matter of providing a
desktop for Unix users, it’s about having people who have never touched
a computer before feel comfortable with a free software system.
“We’re pushing some of the most innovative technologies here,” he
emphasises. “I would say that this time we’re going to have a clearly
technical advantage”—an important point for a system which is often
taxed by its competitors as being something of a laggard in user-interface
technology. “When you see Gnome 2.0 you’re going to see something
that nobody else has—not Windows, not Mac, not anybody.”
The invocation of the most famous Apple computer is particularly pi-
quant because three of the founders of Eazel were key members of the
original Macintosh team. Michael Boich, president and CEO of Eazel,
created Apple’s software evangelism group for the Macintosh; Bud Trib-
ble, vice president of software engineering, managed the original Macin-
tosh software team; and Andy Hertzfeld, whose job title at Eazel is
simply “Software Wizard,” designed and implemented a large portion of
the original Macintosh system software, including the User Interface
Toolbox.
Nothing could symbolize more potently how far GNU/Linux has come
from its command-line, hacker-oriented origins in just a few years, and
the extent to which it is assuming the mantle of challenger to Microsoft
even for ordinary users—once Apple’s cherished role—than the decision
to create Eazel and work on open source software by these ex-Macintosh
gurus. But the fight between GNU/Linux and Microsoft on the desktop
lies some way off yet, assuming it arrives at all, which some still doubt.
On the server side, however, the battle has been raging for a while
now. The official opening of hostilities can be dated to an attack by Mi-
crosoft back in April 1999, when what began as a small-scale covert op-
eration went horribly wrong, and escalated into all-out media war.
0738206709-04.qxd 9/4/03 10:20 AM Page 269
16
269
0738206709-04.qxd 9/4/03 10:20 AM Page 270
one, because it was written by Digital and Pathworks was Digital’s imple-
mentation of TCP/IP for Windows.
“So I installed Pathworks [on the PC],” he says, “and I could no longer
access my PC-NFS server” because the stacks were different. At this
point, the classic hacker idea ocurred to him: “So I thought, well, it can’t
be all that hard, the protocol, so maybe I can try and write a server for
SunOS that provides the same functionality that is provided by a Path-
works server running on a Digital Unix box.” By writing some software
for the departmental Sun machine, he hoped to be able to swap files with
his PC using just Pathworks instead of always needing to go back to
NFS.
“What I didn’t know was that the [Pathworks] protocol wasn’t a pro-
prietary Digital protocol,” he notes. “In fact, the protocol was the same
protocol used by Microsoft in their networking. And I didn’t realize that
the protocol was called SMB [Server Message Block], and I didn’t realize
that it was a partly documented protocol. So, in blissful ignorance, I
went and implemented the protocol from the wire,” he continues, refer-
ring to another classic hacker technique, “just watching packets.” By
looking at what messages were sent over the network, Tridgell was able
to write some software to do the same, even though he didn’t know how
those messages worked.
“[I] implemented a basic server, and that took about a week over
Christmas December ’91,” he recalls. “It’s fun doing things over a Christ-
mas break.” Coincidentally, Linus was also having “fun doing things over
a Christmas break,” that year; he was adding the Virtual Memory capa-
bility to his very young Linux kernel, which he had begun just a few
months before.
Once he had written his server, Tridgell decided to see whether anyone
else was interested in using it. “So that was sent out to a Pathworks mail-
ing list, and a number of people were interested in it and tried it out. I re-
leased version 0.1, then 0.5, within a couple of days of each other,” he
recalls. But something happened that caused him to drop the project:
The department where he was working changed the hardware he was us-
ing and he no longer needed his simple Pathworks server program. “So I
just stopped working on the project completely,” he says. “I put a note
on the FTP site saying, ‘If anyone else wants to take over, then feel free to
do so.’” Even though he had received “quite a bit of feedback,” including
several patches, nobody stepped forward to carry on development, and it
seemed as if the project would go no further.
GNU/Linux revived the project, which was apt because the future
Samba would prove one of its most powerful allies. In November 1992,
0738206709-04.qxd 9/4/03 10:20 AM Page 272
somebody named Dan Shearer sent Tridgell an e-mail about his old Path-
works server. “Dan told me basically that there was some interest in port-
ing this thing to Linux,” Tridgell recalls. “So I wrote back and asked him,
‘What’s Linux?’—I’d never heard of it. So he told me what Linux was,
and I went and downloaded a copy and put it on a PC that I’d got by that
time, a 386, and I fell in love with Linux immediately.”
As well as alerting him to GNU/Linux, this also raised his conscious-
ness about free software in general. Even though Tridgell used free soft-
ware—for example, he employed Emacs to look at the contents of the
packets being sent over the network by Pathworks—“it was just sort of,
oh, there it was, [and] I downloaded it. I didn’t think about the whole
free software movement,” he says. But after his encounter with
GNU/Linux, this changed. “Once I started reading the mailing lists, very
quickly I understood what was happening, and wanted to participate.”
Fortunately, another chance event a year later suggested a way he might
do this. “This was about the same time that I set up a little network at
home.” By now, Tridgell was using GNU/Linux, but his wife had a Mi-
crosoft Windows machine. The problem was how to network the two
very different systems.
“And then I remembered an e-mail that I’d got from somebody saying
that my server worked with Microsoft Windows as a client,” he recalls.
That is, Tridgell’s Pathworks server could also talk to PCs running Mi-
crosoft Windows, not just to those running Digital’s Windex product,
which he was using initially.
“I’d dismissed the e-mail at the time,” he says, “because I assumed the
person didn’t know what they were talking about.” After all, he says, “I’d
written a Pathworks server, not a Windows server. And you get a lot of
crazy e-mails. But I went and downloaded the code [for the Pathworks
server]—I had to actually go and download it because I didn’t have a
copy—and tried it” on his GNU/Linux box and “with my wife’s Windows
box at home. And lo and behold, it did indeed work.”
Tridgell realized that his quick Christmas hack was actually something
much more valuable: a way of allowing Unix and Windows machines to
share files.
Spurred on by this discovery, Tridgell announced on 1 December 1993
a project to improve his old file-sharing code. “Lots of people responded
to that,” he recalls, “lots being of the order of fifty or a hundred people,
which at the time, when you announced free software project, that was a
big deal.” Linux, after all, had initially only elicited responses from a
handful of people. “At that stage, I GPL’d the code,” Tridgell says. “I
think it was largely out of loyalty to Linux; I was doing this on Linux and
0738206709-04.qxd 9/4/03 10:20 AM Page 273
I loved what was happening with GCC, and the other GNU projects, and
so I wanted to be part of that.”
In a short history of Samba that he wrote, Tridgell explains how the
name ‘Samba’ came to be chosen. “The code in Samba was first called
just ‘server’; it then got renamed ‘smbserver’ when I discovered that the
protocol is called SMB. Then in April 1994, I got an e-mail from Syntax,
the makers of ‘TotalNet advanced Server,’ a commercial SMB server. They
told me that they had a trademark on the name SMBserver and I would
have to change the name.” So Tridgell looked for words that contained
the letters S, M, and B and chose Samba as the best (another was
“salmonberry”).
Samba developed in classic open source style. “People basically started
sending in patches,” he says, “and I started working frantically to im-
prove the project. It was not long after that that I first read a specification
of the SMB protocol, and discovered that there were lots of holes in the
spec.”
The holes in the specification make SMB an interesting hybrid be-
tween an open and proprietary standard. “Microsoft very much takes the
attitude that if the specification, such as it exists, and the implementa-
tion in Windows NT differ, then NT is correct and the specification’s
wrong.” This has a direct consequence for Tridgell and the Samba team.
“In the SMB world,” he says, “the whole point of the protocol these days
is for interoperability with Microsoft. And so we always do all of our test-
ing directly against the Microsoft implementations. And we try to remain
bug-for-bug compatible where it makes sense. There are some cases
where it doesn’t make sense, and their bugs are just ridiculous, and you
shouldn’t emulate them. But in most cases, we emulate the bugs so that
we interoperate completely with the Microsoft implementation.”
Tridgell makes an important point about Microsoft in this context.
“They consider themselves the only implementation of this protocol,
even though they know they’re not. But they control the server and they
control the client, so they can just stick the code at the client and the
server and it will work, and they don’t really think much beforehand
about designing it appropriately.” This analysis sounds a note of warning
of what might happen should Microsoft ever attain a dominant position
on the Web, able to use—and abuse—its power on the client and server
sides to mask bad coding. It emphasizes once again the importance of in-
dependent projects, such as Apache and Mozilla, dedicated to open stan-
dards.
Microsoft’s attitude to the SMB protocols means that as it has extended
the basic standard, Samba has had to track it. Samba’s constant shadow-
0738206709-04.qxd 9/4/03 10:20 AM Page 274
Linux as a file server and 3.7 times faster as Web server.” Once more,
Ziff-Davis’s NetBench and WebBench were used for the benchmarks,
running on a PC with four Intel Xeon processors and 1 Gigabyte of
memory (almost identical to the PC Week test machine of 1 February).
Mindcraft was set up in 1985 by Bruce Weiner. “Our current business
has really been focused on performance testing over the last four or five
years,” Weiner explains. This comes in two forms. “One is performance
testing for vendors to validate that their products perform the way they
expected under circumstances they expected to work in,” he says, “and
that work is generally not published. The other side is performance mar-
keting; it’s nearly always for publishing.”
Almost immediately after the press release appeared, an item posted to
the Linux Today Web site by Dave Whitinger pointed out one extremely
relevant fact from the benchmark report not mentioned in the press re-
lease. Under the heading “Mindcraft Certification,” Whitinger noted,
was the following phrase: “Mindcraft, Inc. conducted the performance
tests described in this report between March 10 and March 13, 1999. Mi-
crosoft Corporation sponsored the testing reported herein.”
Of this significant omission, Weiner explains, “Initially the contract
had a clause that precluded us from discussing the test in any detail.
Other details were approved.” Weiner was well-aware that it would have
been better to admit this from the start. “I suggested that,” he says, and
explains Microsoft’s about-face as follows: “They’re a big company, and I
think things got communicated incorrectly or misunderstood some-
where along the way as to what happened. When that became clear, they
said, ‘Well, sure, go ahead and tell them.’ I guess maybe somebody high
enough up heard the noise, whatever.”
That noise was considerable. “I expected a few things: ‘They don’t
know what they’re talking about’—that kind of thing,” Weiner says. “But
the firestorm that occurred was unbelievable; it’s like I killed a baby or
something. I didn’t kill it, I spanked it, and I said, ‘No, no, don’t do that,
learn how to do it right.’” But the open source community felt that even a
spanking was a mortal insult, and Microsoft must have been taken aback
by the vehemence of the reaction.
Of Microsoft’s initial approach, Weiner says, “They contacted us some
short time before; it wasn’t months and months.” This put a certain
amount of pressure on Mindcraft, because “typically there’s a lot of setup
goes on in these tests ahead of time,” he says. Mindcraft and Microsoft
agreed that the Ziff-Davis WebBench and NetBench programs would be
used. “It’s an OK thing to run,” Weiner explains, because “nobody can
say you rigged the test.”
0738206709-04.qxd 9/4/03 10:20 AM Page 276
Weiner says that once the open benchmarks were suggested, the atti-
tude of the hackers changed. “It was ‘OK, guys, let’s call your bluff,’” he
recalls. And “all of a sudden, the Linux community was backing away,”
he adds. In one respect, this was hardly surprising; after all, when the
first test results were released, there was no mention that Microsoft had
paid for them, nor that they were conducted in a Microsoft lab. The tun-
ing was not optimal for the GNU/Linux system, and even with the sec-
ond round of tests, representatives of the open source world were refused
access to the equipment.
In another respect, however, Weiner may have been right about a cer-
tain backing away. After the initial shock and outrage had begun to die
down, the top hackers got down to some serious analysis of just what
was going on. One of the reasons this took some time was the nature of
the Ziff-Davis benchmarks.
As Andrew Tridgell explains, “The problem with NetBench is that it’s a
very closed benchmark, in the sense that anyone can download it, but
you’d need a half-million-dollar lab in order to run it seriously. You can’t
just go and run it at home and expect to get a result that’s at all meaning-
ful. There was really only one location available to the open source com-
munity to test this sort of configuration,” he continues, “and that was
inside SGI, where Jeremy [Allison] was working. And so though he
could do that, that was an enormous bottleneck, and it meant that peo-
ple like Alan [Cox], Linus, and David Miller couldn’t really do the tests
themselves.” In this respect, NetBench and WebBench were taking the
community into a new realm where the open source techniques based on
the input of the many users running software on their personal comput-
ers was no longer enough.
But hackers are nothing if not resourceful. Using SGI’s facilities, Alli-
son ran NetBench for just one PC client, and all the details of the data’s
flowing over the network were recorded. Tridgell then replayed that data
to simulate hundreds of clients running simultaneously, employing just
two fast GNU/Linux machines linked together. In this way, it was possi-
ble to run NetBench without having that half-a-million-dollar labora-
tory—and hence start investigating what was happening in the
Mindcraft benchmarks.
Tridgell also carried out another important piece of work. “One of the
things you’re doing with benchmarking is you’re trying to work out
which subsystem on the server is the bottleneck,” he says. “So what I did
was I created three benchmarks,” each one of which tested a different
part of the GNU/Linux-Samba combination. The problem was revealed
when the NetBench simulation was run using these three benchmarks.
0738206709-04.qxd 9/4/03 10:20 AM Page 280
The difficulty was not with Samba, as Tridgell was pleased to find; the
tests ran slowly even without it. It turned out that it “was a kernel prob-
lem,” he explains. “Basically, we weren’t scalable to a four-way SMP.”
This was a variant on the old “Linux does not scale problem,” in this
case deep within the kernel, that was showing up only under the ex-
treme circumstances of the Mindcraft test. Once he knew what the prob-
lem was, Tridgell came up with a way of fixing it. “It was a very ugly
hack,” he says with the typical disdain a top coder has for such rough-
and-ready solutions. “That sped us up by about a factor or two. It wasn’t
of course valid, it wasn’t a patch that you could really put into a produc-
tion kernel.”
The good news was that Tridgell’s analysis and the rough fix meant
that the problem he had located could be fixed after a little serious hack-
ing. The bad news was that a proper patch would not be available for the
open benchmark that Weiner was proposing. As Allison says, “I knew we
were going to lose, because I’d already done the benchmarks on Linux in
SGI’s internal lab. I knew what the bottleneck was, and it wasn’t a Samba
issue. Basically, the idea at that point was just trying to do damage con-
trol as much as we could.”
In this sense, Weiner is probably right in that the open source com-
munity were backing away from the idea of a rematch: They knew what
was wrong, and that at this stage “it was all politics,” as Tridgell com-
ments. “It was Mindcraft trying to save face.” But “saving face” was im-
portant to Weiner; the “firestorm” that the benchmarks had provoked
had cast aspersions on his company. Whatever flaws the first two bench-
marks may have had, Mindcraft needed to establish that it had not
cheated. If doubts remained on that score, Mindcraft would have no fu-
ture customers.
Weiner originally made his open benchmark invitation on 4 May
1998, and revised it on 18 May to address some of the concerns raised.
The first phase of the test had as its stated purpose to “reproduce the re-
sults of Mindcraft’s second test.” As Weiner says, “The reason we wanted
that one done is that we wanted Mindcraft to be vindicated that we had-
n’t rigged a damn thing.”
The second phase was where “Linux experts use anything available at
the time of Mindcraft’s second test. This will show how much better
Mindcraft could have done at the time.” The final phase aimed to “get
the best performance using today’s software.” Weiner says that allowing
this third phase “was a pretty brave thing for Microsoft to say at that
point,” with some justice. Microsoft had little to gain and much to lose
by allowing the GNU/Linux platform to be tuned to the limit.
0738206709-04.qxd 9/4/03 10:20 AM Page 281
Ewel explains that “it was a joint decision. I talked to Mike Nash and
Jim Allchin”—one of the most senior Microsoft executives—“about it. It
was taking a chance, we knew that.” But “we know we have a good prod-
uct and we were pretty confident that [the Linux hackers] weren’t going
to be able to do a benchmark special to beat us.” This was because he
and his colleagues had “sat down with our best technologists,” he says,
to weigh the risks and the rewards of taking part in such an open bench-
mark. The last point is interesting, because it suggests that Microsoft had
been examining GNU/Linux and Samba in great detail, to the point that
it knew how much work was required to deal with issues the Mindcraft
benchmarks had surfaced. Nonetheless, credit should be given to Ewel
and his team for having the courage to take this calculated risk.
Not that Microsoft did not exploit the situation to the full in other
ways. On 12 May, it posted a document to its Web site called “Update on
Windows NT server vs. Linux performance and capabilities.” In this, a
typically well-researched Microsoft document, it laid out the previous
Mindcraft benchmarks, along with other independent tests; for example,
tests that had just appeared in PC Week on 10 May, and in PC Magazine
on 11 May, all of which tended to corroborate Mindcraft’s results.
As well as allowing representatives of the open source world to be pre-
sent, the open benchmark tests sought to address other concerns. For
example, it turns out that Samba works better with PCs running Win-
dows NT rather than the Windows 95 and 98 used in the first two Mind-
craft tests. “I personally wasn’t aware of that at all at the time,” Weiner
says. But the open benchmark would be run using both Windows 95/98
and Windows NT clients. Moreover, the tests would be run using first
one processor and then four processors, with corresponding amounts of
memory, because GNU/Linux was designed to run better on leaner sys-
tems.
Weiner’s final concession was to allow these open benchmarks to be
run at an independent laboratory. Weiner explains, “PC Week and ZD
Labs said, ‘Why don’t you do it here,’ because obviously it was a story.
And we said, ‘Great, we’d love to do it.’” The benchmarks having origi-
nated at Ziff-Davis, a certain justice prevailed in this choice for the scene
of the final shoot-out. All parties agreed to participate on this basis, and
the bench testing began on 14 June. “We wanted it to happen much ear-
lier,” Weiner says, “but [the head of ZD Labs] couldn’t arrange that be-
cause of their lab schedule.”
The event proved anticlimactic after the drama of the previous three
months. As everybody knew it would, Windows NT beat GNU/Linux
running Apache and Samba. Henry Baltazar and Pankaj Chowdhry re-
0738206709-04.qxd 9/4/03 10:20 AM Page 282
The Mindcraft saga has done more than give rise to a few handy
patches. “It gave a real focus,” Tridgell believes. “In particular, from a
technical point of view, it made us focus on the larger-scale benchmarks
as compared to the microbenchmarks. To a reasonable extent, the kernel
developers had been using microbenchmarks like lmbench,” which was
Larry McVoy’s software for tuning at a very low level. In this area,
Tridgell contends, “Linux was so far ahead of all the other operating sys-
tems in that it’s just not funny.”
But lmbench “didn’t stress the operating system under a load of fifty
processors running simultaneously,” he says. “Getting scalable kernel-
level SMP is a much more difficult task, and nobody had really worked
on it. So what this [episode] did was it focused a lot of effort on that.” In
a sense, then, the Mindcraft benchmarks did Linux a great service.
Linus eventually realized this too, but not before even he lost his usual
equanimity and coolheadedness. Weiner says, “I got the firestorm from the
top ranks” in the open source world, and Allison recalls that “Linus sent
[Weiner] this just unbelievably nasty e-mail calling him a fair few names.”
Similarly, Ewel says that he and Linus “exchanged some fairly heated
e-mails.” For a moment, Linus had forgotten that bug reports from users
in new situations, pushing and stressing the software in new ways, had
always helped make Linux better. And the Mindcraft benchmarks had
done this by showing what needed to be done if GNU/Linux was to be-
come an enterprise solution. One concrete result of this input was revealed
by Red Hat’s announcement on 15 August 2000 that Dell servers running
GNU/Linux and Red Hat’s own new high-performance Web server, Tux,
were “the fastest Web solutions of those tested” and more than twice as
powerful as Windows 2000 on the same four-processor system.
The fueling of future improvements in Linux was one way the exercise
backfired for Microsoft. The bigger blunder, however, was something
more profound. By arranging for GNU/Linux, Apache, and Samba to be
benchmarked against Windows NT, Microsoft said in the most emphatic
manner possible that these were rivals; after all, there is no point bench-
marking things that are not in some sense comparable. This was a signif-
icant shift from Microsoft’s previous stance that GNU/Linux was not up
to enterprise-level tasks, and nobody was using it anyway.
The Mindcraft benchmarks gave the lie to this position more effec-
tively than anything the open source community could have done. At a
stroke, Microsoft had anointed GNU/Linux as an official rival. Moreover,
through its inept initial handling of the resulting furor, it hammered
home this message for over three months, just in case anybody missed
the first announcement in April.
0738206709-04.qxd 9/4/03 10:20 AM Page 284
The third myth that Microsoft seeks to demolish is that “Linux is free.”
It rightly points out that “the cost of the operating system is only a small
percentage of the overall total cost of ownership (TCO). In general Win-
dows NT has proven to have a lower cost of ownership than UNIX.” But
its statement that “there is no reason to believe that Linux is significantly
different than other versions of UNIX when it comes to TCO” conve-
niently forgets that Unix’s high costs were largely caused by the frag-
mented nature of the sector, which in turn led to higher software and
training costs.
The next point addresses security. Microsoft asserts that the “Linux se-
curity model is weak.” But many believe it is the Microsoft security
model that is fundamentally flawed. One such person is Bruce Schneier,
author of the classic book Applied Cryptography, and widely respected as
an independent authority on computer security. In the September 1999
issue of his free monthly newsletter Crypto-Gram, Schneier examined
the relative security of closed and open source software. He wrote, “As a
cryptography and computer security expert, I have never understood the
current fuss about the open source software movement. In the cryptogra-
phy world, we consider open source necessary for good security; we have
for decades.” Schneier concluded, “Comparing the security of Linux
with that of Microsoft Windows is not very instructive. Microsoft has
done such a terrible job with security that it is not really a fair compari-
son.”
Microsoft’s last point in the Linux Myths document states that “Linux
as a desktop operating system makes no sense. A user would end up with
a system that has fewer applications, is more complex to use and man-
age, and is less intuitive.” The comment about GNU/Linux’s being less
intuitive has been overtaken to a large extent by enormous progress in
the KDE and GNOME projects, which offer most of the benefits of Mi-
crosoft Windows. However, GNU/Linux support for PC hardware tech-
nologies such as Plug and Play (allowing automatic configuration of
devices), PCMCIA (credit-card-sized peripherals commonly used with
notebooks) and USB (another easy way of adding hardware) has un-
doubtedly lagged behind Windows, which does indeed make the system
more difficult to use and manage.
There is also much truth in the comment about software availability.
Tens of thousands of applications run under Windows 95/8 and NT;
GNU/Linux cannot yet come close. But things are changing rapidly, as
the growing flood of announcements on sites such as Freshmeat indi-
cates. Moreover, the picture could change almost at a stroke if a long-
running open source project called Wine comes to fruition.
0738206709-04.qxd 9/4/03 10:20 AM Page 286
Michael Tiemann, who became Red Hat’s CTO after it acquired Cygnus,
gave a course on the GNU development tools that Cygnus had ported to
the IA-64, at a conference for developers on the new platform. “Mi-
crosoft was nowhere to be seen at the conference,” he recalls. “And I
asked, did they actually manage to demo Windows 2000 booting on IA-
64? And what people told me was they got up on stage on the first day of
the conference, they showed it booting, they shut it down, and their ma-
chine went back into cold storage.
“However, there was a third-party solutions booth where NEC and SGI
and IBM and Compaq and everybody else [were] demonstrating IA-64.
Those machines were running all day long, all throughout the confer-
ence, and they were all running Linux.” The first publicly available
GNU/Linux distribution for IA-64 was delivered by TurboLinux on 20
March 2000, just six weeks after the source code of the Linux port was
released to developers—a considerable achievement.
Microsoft doesn’t talk much about GNU/Linux’s lead here. Instead, it
prefers to characterize open source as “kind of chasing taillights” as Jim
Ewel puts it. There is some justice in this because, as Augustin had
noted, GNU/Linux developers were denied timely access to proprietary
hardware specifications.
Now, though, Intel had not only provided details about the Itanium
chip to the GNU/Linux community, it had even placed what were once
tightly guarded trade secrets about the IA-64 architecture online for all to
see—a stunning conversion to open source ideas. Tiemann says, “IA-64
is the first time where we got an even start instead of being behind by six
months or a year.” And Augustin points out, “It’s the first time that we’ve
seen a major launch of a new processor where Linux was the default op-
erating system. That to me is just a tremendous milestone, it moves
Linux from being the number two player to being the number one
player.”
Not only is it in the new market defined by the Intel Itanium chip that
GNU/Linux is reshaping the computing landscape. Parallel to the move
to 64-bit enterprise systems, practically every sector of the industry is
reinventing itself as a result of GNU/Linux and open source. And this
was revealed nowhere more dramatically or visibly than in the extraordi-
nary announcements made by IBM at the beginning of 2000.
0738206709-04.qxd 9/4/03 10:20 AM Page 289
17
Tomorrow’s Hothouse
289
0738206709-04.qxd 9/4/03 10:20 AM Page 290
all its hardware, it immediately unites all its systems into a single family
and offers customers perfect scalability.
When IBM has delivered on its promise to make all its server platforms
“Linux friendly,” the same piece of software that is created on a desktop
PC, tested on an RS/6000 system, and then run on an AS/400 minicom-
puter can, as the company grows, be moved onto an S/390 without
changing a line of the code. “We’ve been so trapped into the paradigm
that says once you develop the application on a platform, you are
trapped,” Wladawsky-Berger says. “And porting from that platform to
some other platform is a custom effort that takes a lot of manpower. All
of a sudden, that can start becoming something from the past.”
He emphasizes again that, although radical, this approach is not un-
precedented.
Before TCP/IP came along, the notion of connecting different systems was
considered a major custom job. You’d hire very specialized programmers
because you usually had different systems that talked different networking
languages and then they had to very carefully handcraft a way for the sys-
tems to work together. Once TCP/IP became a de facto standard for any
system that wanted to be part of the Internet, then all of a sudden every
system supported TCP/IP and it no longer was an issue how do you con-
nect to the network; you just connect.
Now what’s happening with Linux is this cultural shift that says that if
you can agree on standards for applications then you can develop an appli-
cation and then separately choose your deployment platform depending on
what works best. The whole notion separating application development
from the underlying deployment platform has been a Holy Grail of the in-
dustry because it would all of the sudden unshackle the application devel-
opers from worrying about all that plumbing. I think with Linux we now
have the best opportunity to do that. The fact that it’s not owned by any
one company, and that it’s open source, is a huge part of what enables us to
do that. If the answer had been, well, IBM has invented a new operating
system, let’s get everybody in the world to adopt it, you can imagine how
far that would go with our competitors.
IBM’s plans represent as big a boost to the open source movement as its
initial support for Apache. They will also have enormous repercussions,
even for companies that might not share Big Blue’s sudden enthusiasm
for GNU/Linux. For example, one small but important element of IBM’s
strategy is to add GNU/Linux compatibility interfaces to its variant of
Unix, AIX; this will allow users to employ programs written for
0738206709-04.qxd 9/4/03 10:20 AM Page 293
tivity suite, which it acquired in August 1999, under the GNU GPL in
October 2000, and the adoption of GNOME 2.0 as the default desktop
for Solaris.
Although Sun’s relationship with GNU/Linux and open source may have
been bumpy, as a manufacturer of hardware it has been able to find alter-
native models of revenue generation even while giving away its entry-level
Solaris 8 operating system. Other companies are not so fortunate.
A case in point is Berkeley Software Design, Inc. (BSDI), formed in
1991 by many of the key developers of the original Berkeley Unix. On 14
December 1998, BSDI announced that it would be adding Linux applica-
tion support to its version of Unix, called BSD/OS; but in addition to pro-
viding their customers with a means to run GNU/Linux applications, the
move also lessens the incentive to port to BSD/OS, its strengths notwith-
standing. The announcement on 10 March 2000 that BSDI would merge
with Walnut Creek CDROM, the distributor and sponsor of the popular
FreeBSD operating system (which also derives from Berkeley Unix) is ev-
idence that BSD/OS is feeling the effect of GNU/Linux. This move allows
BSD/OS development to tap into the enthusiasm and creativity of the
skilled and dedicated FreeBSD hackers, and provides more formal sup-
port for FreeBSD.
The situation for the other major software-only Unix vendor, SCO,
was more problematic—and ironic. In the early 1980s, SCO excelled as a
company in almost the same space as GNU/Linux would in the early
1990s: that of Unix running on Intel processors. SCO’s achievements
were crowned in 1995 when it purchased UnixWare from Novell, which
had, in turn, bought it from AT&T in 1993. In acquiring the rights to
Thompson and Ritchie’s original Unix work of 1969, SCO became, in ef-
fect, the keeper of the sacred Unix flame. But hackers have no piety and
little pity when it comes to technology; for them, SCO’s products were
old proprietary versions of something that would reach its culmination
in GNU/Linux.
SCO tried to fight back. It had been providing “students, educators
and Unix system enthusiasts” with free copies of its latest software for
some years, but in binary form only. Then, on 12 May 1998, it offered
low-cost licenses for an “ancient” release of Unix, including the source
code. But despite these moves, GNU/Linux continued its march into the
hearts of hackers and onto the desktops of companies. Along with the
other vendors of proprietary Unixes, SCO bowed to the inevitable, and
on 2 March 1999 “affirmed its long-standing support for the Linux and
Open Source movements by adding Linux application support to its
award-winning UnixWare 7 platform.”
0738206709-04.qxd 9/4/03 10:20 AM Page 296
thanked various people who had helped in this epic endeavor, but con-
cluded with “no thanks” to “Steve Jobs—for refusing to provide any
[Macintosh] documentation.”
Given this starting point, and its subsequent tepid enthusiasm for ef-
forts to port GNU/Linux to later Macintosh hardware, Apple’s announce-
ment on 16 March 1999 that it was releasing what it called Darwin, the
foundation of its new Mac OS X Server operating system, as open source
was an important step. With elements taken from Berkeley Unix and mi-
crokernel technologies of the kind Andrew Tanenbaum (the creator of
Minix) favored, Darwin is a close relative to GNU/Linux.
Although Apple’s move, and its subsequent release of other code as
open source, may well help “Apple and their customers to benefit from
the inventive energy and enthusiasm of a huge community of program-
mers,” as the original March press release hoped, it fails to address a
more fundamental problem for the company.
This had already been raised in a prescient essay titled “A Modest Pro-
posal,” written by Don Yacktman in 1998. He noted, “There is a finite
amount of ‘serious’ development happening at any given time. Of that
development, very little happens for non-Windows operating systems.
Apple is competing with Linux to obtain third-party support. As Linux
overtakes Mac OS, it is likely to absorb the non-Windows development
pool. If third party support for Mac OS were to dwindle, then Apple’s fu-
ture could be in serious jeopardy,” whether or not the company uses
open source development techniques for its operating system. Fewer
new applications means fewer users. SGI understood this, which is why
it switched emphasis from Irix to GNU/Linux. As Yacktman correctly
pointed out, the same could happen to Apple if GNU/Linux becomes vi-
able as a desktop system.
The foundations for this viability have been laid with the creation of
the KDE and GNOME environments. One of the first software vendors
to take advantage of this was the Canadian company Corel, which ported
its well-known Windows programs using an innovative approach based
on technology from the open source Wine project. Corel’s founder,
Michael Cowpland, has described GNU/Linux as “the lever that has fi-
nally pried open the single-vendor situation,” breaking Microsoft’s stran-
glehold on the desktop. But that “single-vendor situation” does not
obtain in plenty of sectors in the computer market, where the prospects
for GNU/Linux look particularly rosy.
For example, GNU/Linux is already flourishing in the server appliance
area where the idea is to create a low-cost system with minimal configu-
0738206709-04.qxd 9/4/03 10:20 AM Page 298
with its own solution, called eCos, which it released in September 1998
as open source. Although eCos was not based on Linux, Cygnus’s next
project in the embedded space, EL/IX, was, but with a twist. In Septem-
ber 1999, the company proposed not just another embedded
GNU/Linux, of which there were by now many, but a way for competing
products to preserve compatibility among themselves. The press release
called EL/IX “a major step to pre-empt the fragmentation of embedded
Linux.”
This sounds fine in principle, but some, like Lineo’s Sparks, have their
doubts. “We think it’s nonsense,” he says. “There’s no technology there at
all; it’s just marketing slicks.” Despite his disdain for EL/IX, Sparks has
plenty of respect for Cygnus/Red Hat in the embedded space. At the very
least, he says, “we see them as a competitor in the things that they could
do.”
The danger of fragmentation within the embedded space is probably
less than it is for the server or desktop sectors. By their very nature, em-
bedded devices address certain needs and usually run custom software.
As a result, detailed compatibility between the many platforms that exist
is less important than for servers, say.
There is another interesting consequence of this inherently balkanized
nature of the embedded sector. The Linux kernel has flourished and
grown because so many hackers have been able to try out the code, find
bugs, and make suggestions. Because embedded systems are specialized,
it is unlikely that many hackers will be playing around with code written
for them; this means that the classic open source development method-
ology has little to offer in the embedded space, or in any other domain
where few people look at the code, namely, specialized niche markets.
Their differences in philosophy notwithstanding, Bryan Sparks and
Michael Tiemann, cofounder of Cygnus and now CTO of Red Hat, agree
on one thing: the huge potential of the area. Tiemann says, “I believe that
the real name of the game is going to be Internet appliances. The PC
form factor is going to go the way of the universal motor.” Sparks adds,
“We believe that the next wave of Linux is embedded, and we think that
the next wave in embedded is Linux.”
Some big names agree. In May 2000, Lineo received $37 million from
Mitsubishi, Motorola, and Samsung; in September Motorola invested a
further $22.5 million, while Samsung and Lineo formed an embedded
systems joint development in Korea. Earlier in 2000, Lynx Real-Time
Systems, another embedded systems company, received investments
from Intel, Motorola, and TurboLinux. Lynx launched its embedded
Linux product BlueCat in February 2000, and in May changed its name
0738206709-04.qxd 9/4/03 10:20 AM Page 301
Let’s think about the Net for a change as a collection of pipes and switches,
rather than thinking of it as a thing or a space.
There’s lot of data moving through those pipes, and the switches deter-
mine who gets which data and how much they have to pay for it down-
stream. And of course those switches are by and large what we think of as
digital computers.
The basic media company theory at the opening of the twenty-first cen-
tury is to create a leak-proof pipe all the way from production studio to
eyeball and eardrum. The switch that most threatens that pipe is the one at
the end. If the switch closest to your eyeball and your eardrum is under
your complete technical control, the whole rest of the aqueduct can be as
0738206709-04.qxd 9/4/03 10:20 AM Page 303
leak-proof as they like, and it won’t do them any good. And the switch is
under your control, of course, if the software is free software.
That these are not simply the interesting but theoretical musings of an
academic who has spent too much time playing with GNU software and
reading the GNU GPL has been demonstrated by the case of two open
source programs that circumvent the Content Scramble System (CSS)
used to block unauthorized viewing of DVDs.
Because the GNU/Linux platform was not a supported platform, it was
not possible to use that system to watch DVDs. In the usual way, hackers
started LiViD, the Linux Video and DVD Project, to craft some code so
they could view DVDs on their computers. Even though it has no evi-
dence, the film industry sees the software that has grown out of the pro-
ject as an attempt to subvert the encryption system for the purpose of
piracy; hackers regard it as freedom to watch their DVDs under
GNU/Linux.
This case is particularly interesting because DVD software for both
GNU/Linux and Windows (called css-auth and DeCSS, respectively) has
been released under the GNU GPL. As such it can—and has—been
copied all over the world; this puts the film industry in a difficult posi-
tion. As Moglen notes, “Now they have to go to some court somewhere
and say, ‘Give us an injunction to make everybody in the whole world
stop talking about this.’ And that’s not going to work, because courts are
going . . . to see that this is an impossibility. They’re going to recognize
that there’s no factual way to accomplish it, and that there’s something
repugnant about being asked to do what’s impossible.”
Although the U.S. film industry appears to fear this software, its poten-
tial impact on the intellectual property landscape pales into insignifi-
cance beside two other open source projects. Called Gnutella—a
combination of “GNU” and “Nutella,” an Italian chocolate and hazelnut
spread—and Freenet, these programs create a means for locating files
that may be distributed anywhere across a collection of computers linked
by the Net and running the Gnutella or Freenet software.
The key feature of both is that no central directory lists where those
files are held, unlike the more famous Napster file-sharing system. In-
stead, requests for an item are passed on from computer to computer; if
the item is found, it is then sent to the originator of the request. The lack
of a central node means that it is extremely hard to track down the where
the file is held on the Gnutella network; Freenet is specifically designed
to make this impossible and also preserves the anonymity of the origina-
tor of the request.
0738206709-04.qxd 9/4/03 10:20 AM Page 304
The way these new programs nullify attempts to police copyright in-
fringement is particularly interesting. Both Gnutella and Freenet overlay
a kind of virtual network on top of the current Internet; this severs all
links between files and their physical locations. Content is just “on” the
network; “where” is irrelevant to the user. It seems appropriate that the
distributed development methodology known as open source, born of
the Internet and largely powering it, should also lead to its reinvention in
an even purer, more powerful form.
0738206709-04.qxd 9/4/03 10:20 AM Page 305
18
305
0738206709-04.qxd 9/4/03 10:20 AM Page 306
Here, then, was a critical test of what happens when commerce col-
lides with the community development process. Cygnus’s solution was
fairly extreme: It decided to fork the GCC code. With its new EGCS (Ex-
perimental GNU Compiler System, pronounced “eggs”), it could incor-
porate all the changes that it needed for its products. But, in effect, it was
hijacking the GCC standard. “We tried to do it in a way that could be
reconciled, but we were forceful in our leadership,” Tiemann says.
“What was very gratifying was that basically the entire Net community
was behind us.”
This last point is crucial. Had Cygnus simply created EGCS to satisfy
its own commercial requirements, it would have become a fork divorced
from the main GCC user community. This would have been perfectly
valid and legal in itself, but it would have meant that Cygnus would have
lost all the benefits of the open source development methodology that
provided much of its strength. By creating a fork that had large-scale user
support, it put pressure on the main branch—controlled ultimately by
Stallman—to reach an accommodation (which was why Cygnus tried to
“do it in a way that could be reconciled,” as Tiemann puts it).
Tiemann explains what happened after the decision to fork. “While
EGCS was alive,” he says, “Stallman was basically trying to evaluate what
would be the consequences of endorsing this Cygnus-sponsored fork. At
the end of the day, I think the integrity of the people who are currently
working on the GNU compiler [at Cygnus] was so unimpeachable that
he was able to make the very risky decision and say, ‘Wow, I’m going to
put all of the eggs in this basket which is controlled by a commercial
company.’”
That is, Stallman healed the fork by using the Cygnus version as the
basis for future developments of the official GCC, renamed GNU Com-
piler Collection (from GNU C Compiler) to reflect the expanded goals
of the project. This was obviously a courageous step, but not entirely a
leap in the dark. As Tiemann correctly emphasizes, the EGCS fork was
not controlled so much by Cygnus as by its engineers. On the basis of
their past work—and their “integrity” in the context of GCC develop-
ment—Stallman was confident that they would continue to develop
GCC for the good of the community, while also addressing Cygnus’s
concerns.
This suggests one way that a company such as Red Hat, say, might as-
sume a more active role in determining the future course of GNU/Linux
without hijacking it in any way. Just as Stallman—hardly a man to com-
promise—was willing to place the development of GCC in the hands of
its key developers, and to depend upon their “integrity” to ensure its in-
0738206709-04.qxd 9/4/03 10:20 AM Page 308
dependence, so Linus might not be too concerned to see Red Hat emerge
as the dominant force in the Linux world. Alan Cox and Dave Miller are
employed by the company, and nobody would think of calling into ques-
tion their integrity. In a sense, their presence acts as a guarantee that Red
Hat will always be a responsible guardian of the Linux community’s in-
terests, just as Cygnus was for GCC.
The power of the users, realized through such top hackers as Cox and
Miller, who amount almost to the community’s representatives in the
commercial distribution world, is one major reason that Red Hat and
companies like it will never be able to impose its will on the free-soft-
ware movements. But there is another; this flows from the unique dy-
namics of business based around open source software.
To understand why Red Hat will never become another Microsoft, con-
sider what a company has to do to go head-to-head with the Redmond
giant. Because the Windows operating system is proprietary and closed,
producing a good clone of it is a mammoth undertaking. A perfect clone
is probably impossible. As a result, Microsoft will always have the advan-
tage of current compatibility and future innovation.
In the world of GNU/Linux, the situation is different. Because, to its
credit, Red Hat releases all its work under the GNU GPL, anybody can take
it and use it. This means that where the barriers to entry for competitors to
Microsoft Windows are so high they are effectively insurmountable, they
are more or less nonexistent for rivals to Red Hat. It is noteworthy that this
dynamic allows not only current competitors of Red Hat to catch up with
it technically very quickly but new entrants, too. The rise of MandrakeSoft
shows that this is no mere theoretical possibility.
In 1995, Gaël Duval, a 22-year-old Frenchman from Caen, Normandy,
who was studying computer science, was looking for a Unix to put on
his 386 PC—a familiar enough story. “I searched through the Internet
and naturally found Linux, [which] I grabbed on fifty diskettes,” he re-
calls. As many before him had been, Duval was impressed with what he
saw.
Again, as with countless others before him, Duval tracked Linux’s
steady improvements, trying out Red Hat and GNU/Debian alongside the
Slackware distribution he had downloaded first. He particularly liked
Red Hat’s distribution format and its installation procedure. Then, when
the KDE desktop was announced at the end of 1997, Duval started to fol-
low this project, too. “I’m a graphical operating systems lover,” he con-
fesses.
But now he had a problem. Because KDE used Trolltech’s Qt libraries
(which initially were not open source), Red Hat refused to ship it. But
0738206709-04.qxd 9/4/03 10:20 AM Page 309
Duval wanted to use KDE while keeping the Red Hat distribution he had
come to know. The solution was obvious for a hacker: He would just cre-
ate his own distribution that put together Red Hat and KDE.
Following the numbering of the Red Hat version it was based on, Du-
val called his distribution Linux-Mandrake 5.1. In July 1998, Duval
placed this on an FTP server for others to download—as Linus had done
with his original Linux code. Just as Linus had been encouraged by the
feedback he received to his early kernel, so Duval was spurred on by the
response to his home-brew distribution. It was “incredible,” he says. “I
immediately got hundreds of e-mails [from] people who enjoyed the
project and the product. I also got first patches and ideas.” He was even
more struck by two e-mails from the United States and one from Aus-
tralia. “The most incredible,” he says, “were three companies who an-
nounced that they had started to sell GPL CDs of Mandrake.”
This opened his eyes to a possibility he had not considered. When he
created Linux-Mandrake, he had no thought of selling it. “I just ended
my studies at Uni,” he says, “I had no money and I was looking for a job
as Unix systems administrator or programmer.” But now, “because of the
demand, I started to sell some Mandrake CDs.” Duval teamed up with
two other Frenchmen, Jacques Le Marois and Frederic Bastok, and to-
gether they founded MandrakeSoft in December 1998, its headquarters
in Paris.
Although things began small—“we started with private fundings plus
auto financing by selling Mandrake powerpacks”—MandrakeSoft grew
rapidly. By the end of its first year it had fifty employees; and soon after,
one hundred. One of MandrakeSoft’s first coups was to sign a deal with
the U.S. publisher Macmillan to provide a retail distribution, launched in
June 1999. “Macmillan is present in many retail stores in the U.S.A. and
several European countries,” Duval says. “Macmillan was looking for an
alternative to Red Hat; they thought that Mandrake was the best choice
for them.” According to a press release dated 27 October 1999, Macmil-
lan’s Mandrake-based product “accounted for 52% of total Linux soft-
ware units sold at retail” in the United States during August 1999, less
than a year after Linux-Mandrake was put together.
Macmillan was not alone in singling out this young company and its
new distribution. Also in August 1999, Linux-Mandrake 6 won two Lin-
uxWorld Editors’ Choice Awards, for Product of the Year and Distribu-
tion/Server (and was runner-up in the Distribution/Client category). The
same month, the investment fund of AXA, “one of the world’s leading in-
surers and asset managers,” with nearly $655 billion of assets under
management, took an equity position in MandrakeSoft. Nor was that the
0738206709-04.qxd 9/4/03 10:20 AM Page 310
end of the investments: The total venture capital obtained in one year
was $15 million.
Like its longer-established rivals, MandrakeSoft has started hiring top
hackers, and even bought projects so that it can release them as open
source software. As a result, MandrakeSoft is emerging as a showcase ex-
ample of all the strengths of the open source model in commerce. The
GNU GPL allowed a more or less penniless computer student to create a
new distribution that has taken on the well-financed Red Hat, and to win
in many cases. It also means that others can take Mandrake’s product as
the starting point for yet more new distributions. “That happened several
times,” Duval says. He welcomes it: “It’s good for us because it makes us
still more known and spread”—just as Mandrake promotes the Red Hat
distribution.
These counterintuitive dynamics act as a brake on the dominance of
any one distribution. They also ensure that user needs are soon met. If
none of the existing distributions—there are currently more than one
hundred—serves a market well enough, it is an easy matter for someone,
with only the most basic of resources, to create that distribution and pos-
sibly rise to become the next MandrakeSoft, or even Red Hat.
MandrakeSoft chose to begin as a Red Hat clone, and still maintains
compatibility with Red Hat. That is, “when a software company releases
an application . . . targeted to Red Hat systems, it will install on Man-
drake,” Duval explains. However, this feature of Linux-Mandrake does
reveal a continuing issue for the GNU/Linux world: cross-distribution
incompatibilities.
These have to do with how software is selected and packaged, not with
forks in the programs themselves. For example, decisions about which
version of the kernel will be adopted, how many patches will be applied,
and where a library will be placed when it is installed can make all the
difference as to whether an application package will run or not on a par-
ticular distribution. Many software vendors have created four or more
slightly different versions of their products to cope with the tiny diver-
gences between distributions.
Recognizing the inefficiency of this approach, and the danger that
such differences might grow, Bruce Perens, author of the original Open
Source Definition and the second leader of the Debian project, created
the Linux Standard Base (LSB) in 1998. The main aim of the LSB is “to
be able to have an application move from one distribution to another
without being recompiled,” Perens says. Unfortunately, he adds, “it’s tak-
ing longer than I would have liked.”
0738206709-04.qxd 9/4/03 10:20 AM Page 311
would cast a huge chill over the world of open source because it depends
on the GNU GPL to regulate much of its operation.
So far, enforcement has not proved a problem. “About a dozen times a
year,” Moglen says, “somebody does something [that] violates the GPL.
Most of the time, they’re doing so inadvertently, they haven’t thought
through what the requirements are. And I call them up and I say, ‘Look,
you’re violating the GPL. What you need to do is do this. Would you
help us?’” The answer is invariably yes, he says.
“What is true,” Moglen admits, “is that no large American software
company has engaged in a public controversy with us over the enforce-
ability of the GPL.” And although some might conclude “that means . . .
there’s something about the GPL [that] is not enforceable, I would turn
that proposition around,” Moglen says. “There have been no such con-
troversies because nobody thinks they’re going to win them.”
But, he adds, “I think that sometime it’s probably going to become nec-
essary, in order to dispel a little FUD [Fear, Uncertainty and Doubt] on
these subjects, for us to choose to take the judicial enforcement route
with a case [that] we would otherwise feel comfortable working out in
our traditional way.” This would be to demonstrate, once and for all, the
validity of the GNU GPL and hence of the foundation of free software.
In one respect, the open source world is fortunate there are companies
with the resources needed to back a long-drawn-out legal tussle with
some well-financed violator. As Moglen says, “Our advantage from the
great elopement of Wall Street with anarchism” in 1999, when compa-
nies such as Red Hat and VA Linux enjoyed highly successful IPOs, is
“that . . . some money [is] lying around that can be used for these pur-
poses.”
But only “within limits,” he adds, and notes that “each one of those
enterprises looks the way it does because it has a zero cost of manufac-
ture. Each of their models depends upon the idea that this is a very inex-
pensive business to be in, because your product is manufactured for you
for nothing by other people.” Moglen’s observation touches on perhaps
the most serious question mark hanging over open source: Will the busi-
ness models based on it work in the long term?
One person who doubts it is Larry McVoy. “I believe that the next
problem that the open source community is going to face is how to make
a revenue stream appear and stick around,” he says. As the author of the
Sourceware proposal, which foresaw many of the features of today’s oper-
ating system landscape, his words carry a particular weight.
“The problem with open source,” he explains, “is that the things that
meet the Open Source Definition are all focused on rights for the end
0738206709-04.qxd 9/4/03 10:20 AM Page 314
users, to the detriment of any rights for the vendor.” The end result,
McVoy believes, is that companies based around open source software
can never equal Netscape, Sun, and Microsoft as industry behemoths.
McVoy also thinks such companies are laboring under a fundamental
misapprehension about the open source development model. “It’s easy to
do the easy work,” he says. “You can have a weekend hack attack, or
even a month-long hack attack and do something significant and get a
lot of praise from the community.” But he points out that after the easy
work comes the grind.
McVoy doesn’t even allow that unpaid hackers will be useful at finding
bugs in code written by paid professionals. If “college students sitting in
their dorm room on your stuff that you’ve spent fifteen man-years work-
ing on, spend a week, or a weekend, or a day, or an hour working on it,
they can’t possibly grasp how it all fits together. There are companies
who are betting that part of their development costs will be gone because
of the open source model,” McVoy says. “And I think they’re betting on a
failing model.”
McVoy’s litany of open source woes continues. He believes not only
that such companies are doomed but that in the process of failing they
will take swathes of the existing software industry with them. As the up-
take of GNU/Linux and other open source programs increases, it will in-
evitably drive out solutions from other vendors—particularly in the
Unix arena. McVoy contends, however, that these vendors paid for what
we now call open source.
“The preceding [software] ecosystem had a built-in pad,” he explains.
“It charged more money than what the stuff you were getting was worth.
The extra money paid for having engineers sitting around at Sun labs,
and HP labs, and the rest of these places. The open source in the world
didn’t happen [thanks to] volunteers,” he insists. “It happened [thanks
to] engineers at places like Sun.
“This is a discussion that [Red Hat’s] Bob Young and I have all the
time,” McVoy continues, “where he’s worried as hell about this. Because,
basically, Red Hat is well on its way to putting [Sun’s software division]
out of business. Well, that’s kind of cool for Red Hat, except for one
problem. Red Hat doesn’t generate enough revenues to pay for the soft-
ware that they get—and they actually get some of that software from
Sun. This is known in the industry as killing your supplier. It’s not a
good idea.” McVoy’s reasoning raises critical questions about the viability
of the companies based around free software. If he’s right, the new open
source companies whose flames have burned so brightly in recent years
0738206709-04.qxd 9/4/03 10:20 AM Page 315
are destined to gutter into oblivion, and, in the process, starve fires of
other companies that depend on Unix.
McVoy’s analysis appears to underestimate the significant contribution
students make to free software, whether in their spare time (the GIMP
and Gtk), or through course projects (the LyX document processor)—
and as both, like Linux, which began as an idle bedroom hack and
turned into material for a master’s thesis. But even if students’ contribu-
tions are relatively small, and if open source companies really do drive
some traditional software vendors out of business before following them
into oblivion, it still may not matter.
As the previous chapter described, the open source revolution has
moved on from the pioneers. Today, mainstream companies—IBM, HP,
Compaq and SGI—have all taken up open source in various ways. They
depend critically neither on Unix, as Sun does, nor on open source, as
Red Hat and the other distributions do. Instead, they use both as ele-
ments of a broader strategy; for example, selling hardware and services.
Against this background, then, whether or not the new open source
companies succeed is in some senses a secondary concern, although not
for them and their shareholders. Development of GNU/Linux and open
source within other companies will almost certainly continue, anyway.
McVoy’s own analysis explains why. The “built-in pad” he refers to
means that traditional computer companies “can afford to have some
percentage of their engineers working on stuff that they don’t understand
and they don’t agree with,” he notes. The reason this goes on is because
“the engineers [are] passionate enough about it that they say, ‘Look, I’m
going to do this or I quit.’ And [the companies] say, ‘OK, you can do it.’”
Given that at least some of these companies will thrive as businesses—
however great the impact of open source, it is not possible that the entire
software business will disappear—those passionate engineers will still be
working in their midst. The fruits of their passion will supplement the
work of new generations of students in educational establishments, or of
enthusiastic hobbyists programming at home, to say nothing of such
dedicated hackers as Richard Stallman and the FSF who do not code to
live, but live to code. This, ultimately, is the reason free software does not
depend on open source companies for its own survival: It exists beyond
the market.
The central issue, then, is not whether open source companies can
flourish and blossom into multibillion dollar concerns but whether
free software can continue to grow and progress as it has for the last
decade and a half. Another scaling challenge presents itself: How can
0738206709-04.qxd 9/4/03 10:20 AM Page 316
dence that IBM has chosen to set up a GNU/Linux center in India as part
of its move to embrace open source. “One of the appeals of Linux is that
not only is there a hotbed in the U.S.,” says IBM’s Wladawsky-Berger,
“but that there are even bigger hotbeds in Europe, India, and China. You
can imagine there are lots and lots of wonderful Indian engineers.”
There are likely to be even more such engineers if an Indian open
source project called the Simputer—from SIMPle compUTER—suc-
ceeds. As a press release of 26 May 2000 explains, the Simputer is a “mo-
bile computing device for the common man . . . based entirely on free
software,” principally GNU/Linux, Perl, and Tk. It is designed as a
“shared computing device for a local community of users” and to “make
available the benefits of Information Technology to the rural masses” in
India. The unit’s low price—around $200—is crucially important if it is
to be widely deployed to these users, and that price would be impossible
to achieve without open source software.
Other countries, too, are likely to become major contributors to the
free-software world—Mexico, for example, thanks to a major distance
learning project that is already underway there. Details about it first ap-
peared in October 1998, when Arturo Espinosa Aldama posted to a mail-
ing list for GNOME—a project to which he has close personal links. As
Miguel de Icaza explains, “He was my roommate in Mexico,” and in Au-
gust 2000, Espinosa joined de Icaza’s company Helix Code.
Espinosa wrote in 1998:
I work as the project leader of the “Scholar Net,” a program that aims to
bring computers and the net to every elementary and mid-level school in
Mexico. We expect to install from 20 to 35 thousand labs per year to a total
of 140,000 centers in the next five years.
At the time, this message led to wild assertions in some quarters that
140,000 Mexican schools would soon be using GNU/Linux.
But Espinosa emphasizes, “Our Linux solution is just an option. It is not
something that is being installed by decree. What happens is that every
state in the country decides by itself what’s going to be its technological so-
lution” for implementing Scholar Net. So far, he says, six out of thirty-two
Mexican states have shown an interest in adopting his GNU/Linux ap-
proach. “Realistically, I think that in a year we will have something near
five states,” he says. “So that’s around maybe 1,500 schools installed.”
Although far fewer than the 140,000 touted by some, this number is
significant and would represent an important test-bed for the use of
GNU/Linux in schools. Moreover, if even a small percentage of the chil-
0738206709-04.qxd 9/4/03 10:20 AM Page 319
dren involved became free-software hackers, this could have a major im-
pact on GNU/Linux. As de Icaza says, this and similar projects mean that
“there are going to be more people who do not depend on proprietary
technologies, who know how to use the free software alternatives, and
know how to develop the free-software alternatives.”
In a sense, though, such schemes to introduce GNU/Linux-based sys-
tems into schools are inefficient as a way of fostering new coders to drive
free-software development; it places the onus on the individual—the
child or young person—to get to know more about the underlying soft-
ware, to learn programming, and to start hacking. Alongside such wor-
thy undertakings, a more formal approach to teaching programming that
would naturally lead people to contribute to open source projects is
needed. This is one of the aims of a project called Computer Program-
ming For Everybody (CP4E), the brainchild of another key figure in the
world of open source.
Guido van Rossum was born in 1956, in the Netherlands. In the late
1980s, while living in Amsterdam, he worked on a branch of the Amoeba
project, the main research interest of Andrew Tanenbaum, the creator of
Minix, who is a professor at the Free University in the city. “I remember
Tanenbaum asking me once, ‘Are you a researcher or a developer?’” van
Rossum says. “And I still don’t know the answer. I guess I’m a bit of both.”
While he was working on the Amoeba operating system, van Rossum
came up with the idea for a new language. “One of the things we were
trying to do was to make [Amoeba] a sufficiently usable system that we
could run it on our workstations and use it for our day-to-day use,” he
says. “We realized that one of things that made it difficult was that we
didn’t have a good scripting language”—a method for creating programs
quickly and easily. And so van Rossum wrote one, which he called
Python after the BBC TV comedy series Monty Python’s Flying Circus. “It
was one of my favorite TV shows at the time,” he explains.
As he later wrote in the foreword to Programming Python: “In Decem-
ber 1989, I was looking for a ‘hobby’ programming project that would
keep me occupied during the week around Christmas.” Note that he
called this a “hobby,” the term Linus employed to describe his kernel
when he announced it to the world; and that van Rossum wrote it over
Christmas 1989, just as Linus wrote the Virtual Memory code during
Christmas 1991, and for the same reasons: to stave off boredom.
Python was released in February 1991, and it soon enjoyed a devoted
following. Python is in many ways a rival to Perl—the leading scripting
language—but van Rossum emphasizes that no animosity exists between
him and Perl’s creator, Larry Wall. “I think Larry is a great guy, and Perl
0738206709-04.qxd 9/4/03 10:20 AM Page 320
Most of the main open source projects have already tackled this issue.
Apache’s core group allows new members to be added as others drop
out. Larry Wall has stepped back from the day-to-day running of Perl so
that others can shoulder the burden in turn. Projects such as Sendmail
have companies with teams of paid engineers that can move the stan-
dards forward while preserving their openness. But questions of succes-
sion remain unanswered for the two key free-software projects: Linux
and GNU.
Because Linux has a high profile and more and more companies are
betting their futures on its continued development, the question of what
happens after Linus has been the more discussed of the two. Opinions
within the open source world differ. Larry McVoy, for example, believes
that “Linux is held together by Linus, and he is a unique human being.
The way I always put it is, great programmer, great [software] architect,
nice guy—pick any two: You can’t have all three in one human being”—
except in Linus.
Stephen Tweedie is more sanguine. “There are so many other people
involved,” he says, “who have the ability to take on increasing parts of
the workload, that I really can’t see there being a long-term problem.”
Another senior hacker, Ted Ts’o, agrees: “There are a number of people
who can do the job. Someone would have to get designated as the suc-
cessor, or we might decide to use a different model—such as perhaps
something like what Apache, or Perl, development communities use—
it’s not clear what we would do.” And Ts’o emphasizes, “A lot of it has to
do with enough people in the developer community being mature.”
Fortunately, the Linux community seems to have both strength-in-
depth and the requisite maturity to cope with any situation that might
arise. For example, Alan Cox and Dave Miller are leading candidates to
take over should this be necessary. Alan Cox is already running the sta-
ble branch of the code, leaving Linus to concentrate on the development
version. Although Alan Cox is an obvious choice, he believes that “there
are several people who could take over.” He adds, “I’m told Linus has di-
rections for the event filed away somewhere. I’ve never seen the contents
and hopefully will never need to.”
Dave Miller, too, has been handling an alternative kernel source tree
that is held on the vger machine: “That’s a huge tranche of development
which is now being carried out entirely independently of Linus,” as
Tweedie points out. Nonetheless, even someone as gifted and ambitious
as Miller, who almost took the momentous step of forking the Linux
code during the 1998 “Linus does not scale” incident, recognizes that
the time is not yet right for him to contemplate taking over. “I don’t
0738206709-04.qxd 9/4/03 10:20 AM Page 322
think I’m personally 100 percent mature enough to do it right at this mo-
ment,” he says. “If I had to do it, I would. Maybe a year or two from now
I could keep a grip on such a large project and make intelligent and well
thought-out decisions. But I hope I never have too.” And he adds, with
knowing understatement, “Linus is pretty good at the task.”
The two positions—that Linus is unique and that he is replaceable—
are not, ultimately irreconcilable. Linus’s greatest achievement is not his
software: One day, Linux will be superseded. As Alan Cox says, “There
will eventually come a point where someone is trying to do something
[that] requires you fundamentally break Linux. So you will end up with
something [that] replaces Linux, built just using Linux as kit of useful
parts to get you going.”
Linus’s enduring contribution, rather, lies in his perfecting of the de-
velopment model that he inherited in slightly different forms from the
hacker cultures of Berkeley, the GNU project, and Minix. Linus opened
the process to be inclusive: Anyone can download the latest kernel, any-
one can send in patches. Linus added the vital element of rapid review—
sometimes producing updated kernels on an almost daily basis—that
took the open source process to the next level. Linus allowed his lieu-
tenants progressively to take over parts of the kernel, thus ensuring its
greater scalability and long-term viability.
Linus is unique because he was able to serve as a focal point for all
these advances to come together to create a complete methodology that
is now central to the continuing success of the open source movement
and that offers the first plausible alternative to the current—and creak-
ing—model of software development. But Linus is also replaceable be-
cause of this methodology, which allows programming and architectural
decisions to be delegated to specialized circles of experts; and thanks to
this methodology, even his leadership style—that of power wielded in
subservience to the user base—can be distributed more widely.
This devolution of power has been possible because Linus is a pragma-
tist and the Linux development process inherently consensus-driven. In
the purist wing, however, things are not so easy. Compromise is not an
option when it involves fundamental issues of principle, as selecting a
future leader of the GNU movement inevitably does. Given that there
can be no fuzziness in the GNU project’s goals, so leadership by commit-
tee—an inherently fuzzy approach—is hardly practical. What is needed
is someone able to command unqualified respect within the purist
hacker community not just for technical ability but also for an unswerv-
ing record of commitment to the ideals that stand at the heart of the
GNU GPL and the GNU project.
0738206709-04.qxd 9/4/03 10:20 AM Page 323
Afterword
The preceding chapters portray the state of the free software and open
source worlds in the period leading up to the end of 2000. They there-
fore reflect the sense of excitement of that time, when it seemed that the
open source approach not only would change the way software was writ-
ten but might lead to the emergence of a significant new business sector
based around these free programs.
Since then, the dot-com bubble has burst, wiping out many of the
highly fragile New Economy start-ups and humbling even the bellwether
online names. Against this background, it would have been extraordi-
nary if open source companies, which had benefited as much as any from
the preceding enthusiasm, had emerged unscathed. Intimately entwined
with the now out-of-favor Internet culture, and predicated as they were
on novel business plans and unconventional software, such companies
looked doubly dubious to nervous investors. As a result, all of the busi-
nesses based around GNU/Linux or other open source software have
seen their valuations slashed and additional funding evaporate, and sev-
eral have shut down completely.
Few have fallen further than the brightest star in the open source IPO
firmament, VA Linux. Not only was the company shunned by investors
because of its free software affiliation, but its sales collapsed as many of
its best customers—the Internet start-ups—were themselves wiped out
in the dot-com crash.
324
0738206709-04.qxd 9/4/03 10:20 AM Page 325
Afterword 325
326 Afterword
That same strength of open source can be observed in the wake of the
closure of Collab.net’s SourceXchange. As a notice on the site dated 6
April 2001 explained, SourceXchange “simply did not achieve the vol-
ume of business necessary to maintain the site and evolve the offering to
meet the needs of sponsors and developers.” But the open source soft-
ware brought into being as a result of SourceXchange commissions is
still freely available.
This lack of dependence of open source code on the company that cre-
ates it must have afforded the free software community a certain consola-
tion when a high-profile start-up employing many top hackers began to
have problems. In April 2000, Linuxcare announced the abrupt depar-
ture of its chief executive officer as well as the deferral of its IPO. It laid
off workers in May 2000, but things improved with news of additional
investments from Dell, Motorola, and Sun in August.
The announcement, on 21 February 2001, of TurboLinux’s acquisition
of Linuxcare seemed to signal the creation of an important new force in
the world of open source, which was clearly in need of consolidation.
But the perplexing reversal of this fusion a few months later—even
though the two businesses had been operating increasingly as one—indi-
cates how even in the mainstream GNU/Linux distribution sector an ap-
parently well-funded and successful company like TurboLinux can
suddenly struggle to find a way forward.
The first signs that the situation was changing were redundancies in
May 2000; in June, cofounder Cliff Miller stepped aside as CEO. A filing
in October 2000—for an IPO that was later withdrawn—noted that as
part of a separation agreement Miller and his wife ceased to be officers of
the company in July 2000 and resigned in October 2000. More redun-
dancies followed at the time of the proposed Linuxcare merger, and yet
another CEO was appointed in July 2001.
SuSE seemed to be weathering the storm better than TurboLinux, al-
though there are certain similarities in its trajectory through the increas-
ingly difficult financial conditions facing companies selling GNU/Linux
distributions. Like TurboLinux, SuSE announced staff cuts, in February
2001, the result of what it called a “decision to realign its US business.”
And just as TurboLinux’s Cliff Miller had moved on, so cofounder
Roland Dyroff, CEO of SuSE for eight years, relinquished this post to
join the SuSE board in July 2001. With Dyroff’s elevation, the era of the
hacker bosses had ended.
The relative newcomer among the main GNU/Linux distributions, Man-
drakeSoft, also acquired a new CEO, but in the form of the current chair-
man, Jacques Le Marois, cofounder of the company. The surprise
0738206709-04.qxd 9/4/03 10:20 AM Page 327
Afterword 327
departure of the previous CEO and other senior managers in April 2001
was the “result of a divergence of views regarding the company’s strategic
outlook,” according to Le Marois. More positively, MandrakeSoft entered
the Paris Euronext Marché Libre on 31 July 2001, raising over four million
Euros (about $4 million) through its initial offering—far less than previ-
ous open source IPOs but a real achievement given the market conditions.
Caldera, too, has had mixed results. On 7 May 2001 it completed its
acquisition of SCO’s Unix operations, which allowed it to describe itself
as the “leader in unifying Unix with Linux for business.” Less than a
month later, the renamed Caldera International reported a second-quar-
ter loss of $11.7 million on revenues of just $1.6 million.
Despite the respect it gains for bringing Unix into the GNU/Linux
fold, Caldera’s sales are worryingly small. This is especially obvious
when they are compared to the sales of the sector’s undisputed leader,
Red Hat, which reported fourth-quarter revenue of $27 million in March
2001, up from $13.1 million the previous year. The latter company also
achieved a rough break-even for this period.
Red Hat’s relative stability amidst the growing troubles of its fellow
GNU/Linux distributors has allowed it to continue to branch out in new
fields. On 25 June 2001, Red Hat launched the Red Hat Database, an
open source product based on the existing free PostgreSQL software, a
descendant of the early Ingres database project that Sendmail’s Eric All-
man worked on.
Red Hat is not the only free software company active here. A well-es-
tablished open source database solution, which is being marketed with
increasing vigor, is MySQL, begun by Michael “Monty” Widenius in
1995. Although available before under a liberal license, the adoption of
the GNU GPL in June 2000 brought it firmly into the free software main-
stream.
Nor are databases the only new area where open source is increasingly
visible. Now that Apache is indisputably the leading Web server—its
market share hovers around the 60 percent level according to Netcraft—
companies are starting to explore complementary open source tools in
preference to proprietary solutions.
For example, Microsoft’s Active Server Pages (ASP) technology allows
Web pages to be created on the fly by populating templates with dynamic
content, but a powerful alternative is the open source PHP. Created by
Rasmus Lerdorf in 1995, and originally an acronym for Personal Home
Page, since it was designed for his personal Web page, PHP now stands
recursively for PHP: Hypertext Preprocessor. Commercial support for
PHP is available from the company Zend.
0738206709-04.qxd 9/4/03 10:20 AM Page 328
328 Afterword
Afterword 329
tant for raising the profile of the open source sector than for undermin-
ing the foundations of the entire computer industry, as some suggested at
the time.
IBM’s massive investment also tends to confirm the thesis of the pre-
ceding chapter that free software will thrive somewhere, whatever hap-
pens to the pioneer open source companies. Indeed, the indications are
that despite the dramatic downturn in the fortunes of all open source
companies, and the concomitant precarious employment of many of the
hacker stars, the underlying development of free software has continued
exactly as before.
The best demonstration of how open source has stayed aloof from the
surrounding financial turmoil was provided by the appearance of the
long-awaited version 2.4 of the Linux kernel on 4 January 2001. Its ac-
companying message showed that Linus, for one, has remained refresh-
ingly untouched by the doom and gloom of the dot-com world:
In fact, 2.4 had been expected for over a year rather than just “too
many months.” But as Linus’s message indicates, the delay was a conse-
quence of the open source development model—people testing it “over
and over again”—not of the economic situation. As a result, in addition
to offering significant advances over 2.2, particularly for enterprise
users, version 2.4 proved to contain very few major bugs.
Alongside this sometimes slow but always steady progress, open
source has continued to take on new challenges. None promises to be
more important than an audacious project to match Microsoft’s own
boldest move for years: the .Net initiative.
0738206709-04.qxd 9/4/03 10:20 AM Page 330
330 Afterword
Afterword 331
INDEX
ACC Bookstore, 102, 103, 224–226 Augustin, Larry, 101–102, 167, 170,
ActiveX, 262 250–251, 287–288
Adabas D database, 100
Ajuba Solutions, 246 Baker, Mitchell, 196–197
Allison, Jeremy, 277, 293 Baltazar, Henry, 281–282
Allman, Eric, 120–123, 241–244 Barksdale, Jim, 189, 194
Alpha processor, 110–111, 294 Barry, James, 206–211, 249
Amazon.com, 138 Bash (Bourne Again Shell) program,
America Online (AOL), 201–202, 298 24–25
Amsterdam Compiler Kit (ACK), Bastok, Frederic, 309
21–22 Bechtolsheim, Andy, 243
Andreessen, Marc, 183–186, 190–191, Behlendorf, Brian, 125–130, 161, 199,
249 209–210, 249
Apache (computer program), 127–130, Ben-Halim, Zeyd M., 213
182 Berkeley Internet Name Domain (BIND),
and IBM, 205–219, 289–290 125, 182
versus Microsoft Internet Information Berkeley Software Design, Inc. (BSDI),
Server, 270, 282 295
Apple computers, 183, 268 Berkeley Systems Distribution (BSD),
and image manipulation, 252–253 65–67, 73, 84–85, 122–125
and GNU/Linux, 296–297 and GNU/Linux, 295
Applied Cryptography, 285 and TCP/IP, 72
Applix (computer program), 98 Berners-Lee, Tim, 6, 125, 182–183, 185,
ARPAnet, 120–123 200, 244–245
Artificial Intelligence (AI) Laboratory, Biro, Ross, 73–74, 77
15–16, 17–19 Boich, Michael, 268
Art of Computer Programming, The, 155 Bolzern, Mark, 96–97, 101,
Art of Unix Programming, The, 153, 103–105
155–159 Boot and root disks, 87
AT&T, 66, 73, 124 Bostic, Keith, 149
332
0738206709-RM.qxd 9/4/03 10:20 AM Page 333
Index 333
Brooks, Fred, 14, 77, 148, 151 and Mozilla Public License (MPL),
Browsers, 129–130 197–198
and Netscape, 187 and Netscape Public License (NPL),
open source, 182–183 197–198
wars, 187–191 and open source code, 196–198
Bugs, 58–60, 83, 249 and Open Source Definition, 167–168
and Sendmail, Inc., 243–244
Caccamo, Wayne, 220–222, 248–249 and the GNU Emacs General Public
Caldera (company), 99–100 License, 26–29, 49, 79
and corporate customers, 228–230 and University of California at Berkeley,
and proprietary software, 229–230 124
Network Desktop (CND), 99 and X license, 63
OpenLinux, 99–100 mixed free and proprietary, 259–260
support for GNU/Linux, 113, 114 Cox, Alan, 72, 75–77, 80–82, 85, 103,
Carr, Eric, 269 179, 263, 308, 321
Case, Steve, 201–202 and Apple computers, 296–297
Cathedral and the Bazaar, The, 148, 150, and multiprocessor support for Linux,
165, 190 114
C (computer program), 21–22 Crusoe processor, 162, 298
and the Amsterdam Compiler Kit Cryptozilla, 199–200
(ACK), 21–22 Currie, Peter, 189
and the GNU C Compiler (GCC), 24, Cutler, Dave, 6
37, 237–240 Cygnus Solutions, 237–241, 299–300,
CD-ROMs, 94–95 306–307
Charles, Philippe, 211–212
China, 317 Daemon, 125, 233
Chowdhry Pankaj, 281–282 Dawes, David, 63
Clark, Jim, 186, 199 D’Cruze, Patrick, 103
Clustering, 233 Debian (computer program), 89–94,
Cobalt Networks, 298 167–168
Code-forking, 75, 171, 305–306 de Icaza, Miguel, 260–268, 298, 323
Code Morphing Software, 164 DeCSS, 303
ColdStorage (website), 250–251 Delivermail (computer program), 121–123
Collabra (company), 189, 198 Dell, 222, 249, 298
Commodore Vic–20 microcomputer, 8–9 Demetriou, Chris, 84
Communicator (computer program) de Raadt, Theo, 84
see Netscape Communications Digital (company), 6, 13, 107
Compaq Computer Corporation, 223, Ditzel, Dave, 162, 164
293–294 Document Object Model (DOM), 200
Computer Associates, 213 Domain Name System (DNS), 124–125
Computerworld, 170 Dougan, Cort, 173
Concurrent Version System (CVS), 173 Duck Pond, the, 8, 10, 181
Connolly, Dan, 182–183 Duval, Gaël, 308
Content Scramble System (CSS), 303 DVDs, 303
Control Data Corporation, 121 Dyroff, Roland, 234–235
Convex (company), 183
Copyright Eazel (company), 268
and free software, 302–303 Eckhardt, Drew, 47, 56, 61–62, 66
and Linux, 44, 48–49, 112 eCos, 300
0738206709-RM.qxd 9/4/03 10:20 AM Page 334
334 Index
EGCS, 307 image manipulation, 252–255
Eich, Brendan, 200 making money from, 237–251,
Ellison, Larry, 215 315–316
Emacs (Editing Macros), 16–17, 22–23, Netscape Communications and,
171–172, 184, 237, 305 166–167, 182
E-mail, 121–124, 150 Sendmail, 123, 182
see also Sendmail see also Shareware
Eng, Eirik, 257–258 Freeware Summit, 171, 199
Ericsson (company), 298–299 Freshmeat (Web site), 250
Espinosa, Arturo, 318 Future of Linux Meeting, 215, 287
Ettrich, Matthias, 255–260, 316
European Center for Nuclear Research, 6 Gardner, Ray, 147
Evans, Bruce, 37, 39, 68–69 Garloff, Kurt, 174–175
Ewel, Jim, 278, 281 Gates, Bill, 5, 188
Ewing, Larry, 112 see also Microsoft
Ewing, Marc, 97–98, 223, 225–226 Gateway computers, 223, 298
eXtensible Markup Language (XML), 199, Gecko, 200
246 General Image Manipulation Program
(GIMP), 252–255
File Transfer Protocol (FTP), 41 Germany, 234
and development of Linux, 69 Ghostscript, 233
Filo, Dave, 101–102 Gilmore, John, 239
Finland, 6–7, 68 GNU Image Manipulation Program,
Fintronic, 101–102 252–255
Flagship (computer program), 96 GNU/Linux
Forbes, 170 adaptations of, to different applications,
Foresight Institute, 167 60–61
Forking advantages over other operating
see Code-forking systems, 65–67
Freax, 42 advantages over Windows, 58, 269–270
FreeBSD, 84, 214-215, 295 after Linus Torvalds, 321–322
Freely Redistributable Software and Alan Cox, 75–77, 80–82, 85
Conference, 148 and Caldera (company), 99–100
Freenet, 303–304 and CD-ROMs, 94–95
Free Software Foundation (FSF), 24, 91, and code-forking, 75, 305–306
92, 148, 149, 302, 312–313 and embedded systems, 299–301
Free software, 97–98, 145 and Hewlett-Packard (HP), 220–223
and code-forking, 171 and IBM, 222–223, 289–290
and DVDs, 303–304 and image manipulation, 253–254
and IBM, 206 and networking, 74–75, 80
and Netscape, 187 and Oracle, 213–216
and networking, 272 and proprietary software, 229–230
and Open Source Definition, 167–168 and Red Hat (company), 97–98,
and proprietary software, 259–260 217–219, 220–223, 228
and Slashdot.org, 302 and Samba, 270–275
and Trolltech (company), 258 and Ted Ts’o, 82–84
Apache, 127–130 and the Internet, 68–70, 71–72
Eric Allman and, 121–122, 160–161 and Virtual Memory (VM), 47
finding coders of, 316 and Wine (computer program), 286
0738206709-RM.qxd 9/4/03 10:20 AM Page 335
Index 335
and XFree86, 64 Matthias Ettrich on, 256–257
and X Window system, 61–63, 73 Mobile, 163–164, 298
boot and root disks, 87 multiarchitecture support, 113–114
bug fixes, 58–60, 83, 249 multiprocessor support, 113–114
business services, 225–226, 234–235, myths, 284–286
247–248 Network File System (NFS), 59–60
commercial ventures, 237–251, 284, networking code, 77
305, 309 Networking-Howto, 72, 74–75
competition with Berkeley Systems patches, 173–181
Distribution (BSD), 65–67 portability of, 107–108, 113–114, 115,
competition with Minix, 53–54, 286
55–56 Pro, 96–97
contributions to, by other hackers, Redundant Array of Inexpensive Discs
46–47, 56–58, 74, 80–81 (RAID) code, 261
control of, 84 release frequency, 67–68, 85
control of, by Torvalds, 79–80 security, 285
copyright, 44, 48–49, 79 Softlanding Linux System (SLS)
corporate applications, 100–101 releases of, 88
cost of operating, 285 software availability, 285
customers, 247–248 Standard Base (LSB), 310–311
distribution by Manchester Computing technical support, 247–248
Centre (MCC), 88 testing against Windows, 282–284
Documentation Project, 72 Torvalds’s plan for, 46–47, 57
document processor, 255–256 Turbo, 233
early development of, 40–42, 111–112 upgrades, 97
Eric Raymond on, 150 users, 119, 224–225, 247–248
flexibility of, 298–299 version 0.01, 42–44
fragmentation of and problems with, version 0.02, 45
305–306 version 0.10, 46
growth in applications and distribution version 0.11, 46
of, 88–89 version 0.12, 47
growth of, 118–119, 289–295 version 0.95, 55
High Availability, 301 version 1.0, 85
in Germany, 234 version 1.2, 112
in India, 317–318 version 2.0, 112
in Japan, 232–233 version 2.2, 274
in Mexico, 318–319 Video and DVD Project (LiViD), 303
installation instructions, 48 GNU Network Object Model Environment
integration with Debian, 92–93 (GNOME), 262–267, 295
integration with Minix, 57–58 GNU project, 20–30
International, 103–105 and commercial software, 237–241
kernel development, 111–112 and Cygnus, 237–241
Kernel Version History, 67 and GNUPro, 240
Kongress, 165 and Miguel de Icaza, 261
logo and mascot, 112–113 and Netscape Public License (NPL),
making money from, 309 197
management of, 178–181, 285 and open source code, 212–215
Mandrake 5.1, 309 and the GNU Emacs General Public
marketing, 104, 225–226 License, 26–29, 49, 301, 312–313
0738206709-RM.qxd 9/4/03 10:20 AM Page 336
336 Index
Hurd, 45–46, 149 HyperText Transport Protocol daemon
integration with Linux, 57–58 (HTTPd), 125
Gnutella, 303–304
Google, 284, 301 IA–64 processor, 287–288
Gophers, 126 IBM, 205–219
Graphical user interfaces, 254, 258, 262 and Linux, 222–223, 289–290, 311
Greenblatt, Richard, 17–18 OS/2 Warp operating system, 257
Gtk+ toolkit, 254–255, 262 S/390 mainframes, 289–290
Image manipulation, 252–255
Hackers, 3, 15–16, 17, 155, 315–316 India, 317–318
and code-forking, 171–172 Informix, 213
and DVDs, 303 Infoworld, 213, 226
and Helix Code (company), 267–268 Initial Public Offerings (IPOs), 228
and Initial Public Offerings (IPOs), and Caldera (company), 230
235–236 hackers and, 235–236
and Linux source code, 58 and Pacific HiTech, 233
and Mindcraft (company), 279 and Red Hat, 228
and proprietary software, 229–230, 243 and SuSE, 234–235
and source code, 250–251 and VA Linux, 235
and SourceXchange, 249 see also Venture capital
and VA Linux, 250–251 Intel
contributions to Linux by, 56–58, 74, 80386 processor, 35–36, 162–163
80–81 and development of Linux, 42–44, 50,
employment of, 248, 250–251 64
management of, 320–321 and Red Hat (company), 227–228
Hackers (the book), 16 Itanium chip (IA–64), 287–288
Hahn, Eric, 189–191, 194–195, 201, 205 Internet
Hall, Jon, 107, 116, 167 and Sendmail, Inc., 242
Harnois, Michael, 172 and source code, 250
Hecker, Frank, 192–194, 200–201, 249 and TurboLinux, 233
Helix Code (company), 267–268 Internet Engineering Task Force (IETF),
Helsinki (Finland), 6–7 128, 311
Henkel-Wallace, David, 239 Internet Explorer, 188, 189
Hertzfeld, Andy, 268 Internet, the, 6, 102
Hewlett-Packard (HP), 220–223, 248, 249 and Domain Name System (DNS),
Hohndel, Dirk, 62 124–125
Homer, Mike, 189, 194 and free software, 182
Homesteading the Noosphere, 155 and HyperText Transport Protocol
HotBot (Web site), 301 (HTTP), 125
HotWired magazine, 126–127, 130 and Internet Engineering Task Force
Hubbard, Jordan, 84 (IETF), 128
Hurd (computer program), 45–46, 149 and Requests for Comments (RFCs),
HyperText Markup Language (HTML), 128
126 and the development of Linux, 71–72
and Emacs, 184–185 and the success of Linux, 68–70
early development of, 147–148 development of, 120–124
tags, 185
HyperText Transport Protocol (HTTP), Jakarta (computer program), 289
128 Japan, 231–233
0738206709-RM.qxd 9/4/03 10:20 AM Page 337
Index 337
Jargon File, 146 Lisp (List Processing) Machine
Java (computer program), 6, 189–190, Incorporated (LMI), 17–19
193, 211–212 Literate Programming, 156
Jeeves (computer program), 124 Lonnroth, Magnus, 214
Jikes (computer program), 211–212, 289 Love, Ransom, 229–231, 264
Jobs, Steve, 183, 297 Lu, H.J., 88
Johnson, Michael, 103 Lycos (Web site), 301
Jolitz, Bill, 65, 66 LyX (computer program), 255–257
Jolitz, Lynne, 65
Journalism, open, 301–302 MacDonald, Peter, 88, 234
Joy, Bill, 124, 243 Macmillan Publishing, 309
Julliard, Alexandre, 286 Mailing lists, electronic
Junius, Martin, 88 Apache, 127–128
Linux-kernel, 172
Katz, Roberta, 194 WWW-Talk, 126–127
K Desktop Environment (KDE), 258–262, Malda, Rob, 302
308–309 Manchester Computing Centre (MCC),
and KDE Free Qt Foundation, 264 88
Kerbango Radio, 298 MandrakeSoft (company), 309–310
Kerberos, 311–312 Mares, Martin, 174, 175
Kernels, 41, 44 Massachusetts Institute of Technology
design, 50 (MIT), 145
Linux, 111–112 and X license, 63
monolithic versus micro, 52 see also Stallman, Richard
Kimball, Spencer, 252–255 Mattis, Peter, 252–255, 262
Kirch, Olaf, 71, 103 McCool, Robert, 125
Kleiner Perkins, 247 McNealy, Scott, 141
Knuth, Donald, 155–161, 255, 317 McVoy, Larry, 140–144, 178–181, 240,
261, 298, 313–315
Lai, Glenn, 63 Media attention, 170
Le Duke, Dave, 247 Metzenthen, Bill, 66
Le Marois, Jacques, 309 Mexico, 318–319
Lemmke, Ari, 41, 42, 69, 183 Microsoft, 5–6, 94
Levy, Steven, 16, 17 ActiveX, 262
Licenses and Kerberos, 311–312
see Copyright and Mindcraft (company), 275–284
Lieber, Derek, 55–56 and Red Hat (company), 218–219
LiGNUx, 92, 99 and ResNova, 206
Lineo (company), 299 and SMB protocol, 273–274
Linux and Transmeta, 162–163
see GNU/Linux browsers, 188
see also VA Linux competition with Red Hat, 226–228
Linuxcare (company), 246–248, 270 Internet Information Server, 130
Linux Journal, 89, 94, 102, 103 myths about Linux, 284–286
Linux Kernel Version History, 67 Office Suite, 217
Linux Network Administrator’s Guide, 71, response to Hewlett-Packard support of
103 Linux, 221–222
Linux News, 35 security, 285
Linux Today, 277 success of, 194–195
0738206709-RM.qxd 9/4/03 10:20 AM Page 338
338 Index
support for World Wide Web National Center for Supercomputing
Consortium standards, 200 Applications (NCSA), 125, 126–127,
use of Linux by, 101 183–184, 245
Windows emulators, 286 see also Mosaic (computer program)
see also Windows Natural languages, 134–135
Miller, Cliff, 231–233 Navigator (computer program)
Miller, Dave, 108–110, 172, 174, 179–180, see Netscape Communications
261, 298, 308, 321–322 NetBench, 270, 279
Mindcraft (company), 275–284 NetBSD, 84
Miner, Allen, 214, 215–216, 223 Netscape Communications, 125, 127, 130,
Minix (computer program), 32–34, 219
36–40 and Caldera (company), 228–230
and Linux version 0.02, 45–46 and distribution of Mozilla,
and the Intel 8086 chip, 64–65 202–204
competition with Linux, 49–50, and IBM, 205
53–56 and Netscape Public License (NPL),
complaints about, by Linus Torvalds, 197–198
51 and the Open Directory, 301
Mockapetris, Paul, 124–125 browser source code, 166–167, 182,
Moglen, Eben, 302, 312–313 190–191, 194–198, 205–206
Molnar, Ingo, 240, 282 Communicator, 205–206
Monni, Tove, 116 Heresy documents, 189–190
Mosaic (computer program), 183–184, Navigator browser, 187
185, 245 support for World Wide Web
Communications company, 186 Consortium standards, 200
Netscape browser, 186–187 Network File System (NFS), 59–60
see also National Center for Networking
Supercomputing Applications and electronic mail, 121–124
(NCSA) and Linux, 71, 74–75, 77, 80
Motif (computer program), 95–96, 185, and Windex program, 270–271
254, 262 New Hacker’s Dictionary, The,
Motorola, 108, 300–301 146–147
Mozilla, 192, 197-204, 205-206, 210, 211, Newsgroups, 36–39
212, 214, 215, 227, 229 and development of Linux, 63, 69
and America Online, 201–202 and Mindcraft (company), 276
and distribution of Mozilla, 202–204 and Perl, 136–137
and Mozilla Stabilization Schedule, New York Unix, 102
200 NeXT computer, 183
party, 199 Next Generation Layout (NGL),
Mozilla Public License (MPL), 197–198 200
MS-DOS (operating system), 36 Noorda, Ray, 99
advantages of GNU/Linux over, Nord, Haavard, 257–267
65–66 Novell, 99, 223. 229–231, 232, 249
Multiarchitecture support, 113
Multiprocessor support, 113 Ockman, Sam, 167
Multisoft, 96 Olson, Greg, 242–243
Murdock, Ian, 89–93 OpenBSD, 84
Murphy, Tim, 55 Open Directory, The, 301
Mythical Man-Month, The, 14, 77 OpenProjects (Web site), 250–251
0738206709-RM.qxd 9/4/03 10:20 AM Page 339
Index 339
Open source code, 167–168, 193–196, and embedded systems, 300
212–217, 248, 265–266, 301–302, and GNU Network Object Model
312–313 Environment (GNOME), 263
see also Source code and Hewlett-Packard (HP), 220–223
Open Source Solutions Operation (OSSO), and MandrakeSoft, 310
221 and open source code, 241
Operating Systems: Design and and proprietary software, 228–230
Implementation, 51 competition with Microsoft, 226–228
Oracle, 213–216, 223, 249 employment of hackers by, 248
O’Reilly & Associates, 103, 139, 248–249 growth of, 226–228
O’Reilly, Tim, 165–166, 169, 249 initial public offering, 228, 241
Ousterhout, John, 244–246 profitability of, 314–315
promotion of GNU/Linux by, 307–309
Pacific HiTech (company), 231–233 software marketing by, 225–226
Palmisano, Sam, 291 Reisler, Kurt, 107
Patch (computer program), 132 Requests for Comments (RFCs), 128
Patches, 173–181, 310 ResNova (company), 206
Pathworks, 271–272 Richter, Adam, 95, 239
Pauling, Linus, 7 Ritchie, Dennis, 13–14
PC Magazine, 281 rn newsreader, 132–133
PC Week, 281, 282 Roell, Thomas, 63
Penguin mascot, 113 Running Linux, 57, 103
PenguinRadio, 298 Rutgers University, 172
Perens, Bruce, 91–92, 166, 167–168, 264,
310 Safford, Dave, 88
Perl (computer program), 135–139, 182, Salus, Peter, 13, 148
319–320 Salzenberg, Chip, 137
Peterson, Christine, 167 Samba (computer program), 270–275, 293
Plattner, Hasso, 216–217 San Francisco Gate, 170
Posix standards, 40–41 Santa Cruz Operation, The (SCO), 140,
Prince of Persia (computer game), 35–36 295, 296
Programming SAP (company), 216–218, 223
as an art form, 156 Schneier, Bruce, 285
employment, 248–249 SCO (company), 295–296
Eric Raymond on, 151 Screen Phone, The, 298
philosophies, 151–152 Scriptics (company), 246
profitability of, 315–316 Security, computer, 285
Unix, 153–154 Sendmail (computer program), 123, 182
Python (computer program), 319–320 and proprietary software, 241–244
early development of, 242
Qt (computer program), 257–260, 264 marketing, 243
Quarter Century of Unix, A, 13 Sequoia Capital, 235
Quasar Technologies, 257–258 Server Message Block (SMB), 271
SGI, 109, 223, 242, 261, 277, 279, 280,
Raymond, Eric, 144–155, 165, 177, 199, 284, 287, 288, 293, 297, 315
235–236, 264 Shan, Yen-Ping, 207–211
Red Hat (company), 96–98, 102, 217–219 Shareware, 44–45, 256
and Compaq Computer Corporation, Shearer, Dan, 272
293–294 Shields, David, 211–212
0738206709-RM.qxd 9/4/03 10:20 AM Page 340
340 Index
Sifry, Dave, 246–248 and Open Source Definition,
Silicon Graphics, Inc., 186, 223, 293 168–169
Silicon Valley Linux User Group, 167, and profitability of free software,
169–170 237–238
Simputer, 318 and the Artificial Intelligence (AI)
Sinclair QL microcomputer, 9–10 Laboratory, 15–17
Sinclair, Sir Clive, 9 and the C compiler, 23–24
Slackware, 93–94, 234 and the Debian program, 90–92
Sladkey, Rich, 56–60, 65 and the Freeware Summit, 166
Slashdot.org, 301–302, 312 and the GNU Emacs General Public
Sm@rt Reseller, 269, 274 License, 26–29
Softbank Comdex, 104 and the GNU project, 20–30
Softlanding Linux System (SLS), 88–89 and Trolltech (company), 261
Software AG (company), 100 and Tcl, 245
Software, proprietary and Unix, 19–20
and business-to-business e-commerce coding method of, 23–24
(B2B), 246 see also Massachusetts Institute of
and free software, 259–260 Technology (MIT)
and GNU Network Object Model Sun Microsystems, 6, 140–141, 239, 249,
Environment (GNOME), 267 294–295
and Red Hat (company), 228–230 and Tcl (computer program), 245–246
and Sendmail, Inc., 241–244 use of GNU/Linux by, 108
and the GNU project, 238–239 SunOS (operating system), 41, 141
hackers and, 229–230, 243 SunScript (company), 245–246
in Japan, 232–233 Sun World, 170
outside the West, 317 SuSE (company), 234–235, 289, 294
Solaris, 141, 294, 295 Sustained Development Networking
Soundblaster, 72 Programme (SDNP), 317
Source code, 58, 87, 88 Symbolics, 18–19
and distribution of Mozilla, 202–204 Symmetric Multi-Processor (SMP), 114
and Freshmeat (Web site), 250 Systems Development Corporation (SDC),
and Mosaic, 186 132–133
and Netscape, 187–188, 194–198,
205–206 Tanenbaum, Andrew, 22, 32–34
and Open Source Solutions Operation and Linus Torvalds, 50–53
(OSSO), 221 on control of Linux, 78–79
and Red Hat (company), 225–228 on the Internet, 71
and SAP, 216–218 Tcl (computer program), 244–246
and SourceForge (website), 251 TCP/IP, 72, 124, 270–271
and VA Linux, 250–251 and deployment platforms, 292
see also Open source code Pathworks program for, 271–272
SourceForge (Web site), 251 TECO (Text Editor and Corrector), 16
SourceXchange, 249 Texas A&M University, 88
Sparks, Bryan, 99–100, 119, 230–231, TEX (computer program), 157, 255
299, 300 Thompson, Ken, 13–14, 52
Spyglass (company), 186 386BSD, 65-67, 72, 76, 84, 85
Stallman, Richard, 14–30, 145, 323 Tiemann, Michael, 237–241, 300,
and code-forking, 171–172 306–307
and Cygnus, 307 Time, 170
0738206709-RM.qxd 9/4/03 10:20 AM Page 341
Index 341
TiVo, 298 and image manipulation, 253–254
Torvalds, Linus Benedict and Novell, 99
and Andrew Tanenbaum, 50–53 and Posix standards, 40–41
and early PCs, 35–39 and Samba (computer program), 293
and the Commodore Vic–20 and X license, 63
microcomputer, 8–9 at University of California at Berkeley,
and the Sinclair QL microcomputer, 121
9–10 development of, 13–14
army service, 11–12 fragmentation of and problems with,
college career, 10–11, 114–115 142–144, 296
complaints about Minix, 51 newsletters, 224
early years, 7–9 programming, 153–154
employment with Transmeta, 117–118, Sun Microsystems and, 140–141
163-164 version 7, 32–33
first work on Linux, 40–42 Uytterhoeven, Geert, 172, 173
involvement with business activities,
106–107 VA Linux, 235–236, 287–288
leadership of Linux, 321–322 and ColdStorage (website), 250–251
master’s thesis, 114–115 and Intel Itanium chip (IA–64),
media attention to, 170–171 287–288
move to United States, 116–118 and OpenProjects (website), 250–251
newsgroup postings by, 36–39 and Slashdot.org, 302
on GNU Network Object Model and SourceForge (website), 251
Environment (GNOME), 265–266 employment of hackers by, 248,
on K Desktop Environment (KDE), 250–251
265–266 see also VA Research
on the Internet, 71 Valloppillil, Vinod, 311
on Vger, 173–178 van Kempen, Fred, 74–75, 77, 80
plan for Linux, 46–47 van Rossum, Guido, 319–320
teaching experience, 114–115 VA Research, 101, 102, 167, 170, 235
Torvalds, Patricia Miranda, 116 see also VA Linux
Transmeta (company), 117–118, 162–165, Vaughan-Nichols, Steven J., 269, 276
298 Venture capital, 247, 310
Tribble, Bud, 268 see also Initial Public Offerings (IPOs)
Tridgell, Andrew, 113, 270–284 Vera, James, 102
Troan, Erik, 165, 261 Vger, 172–173
Trolltech (company), 257–267, 298, Virtual Memory (VM), 47
308–309 Visix (computer program), 229
Tsillas, Jim, 63 Visual Basic, 5
Ts’o, Ted, 46–47, 56, 82–84, 175–176, 321 Vixie, Paul, 125
TurboLinux, 233, 249, 289 Volkerding, Patrick, 94
Tweedie, Stephen, 84, 321
Tyde, Arthur, 247 Wall, Larry, 130–139, 161, 165, 319–320
Walnut Creek (software company), 94,
UItraSparc achitecture, 294 295
Ultrix (operating system), 13 Weaving the Web, 125
UniForum, 104 WebBench, 270
Unix (operating system), 6, 107, 254 WebSphere Application Server, 205
and electronic mail, 124–125 Weiner, Bruce, 275–284
0738206709-RM.qxd 9/4/03 10:20 AM Page 342
342 Index
Welsh, Matt, 57, 103 and Freshmeat (Web site), 250
Wexelblatt, David, 63 and HyperText Transport Protocol
Whitinger, Dave, 275 (HTTP), 125
Widgets, 254, 258 and Mozilla, 199–201
Williams, Riley, 67 Consortium (W3C), 200
Windex (computer program), 270–271 WWW-Talk mailing list, 126–127, 182,
Windows (computer program), 5–6, 217 184–185
advantages of GNU/Linux over, 58,
269–270 Xanadu project, 147–148
and Wine (computer program), 286 X Consortium, 63
browsers, 188 Xenix (operating system), 33
competition with Red Hat, 218–219, XFree86, 64
226–228 X Window system, 61–63, 73, 182–183,
emulators, 286 270
Matthias Ettrich on, 256–257 and image manipulation, 254
networking, 273
NT, 142–143, 273–275, 286–287 Yacktman, Don, 297
portability of, 286 Yahoo, 102, 130, 138, 235
web server, 206 Yang, Jerry, 101–102
see also Microsoft Yggdrasil Computing, 95, 148, 239
Wine (computer program), 286 Young, Bob, 102, 103, 223–230, 314
Wired magazine, 126
Wirzenius, Lars, 10–12, 31–32, 69, 87, 92 Zawinski, Jamie, 171, 191–192, 193, 197,
Wladawsky-Berger, Irving, 290, 311 199, 201–204
World Wide Web, 6, 125, 126–127, Zborowski, Orest, 62, 73
182–183, 244–245 Ziff-Davis, 269–270