er impressive forms of apparent
cleverness.
Nevertheless, computers STILL are profoundly brittle and stupid; they
are simply vastly more subtle in their stupidity and brittleness. The
computers of the 1990s are much more reliable in their components than
earlier computer systems, but they are also called upon to do far more
complex things, under far more challenging conditions.
On a basic mathematical level, every single line of a software program
offers a chance for some possible screwup. Software does not sit still
when it works; it "runs," it interacts with itself and with its own
inputs and outputs. By analogy, it stretches like putty into millions
of possible shapes and conditions, so many shapes that they can never
all be successfully tested, not even in the lifespan of the universe.
Sometimes the putty snaps.
The stuff we call "software" is not like anything that human society is
used to thinking about. Software is something like a machine, and
something like mathematics, and something like language, and something
like thought, and art, and information.... But software is not in fact
any of those other things. The protean quality of software is one of
the great sources of its fascination. It also makes software very
powerful, very subtle, very unpredictable, and very risky.
Some software is bad and buggy. Some is "robust," even "bulletproof."
The best software is that which has been tested by thousands of users
under thousands of different conditions, over years. It is then known
as "stable." This does NOT mean that the software is now flawless,
free of bugs. It generally means that there are plenty of bugs in it,
but the bugs are well-identified and fairly well understood.
There is simply no way to assure that software is free of flaws.
Though software is mathematical in nature, it cannot by "proven" like a
mathematical theorem; software is more like language, with inherent
ambiguities, with different definitions, different assumptions,
different levels of meaning that can conflict.
Human beings can manage, more or less, with human language because we
can catch the gist of it.
Computers, despite years of effort in "artificial intelligence," have
proven spectacularly bad in "catching the gist" of anything at all.
The tiniest bit of semantic grit may still bring the mightiest computer
tumbling down. One of the most hazardous things you can do to a
computer program is try to improve it--t
|