David Weinberger
KMWorld Archive
This column is part of an archive of David Weinberger's columns for KMWorld. Used with permission. Thanks, KMWorld!

 

Link to Original at KMWorld  Index

David's home page | Bio | Speaking | Everyday Chaos

The metaphors of technology

01 October 2000

Societies tend to understand what it is to be human in terms of the technology they use every day. For example, when mechanical clocks were invented, the universe started looking like a grand clockwork. When steam engines transformed industry, we started understanding our psyches in terms of various pressures, and we started to talk about "venting." In the age of computers, we have inputs, process information and produce outputs.

As with any metaphor, there is the danger that we'll draw inappropriate conclusions based on the model. For example, there's at best conflicting evidence that "letting off steam" actually lowers the likelihood of committing violence as opposed to putting one more in the mood for it; soccer hooligans spring to mind.

The computer model also can easily lead us astray. For example:

We sometime think that we can control human "output" by controlling the inputs--"garbage in, garbage out," as the computer saying goes. But humans aren't that deterministic, or at least the causal factors are so many and so varied that we can't predict them with the reliability of billiard balls on a collision course. Otherwise, everyone who listened to a Judas Priest album would be up on the stand with the morons who claimed that heavy metal turned them into murderers. The computer model of consciousness "dumbs down" our understanding of human motivation. In fact, motivation is profoundly different from causation.

Similarly, the computer model might lead us to think that we're programmed. But such a belief would have us "dumb down" our educational system, substituting programming for teaching, and being programmed for learning.

Finally -- although there are many other possible examples--computers may model rationality (they don't, actually), but they sure don't touch emotion. The computing metaphor treats emotion as a mere epiphenomenon, an accidental byproduct like the heat generated by a TV set. As "information" appliances, computers are already biased against emotion, preferring a "just the facts, ma'am" world. But emotions are about what things mean to us and thus enable information to matter. They are the engines of personhood, not a byproduct.

Now for the hard part. Suppose for the moment that the Web is as defining of the coming age as the steam engine and computers were of theirs. How are we going to understand ourselves in light of the Web? We can already begin to hear ourselves thinking of a memory lapse as a "broken link." Will we view ourselves as loosely bound, full of play? Will we replace our view of our self as an M&M with its value hidden inside a hard shell, with a sense that we get our value from our outward-bound involvement with others?

Give me a call in a hundred years and let me know ...