CT551--Week 4--Lecture notes


Who Does Technology Benefit?


In Volti's Ch. 2, Winners & Losers: The Differential Effects of Technological Change, he stresses that technological change changes a great many things--the ways people do things, the ways people think about what they do, the ways they think about each other. Science alone can have similar effects. In the July 2000 Scientific American, Ernst Mayr ("Darwin's Influence on Modern Thought") tells us that one of the major impacts of Darwin's views was a shift from "typological" thinking (groups such as races are distinct and can be characterized in terms of "types") to more statistical thinking (groups vary over broad ranges on any character you choose). In time this did a great deal to make racist thinking indefensible.

Not all such effects are foreseeable--Later on, we will look at the impact of the printing press. That was such a simple thing, really. Who could have thought it would create a new religion (Protestant Christianity) and stimulate the slaughter of millions in the American, French, Soviet, etc., revolutions?

Consider the use of ammonium nitrate fertilizer and fuel oil. Alone, each has important and beneficial uses. Put them together, and you have a powerful explosive, as Timothy McVeigh showed the nation when he used the mix to destroy a building.

Some people feel that technology can have such drastic effects in so many ways--supporting war and terrorism, disrupting society, challenging cherished beliefs--that it should be very carefully regulated. Certainly, we should attempt to foresee the effects of every new technology, and if we do not like what we foresee, we should try to strangle the technology in its cradle. That is, we should ban technologies we think will be dangerous. Recall Bill Joy's article in last week's readings.

Should we ban Darwin's ideas? The printing press? Ammonium nitrate fertilizer?

Banning ammonium nitrate fertilizer might mean going back to using animal manures, and in Ye Olden Days, governments would confiscate manure piles in time of war--the reason being that under those piles one found deposits of saltpeter, potassium nitrate, an essential ingredient in gunpowder. Click here for a bit of history. There is no escaping the destructive potentials; the creative mind can find them anywhere.

Some effects are more subtle. Consider Volti's tale of the Yir Yoront. Missionaries introduced steel axes, thinking they were doing these Australian Aborigines a favor. But the Yir Yoront already had an ax-making technology which structured society and defined the roles and values of men and women. Now axes had to be bought or traded for, and women could own them too. "A certain kind of freedom was achieved, but only at the expense of confusion and insecurity. ... The result was rapid cultural disintegration...." In our own society, we see technology as providing freedoms, but it is hard to deny that some people see it as a source of confusion and insecurity, a threat to the older way of doing things. It is thus a subversive force, where "subversive" means "undermining or threatening the established order." (This is admittedly a more general and less political definition than people usually employ.)

Are there other technologies with similarly subversive potential? How about TV? Birth control pills (THE Pill)? What will be the impact of the morning-after pill? How about the battery-powered Segway Human Transporter? The Mindflex mentioned in last week's lecture?


What is the proper response to the threat of technologically induced change? Some authors suggest forethought, deliberation, regulation, in an effort to avoid the negative effects. Others suggest a very different approach--don't LET yourself be victimized by change, go to school, make sci-tech yours too. If the Yir Yoront had taken that attitude, they would surely have adapted much more quickly to the new technology.

How about the Luddites? The term is usually used to refer to people who wish to destroy technology. Originally, it referred to English workers who felt their jobs threatened at a time when the economy gave them few alternatives. When the economic climate improved after the defeat of Napoleon, the protests stopped. In other words, both problem and solution really lay outside the technology.

Modern "neo-Luddites" seem to have more complex motivations, born partly of disappointment arising from the failure of technology to live up to its most overblown promises. Those promises have said technology can fix social problems, and "techno-fixers" have tried, but many problems are worse than ever. "Techno-fixers" as technocrats have presented themselves as the proper shapers of "scientific management" and government and child-rearing, and they have not succeeded, surely because they insisted on treating people as machines (the "time & motion" movement; Skinnerian psychology, etc.).

The technocrats haven't given up, though. "We have a new vision of the machine people are," they say. "People are not like wind-up clocks but like computers, loaded with feedback and interconnections and information exchanges and so on. Now that we understand that, we can get it right."

Maybe. What do you think? Take a look at the technocracy site.


Florman seems less concerned than many that things will go wrong with our technologies. Of course they will! But "When you fall down, get up. If something goes wrong, fix it." The story of technology may be a tragedy, but it is not disaster that defines tragedy; it is the struggle against fate, the struggle to achieve, and the struggle that defines humanity at its finest. This is an attitude very likely to alarm--who? Consider what Florman says near the end of his essay about "evil."

DDT is a product of technology with an interesting history. As an insecticide, it saved a great many lives from insect-borne diseases before it was relized that it had unforeseen effects on the environment. Those effects were bad enough that the use of DDT was banned (with public-health exceptions). The next round of unforeseen effects involved the resurgence of malaria in tropical nations, where this disease now infects some 300,000,000 people and kills 1-3,000,000 per year. Nevertheless, says Worldwatch's Anne Platt McGinn, alternative mosquito-control measures are to be preferred. Donald Roberts thinks malaria is more of a threat than the environmental effects of DDT, and besides, DDT is, when sprayed on house walls, a very effective repellent. Used this way, the environmental threat is minimal.

The environmental effects of DDT could reasonably have been foreseen. So could the rise of resistant insects and, when DDT was banned, the resurgence of malaria. Should we have refrained from using DDT in the first place? Should we have refrained from banning it once it was in use? Or perhaps, in line with Florman's thinking, should we have gone ahead, fallen down, learned from the experience, and gotten up again?

What will happen if we keep getting up? Florman can make no promises, but Ceruzzi says that we can wind up in astonishing places. He tells us that computers were once thought to have very limited applications. Great Britain, in 1951, said, "We've got three. We might need one more." If we had stopped developing the technology then--as we might have, out of complacency, or out of fear generated by an attempt to look ahead at potential problems--we would have missed a lot. And we would have started missing things very soon.

In 1953, the IBM 650 sold by the thousands, surprising even IBM. Thereafter costs and size came down rapidly (aided by transistors and eventually integrated circuit chips), people (especially at universities) found new things to do with them (games as well as spreadsheets), and they spread wildly.

The main key was the realization that a computer could do more than one thing, with the use of stored programs. Another was the realization that words, music, and images could be expressed as strings of numbers that a computer could manipulate. Easier programming languages helped, too. And in due course, we reached today, when we can hold in one hand more sheer computing power than all of England had fifty years ago!





Questions for Discussion

1. We need some more examples of socially subversive technology besides steel axes. We might expand here upon TV, the Pill, or RU-486, mentioned in the lecture notes, but what about the computer? In what ways is it a subversive technology? Bear in mind that "subversive" is not a synonym for evil or immoral, but rather for "undermining of established authority."

2. Ceruzzi says that the inventors of the first computers did not really understand what they were making. What did those inventors think a computer was? (Or was for?) What did we eventually come to realize it really was? (Or was for?)

3. Take another look at that Mindflex toy. Is it just a toy? Is it a precursor to something more ambitious? Will people someday call this technology subversive? And what will the impact of mental control of computers eventually mean in the classroom?