Technology is broadly defined as the human means of manipulating and changing the environment. It encompasses everything from simple tools, such as a crowbar or wooden spoon, to complex machines, such as a particle accelerator or space station. Moreover, technology can also refer to less tangible items, such as computer software and business methods. In the context of this article, however, we will be focusing on technology in its narrower sense: the application of scientific knowledge to create tools and devices that can be used in a practical way to achieve some desired outcome.
Technological advances have allowed humans to better understand the world around them, allowing for the creation of new products and services. For example, the development of fire increased food production while the invention of the wheel allowed for easier transportation. Other technological innovations, such as the printing press, telephone, and Internet have reduced barriers to communication and allowed humans to interact on a global scale. However, not all technology is embraced by humans; the development of weapons of increasing destructive power has been a constant throughout history.
The word technology derives from the ancient Greek term tekne, which was a synonym for skills of working with wood. Initially, this included building houses, but later the concept was broadened to include crafts, science, and even rhetoric. By the time of Plato, however, disputes had begun to arise over whether medicine was a type of technology.
In the 1950s, the Massachusetts Institute of Technology popularized the term “technology” with its oddly-named book, The Useful Arts, although Schatzberg argues that this was not the decisive moment when the word entered English (the turgid compendium was already well established as a ‘filler’). Moreover, the word never truly settled on what it meant to mean: sometimes referring to applied science and other times to broad industrial arts. The ambiguity created by the broad-versus-narrow interpretations of the term is why discussions of technological change are so turgid and intellectually impoverished.
One reason for this turgidity is that technologies, by their nature, prioritize certain paths to ends while neglecting others. This prioritization necessitates that they inherently involve instrumentality and a kind of free-will. The emergence of digital cameras, for instance, has deprioritized the analogue photography pathway with its inefficient and labor-intensive workflows and cultures of painstakingly retouching images in darkrooms.
This is an important point to keep in mind because, as the turgid debates show, technology is both complex and slippery. Attempts to analyze it have been buffeted by two radically diverging traditions of talking about it, which Schatzberg labels as a ‘narrow technical rationality’ and an ‘explicitly moral approach’. Each has its defenders, but each is flawed. If we want to rethink how we think about technology, it is essential that we avoid these intellectual dead ends. Thankfully, there is room for a different, more promising direction.