We, in the present, are in some sense both our past and our future. What we have done and what has been done to us contributes to who we are; and what we intend to do, what we are ordered towards doing, instantiates within us anticipatory relations and dispositions. We resolve–without reducing–past and future into the present, as relations constitutive of the self.
One consistent thread in the temporal relations thus constitutive is the technological. That is, technology is woven throughout our lives, and not only our lives as individuals, but as societies. As has been the case since the “philosophy of technology” (as a specific field of philosophical study) germinated in the 19th century, one finds two prevailing attitudes: first, that of the Luddite, who sees in technology a continual threat to human well-being; second, that of the Futurist, who sees in technology a means to increased human efficacy, power, and freedom. We can observe tendencies towards the first in the writings of Martin Heidegger, Richard Weaver, Jacques Ellul, and Theodore Kaczynski (the “Unabomber”); and outright commitment to the second in the Transhumanists such as Ray Kurzweil and Nick Bostrom, and implicit hope for such in many Marxist thinkers as well (including Marx himself, in some fashion). If one were to ascribe fictional works to each attitude, Tolkien’s Lord of the Rings espouses a love of harmonious living with the land against a tortorous subjugation of it productive of abominations (consider: the Shire in contrast to the machinations of Saruman), while the Deus Ex video game series–though attentive to the difficulties posed by new technologies–ulitmately places technology as the means to enhancing human capacity and quality of life… though a more realistic depiction of the human consequences might be found in Aldous Huxley’s Brave New World.
Some might think that a balanced approach (“A little from column A, a little from column B”) might ease the tension which grows as stronger technological possibilities continually traverse the horizon. This, I think, is a mistake. The question of integrating technology into our lives is not one of balance, but one instead of nature.
What is technology?
Any number of objects may leap to mind when asked, “What is technology?” Typically, these are the newest devices at our disposal, or those most recently brought to our attention, indicating an association between “innovation” and “technology”. At present, this likely includes many digitally-oriented devices (i.e., internet-connected, data-processing things, such as tablets, appliances, car computer systems, etc.) as well as, say, transportation technology (electric and self-driving cars), “autonomous” robots (the nightmares DARPA dreams up), weapons and other military systems, and medical equipment.
With further consideration, we might arrive at thinking about older technologies: the capacity of televised broadcast, radio, automobiles in general, air-conditioning (my personal favorite), and probably the most important, the technologies which allow the harnessing and application of electricity. Even more remotely we might touch upon coal or steam power as antiquated (although still used) technological means. For the most part, however, we seldom stretch our thinking to devices prior to the Industrial Revolution, or if we do, we do insofar as they anticipate said revolution. We do not tend to think of the hammer, the hacksaw, the nail, or the wheel as technology, except perhaps in an analogous sense.
Most thinkers who have undertaken a philosophical inquiry on technology have wondered why this is: after all, is there really an essential difference between the hammer swung by hand and the hydraulic-powered hammer? The latter, certainly, is more efficient and will not tire out like an arm (although ill-suited for finesse work); but is there a difference in kind between the two, or only a difference of degree–of power, of control, of embodied-connection between the instrument and the agent? The combustion engine can get us from one location to another much faster than a leg-powered bicycle, but are they not doing the same thing?
These questions are tired; not because they have been discussed endlessly, but because they have been discussed poorly. That is, there have been many theorists concerned with the effects of technology in the past seventy years (Lewis Mumford, Martin Heidegger, Richard Weaver, Jacques Ellul, Herbert Marcuse, E.F. Schumacher, Don Ihde, Andrew Feenberg, Langdon Winner, etc.), but few who have provided a truly penetrating analysis to its nature, despite their many collective insights. As a consequence, what we truly mean by the term technology remains opaque–especially as the difference between pre-industrial and post-industrial revolution technologies is acknowledged, but not distinguished as to its precise character.
To attain this precision, the distinction in pre-industrial and post-industrial technologies requires not only a causal resolution–i.e., understanding technology in terms of a resolutio secundum rem, which identifies the causes of technology and shows how it and its effects follow from those causes–but also a resolution of intelligibility, a resolutio secundum rationem, the discovery of what commonly unifies the meaning we intend by the word “technology”. In other words, what the distinction is cannot be understood without a clearer understanding of technology as a whole. The above correspondences–digital, transportation, military, etc.–are mere instances, but no definition, and therefore provide no common basis for understanding all the instances. If there is to be a distinction identified between pre- and post-industrial technology, it will be discovered in either one of these resolutions.
That is, while historical facts can give us many clues as to the development of technology, and even indicate a distinction between pre- and post-industrial technology, it cannot explain this difference. Rather, we need a properly philosophical reflection. First, we need a dialectical consideration–a tentative searching for the intelligible accounts given by common reason and by other thinkers. Second, we need a resolutio secundum rationem from this dialectic, resolving the various intelligible accounts into an intelligible unity, to provide a definition for what technology is. Third, we need a resolutio secundum rem, that is, identifying the causal factors which produce technology. Because this essay aims only at an introduction, we will not be engaging in this process in full; rather, I will only posit a provisional definition of technology (based upon a dialectic not shown here) and give a brief explanation of its merits:
“Technology” names any product of human artifice which extends a human capacity for control over specifically-determined, non-theoretical, use-oriented disclosure.
This definition, I believe, avoids being too broad or too particular. Some have too closely aligned technology with technique, which is the specifically human function of turning know-how into doing; others have proposed notions of technology as any extension of human capacity, which necessarily turns any instrumental means into an instance of technology–and thereby, I think, loses the specific character of what makes something technological. Others still have made technology consist entirely in its relation to the user, while some have contrariwise eliminated the user from consideration altogether. On the contrary, the relation to the user is essential for a thing’s technological designation–and yet the relation is not a relatio rationis (i.e., one that is “made up” by the user), but depends intimately upon properties of the thing too.
(As an aside, this relationality of technology entails that any instrument can but need not be technological, whereas every technology necessarily is instrumental.)
Necessarily, technology has an influence on the user, not only in any given moment, but in terms of altering the way we think. Understanding technology, therefore, requires understanding something of human nature.
What is human nature in respect to technology?
While the past seventy years have seen much written about the nature of technology, human nature has been discussed for millennia. Consequently, I am going to enter the dialectic even less (since it is even more complex), and instead cut right to the chase: what is meant by “human nature” is not the biological given, but rather the essential kind that belongs to every human individual, identified through typical actions manifested in necessary properties. In short: all the things that make us “semiotic animals”.
This entails not only the biological, but also the cultural–for which the biological serves as a basis. That is, we are not only social animals, but cultural ones; we engage not only in relations with our contemporaries, but our ancestors and our potential progeny, enmeshed in a web not only of hic et nunc praxis, but also ubicumque et quandocumque understanding, which two elements intertwine throughout our lives, both as individuals and as parts of social groups. Technology, it needs to be seen, is an inevitable consequence–or, even, an inevitable corollary–of our natures as biological, social, cultural, practical, intellectual beings.
We can glimpse this in three instances: the development of a particular language, the construction of buildings, and the expansion of communication-based technologies.
Language, as an innate and distinctive character of human beings–defined by Aristotle as zoon logon echon, the animals having language–and which we might succinctly describe as the capacity to articulate the intelligible meanings of objects we experience, is not a technology, but a property of human beings. This or that particular language, on the other hand, is a technology: that is, it is an extension of our innate capacity for language which provides a controlled means for disclosure. (It is a complex discussion of how the proximate use-orientation of linguistic disclosure may be subordinated to an understanding-orientation; too complex for here.)
Building is an extension which begins from a biological fact: humans are mal-adpated to prolonged exposure in many, if not most, earthly climates. While early humans were likely better-adapted, they may not have undertaken to build were they very-well adapted. But what begins as a compensatory extension for the sake of protection and survival becomes oriented to further uses: privacy, sanitation, comfort, commerce, governance, education, and so on. Construction has become increasingly specific in its use-oriented developments.
Communication technologies comprise a wider range of items than we might initially realize: from writing (and therefore all writing implements) to the printing press, radio to television, the internet to smartphones, it is true that our social lives have become increasingly permeated by access to means of communication–themselves all either extensions of languages or of what we can call “post-linguistic” structures (i.e., things themselves not linguistic but which would not be what they are without languages). Every communication technology stems ultimately from the innate capacity for language, as well as its innate usefulness in praxis inasmuch as we are creatures who find difficulty in surviving individually and therefore flourish only in society (not to mention that whole “needing another human of complementary gender for reproducing the species”).
But, as has recently been brought to the attention of many, there is a pernicious danger in the potential misuse of our communication technologies: that they may grant to certain parties an undue control over the distribution of information. As creatures who naturally seek knowledge (and therefore, the truth), being systematically or maliciously misinformed is violence to the fulfillment of our nature. What precipitated this public awareness–potential abuse of social media (“fake news”)–however, was a rather minor instance of systematic misinformation compared to the decades (if not centuries) long spread of falsehoods about nature and human understanding. That is, communication technologies can be and have been bent towards shaping beliefs in a way dissonant with an understanding of reality, to the malformation of human cognitive faculties themselves–not at the level of “changing our natures”, but certainly at the level of changing our “second natures” (i.e., deeply-ingrained habits).
What communication technologies show in a particular instance extends to the larger question concerning technology today: namely, to what extent, and with what freedom, are we responsible for the direction of our own natures?
A Brief Ethics Lesson
David Hume (1711-1776), at whose door I lay much blame for bad thinking today, wrote in his Treatise on Human Nature (c.1740) that we cannot derive an “ought” from an “is”; in other words, nothing we can know about what a thing is tells us how that thing (or anything else for that matter) ought to be. The consequence of this: morality becomes either arbitrary or grounded only in sentiment (and thus lacks all force of necessity). Hume’s challenge to what was then a traditional notion emerges from a bad study of how human beings know, and in what our knowledge consists–essentially, that our knowledge is composed from the aggregation of atomistic sensations upon which our minds operate in order to produce ideas.
The opposite albeit lesser mistake to the so-called “is-ought fallacy” is to believe that every “is” necessitates itself as an “ought”; in other words, that how a thing is determines the only right way for it to be. This is-ought determinism not only fails to accord with evolution, but more fundamentally with the nature of material things: that is, material things are inherently capable of being faulty.
Ethics, although it gives guidelines for discerning “right” and “wrong”, must deal first with “good” and “bad”. That is, we determine actions right or wrong based upon whether they result in good or bad (not just in their ultimate consequences, but along the way, as well; the ends cannot justify the means). The Humean “is-ought fallacy” necessitates morality as either arbitrary or grounded in sentiment because it denies any way of knowing from the objects of experience (as opposed to the having of experience, what today is frequently called “lived experience”) what is good or bad for those objects–including ourselves. Attempts have been made to systematize rules which apply for everyone based upon common sentiments, but sentiments change over time not only in individuals, but generations, societies, and the rules therefore become obsolete–and when the sentiments not only change, but diverge (as, e.g., with sentiments about civil marriage rights of homosexual couples), the rule of law becomes strained.
In contrast, a moderate realist theory of knowledge–one which states that we do in fact know something not only about the incidentals of objects, but about their natures–constantly seeks the “ought” from the “is”; indeed, it sees the “ought” as part of the “is”, which must be worked out no less and no differently than any other property of the thing. Discerning the “ought” in the “is” requires understanding the thing in its basic dimensions of potentiality–that is, how it could be otherwise. Moreover, it requires understanding what is good for that thing–which likely includes ways in which it could be otherwise.
Our present technologies unquestionably alter our mundane relations to the world around us much more than those of even the industrial revolution, and especially of those technologies prior to it (the most impactful such changes of pre-industrial technology being the printing press and the invention of writing, arguably). Today, the impact of the rapidly disseminated digital technology is still being realized: and so we struggle with cognitive presence affected by device immersion; the immediacy of gratification and reduction of effort for common activites (think: Amazon Prime or even Amazon Prime NOW, the agony of internet buffering, and so on); social media information curation and manipulation; source validity and authenticity; big data; “artificial intelligence”; and what I like to call the “lowets common denominators of taste and interest”–that is, despite the accessibility of new and different, though the internet, what we have in common are typically the things most appealing to our basic desires.
To deal with these–to understand how they can, do, and should impact our good as human beings–we need a stronger understanding not only of technology, but also of human nature. Barring this latter understanding, our culture will cancerously develop, I think, even more towards the dystopia of a Brave New World. Perhaps the first step is to cease taking the soma.