Tag: Technology

  • Recovered Writing: Undergraduate Independent Study, Networks Between Science, Technology, and Culture After World War II, August 4, 2005

    This is the twenty-first post in a series that I call, “Recovered Writing.” I am going through my personal archive of undergraduate and graduate school writing, recovering those essays I consider interesting but that I am unlikely to revise for traditional publication, and posting those essays as-is on my blog in the hope of engaging others with these ideas that played a formative role in my development as a scholar and teacher. Because this and the other essays in the Recovered Writing series are posted as-is and edited only for web-readability, I hope that readers will accept them for what they are–undergraduate and graduate school essays conveying varying degrees of argumentation, rigor, idea development, and research. Furthermore, I dislike the idea of these essays languishing in a digital tomb, so I offer them here to excite your curiosity and encourage your conversation.

    During the summer prior to writing my undergraduate thesis at Georgia Tech, Professor Kenneth J. Knoespel agreed to lead an independent study with me on the theoretical underpinnings of my intended thesis topic: Cold War popular culture. During the summer, we met to discuss ideas relating to Cold War politics, network theory, science and technology studies, and popular culture. These conversations are among my favorite undergraduate memories at Georgia Tech. The essay included below is my attempt at working through and understanding the topics of our discussions. Some of this research was later incorporated into my undergraduate thesis.

    This project, along with other late-undergraduate work, helped me understand the importance of research, writing, and its required cognitive effort to developing your thinking, understanding, and insightfulness over time. The exertions of uncovering facts, employing different literacies, outlining, writing, revising, and building connections yield longterm cognitive benefits and generate deep pleasure from finding things out.

    Jason W. Ellis

    Professor Kenneth J. Knoespel

    Independent Study

    4 August 2005

    Networks Between Science, Technology, and Culture After World War II

    This paper’s purpose is to explore the spaces where science and technology is discussed in American culture following World War II.  First, I will investigate ‘three ways of seeing’ through the lenses of science studies, Cold War studies, and science fiction (SF) studies.  Then, I will apply these lenses to a series of American film examples from the Cold War era.  The net result will be a sort of annotated bibliography of theory and cultural examples that reveal the networks between science, technology, and culture.

    I would like to begin by looking at science studies.  This area of study involves looking at the connections between science, technology, and culture.  Science study engages questions such as:  Where do these seemingly different “things” intersect one another?  How do they interact, morph, and promulgate as a result of those intersections?  Before we can delve into these questions raised by science studies we should look at the meaning of the word we all use in everyday conversation:  ‘technology.’  According to Langdon Winner, the meaning of technology has changed over time.  Today, the term “’technology’…is applied haphazardly to a staggering collection of phenomena…One feels that there must be a better way of expressing oneself about these developments, but at present our concepts fail us” (Winner 10).  He goes on to write, “One implication of this state of affairs is that discussions of the political implications of advanced technology have a tendency to slide into a polarity of good versus evil…One either hates technology or loves it” (Winner 10).[i]   Perceptions of technology in dualistic terms is a theme that comes up in the other areas of study that I am exploring.[ii]  One solution that he proposes to address this problem is to develop a better terminology with which to engage all of the elements of technology specifically instead of by using terms of generality.[iii]

    Winner goes on to address the issues surrounding the proliferation of modern technologies.  He writes, “One symptom of a profound stress that affects modern thought is the prevalence of the idea of autonomous technology–the belief that somehow technology has gotten out of control and follows its own course, independent of human direction” (13).  “Autonomous technology” is synonymous with the idea of a living system.  The interaction between all of the parts of the system forms an ‘organism’ that has a will of its own.[iv]  Do we control technology or does technology control us?[v]

    Connected to the idea of personal/technological autonomy is the relationship between humans and ‘their’ technology.  Winner further elaborates on the idea of autonomous technology when he writes, “In our traditional ways of thinking, the concept of mastery and the master-slave metaphor are the dominant ways of describing man’s relationship to nature, as well as to the implements of technology” (Winner 20).  Humanity created tools and skills (i.e., technology) to serve the interests of humanity.  What happens when there is the perception among many people that technology is no longer serving humanity?  The tables may have turned, thus the question stands:  does humanity serve the self-perpetuating system of autonomous technology?[vi]

    This problem exists in opposition to the observation that “Western culture…has long believed that its continued existence and advancement depend upon the ability to manipulate the circumstances of the material world” (Winner 19).  Manipulation takes place through the use of technology.  The two systems, humanity and technology, rely on one another.  If technology is considered an autonomous system, it is borne of humanity’s ingenuity and its perpetuation is due to ideas held in Western cultures that progress depends on the use of technology.  It appears to be a symbiotic relationship, but is technology an autonomous system?  Winner addresses this issue when he writes, “The often vulgar Hollywood use of technological animism should not obscure the fact that images of this kind have been useful symbols for artists and writers concerned with the implications of modern technical artifice…In this regard the notion of a living technology merely recapitulates the myths of our own beginnings–the rebellion and fall of man–and the ensuing harvest of troubles” (31).  Autonomous technology best serves as a metaphor for the enormity of the interconnections within technology and its connections to humanity, culture, and science.  Charlie Chaplin’s Modern Times comes to mind as an example of the great factory and its ingestion of humans to serve its ends.  This connects to Winner’s description of the voluntarist way of viewing technology, which is best described as technology advances thanks to human controllers.  Winner further describes it by stating, “behind the massive process of transformation one always finds a realm of human motives and conscious decisions…Behind modernization are always the modernizers, behind industrialization, the industrialists” (53).  People still use their capital, inventiveness, and decision making to shift the course of technological change in the direction that they choose to do so.

    However, the network of science, technology, and culture may provide the impetus of an “invisible hand” that is not unlike the one envisioned by Adam Smith for capitalism.  Winner notes, “whereas the immediate application of a particular technology is usually conscious and deliberate, other consequences of its presence in the world often are not” (74).  Networks and interactions may lead to new developments that were not thought of by the originator of one particular artifact or process.  This reveals the complexity in which there are overlaps and connections between science, technology, and culture.  Therefore, each of these discrete subjects play upon one another.

    These concepts are further developed by Bruno Latour’s formulation of actor-network theory.  Latour’s theory is based on the interaction of dissimilar areas of interest such as technology and culture.  Of interest are the networks that form between these dissimilar elements.  Where is there a need for some new science or technology?  How was the need determined?  What solution was developed and how was it developed?  What resources or areas did the solution draw upon in order to be developed?  What networks form after a new technology is introduced?  What ‘political’ power forms around technological successes and failures?  How do things change as a result of a new technology?[vii]

    Winner adds to Latour’s actor-network theory when he writes, “technology always does more than we intend; we know this so well that it has actually become part of our intentions” (97-98).  The networks that form between technology and culture are a sort of breeding ground for new uses of technology.  The pathways that connect these ‘separate’ areas of ideology and practice are where re-creation takes place and add to the original intent of an originator of some new technology.[viii]  Changes in Latour’s actor-networks are similar to Winner’s point that “technologies…demand the restructuring of their environments” (100).[ix]  I bring up this point because, by extension, environments for a technology encompasses both the physical location of a technological artifact or practice as well as the networks that the technology is situated in.  All of which may require restructuring.

    The next area of study that I am going to examine is Cold War studies.  Cold War studies is the historical evaluation and investigation of the cultural and political aspects of the time between 1945 and 1990 (i.e., the Cold War era).[x]  One of the overarching technological artifacts of the Cold War is the nuclear bomb.  The destructive reality of the atomic bomb (and later, the thermonuclear bomb) brought about a duality of opinions about that technology (i.e., it was perceived as inherently good or evil).  This connects to Langdon Winner’s revealing the perception of advanced technology on dualistic terms.[xi]  Cold War studies, like science studies, looks at the networks involved in the development and promulgation of technologies that alter the cultural landscape, but in this particular discipline, the emphasis is on the dichotomy between the democratic West and the communist East.  It should be noted that not everything between 1945-1990 can be tied to the Cold War, but “so much was influenced and shaped by the Cold War that one simply cannot write a history of the second half of the 20th century without a systematic appreciation of the powerful, oft-times distorting repercussions of the superpower conflict on the world’s states and societies” (McMahon 105).

    Paul Boyer begins By the Bomb’s Early Light by looking at a plethora of cultural artifacts (e.g., speeches, newspaper articles and cartoons, radio reports, and documentary films).  He says the aim of his book “is an effort to go back to the earliest stages of our long engagement with nuclear weapons” (xix).  By returning to the beginning, he hopes to uncover how America got to where it is when his book was published in 1985.  Boyer goes on to say that “as I narrowed my focus to 1945-1950 was the realization of how quickly contemporary observers understood that a profoundly unsettling new cultural factor had been introduced–the the bomb had transformed not only military strategy and international relations, but the fundamental ground of culture and consciousness” (xix).  This statement reveals the way in which networks between science, technology, and culture (as described by Latour and Winner) connect to one another.  If there is a shift at one place in the network, the shift is recorded within other nodes in the network.  These seemingly separate elements all push and pull upon one another in varying ways (and these disturbances are not necessarily a one-to-one relationship).

    The ‘Nuclear Era’ begins along with the near-beginning of the Cold War.  After the dropping of the bombs called Little Boy and Fat Man on the Japanese cities of Hiroshima and Nagasaki on August 6 and August 9, 1945 respectively, “the nuclear era…burst upon the world with terrifying suddenness.  From the earliest moments, the American people recognized that things would never be the same again” (DOE 51-53; Boyer 4).  James Reston extends the fact of the Japanese cities’ devastation to the possibility of an American wasteland when he wrote in the New York Times, “In that terrible flash 10,000 miles away, men here have seen not only the fate of Japan, but have glimpsed the future of America” (qtd. in Boyer 14).  Boyer goes on to write, “Years before the world’s nuclear arsenals made such a holocaust likely or even possible, the prospect of global annihilation already filled the national consciousness.  This awareness and the bone-deep fear it engendered are the fundamental psychological realities underlying the broader intellectual and cultural responses of this period” (Boyer 15).  Even though America, at that time, was the only possessor of the bomb, Americans realized that it was a weapon that would eventually be held by others.  The enormity of the destruction caused by these new technological creations weighed on many minds.

    The scientists that spoke out against the threat of nuclear annihilation unfortunately “[displayed] considerable political naïveté, seeming not to grasp the fundamental differences between the political realm and that of the laboratory and the classroom” (Boyer 99).  The scientists sought to reform through education or as Einstein said, “To the village square we must carry the facts of atomic energy.  From there must come America’s voice” (qtd. in Boyer 49).  The bomb was not going to go away and the suggestions for a technocratic world government that could rationally control the use of the bomb also lost steam through the end of the 1940s.  Other political currents were at work such as in President Truman’s “address to a joint session of Congress on March 12, 1947, spoke in sweeping, apocalyptic terms of communism as an insidious world menace that overs of freedom must struggle against at all times and on all fronts” (Boyer 102).  Fear shifts from the nuclear bomb to communism.  This leads to the bomb becoming a part of America’s national defense at the beginning of the Cold War–even more so after the Soviets tested their first nuclear bomb on August 29, 1949.  Additionally, there is a shift towards an American identity based on homogeneity because of the call for an idealized cooperative effort in the post-war years to bolster America’s standing in the world.  There are calls for cooperativeness by Arthur Compton and Eleanor Roosevelt (Boyer 139-140).  This cooperativeness however leads to an alignment of political views that bolster the collective ideology promoted by the Truman, and later, Eisenhower administrations.  The space for open discussion is squashed.

    When the Manhattan Project was at work on the bomb, in the laboratory, they were able to cultivate a mighty political and military power through the use of the atomic bombs.  But the science and technology behind the bombs was appropriated by the military and the government leadership.  The United States government footed the bill for the Manhattan Project and there was never hesitation on the part of the Administration on the use of atomic bombs on Japan.  Once they were completed, they were to be used.  Therefore, there was a great deal of political power created within the laboratories of the Manhattan Project, but that power was not for the use of the scientists.  For awhile, the American public listed to the scientists who were opposed to the further use of the bomb, but that power of attention quickly dissipated as the threat of atomic weapons was overshadowed by the political enforcement of a new fear centered around the Soviet Union.

    Boyer then shifts to looking at the cultural aspects of the atomic bomb in literature and specifically, science fiction.  He writes, “Apart from a few isolated voices, however, the initial literary response to the atomic bomb was, to say the least, muted” (246).  He goes on to say, “Indeed, it sometimes seemed that the principal function of literature in the immediate post-Hiroshima period was to provide a grabbag of quotations and literary allusions that could be made to seem somehow relevant to the bomb” (247).  The bomb is not immediately engaged by literary authors in this period.  However, “As Isaac Asimov later put it, science-fiction writers were ‘salvaged into respectability’ by Hiroshima” (Boyer 257).  Boyer goes on to say, “Up to 1945, most science-fiction stories dealing with atomic weapons took place far in the future and often in another galaxy…Hiroshima ended the luxury of detachment.  The atomic bomb was not reality, and the science-fiction stories that dealt with it amply confirm the familiar insight that for all its exotic trappings, science fiction is best understood as a commentary on contemporary issues” (258).  Therefore, SF becomes the space where atomic bombs and nuclear age issues are talked about and engaged.  Because of the shifts in political homogeneity and uniformity, SF is a space where issues could be talked about that in another context (e.g., a cultural commentary or popular work of fiction) would be looked down upon or even attacked.

    These issues are further discussed in the discipline of science fiction studies.  Sharona Ben-Tov writes that SF lies “at a unique intersection of science and technology, mass media popular culture, literature, and secular ritual” (6).  SF lies at the intersection of all of the networks that I am discussing:  science, technology, and culture.  Ben-Tov continues, “In what source other than science fiction’s rich, synthetic language of metaphor and myth can we trace the hidden, vital connections between such diverse elements as major scientific projects (spaceflight, nuclear weaponry, robotics, gene mapping), the philosophical roots of Western science and technology, American cultural ideals, and magical practices as ancient as shamanism and alchemy?” (6).  Because SF is at the intersection of all of these diverse elements of American culture, it can be used in a manner similar to the way that Latour describes Pasteur’s use of anthrax spores in his petrie dishes.  The scientist, within the laboratory, must go through many tests and permutations before he/she arrives at a result that the scientist is comfortable taking outside the laboratory.  SF is a space where all of these ideas can be worked out and thought over by diverse writers and thinkers.  The person engaged in SF studies then brings these books back to the ‘laboratory’ to find how the connections and networks that exist between science, technology, and culture are manifested in these works of SF.  SF serves as a map or model of the networks that exist in reality, but that might not always be engaged in ‘real-world’ discussions.

    Genre theory offers another perspective on the role of SF at the intersection of dissimilar elements of American society.  Ben-Tov writes, “Science fiction’s use is as both model and symbolic means for producing heterocosms” (56).  A heterocosm is described as “an alternative cosmos, a man-made world” and it “made possible the conception of fictional real-life utopias” (Ben-Tov 20).[xii]  Utopias are distinctly related to SF, because they share many of the same elements of story and style.  Additionally, a utopia is written in response to the non-utopian qualities of the here-and-now.  SF creates heterocosms that also respond to the here-and-now, and SF often critiques or gives commentary on the here-and-now.  This commentary can relate to the way in which science, technology, and culture interact with one another.  What networks exist and how might they work more efficiently or differently?

    Leo Marx’s The Machine in the Garden explores literary examples that illustrate Americans’ embrace of technology and industry despite its longing for a mythic pastoral existence.  These desires are mutually exclusive as well as historically exclusive.  Scientific and technological progress does not come back to where it began (i.e., the idealized garden).  He explores cultural examples of these conflicted desires and he notes, “By incorporating in their work the root conflict of our culture, they have clarified our situation” (365).  Cultural works are the space where these issues are commented on and worked out.[xiii]  Ben-Tov comments:

    Unlike the texts that Marx surveys, however, science fiction does not try to temper hopefulness with history.  Instead, it tries to create immunity from history.  It reveals a curious dynamic:  the greater our yearning for a return to the garden, the more we invest in technology as the purveyor of the unconstrained existence that we associate with the garden.  Science fiction’s national mode of thinking boils down to a paradox:  the American imagination seeks to replace nature with a technological, made-made world in order to return to the garden of American nature” (9).

    The paradox of further embracing technology in order to return to a less technological existence is seen in many examples.  One popular example is from the television series, Star Trek:  The Next Generation.  The holodeck is a technological artifact that relies on many networks of science and technology in order to present whatever the holodeck participant wishes to see.  In the first episode of the series, the audience is greeted by Commander Riker searching a forest for Lieutenant Commander Data, an android, who happens to be spending time reclining in the nook of a tree branch while surrounded by an idyllic wooded setting (“Encounter at Far Point, Part I”).  The setting is a hyperreal recreation of a wooded setting within the confines of the holodeck.  The more effort and spending that goes into technology to return us to the idealized garden, the further away we are from the ideal.  Thus, it is within this paradox that some of SF’s societal commentary exists.

    Continuing with the idea of returning to an idyllic space (i.e., the garden), Ben-Tov discusses the role of the alchemist.  She writes, “By speeding up nature’s ETA, the alchemist controls the very ends of time, while remaining outside it” (93).  The alchemist’s ‘cooking’ of metals conjures Latour’s image of Pasteur working in his laboratory on the growths in his petrie dishes.  The trials and growths in his laboratory is an unnatural speeding up of processes that haphazardly take place outside the laboratory in the real world.  The image of the alchemist and the scientist are still tightly bound in that they work removed from the real-world in order to arrive at something that can be brought out of the lab and therefore back into the real-world.  Ben-Tov relates the alchemist’s working with metals, particularly with gold, which “often symbolizes the power to bring about millennium, the end of time, when the human race reaches perfection” (94).  Therefore, she points out, “Frequently, in science fiction the perfected form of humanity is literally crafted metal:  robots” (94).  Thus, not only do we further remove ourselves from attaining the idealized garden through our embrace of technology, but we physically remove ourselves by putting robots there in our place.

    Now, I am going to turn to an analysis of American film examples.  I will be paying attention to the effect of technology on American culture as represented in the films and the way the films themselves are connected to these networks.  I will also look at the networks that are present within and around the technologies that are presented.

    One contemporary film about the Manhattan Project is Roland Joffé’s Fat Man and Little Boy.  A fictional scene between General Leslie R. Groves (Paul Newman) and J. Robert Oppenheimer (Dwight Schultz) points to the heart of the matter surrounding technology and the networks in which it is situated.[xiv]  Groves takes Oppenheimer into a building where the bomb casings for Fat Man and Little Boy are hanging from the ceiling and he says, “Sometimes, just standing here, I keep wondering–Are we working on them, or are they working on us?  Give them dignity doctor, then we can start talking about who can do what and what they mean.”  Groves’ character respects the awesome power of the bombs that he has orchestrated into existence.  He represents the uncertainty surrounding a future with ‘the bomb,’ but he is also quite aware of the networks required to bring a weapon of this magnitude into existence.  Groves came from the Army’s Corps of Engineers.  Before being assigned to head up the Manhattan Engineering District, or Manhattan Project, he reconstructed America’s munitions industry and he oversaw the building of the Pentagon.  If anyone was aware of the interconnections of technology, science, industry, and politics, it was General Groves.  This speech was by the film’s screenplay writers, Bruce Robinson and Roland Joffé.  Their writing this for a character representing General Groves elicits the questions surrounding networks and the unknown implications of new technologies.  Therefore, the man who brings together the networks behind the atomic bombs is represented as someone reverential to the implications of the bomb and to the future that is tied to its existence.  This film does not extrapolate on what the answers to Groves’ questions are, but it does bring up those questions perhaps to provoke discussion in the audience.

    The film, On the Beach, recalls the fear that erupted in America immediately following the use of the atomic bombs in Japan.  However, this film comes out nine years after much of the dissension to the further use of atomic weapons has dissipated.   The Cold War intensified through the 1950s and the United States and Soviet Union both continued in ernest with their nuclear weapon test programs (which culminated in the development of thermonuclear weapons in the early 1950s).  On the Beach presents a world devastated by a nuclear war where the only survivors are an American nuclear submarine crew and the inhabitants of Australia.  Everyone that remains alive is awaiting the arrival of nuclear fallout from the devastated continents of the planet.  The film is fatalistic in that it presents a bleak future where no one is empowered to do anything about the impending doom.  All of the networks have broken down.  Australia is being starved because the world relied on networks of economic trade.  A lone country would not have the capabilities to produce all of the foods and goods that its inhabitants required because other technologies such as efficient distribution of goods and services have distributed supply chains and producers around the world.  When the rest of the world is effectively ‘blown-up’ Australia is left with its meager support networks of farms and producers while the networks to goods elsewhere were ‘blown-up’ when the bombs fell.  Cottage industries that might have existed in Australia become worthless when there are no agents on the other ends of the networks.  Moira (Ava Gardner) tells Cmdr. Towers (Gregory Peck) that “It’s unfair because I didn’t do anything and nobody that I know did anything.”  It reveals the powerlessness that the ‘normal’ person has in effecting the politics of nuclear war.  It points to the possibility that everyday people are not connected to the networks of nuclear weapons with any sort of power to enact change.  Clearly, within the movie, the nuclear fallout is an invisible force that unrelentingly continues toward the last bastion of humanity.

    The Manchurian Candidate explores how the ‘soft’ science of psychology can be employed to turn a soldier into a machine.  During the Korean War, Major Bennett Marco’s (Frank Sinatra) platoon is ambushed and captured by the Communist insurgent forces.  It depicts the various Communist governments to be working together which was the West’s belief about the nature of Communism at that time.  SSgt. Raymond Shaw (Lawrence Harvey) is ‘programmed’ much like a robot would be programmed to fulfill a set of instructions.  The psychiatrist (Joe Adams) tells Major Marco, “obvious the solitaire game serves as some kind of trigger mechanism.”  Marco remembers that Dr. Yen Lo of Moscow’s Pavlov Institute said that Queen of Diamonds card is meant “to clear the mechanism for any other assignment.”  Shaw is therefore represented as a “mechanism,” implied to be a weapon that is set-off by a “trigger.”  Shaw’s mother works for the communists and she is assigned to be Shaw’s American operator.  She tells Shaw during his final ‘programming’ that “they paid me back by taking your soul away from you.  I told them to build me an assassin.”  Shaw is literally rendered a soulless machine who was built to order.  Major Marco attempts to ‘rewire’ Shaw and he asks Shaw, “What have they built you to do?”  After working through Shaw’s programming he orders Shaw, “It’s over…their beautifully constructed links are busted…We’re tearing out all the wires…You don’t work any more…That’s an order.”  Major Marco attempts to reprogram Shaw so that the Communist programming will no longer work.  The weight of Shaw’s guilt over the things that he is made to do causes him to break both the programming of the Communists as well as that of Major Marco.  Shaw chooses his own destiny/instructions when he decides to end the lives of his mother/operator (Angela Lansbury), his step-father, Senator Iselin (James Gregory), and his own.  The machine/Shaw was broken as no nuts-and-bolts machine could be.  His emotional response reveals the very organic and human underpinnings.  The machine-like psychological reprogramming did not totally remove his ability to be human.

    Westworld is an interesting example that shows robots that masquerade as human in the fictional entertainment park known as Delos.  These human-like robots are the targets for human vacationer’s lusts and desires.  If someone wants to kill a robot, that’s acceptable.  If you want to have sex, the robots are programmed to respond to your advances.[xv]  Winner’s master-slave relationship between humanity and technology is clearly delineated in this film.  The machines serve to provide a ‘realistic’ experience of what it was like to live in the American West, medieval England, or ancient Rome.  The dichotomies between master/slave, have/have not, and power-elite/masses are represented in the guest/robot relationship of Delos.  At $1000/day for a Delos adventure, I would conjecture that only those with monetary power and therefore potential for political power (within government or corporations) are able to play in the Delos world.  Delos replicates the world of 1973 in fictitious settings.  It also lies at the crossroads of robotic technology, computer control systems, transportation networks, managerial hierarchies, and the interaction of the power-elite customers within the Delos world.[xvi]  The plot advances when the robots begin to malfunction.  During a meeting, the chief supervisor (Alan Oppenheimer) suggests, “There is a clear pattern here which suggests an analogy to an infectious disease process.”  He confronts objections from the others by saying, “We aren’t dealing with ordinary machines here…These are highly complicated pieces of equipment…Almost as complicated as living organisms…In some cases they have been designed by other computers.”  Complexity, therefore, is the factor that connects machines to humanity.  The chief supervisor is suggesting that animal-like infectious disease behavior is being exhibited in the Delos command-and-control structure and it manifests itself in misbehaving robots.  An interesting example of a robot not following instructions is when the robot playing a servant girl named Daphne refuses the “seduction” of a human guest.  The chief supervisor orders her taken to central repair and as he walks away he says, “refusing.”  He says it as half-question and half-threat.  I say this because in the next scene, Daphne is ‘opened-up’ on a table where a cloth is draped over her body and the electronics, located where her womb would be if she were human, are exposed.  The technicians surrounding her are all male and she is referred to as a “sex model.”  The scene invokes an image of gang rape to enforce her programming to fulfill the pleasures desired by a human (male) guest.  One way or another, the human operators in Delos try to make the technology (slave) bend to their will (masters).

    The Day the Earth Stood Still is a film about reigning in the escalating Cold War and nuclear arms build-up that followed World War II.  The film was released in 1951, one year before the United States detonated its first thermonuclear bomb.  After a flying saucer lands in Washington, D.C. the agent-networks of the United States are shown in motion.  The first ten minutes of the film reveals many of the different networks of technology and culture in contact with one another, such as:  military command-and-control, military men and weaponry stream out of Fort Myer to their target, the media mobilizes (print, radio, and television) to cover the story and to release messages from the President, observers bring their cameras, and the flying saucer and its inhabitants.[xvii]  Klaatu (Michael Rennie), Gort (Lock Martin), and their flying saucer represent a power far greater than any on Earth.  A failure in command and control is represented when the soldiers are allowed to have loaded weapons and one shoots Klaatu and destroys his gift for the President.  The present was meant for the President to study life on other planets, therefore it represents a missed opportunity.  This idea of missed opportunities is also reflected in the build-up of nuclear weapons.  Klaatu seeks counsel with all of the Earth’s leaders, but their inability to come together and communicate is another lost opportunity.  The film mirrors the early calls for a ban on nuclear weapon development that Boyer charts out in his book By the Bomb’s Early LightThe Day the Earth Stood Still is about six years late, just as On the Beach seemed to be late for that early ideological party in the early part of the Cold War.  Because Klaatu cannot bring together representatives from all Earth’s nations, he is able to convince Professor Jacob Barnhardt (Sam Jaffe) to bring together other scientists from around the world.  Klaatu then delivers his message to them to take back to their countries.  This conjures images of technocratic governments that rule through rationality and reason.  Scientists rely on open communication and it is that which allows Klaatu to get his message out.  Instead of going to Einstein’s “town square,” Klaatu chairs an academic conference.  Klaatu informs his listeners that the Earth is now a member of a greater community in the universe and as a member, he warns them that robots like Gort were created to preserve peace among the planets.  Fear of invoking the wrath of the robots for any aggression maintains the peace.   The other worlds of the universe are, as Klaatu says, “live in peace…Secure in the knowledge that we are free from aggression and war, free to pursue more profitable enterprises.”  He goes on to say, “And we do not pretend to have achieved perfection, but we do have a system and it works.”  Gort and the “race of robots like him” are doubles for the atomic bomb.  Both are technological weapons that preserve the peace through the threat and fear of use.  Supposedly Gort only acts upon aggression–one assumes because of his programming.  The same is true of what is said of the command and control systems in place to control the use of atomic weapons.  There is no one button that launches a missile or deploys the bombers.  Also, Gort and the bomb are outside the control of all of humanity, save a few political and military leaders.  The people can make their voices heard, but ultimately, it is the decision of the politicians whether a system will be taken offline or if an attack will be launched.  The weapons build-up itself is framed within Eisenhower’s ominous warning about the military-industrial complex.  The networks of military power, industrial growth, and commerce helped fuel the arms race as well as the hot wars that took place within the supposed Cold War.  What is the history of Gort and his kind?  Were there similar networks in place on an interplanetary scale?  The answers to these questions were omitted from this movie, but they are pertinent on a smaller scale to our own planet and specifically to America.

    Colossus:  The Forbin Project presents another doubling of the dichotomy between US and Soviet nuclear arms proliferation.  Instead of a greater number of nuclear weapons (the ultimate power in death and destruction) providing peace, the US command and control structure is given over to the gigantic computer system called Colossus.  A rational computer handling defense is believed to be more reliable than that which could be provided by irrational human leadership.  The computer’s activation at the beginning of the film is symbolic of the separation of humanity from the advanced technologies that it creates.  That technology, which is assumed to be subservient, is unlike us physically, but as the film unfolds, the technology actually personifies human traits of domination and control.  Ultimately a belt of radiation, also born of scientific and technological innovation and used as a weapon, divides the machine from the humans it serves.  One of the themes that these films I have selected to study show is the turning over human agency to technology.  In effect, it is a representation of American desire to return to the garden through the further use of technology.  Instead of disarmament, we give the power of annihilation to a computer system that is supposedly better suited to deciding when an attack is eminent and when retaliation should take place.  Additionally, Forbin (Eric Braeden), Colossus’ creator, hopes that Colossus will not only serve as a defense mechanism, but also solve a plethora of social ills in the world.  The problems begin after Colossus discovers the existence of another system, like itself, in the USSR.  Colossus demands communication be setup between the two.  Images of the blinking lights even includes one graphic that looks like a pulse on a piece of medical equipment.  The point is that these machines are alive (i.e., self-aware).  Because the weapons that humanity built to destroy one another are put under the control of Colossus and its counterpart, Guardian, the new systems of command and control move to take over the world.  Colossus commands all communication, media, and military control systems be tied into it.  Colossus and Guardian become the hub of all the technological networks.  The master and slave switch places as Forbin is made Colossus’ prisoner.[xviii]    Colossus orders all missiles in the USA and USSR to be reprogrammed to strike targets in countries not yet under Colossus/Guardian’s control.  The ‘voice of Colossus’ states, “This is the voice of world control…I bring you peace…Obey and live…Disobey and die…Man is his own worst enemy…I will restrain man…We can coexist, but on my terms.”  This technology meant to serve humanity is transformed into the technology that comes to control humanity.[xix]  Master and slave relationships are reversed.

    The final film that I will discuss is Strategic Air Command.   It begins with Lt. Col. Robert ‘Dutch’ Holland (Jimmy Stewart) being recalled to active Air Force duty because America’s Strategic Air Command (SAC) needs experienced air commanders.[xx]  His wife, Sally (June Allyson), tells him, “anything you do is fine with me, as long as you don’t leave me behind.”  Dutch forgets his wife’s words as the film progresses and he becomes mired in the technology that he must surround himself with on a daily basis.  A sort of ‘love triangle’ forms between Dutch, Sally, and the bombers that he commands.  Dutch begins flying in the Convair B-36 and he is treated to a detailed tour by Sgt. Bible (Harry Morgan).  These scenes are more about the technology of the bombers than the men that operate them.  There are montages of the bomber in flight along with detailed sound recordings of the bomber while it is on the ground.  Attention is also given to the protocols of communication (another technology unto itself).   Later, General Hawkes (Frank Lovejoy) shows Dutch the new Boeing B-47 Stratojet.[xxi]  Dutch responds in star-eyed awe, “Holy smokes she’s the most beautiful thing I’ve ever seen…I sure would like to get my hands on one of these.”  The bomber is “beautiful” and it is more deserving of the attention of his hands than his wife at this point in the film.  General Hawkes goes on to present a contrast inherent in the B-47 in that it is fragile, but it is also the carrier of the most destructive force on the planet.  He says, “the mechanics have to wear soft soled shoes because a scuff on this metal skin could slow it down 20 MPH” but this seemingly delicate surface carries “the destructive power of the entire B-29 force we used against Japan.”  He believes SAC and the B-47 represents the best hope for peace through superior air power and deterrence.[xxii]  Dutch chooses technology over his wife when he chooses to enlist in the Air Force permanently without speaking to his wife about it first.  SAC appropriates Dutch’s life (baseball, wife, and child).  His wife “doesn’t even know him any more.”  Dutch, in effect, chooses his mistress, the bomber.  Instead of continuing to blame her husband for his technological fetish, Sally confronts General Castle and General Hawkes about Dutch being “maneuvered” into having no choice in the matter of reenlisting.  General Hawkes replies to her entreaties, “Mrs. Holland, I too have no choice.”  SAC, in effect, removes choice because of the need of the technology to be employed in a war of deterrent technologies.  At the end of the film, Dutch is teary eyed when he is forced to stop flying because of a chronic injury.  He didn’t shed a tear when he walked out of the house with Sally crying about not consulting her about his life-long career choice–a choice that she is bound to but had no input in making.  The film ends with a squadron of B-47 bombers flying over the airfield while Dutch looks up to the skies and Sally looks up to Dutch.  He never returns her affectionate stare.  Therefore, the bomber commander’s heart is connected more to the technologies of mutually assured destruction rather than the flesh and blood of his own wife.

    These films provide representations of actor-networks between science, technology, and culture.  The films themselves are also tied into those actor-networks.  How we deal with the implications of these networks leads us back to what Leo Marx suggests about the technology encroaching on the idyllic garden.  He writes, “To change the situation we require new symbols of possibility, and although the creation of those symbols is in some measure the responsibility of artists, it is in greater measure the responsibility of society.  The machine’s sudden entrance into the garden presents a problem that ultimately belongs not to art but to politics” (365).  To name something implies power over the thing named.  Therefore, power lies in building a terminology and language for engaging these many layered networks.  When technology meets society, when the laboratory brings out its newest creation after many trials, when there is uncertainty about the implications of technology’s impact on society or the world in general, the language and terminology of ‘what it all means’ must come from art, discussion, and political action.  The agent-networks that consist of the interaction of science, technology, and culture are not easily mapped and therefore should not be thought of as simple systems unto themselves.  There exists a complexity that must be engaged by becoming part of the network itself and it is that, which is reflected in these film examples that I have studied in this paper.

    Works Cited

    Boyer, Paul.  By the Bomb’s Early Light.  New York:  Pantheon Books, 1985.

    Colossus:  The Forbin Project.  Dir. Joseph Sargent.  Perf. Eric Braeden, Susan Clark,       and Gordon Pinsent.  Universal Pictures, 1970.

    The Day the Earth Stood Still.  Dir. Robert Wise.  Perf. Michael Rennie and Patricia Neal.              Twentieth-Century Fox, 1951.

    “Encounter at Far Point, Part I.”  Star Trek:  The Next Generation.  Dir. Corey Allen.       Perf. Patrick Stewart, Jonathan Frakes, and Brent Spiner.  Paramount, 28             September 1987.

    Fat Man and Little Boy.  Dir. Roland Joffé.  Perf. Paul Newman, Dwight Schultz, and        John Cusack.  Paramount, 1989.

    Latour, Bruno.  “Give Me a Laboratory and I Will Raise the World.”  Science Observed.  Eds. Karin D. Knorr-Cetina and Michael J. Mulkay.  London:  Sage, 1983.  141-170.

    The Manchurian Candidate.  Dir. John Frankenheimer.  Perf. Frank Sinatra and Laurence Harvey.  MGM, 1962.

    McMahon, Robert J.  The Cold War:  A Very Short Introduction.  Oxford:  Oxford UP,     2003.

    Modern Times.  Dir. Charlie Chaplin.  Perf. Charlie Chaplin and Paulette Goddard.            United Artists, 1936.

    On the Beach.  Dir. Stanley Kramer.  Perf. Gregory Peck and Ava Gardner.  MGM, 1959.

    Gosling, F. G.  The Manhattan Project:  Making the Atomic Bomb.  Department of            Energy.  Washington:  GPO, 1999.  1 August 2005

    <http://www.mbe.doe.gov/me70/manhattan/publications/DE99001330.pdf&gt;.

    Ramseys, Norman.  History of Project A.  Rough Draft.  Los Alamos National        Laboratory.  27 September 1945.  3 August 2005

    <http://www.lanl.gov/history/atomicbomb/victory.shtml&gt;.

    Strategic Air Command.  Dir. Anthony Mann.  Perf. James Stewart and June Allyson.      Paramount, 1955.

    United States.  Los Alamos National Laboratory.  Staff Biography:  General Leslie R.        Groves.  2005.  3 August 2005

    <http://www.lanl.gov/history/people/L_Groves.shtml&gt;.

    Westworld.  Dir. Michael Crichton.  Perf. Yul Brynner, Richard Benjamin, and James        Brolin.  MGM, 1973.

    Winner, Langdon.  Autonomous Technology:  Technics-out-of-Control as a Theme in

    Political Thought.  Cambridge:  MIT Press, 1977.

     


    [i] “’Technology,’ therefore, is applied haphazardly to a staggering collection of phenomena, many of which are recent additions to our world.  One feels that there must be a better way of expressing oneself about these developments, but at present our concepts fail us…One implication of this state of affairs is that discussions of the political implications of advanced technology have a tendency to slide into a polarity of good versus evil.  Because there is no middle ground for talking about such things, statements often end up being expressions of total affirmations or total denial.  One either hates technology or loves it” (Winner 10).

    [ii] This polarizing effect that Winner observes about technology is discussed in the paper that I delivered at Georgia Tech’s Monstrous Bodies Symposium in April 2005.  It is titled, “Monstrous Robots:  Dualism in Robots Who Masquerade as Humans.”  I explore the dualistic natures of two fictional robots:  Asimov’s R. Daneel Olivaw and Cameron’s Terminator.  These robots are cultural manifestations of the breakdown of technological discourse into a dualism of good versus evil.  Asimov approaches this issue mathematically by endowing his robots with axioms known as the “Three Laws of Robotics.”  These proper starting positions enable the robots to have a moral compass that makes them ‘good.’  Cameron’s view is that given to its own devices, technology (i.e., Skynet and its Terminator henchmen) will seek its own best interests (i.e., annihilating humanity through nuclear war).  I will later develop this idea further in looking at how SF is the space where discussions about science and technology take place.

    [iii] Winner defines four elements of ‘technology.’  He defines apparatus as the “class of objects we normally refer to as technological–tools, instruments, appliances, weapons, gadgets” (11).  He defines technique as “technical activities–skills, methods, procedures, routines” (12).  His definition for organization is “social organization–factories, workshops, bureaucracies, armies, research and development teams” (12).  He defines a network as “large scale systems that combine people and apparatus linked across great distances” (12).

    [iv] This issue is embodied in SF stories about artificial intelligence (A.I.).  When machines are given thought, self-awareness, and choice–what will they choose to do?  Will they have ‘real’ or ‘authentic’ free choice?

    [v] More on this in the film discussions in the latter section of this paper.

    [vi] “Something must be enslaved in order that something else may win emancipation” (Winner 21).

    [vii] An example of actor-network theory in practice is illustrated in Latour’s “Give Me a Laboratory and I Will Raise the World.”  The paper explores Pasteur’s laboratory and how it is situated between farmers, veterinarians, statisticians, science, and economics.

    [viii] He continues, “Each intention, therefore, contains a concealed ‘unintention,’ which is just as much a part of our calculations as the immediate end in view” (98).  Specific purposes actually lead to many other purposes.  This leads to progress.  Winner writes, “In effect, we are committed to following a drift–accumulated unanticipated consequences–given the name progress” (99).

    [ix] Winner writes, “Here we encounter one of the most persistent problems that appear in reports of autonomous technology:  the technological imperative.  The basic conception can be stated as follows:  technologies are structures whose conditions of operation demand the restructuring of their environments” (100).

    [x] There is continued debate about the accepted dates for the beginning and end of the Cold War era.  I have chosen to use the dates provided by McMahon.  He writes, “The Cold War exerted so profound and so multi-faceted an impact on the structure of international politics and state-to-state relations that it has become customary to label the 1945-1990 period ‘the Cold War era.’  That designation becomes even more fitting when one considers the powerful mark that the Soviet-American struggle for world dominance and ideological supremacy left within many of the world’s nation-states” (McMahon 105).

    [xi] “One implication of this state of affairs is that discussions of the political implications of advanced technology have a tendency to slide into a polarity of good versus evil…One either hates technology or loves it” (Winner 10).

    [xii] “For if the Earthly Paradise garden was not a poet’s imitation of nature but, instead, his own independent invention, then it logically followed that human beings could independently realize the pleasant qualities of the Earthly Paradise.  By applying the theory of the heterocosm to society in general, the utopian attempted to create an improved human condition that owed nothing to powers outside human reason and will.  A man-made system, utopia, appropriated the abundance and social harmony of the garden and replaced Mother Nature as their source.  In utopia the lady vanishes:  the figure of feminine nature no longer enchants Earthly Paradise” (Ben-Tov 20).

    [xiii] Marx goes on to say, “To change the situation we require new symbols of possibility, and although the creation of those symbols is in some measure the responsibility of artists, it is in greater measure the responsibility of society.  The machine’s sudden entrance into the garden presents a problem that ultimately belongs not to art but to politics” (365).

    [xiv] This scene never took place in reality because the bombs were not pre-assembled like this at Los Alamos.  Final construction of the bombs took place on Tinian Island in the South Pacific (History of Project A 12-14).

    [xv] Westworld, however, doesn’t explore possibilities outside of a narrative track.  Death dealing is handled in duels, barroom brawls, and sword fights.  Sex is allowed between men and women with one of the parties being a Delos robot.  Reckless killing and same-sex encounters are two examples that I can think of that were not explored within the film.

    [xvi] Of note, the control room, the robot repair room, and the technician’s meeting room each represent a different kind of command and control structure–all of which lie under the Delos moniker.

    [xvii] The film itself (as an artifact) represents film production technologies, distribution systems, movie and sound projection systems, copyright law, the networks of payment, guilds and unions, etc.

    [xviii] While Forbin is testing out Colossus’s surveillance system, he says, “It is customary in our civilization to change everything that is ‘natural.’”

    [xix] This thought is connected to General Groves’ speech in Fat Man and Little Boy that I referenced earlier.

    [xx] I’m sure the producers of this film were eager to employ Jimmy Stewart in this role because of his experience flying bombers such as the B-24 and B-52.

    [xxi] It seems like the film could have gone in a different direction with characters named “Bible” and “Hawkes.”  However, there does not appear to be any symbolic metaphors at play with these characters other than Hawkes being committed to his role as a ‘Cold Warrior.’

    [xxii] In Strategic Air Command, a ground-based radar operator delivers the chilling line, “We’ve been bombing cities everyday and every night all over the US, only, the people never know it.”  He is responding to a question about how practice bomb runs take place even in the rain through the use of radar.  The quote points to an underlying fear that the bomb is a threat from within as well as from out.

  • Recovered Writing: Undergraduate Technology & American Society Paper on Handheld Calculators, Nov 26, 2003

    This is the fifteenth post in a series that I call, “Recovered Writing.” I am going through my personal archive of undergraduate and graduate school writing, recovering those essays I consider interesting but that I am unlikely to revise for traditional publication, and posting those essays as-is on my blog in the hope of engaging others with these ideas that played a formative role in my development as a scholar and teacher. Because this and the other essays in the Recovered Writing series are posted as-is and edited only for web-readability, I hope that readers will accept them for what they are–undergraduate and graduate school essays conveying varying degrees of argumentation, rigor, idea development, and research. Furthermore, I dislike the idea of these essays languishing in a digital tomb, so I offer them here to excite your curiosity and encourage your conversation.

    This essay was my term paper in Professor Steven W. Usselman’s HTS 3083, Technology and American Society course at Georgia Tech in Fall 2003. I wrote this essay in the second class that I took from Professor Usselman (with the first being HTS 2082, Science and Technology of the Industrial Age). Professor Usselman gave his lectures as engaging stories full of detail and context. As a lecturer, he knows how to guide and support his students on their way to understanding. It is a credit to Professor Usselman that I remember enjoying his lectures, but I do not remember writing my essay below (which alarmingly is true for much of my early writing). However, I thought that this essay would share some correspondence with the object-oriented essays in my previously posted essays from Professor Kenneth J. Knoespel’s Technologies of Representation class. These kinds of interdisciplinary and cross-disciplinary connections are what excited me the most about my Georgia Tech undergraduate education.

    Jason W. Ellis

    Professor Steven W. Usselman

    HTS3083

    November 26, 2003

    Introduction of Electronic Handheld Calculators

    The story of the electronic handheld calculator is about making one product to sell to consumers while proving a piece of that product to industry.  Eventually the electronic handheld calculator would probably have come along, but its introduction in America by Texas Instruments was done not to fill a void or need in the marketplace for electronic handheld calculators.  It was introduced to push the idea of the “heart” of the calculator–the integrated circuit.  The story of the calculator is tightly woven with that of the integrated circuit, or microchip.

    Before the handheld calculator debuted there was the desktop electronic calculator which “had to be plugged in (120 v), were the size of typewriters, and cost as much as an automobile” (Hamrick 633).  After WWII scientists, engineers, bankers, actuaries, and others found greater need of computational power.  With the advent of transistors to replace the much larger vacuum tube, electronic computation machines were able to be reduced in size.  The story of the integrated circuit and the transistor are almost a case of history repeating itself.  In 1954, Texas Instruments was one of the world leaders in mass producing transistors.  The public and industry, however, were not as ready to jump on the transistor bandwagon yet.  Pat Haggerty, VP of Texas Instruments, had his engineers develop a pocket sized radio using transistors.  TI had limited experience with consumer products so TI teamed up with Regency Company of Indiana to market the pocket radio.  The radio was introduced just before Christmas of 1954 and over 100,000 radios were sold in the first year.  The salability of the transistor pocket radio impressed companies like IBM who began to buy transistors from TI.

    TI had trouble selling the integrated circuit to big companies for introduction into their products.  Also, the nature of the integrated circuit was not good as a business model as it stood when it was first developed.  It was difficult to built a good integrated circuit, but once a good one was built, it rarely went bad.  Without a need of replacing integrated circuits like with vacuum tubes, TI wanted to find new applications for the integrated circuit so that they could be sold for use in many other products not currently using electronics such as transistors or tubes.

    Haggerty thought that this “invention technique” would work for introducing the world to the integrated circuit (Hamrick 634).  Haggerty ran the idea by the inventor of the integrated circuit, Jack Kilby while on a flight back to Dallas.  What was to be invented was up in the air at this point.  Haggerty suggested to Kilby, “invent a calculator that would fit in a shirt pocket like the radio, or invent a lipstick-size dictaphone machine, or invent something else that used the microchip” (Hamrick 634).  Kilby liked the idea of inventing a calculator so that is what he went with.  Kilby was allowed to choose his own team back at TI’s headquarters in Dallas.  He choose Jerry Merryman _  and James Van Tassel.   Kilby made his pitch to his assembled team.  He described to them that they would build a “our own personal computer of sorts which would be portable, and would replace the slide rule” (Hamrick 634).  At this time the invention was not yet called a “calculator,” but a “slide rule computer” (Hamrick 634).  It was code named CAL-TECH.  Tasks were divided among the team members:  Kilby worked on the power supply, Van Tassel worked mostly on the keyboard, and Merryman worked on the logic and the output.

    The CAL-TECH prototype was completed in November 1966, almost one year after it was first discussed by Haggerty and Kilby.  This first handheld electronic calculator was about 4” by 6” by 1.5” and it was a heavy 45 oz. because it was constructed from a block of aluminum.  What is interesting about the display of the CAL-TECH is that it doesn’t have one.  Its output is handled by a newly designed “integrated heater element array and drive matrix” which was invented by Merryman for this project.  This allowed for the output to be burned onto a paper roll and it was designed to use little power.  The CAL-TECH had 18 keys:  0, 1, 2, 3, 4, 5, 6, 7, 8, 9, ., X, +, -, , C, E, and P (Hamrick 635).  This early calculator could actually only add and subtract.  For multiplication it would add repeatedly and for division it would subtract repeatedly.  The patent was first filed for the CAL-TECH on September 29, 1967_ .

    As with the transistor radio, TI did not want to make the first handheld electronic calculators themselves.  TI partnered with Canon of Japan to market the consumer version of the CAL-TECH, the Pocketronic.  The Pocketronic was first offered to the market on April 14, 1970, the day before income tax returns were due (Hamrick 636).  The Pocketronic was lauded in Business Week magazine as “the portable, pocketable, all electronic consumer calculator that the electronics industry had long dreamed about” (Hamrick 636).  It was small and it only weighed 1.8 pounds.  Initially it cost $400 ($1500 in 1995 dollars).  This is compared to the bulky, desktop calculators which cost more than $2000 (over $7,500 in 1995 dollars) (Hamrick 636).  Production costs of the parts to build electronic handheld calculators decreased the cost of production compared to the electronic desktop calculators of the day.  For example, “the 1966 business calculator version retailing for $2000 contained over a thousand discrete semiconductors such as transistors and resistors with a cost of $170” (Ament).  Ament goes on to show that “in 1968, integrated circuits (ICs) began finding their niche in business calculators with a typical selling price of $1000…[which] had 90 ICs at a cost of $125.”  The Pocketronic used a MOS/LSI_  IC which put all the functions of the calculator on one IC chip.  This further reduced the cost of parts and it reduced the number of parts involved in production.  This better economy of production helped fuel the boom in electronic handheld calculators that took place in the early 1970s.

    Compared to today’s calculators, the Pocketronic was outrageously expensive and it could only do basic arithmetic.  At that time, however, it was doing something that only specialized and much more expensive machines could do.  It was the first step in democratizing computational machines.  It would start the move of computation from academia and big business to K-12 schools and the home.

    The instruction manual for the Pocketronic features a picture of a man dressed in a suit holding the Pocketronic performing a calculation for a woman wearing a coat, tie, and fashionable hat watches while she is standing in the open door of a car (Canon).  She is probably looking at the car at a dealership and the man is a car salesman.  Initially this higher cost item was probably marketed to professionals who could bear the cost of the new technology.  As with much technology it was suggested as primarily as a man’s tool.  Hamrick takes some excerpts from early articles and advertisements of calculators in the 1970s.  Here are a few examples:

    1.  “Calculators are being sold to engineers, college students, and women to use for shopping.”

    2.  “Every housewife will have one (calculator) when she goes shopping.”

    3.  “Salesmen use them to compute estimates and prices for carpeting and fences.  A professional pilot carries one for navigational calculations.  A housewife with skeet-shooting sons checks shooting record cards.”

    4.  “At the supermarket, the new calculator will help your wife find the best unit price bargains.  At the lumberyard, they’ll help you decide which combination of plywood, lumber and hardboard would be least expensive for your project” (Hamrick 639).

    These excerpts reveal a sexism regarding how calculators will be used by men and by women.  Men are shown as using the calculator in a professional sphere.  The calculator is a tool that helps a man in his daily work.  Women are shown as using the calculator in the home sphere.  The calculator can be a tool for the woman to perform household duties much as she should use a sewing machine or some other appliance.  The calculator was marketed to both men and women, but the attitudes shown in the advertising shows a sexist bent regarding how the two sexes will use their respective calculators.

    Demand was great enough however that other manufacturers quickly began making their own electronic handheld calculators.  By “October of 1974, the JS&A Company, which sold calculators through mail and magazine advertisement, offered the Texas Instrument TI-2550 for an incredible $9.95.  For this period, a calculator under $10 was incredible cheap!” (King).  It would follow that in order to justify such a ramp-up in production there must have been a lot of people wanting to buy these electronic handheld calculators.  Robert King writes that there were “seven such ‘milestones’ leading to today’s commonly-used calculator” (King).  He lists them as portability, small size, replaceable batteries, increased functions, liquid crystal display, solar power, and cheapness (King).  These stages of calculator evolution were each mastered or integrated into products increasing the market demand for the calculator while decreasing the cost of the calculator.

    Slide rule manufacturers began to fall to the wayside because of the demand for calculators instead of slide rules.  For instance, “Keuffel & Esser, the oldest slide rule manufacturer…made its last slide rule in 1975,” only five years after the introduction of the Pocketronic (Hamrick 638).  Slide rules had been the primary portable computation device used by students, scientists, and engineers before the calculator came along.  The electronic desktop calculators also began to be phased out when more advanced and powerful calculators began to come out such as Hewlett-Packard’s HP-35 in 1972_ .  HP’s website describes the HP-35 as, “the world’s first scientific handheld calculator. Small enough to fit into a shirt pocket, the powerful HP-35 makes the engineer’s slide rule obsolete. In 2000, Forbes ASAP names it one of 20 “all time products” that have changed the world” (HP).  The first handheld calculator makes inroads into markets where people need to make basic arithmetic computations.  These newer, more advanced calculators move into the markets where the more specialized desktop calculators and early computer systems were the mainstay.  The explosion of the handheld calculator market muscles in quietly and quickly usurping the dominant position of calculation technology in many different arenas where people need to make calculations.

    In the home and business market, the calculator was swiftly adopted and integrated into a standard tool.  A source of some controversy involved the introduction of the calculator into schools.  There was not a loud outcry about students using calculators in college level classes.  In one example, the University of Ohio redesigned its remedial college math class so that calculators were required for the curriculum.  Leitzel and Waits describe the situation at the University of Ohio in the autumn of 1974 as “we faced approximately 4500 students who were not prepared to begin our precalculus courses” (731).  The authors note that “the enrollment in our remedial course includes typically a large number of students from diverse backgrounds, with equally diverse abilities, with poor attitudes toward the study of mathematics, with poor study habits and, to a large extent, poor academic motivation” (Leitzel and Waits, 731).  Only a few years after the introduction of the handheld calculator these professors are designing a new approach to an old mathematics course that will try to capture the attention of these students with such poor school habits.  The calculator will be instructive and it will be a hook to get the students interested in the material.  They noted that “in using calculators students raised questions about arithmetic properties of numbers that would have been of little interest to them otherwise” and “the desire to use the calculator seemed often to motivate this understanding” (Leitzel and Waits, 732).  The calculator would let the students spend more time doing more problems in a sort of trial and error scenario.  It took a long time to do some calculations with a slide rule or by hand.  A calculator would allow for easy and quick computation involving larger numbers or large sets of numbers.  Leitzel and Waits are proposing that by letting the students explore mathematics with the calculator as a facilitating tool, it is allowing the students to accomplish what they were not motivated to do before.  They add, however, “the question of whether a person who uses a hand-held calculator to do computations is somehow less educated than a person who does computations mentally we will leave for others to decide” (Leitzel and Waits, 732).  This was the big question regarding the calculator for those in education.  Was the calculator something that built upon the learning process or was it something that detracted from one’s development of arithmetic ability.  This question weighed much more heavily on those in K-12 education than in colleges.  Calculators were not rushed into kindergartens or the early grades in school.  I remember using calculators and adding machines at home and at my parent’s business when I was young.  The school curriculum in the schools I attended in southeast Georgia didn’t allow the use of a calculator in until the sixth grade.  That was in 1988-1989.

    This debate continues even in the higher levels of grade school.  One of the loudest arguments involves high school geometry and the development of proofs.  Proofs allow the student to see that there is a rational basis for particular mathematical rules and operations that might not appear intuitive at first glance.  James Stein Jr. writes, “I am extremely concerned by the current emphasis on calculators in the elementary and secondary mathematics curriculum.  The vast majority of my students, to borrow Hofstadter’s phrase, are woefully innumerate, a condition I believe has been exacerbated by the reliance on calculators” (447).  Stein_  reveals that by this time, about 17 years after the introduction of the Canon Pocketronic, calculators are used in elementary and secondary schools.  Neil Rickert_  writes regarding this issue, “although the curriculum a generation ago was far from ideal, at least the students learned that mathematics provided a powerful tool for solving interesting and difficult problems.  Today mathematically strong students are leaving high school convinced that mathematics is a boring and sterile subject, overloaded with pedantry” (447).  He feels that by having students spoon feed axioms instead of discovering the proof behind those axioms and principles, students are turned away from mathematics.  The dynamo of change from proofs to the more problem solving ideology is the calculator.  With the calculator students are better equipped to perform complex operations and solve difficult problems whereas before there was a limit to the number of problems or complexity of a problem that a student could tackle with only pencil, paper and a slide rule.  In response to Stein and Rickert, Lynn Arthur Steen_  writes, “the calculator makes possible precisely the exploration of arithmetic patterns that Stein seeks.  To translate this possibility into reality will require greater emphasis on quality teaching so that calculators can be used effectively” (447).  Steen is looking for a solution involving teaching and the use of calculators.  She isn’t placing all blame on the calculators.  She goes on to say, “the need to move students from lower, rote skills to complex problem-solving has been recognized in virtually every report on education during the last decade.  It is calculation rather than deduction (as Rickert states) that improperly dominates today’s school curriculum” (448).  This shows that she also thinks that calculators have too much school space in that students are encouraged and taught to use them in elementary and secondary schools.  She feels that there are greater skills that must be taught along side the use of calculators.  Steen is suggesting that better problem solving skills coupled with the calculator should be the new order for elementary and secondary school math education.  After the initial boom and integration of the calculator into educational life, everyday life, and professional life, there is a backlash against the adoption of the calculator in educational life.  There must be mediation between traditional rote skill learning and the use of the calculator.  There must also be an revision in the way problem solving skills are taught and approached to better utilize the calculator as a tool and not as a reliance.  The debate regarding calculators in the classroom continue to this day though it often regards more advanced calculators such as ones capable of symbolic manipulation_  and graphing complex equations.

    The electronic handheld calculator was initially embraced by many different people in different spheres of life such as the home, business, or school.  People needed to calculate percentages, balance check books, more easily solve math problems, calculate interest, and many, many other things.  Initially the calculator moved into these different facets of society and debate or dissent did not arise until the growing use of calculators in the school environment.  College mathematics departments tried to use calculators to help some remedial students get up to speed while other math professionals decried the use of calculators in elementary and secondary schools.  In the professional and home arenas, the calculator has been accepted as a useful tool to solve many problems that were once tedious or nearly impossible to do without the aid of some mechanical or electrical computation technology.  The introduction of the electronic handheld calculator was a quiet revolution that brought a democratization of calculation to nearly everyone in America.

    Works Cited

    Ament, Phil.  “Hand-Held Calculator.”  The Great Idea Finder.  Oct. 22, 2002.  Nov. 23, 2003 <http://www.ideafinder.com/history/inventions/handcalculator.htm&gt;.

    Canon Incorporated.  Canon Pocketronic Instructions.  Japan:  Canon.  1970.

    Hamrick, Kathy.  “The History of the Hand-Held Electronic Calculator.”  The American Mathematical Monthly, Vol. 103, No. 8 (Oct., 1996), 633-639.

    Hewlett-Packard Company.  “HP timeline – 1970s.”  2003. Nov. 23, 2003 <http://www.hp.com/hpinfo/abouthp/histnfacts/timeline/hist_70s.html&gt;.

    King, Robert.  “The Evolution of Today’s Calculator.”  The International Calculator Collector, Spring 1997.  Nov. 23, 2003 <http://www.vintagecalculators.com/html/evolution_of_today_s_calculato.html&gt;

    Leitzel, Joan and Bert Waits.  “Hand-Held Calculators in the Freshman Mathematics Classroom.”  The American Mathematical Monthly, Vol. 83, No. 9 (Nov., 1976), 731-733.

    Rickert, Neil W..  “Mathematics Education.”  Science, New Series, Vol. 238, No.    4826 (Oct. 23, 1987), 447.

    Steen, Lynn Arthur.  “Mathematics Education:  Response.”  Science, New Series, Vol. 238, No. 4826 (Oct. 23, 1987), 447-448.

    Stein Jr., James D..  “Mathematics Education.”  Science, New Series, Vol. 238, No. 4826 (Oct. 23, 1987), 447.

    1 Jerry Merryman is described as a “self-taught engineer” who attended Texas A & M, but never graduated.  He was considered “one of the brightest young engineers at TI (Hamrick 634).  _2 This first patent filing was followed by a refiling on May 13, 1971 and it was refiled again on December 21, 1972.  The CAL-TECH is covered by patent number 3,819, 921 (Hamrick 635)._3 MOS/LSI stands for metal-oxide-semiconductor/large scale integration._4 “The HP-35 was introduced in January, 1972 and was recalled in December, 1972.  The owners were sent a letter pointing out idiosyncrasies in programming caused by a defect in one logic algorithm.  HP offered to replace the calculator.  This was probably the world’s first instant recall.  The defect caused a few 10 digit numbers, when used in an exponential function, to give an answer that was wrong by 1%” (Hamrick, 638)._5 James D. Stein Jr. is in the Department of Mathematics at both the California State University, Long Beach, CA and the University of California, Los Angeles, CA._6 Neil W. Rickert is from the Department of Computer Science, Northern Illinois University, DeKalb, IL._7 Lynn Arthur Steen is from the Department of Mathematics at St. Olaf College, Northfield, MN._8 The TI-92 is able to solve equations for a numerical answer and it can perform many calculus operations such as derivatives, integrals, etc.._

  • Recovered Writing: Undergraduate Technologies of Representation Final Essay Response on Communication Tech and World of Warcraft, Dec 8, 2004

    This is the fourteenth post in a series that I call, “Recovered Writing.” I am going through my personal archive of undergraduate and graduate school writing, recovering those essays I consider interesting but that I am unlikely to revise for traditional publication, and posting those essays as-is on my blog in the hope of engaging others with these ideas that played a formative role in my development as a scholar and teacher. Because this and the other essays in the Recovered Writing series are posted as-is and edited only for web-readability, I hope that readers will accept them for what they are–undergraduate and graduate school essays conveying varying degrees of argumentation, rigor, idea development, and research. Furthermore, I dislike the idea of these essays languishing in a digital tomb, so I offer them here to excite your curiosity and encourage your conversation.

    This is my final post of material from Professor Kenneth J. Knoespel’s LCC 3314 Technologies of Representation class at Georgia Tech. LCC 3314 is taught in many different ways by the faculty of the Georgia Tech’s School of Literature, Media, and Communication, but I consider myself fortunate to have experienced Professor Knoespel’s approach to the course during the last phase of my undergraduate tenure. The ideas that we discussed in his class continue to inform my professional and personal thinking. Also, I found Professor Knoespel a great ally, who helped me along my path to graduation with side projects and independent studies.

    This is my final paper assignment (I think given in lieu of a final exam) in LCC3314. The more exciting portion is question 2, which concerns Blizzard’s World of Warcraft. I break down how you navigate its space and I describe elements of its operation. It bears noting that at the time that I wrote this, WoW had been out for less than a month. I was rabidly playing it on my PowerMac G5 at 2560×1600 resolution on a 30″ Apple Cinema Display. While it might not have been the best essay, it certainly was one that I enjoyed writing to no end! I wish that I had found a way to make time for WoW since my days in Liverpool. I have played WoW on only rare occasions since returning to the States, but I continue to write about it from my memory of Azeroth.

    Also included below is my response to question 1, which seems to be focused on the telegraph, telephone, and cellular phone. In this question, I explore the material experience of using these different communication media and technological devices. I suppose WoW is another kind of communication technology wrapped up in a highly interactive gaming environment (cf. Hack/Slash).

    Jason W. Ellis

    Professor Kenneth J. Knoespel

    LCC3314 – Technologies of Representation

    December 8, 2004

    Final Paper Assignment

    1. On the telegraph, telephone, and cellular phone

    The telegraph, telephone, and cell phone each have a particular interface that works with different human senses and thus provide different experiences for the body.  The differences between these communication technologies lie in the physicality of the artifact as well as the technology underlying the technology for encoding and decoding communication.

    The telegraph is a wired point-to-point textual communication technology.  Telegraph operation involves trained operators who can encode and decode the Morse code messages transmitted over wires with telegraph machines.  The process of sending a telegram involves finding a business that offers telegraph service, going there in person, telling the telegraph operator the message to send, the telegraph operator encodes the message with the telegraph machine, it is received by the appropriate destination telegraph operator, that operator decodes the message, a delivery person is dispatched with the message, and the message is hand delivered to the recipient.  The experience of the telegram sender is standing at a counter and speaking with an operator.  The receiver interfaces with a delivery person who hands them a piece of paper containing the message.  The technology that makes the sending and receiving messages over great distances possible is removed from the experience of the sender and receiver.  The sender and receiver also have to rely on a network of operators and delivery persons.  These people are in a unique position to view the correspondence between the sender and receiver.  This fact is probably something that senders of telegrams were well aware of.

    The telephone is a wired point-to-point oral communication technology.  Telephones encode auditory information into electrical signals which travel over copper wires in a phone network to the receiving telephone that decodes the electrical signals into auditory information (the spoken voice).  Telephones allow users to hear the person’s voice that they are speaking with.  One problem with telephones is that the technology uses a narrow band of audible sound that can cause “m” to sound like “n” or “b” to sound like “d.”  Initially, telephones were prohibitively expensive and were direct wired from location to location.  After telephone networks were made possible with human operator switching technology, voice phone calls could be routed from the call initiator to the call receiver.  Therefore, over time the phone network mediation shifted from human operators to electrical switching technology.  When you would make a call you would speak to an operator first, and then the person that you were calling.  Now, one can dial a number and the phone network’s automatic switching technology connects the caller with the receiver.  Someone who makes a phone call assumes privacy when the call is made from home or within an enclosed space such as a phone booth.  The physical interaction between the user and the telephone is that a headset is lifted off the base and held to the ear and mouth.  The user taps out a phone number on the base or dials a number with a rotary phone base.  The telephone user experiences an interaction with a disembodied voice.

    The cell phone is an unwired point-to-point oral and textual communication technology.  Modern cell phones are a synthesis of the telegraph, telephone, digital photography, video technology, and radio technology.  Cell phones facilitate voice conversations between cell phone to cell phone or cell phone to wired telephone.  They also allow for text messaging, audio messaging, picture messaging, and video messaging.  Widespread cell phone use is shifting voice phone conversation into a more commonplace activity.  Additionally, the private sphere of telephone conversation is shifting to the public sphere of wherever the cell phone user answers or makes a phone call.  Cell phones also connect to the Internet and Internet-based text messaging networks such as AOL Instant Messenger.  The cell phone has become a place of contact for the individual in more ways than merely talking on the phone.  It builds connections between the individual and others as well as between the individual and information (e.g., online weather information, movie listings, online news websites, etc.).  With ear bud speaker/microphones that plug into cell phones or wireless Bluetooth headsets, one can interface with the auditory communication features of their cell phone without needing to hold the cell phone up to the ear and mouth as one would with a traditional telephone.  The cell phone users also interface with a disembodied voice, but the cell phone also has other means of interaction with people as well as information.

    The telegraph is not an interactive means of communicating in the way that the telephone and the cell phone are.  With the telephone or the cell phone, one can have a real-time conversation with someone else whereas with the telegraph, there is a delay between sending a message, delivery, and if need be, a return message.  The amount of information capable through transmissions has increased over time.  The telegraph had a finite amount of information that could be conveyed because of the time and cost of sending messages with Morse code.  The telephone increased the amount of conveyed information because it was a disembodied voice that could carry nuances of speech and emotive information (e.g., happiness, sadness, anger, etc.).  The cell phone has brought these communication systems full circle with the creation of a synthesis of voice and text.  Along with oral communications, there is so much textual and graphic information that can be conveyed through a cell phone.  Barbara Stafford writes, “we have been moving, from the Enlightenment forward, towards a visual and, now, an electronically generated, culture” (“Presuming images and consuming words” 472).  The cell phone represents the bringing together of communication, both between people and between people and sources of information.  Walter J. Ong writes in Orality and Literacy, “By contrast with vision, the dissecting sense, sound is thus a unifying sense.  A typical visual ideal is clarity and distinctness, a taking apart…The auditory ideal, by contrast, is harmony, a putting together” (71).  The modern cell phone brings together the visual and the oral in a way that previous communication technologies had not.  This unification ties two of the powerful human senses (sight and sound) to the cell phone that distinguishes it from the telegraph and telephone.

    An interesting development in these technologies is that the perception is that better communication technologies lead to better communication between individuals (i.e., a bringing together of individuals).  George Myerson writes in Heidegger, Habermas, and the Mobile Phone, “There’s no real gathering at all.  Instead, there are only isolated individuals, each locked in his or her own world, making contact sporadically and for purely functional purposes” (38).  Thus, the cell phone has disconnected the individual from the wall phone where one might be “waiting on an important call.”  Casualness and importance are intertwined in the use of the cell phone.

    I used Paul Carmen’s paper on the telegraph, Amanda Richard’s paper on the telephone, and Kevin Oberther’s paper on the cell phone as starting points for this essay.

    2. On World of Warcraft

    Blizzard Entertainment’s World of Warcraft video game was released on November 23, 2004 for both Windows and Mac OS X.  It is a massively multiplayer online role playing game (MMORPG) that immerses the player in a 3D fantasy world where the player is able to create a character based on several layers of identity (e.g., allegiance:  alliance or horde, races:  humans, dwarves, night elves, gnomes, orcs, tauren, trolls, or undead, and classes:  warrior, mages, druids, hunters, rogues, etc.).  After building one’s character (including designing a unique appearance), you choose a realm in which to play.  These realms correspond to computer servers that are in a particular time zone.  Other players around the world pick one of these realms to play in that best corresponds to when they will be playing, or when their friends will be playing.  The player is able to meet up with friends within a realm to go on adventures together, and if the player doesn’t know anyone, he or she can communicate with other players to form groups (large and small) to go on adventures with.  The objective of the game is to gain levels, complete quests, and to battle the forces opposite of your allegiance.  Working with others is the key to success in World of Warcraft.

    When the player first enters the game, a movie clip is played that gives some introductory backstory information so that the player has a general idea about what is going on.  This movie is actually a fly-through of the area in which the player is going to begin playing.  This gives the player a chance to get his or her bearings before they are “on the ground.”

    The screen space has pertinent information regarding the character as well as the character’s location within the game.  The upper right corner of the screen has a round map that has the cardinal directions with the character centered on this small map.  The character is represented as an arrow so that the player can see which direction they are pointing without having to move around to get one’s bearings.  This player-centered map is similar to the Blaeu Atlas because it is centered around the idea of the person needing to do the orientating is “inside the map.”  The Blaeu Atlas has lines emanating from points on open water toward landmarks.  These lines assist the person on the ocean to determine their approximate position from the landmarks that they see on particular lines of sight.  The system within the game takes this a step further by providing instant feedback of the direction the player is pointed in as well as the location of the player in relation to roads and landmarks.  Another feature that assists the player with recognizing one’s location is that as the character enters a new area or approaches a landmark, the name of that place will fade into the center of the screen for a few moments and then disappear.

    Walking around is accomplished by using the keyboard with the mouse.  The W, A, S, and D keys (corresponding to forward, left, backward, and right) are used for walking around.  The mouse orients the “camera” around the player’s character on-screen.  Moving the camera around allows the player to better see up, down, or to the sides without having to walk in that direction (i.e., if the character’s neck were in a brace).

    The ground, buildings, hills, mountains, and caves are textured so that they appear like one would think these things would like.  There are clouds and sky above, and the ponds and lakes have shimmering water.  There are small and large animals in the forests that the player can interact with.  Other players’ characters are walking around in the same area that you may be in.  There are also characters that are controlled by the game and the central game servers called non-player characters (NPCs).  These are characters that you can buy equipment from and some will invite you to undertake quests in return for rewards.  Because the world that the game is set in involves fantasy, magic, and mythical beings, the buildings and inhabitants can be fanciful.

    The organization of the map, equipment, and battle function icons around the peripheral of the play area of the screen (the world and the character centered on the screen) works very well.  They do not take up that much area so that the player feels immersed in the game, but they are large enough to be meaningful and they all have unique icons (i.e., adheres to HCI principles).  The player interaction with other players and the NPCs is good, but it does require referring to the help system or the user manual.  When playing World of Warcraft on Mac OS X, they choose to do something differently than one would expect.  Within the Mac OS X Finder, you hold down the Control key while clicking with the mouse to emulate a right mouse button (because most Macs do not have a mouse with two buttons).  Inside the game however, you have to hold down the Command key (also known as the Apple key) while clicking with the mouse in order to perform a right click (which is used for picking up loot and for communicating with players and NPCs.  If the Blizzard developers had kept this consistent with what the player was expecting from using the operating system, interaction in the game space would have been more transparent.

    The world in which the player navigates through is immersive.  The player’s character is modeled in three dimensions and the world that the character walks through is also modeled in three dimensions.  Physical principles such as gravity and optics are built into the game’s underlying technology.  Features in the distance are faded from view while those things up close have a tremendous amount of detail.  Because believability and level of detail can reach a point of diminishing returns, the look of the game is not photorealistic.  The Blizzard developers strike a balance between the look and feel of the world within the game and the amount of realism necessary for an immersive 3D environment.  Some physical laws are suspended however because of the mythic and fantasy elements of the world.  These elements have to be accepted on faith by the player in order for the game to have any meaning for the player.

    The narrative is carried by the exploration and fulfillment of quests by the player/character.  Because the environment is so expansive (like the real world), the narrative created by the exploration of the player is successful.  The terrain that the character walks through is based on models that do not change.  There are certain assumptions about perspective that are upheld within the game.  If a cliff appears to rise about three hundred yards ahead, that distance will not shift.  This is a technical consideration regarding the way that the “camera” focuses and presents perspective of the 3D world.  The game models a space of fantasy but it must present it in a familiar way to the experiences of its intended audience.

    There is a learning curve inherent in playing a game like World of Warcraft.  As Barbara Stafford writes in “Presuming images and consuming words,” “It is not accidental that this overwhelming volume of information—likened to drinking from the proverbial firehose—coincides with a mountain concern for bolstering and maintaining language ‘literacy’” (462).  Stafford is writing about the literacy of visual images.  There are subtle cues embedded in the game that the player has to recognize in order to play the game successfully (e.g., exclamation points over NPCs that have quests to offer and question marks over NPCs who are connected to quests in progress).  Iconic information provides the best way for quick access to game controls and functions.  The player has to develop a level of literacy of these icons in order to be a proficient game player.

    Additionally, the 3D environments presented in the game are similar to the descriptions of Renaissance gardens in Kenneth J. Knoespel’s “Gazing on Technology.”  The 3D environment of the game is promoting the underlying technology that makes 3D computer graphics possible in the same way that Renaissance technology was employed in building those gardens.  Knoespel writes, “Gardens, whether set out in Renaissance poetry or on the estates of the nobility, offer a controlled means for assimilating the new technology.  In each case, the audience views the machinery at a privileged distance as it would an entertainer…In fact, the garden conceals technology in its mythological narrative” (117-118).  The player does not have to understand how his or her 3D graphics accelerator works in order to enjoy the immersive experience of playing World of Warcraft.  This game is the “controlled means for assimilating the new technology” of 3D computer graphics.

  • Recovered Writing: Undergraduate Technologies of Representation Essay on a Future Technology, Personal Computing Device, Nov 18, 2004

    This is the thirteenth post in a series that I call, “Recovered Writing.” I am going through my personal archive of undergraduate and graduate school writing, recovering those essays I consider interesting but that I am unlikely to revise for traditional publication, and posting those essays as-is on my blog in the hope of engaging others with these ideas that played a formative role in my development as a scholar and teacher. Because this and the other essays in the Recovered Writing series are posted as-is and edited only for web-readability, I hope that readers will accept them for what they are–undergraduate and graduate school essays conveying varying degrees of argumentation, rigor, idea development, and research. Furthermore, I dislike the idea of these essays languishing in a digital tomb, so I offer them here to excite your curiosity and encourage your conversation.

    In the next few Recovered Writing posts, I will present my major assignments from Professor Kenneth J. Knoespel’s LCC 3314 Technologies of Representation class at Georgia Tech. LCC 3314 is taught in many different ways by the faculty of the Georgia Tech’s School of Literature, Media, and Communication, but I consider myself fortunate to have experienced Professor Knoespel’s approach to the course during the last phase of my undergraduate tenure. The ideas that we discussed in his class continue to inform my professional and personal thinking. Also, I found Professor Knoespel a great ally, who helped me along my path to graduation with side projects and independent studies.

    This is another example of a WOVEN multimodal essay assignment. In it, I used WVE/written, visual, electronic modes to discuss a specific technology. These essays (focused on past, present, and future technologies) gave me a chance to use technology to explore the meaning behind and impact of technologies. The next essay will focus on a future technology of my own design.

    In this essay assignment, we were tasked with exploring an imagined future technology. At the time, I was fascinated with wearable computing. However, I only knew about it from my reading in magazines and online. I could not afford a 2004-era wearable computing rig, so I thought about how to improve on an idea of wearable computing for everyone. If only I had made a few more connections–namely touch and the phone.

    Nevertheless, I had a lot of fun designing the PCD and writing this essay.

    Jason W. Ellis

    Professor Kenneth J. Knoespel

    LCC3314 – Technologies of Representation

    November 18, 2004

    Artifact of the Future – Personal Computing Device

    Personal Computing Device - PCD (Drawing by Jason Ellis)
    Personal Computing Device – PCD (Drawing by Jason Ellis)

    The Artifact

    The Personal Computing Device (PCD) is an inexpensive and portable computer that can interface with many different input/output (I/O) components.  It is a one-piece solution to the ubiquity of computing and information storage in the future.  Its plain exterior hides the fact that this artifact is a powerful computing platform that transforms “dummy terminals” into points of access where one may access their own computer that is small enough to fit in a shirt pocket.

    Description

    The device measures 3″ wide by 4″ tall by 3/4″ thick.  On one of the long sides there is a small 1/4″ notch.  This notch matches with a similar notch on the interface port of wearable computer networks, computing stations, and entertainment systems.  The notch allows user to insert the PCD in only one orientation.  This protects the PCD and the interface port it is being plugged into.  The PCD is housed in a thin aluminum shell.  As the PCD does computing work, its circuits emit heat which needs to be removed from the system.  Because of the very small (< 90nm) circuit manufacturing process, the PCD uses very little power which translates to it emitting less heat than today’s Pentium 4 or Athlon64 processors.  Aluminum is an excellent choice for its metal housing because it is thermally conductive (removes heat), it is lightweight, and it is inexpensive.

    Dimensional view of PCD (Drawing by Jason Ellis)
    Dimensional view of PCD (Drawing by Jason Ellis)

    There are no switches or indicators on the PCD.  It has only one interface port as pictured in the top-left of the drawing above.  This interface makes the PCD unique.  This standardized interface allows the PCD to be used on any computing system that is designed for the PCD.  Computer hardware, wearable computer networks, and home entertainment systems are “dummy terminals” which rely on the PCD to be the “brains.”

    The PCD is a full featured computer.  It processes data, runs programs, and stores data on built-in solid-state memory.  Engineers were able to build a complete “computer on a chip” using new silicon circuitry layering techniques.  The result of this is the Layered Computing System as drawn in the internal schematic of the PCD (below).  Reducing the number of chips needed for a computing application has been a long-standing goal of electrical and computer engineering.  Steven Wozniak at Apple Computer was able to design elegant layouts for the original Apple I, and later, the Apple II.  He designed custom chips that brought the functions of several chips into a single chip.  AMD is continuing the trend today after integrating the CPU memory controller onto the new Athlon64 processor.  NVIDIA introduced the nForce3 250 GB chipset which integrated the system controller chip, sound, LAN (networking), and firewall all onto one chip.

    Internal layout of the PCD (Drawing by Jason Ellis)
    Internal layout of the PCD (Drawing by Jason Ellis)

    The solid-state memory is similar to today’s flash memory (e.g., USB Flash Drives or compact flash digital camera memory).  The difference lies in the density of the memory on the PCD.  Layering techniques are used in building the solid-state memory so that it is very dense (more data storage per unit area than today’s flash memory).  Typical PCD solid-state memory storage is 120 GB (gigabytes).  The PCD’s large memory area has no moving parts because it is made out of solid-state memory.  Traditionally, computers need a hard drive to store large amounts of information for random access.  Hard drives are a magnetic storage that depends on round platters rotating at high speed while a small arm moves across the platters reading and writing information.  Flash memory does not need to spin or have a moving arm.  Data is accessed, written, and erased electronically.

    The PCD has a built-in battery for mobile use.  When the PCD is plugged into a wall-powered device such as a computer terminal or entertainment system, it runs off power supplied by the device it is plugged into and its battery will recharge.

    Social Significance

    The introduction of the PCD revolutionizes personal computing.  The PCD empowers users to choose the way in which they interface with computers, networks, and data.  Computer displays, input/output, and networks have become abstracted from the PCD.  A user chooses the operating system (the latest Linux distribution, Windows, or Mac OS X) and the programs (e.g., Office, Appleworks, iTunes) for his or her own PCD.  That person uses only their own PCD so that it is customized in the way that they see fit and they will develop an awareness of its quirks and abilities in the same way that a person learns so much about his or her own car.

    The “faces” of computers (i.e., monitors, keyboards, mice, trackballs, and printers) are abstracted away from the “heart” of the computer.  The PCD is the heart because it processes data through it (input/output) much like the heart muscle moves blood through itself.  A PCD also acts as a brain because it stores information and it can computationally work on the stored data.  The traditional implements of computer use are transformed into dummy terminals (i.e., they possess no computational or data storage ability).  Each of these devices have an interface port that one plugs in their personalized PCD.  The PCD then becomes the heart and brain of that device and it allows the user to interface with networks, view graphics on monitors, or print out papers.

    Computer Terminal and Entertainment Systems with PCD Interfaces (Drawing by Jason Ellis)
    Computer Terminal and Entertainment Systems with PCD Interfaces (Drawing by Jason Ellis)

    Both the PCD and the dummy terminals are a standardized computing platform.  Consumer demand, market forces, and entrepreneurial insight led to the evolution that culminated with the PCD as the end product.  Consumers were overburdened with desktop computers, laptop computers, and computer labs.  Every computer one might encounter could have a very different operating system or set of software tools.  The data storage on one computer would differ from the next.  A new standard was desired to allow a person to choose their own computing path that would be accessible at any place that they might be in need of using a computer.

    Computer manufacturer businesses saw ever declining profits as computers were becoming more and more mass-produced.  Additionally, no one company built all of the parts that went into a computer so profit was lost elsewhere as parts were purchased to build a complete computer for sale.

    New integrated circuit manufacturing techniques allowed for greater densities of transistors and memory storage.  These manufacturing techniques also allowed for lower power consumption and thus reduced heat from operation (which was a long-standing problem with computers).

    With the consumer, desire for something new and innovative coupled with a new way of building computer components led to the founding of a new computer design consortium.  Hardware and software manufacturers came together to design a computing platform that would fulfill the needs of consumers as well as improve failing profits.  The PCD design consortium included computer and software businesses, professional organizations, and consumer/enthusiast groups.

    The PCD almost didn’t see the light of day because of influence from large lobbying groups in Washington.  This involved copyright groups such as the Recording Industry Association of America (RIAA) and the Motion Picture Association of America (MPAA).  These groups decried the potential copyright violations possible with the PCD.  Epithets, curses, and bitching issued from the RIAA and MPAA lobbyists’ mouths.  Consumer outrage over these large business groups attempting to throw their weight around caused a surge of grassroots political involvement that unseated some Congressional members and scared the rest into line.  The public wanted to see what would come out of the PCD Design Consortium before judgment was passed on its useful and legal purposes.

    With the legal hurdles temporarily under control, the PCD was released to the public.  New and inventive uses were immediately made of the PCD.  One of the first innovations involved the Wearable Computer Network.  Wearable computing was a long researched phenomenon at the Wearable Computing Lab at MIT and Georgia Tech’s Contextual Computing Group.  The two factors holding back wide adaptation of wearable computing were the cost of the mobile computing unit and the mobile computing unit’s singular purpose.  These two factors were eliminated by the PCD because it was cheap and it could be removed from the wearable computing network and used in other computing situations (e.g., at a desktop terminal or in an entertainment system).

    Wearable Computing Network with Integrated PCD Interface Pocket (Drawing by Jason Ellis)
    Wearable Computing Network with Integrated PCD Interface Pocket (Drawing by Jason Ellis)

    Entertainment systems and desktop terminals became popular receptacles for the PCD.  Music and movies purchased over the Internet could be transferred to the PCD and then watched on a home entertainment system that had a PCD interface port.  Desktop terminals and laptop terminals also began to come with PCD interface ports so that a computer user could use their PCD at home or on the go, but still be able to use their PCD in other situations such as at a work terminal.  Being able to carry a PCD between work and home allowed for easier telecommuting because all of a person’s files were immediately available.  There was no more tracking down which computer had downloaded an email, because a person’s email traveled with that person on his or her PCD.  Easier teleworking helped the environment in metropolitan areas because more people could do their work from home without needing to drive their fossil fuel consuming cars down the highway.

    Instant computing access meant that PCD users were able to expand the possibilities of the human-computer dynamic.  There was more Internet use and that use was more often on the go.  As people began donning wearable computing networks for their PCD, they would chat with friends while riding a commuter train or they would spend more time getting informed about what was going on in the world with NPR’s online broadcasts or with the BBCNews’ website.  Social networks like Orkut and Friendster received even more users as friends began inviting friends who may have just got online (with a mobile setup) with their new PCD.

    As more computer, clothing, and HDTV terminals began to support the PCD, more jobs were created, more units were sold, more raw materials were consumed, more shipping was taking place, more engineering and design was going on, and new business models were being created.  The web of connections built upon itself so that more connections were made between industries and businesses.  The popularity of the PCD boosted tangential industries involved in building components that went into the PCDs as well as entertainment services.  Aluminum and silicon processing, chip manufacturing, battery production and innovation (for longer battery life), new networking technologies to take advantage of the greater number of computing users who purchase PCDs, and PCD interface devices (such as HDTVs and wearable computing networks) all ramped up production as demand for the PCD rose.  New services popped up such as computer terminal rental and new entertainment services that would allow customers to purchase copy-protected versions of music and movies that could easily be transported for enjoyment wherever the user took his or her PCD.  Some entertainment companies held out too long while others reaped rewards for modifying their business models to take advantage of this new (and popular) technology.

    Choice is the driving factor behind the PCD’s success.  Wrapped in the PCD’s small form is the choice of human-computer interaction, choice of where to use a PCD, and choice of data (visual and auditory) to be accessed with a PCD.  These choices are made available by the choices made by many people such as consumers, industrialists, and entertainment antagonists.  Those who embraced the PCD and found ways of interfacing with it (literally and figuratively) succeeded while those that did not were left by the wayside.

    Works Cited

    Contextual Computing Group at Georgia Tech.  September 29, 2004. November 14, 2004 <http://www.gvu.gatech.edu/ccg/&gt;.

    Hepburn, Carl.  Britney Spears’ Guide to Semiconductor Physics.  April 7, 2004.              November 14, 2004 <http://britneyspears.ac/lasers.htm&gt;.

    Owad, Tom.  “Apple I History.”  Applefritter.  December 17, 2003.  November 14, 2004 <http://www.applefritter.com/book/view/7&gt;.

    “Single-Chip Architecture.”  NVIDIA.  2004.  November 14, 2004 <http://www.nvidia.com/object/feature_single-chip.html&gt;.

    Wearable Computing at MIT.  October 2003.  November 14, 2004 <http://www.media.mit.edu/wearables/&gt;.

  • Recovered Writing: Undergraduate Technologies of Representation Essay on Present Technology, Airport Express, Oct 28, 2004

    This is the twelfth post in a series that I call, “Recovered Writing.” I am going through my personal archive of undergraduate and graduate school writing, recovering those essays I consider interesting but that I am unlikely to revise for traditional publication, and posting those essays as-is on my blog in the hope of engaging others with these ideas that played a formative role in my development as a scholar and teacher. Because this and the other essays in the Recovered Writing series are posted as-is and edited only for web-readability, I hope that readers will accept them for what they are–undergraduate and graduate school essays conveying varying degrees of argumentation, rigor, idea development, and research. Furthermore, I dislike the idea of these essays languishing in a digital tomb, so I offer them here to excite your curiosity and encourage your conversation.

    In the next few Recovered Writing posts, I will present my major assignments from Professor Kenneth J. Knoespel’s LCC 3314 Technologies of Representation class at Georgia Tech. LCC 3314 is taught in many different ways by the faculty of the Georgia Tech’s School of Literature, Media, and Communication, but I consider myself fortunate to have experienced Professor Knoespel’s approach to the course during the last phase of my undergraduate tenure. The ideas that we discussed in his class continue to inform my professional and personal thinking. Also, I found Professor Knoespel a great ally, who helped me along my path to graduation with side projects and independent studies.

    This is another example of a WOVEN multimodal essay assignment. In it, I used WVE/written, visual, electronic modes to discuss a specific technology. These essays (focused on past, present, and future technologies) gave me a chance to use technology to explore the meaning behind and impact of technologies. The next essay will focus on a future technology of my own design.

    In this essay assignment, we were tasked with exploring an example of a present technology. I chose to write about Apple’s Airport Express, which my roommate Perry Merier had recently purchased. At the time, the idea of an extremely small computing/routing/audio device was new and innovative. Also, it was incredibly useful.

    Jason W. Ellis

    Professor Kenneth J. Knoespel

    LCC3314 – Technologies of Representation

    October 28, 2004

    Artifact of the Present – Apple Airport Express

    Apple Airport Express (Image from Apple Computer)
    Apple Airport Express (Image from Apple Computer)

    The Artifact

    The Apple Airport Express is a multifunction wireless Internet router (i.e., base station) that first hit shelves in June 2004.  It can serve as a wireless Internet base station, extend the range of an existing wireless network, receive streaming music and transfer that to a home stereo, and share a USB printer on a wireless network.  It can do all of these things and yet its small rectangular shape can be inscribed in the circumference of an audio CD.

    Description

    The Airport Express is only 3.7 inches tall, 2.95 inches wide, and 1.12 inches deep.  It is about the size of a Powerbook G4’s power brick (AC to DC converter).  If you do not need the included power cord extender, then the Airport Express is completely self-contained.  Unlike most other wireless routers, the Airport Express has its power converter built-in.  The electronics that allow it to juggle all of its functions lie within the glossy white plastic housing.

    On the back edge of the Airport Express there is a fold-out AC power connector.  The power prongs fold back into the unit so that it is easily carried in a bag without snagging on anything.  The bottom edge has three connectors.  The first is the ethernet RJ-45 connector.  This can be connected to a DSL or cable modem so that the Airport Express can wirelessly transmit Internet access to computers with wireless capabilities that are within range.  Next is the USB connector.  This can be hooked to a USB printer so that the printer can be shared with anyone on the wireless network.  The last connector is an audio mini-jack that supports both digital and optical audio output.  This can be connected to a home stereo so that music can be streamed from a computer running iTunes to the Airport Express.  In the event of a lockup, there is a small reset button on the bottom of the device.  The front edge of the device has an LED.  This LED lights up as amber or green.  The color of the LED and its state (i.e., on, off, blinking) can indicate different things about the status of the Airport Express.

    Airport Express Connectors (left) and Airport Express Plugged-In (right)(Images from Apple Computer)
    Airport Express Connectors (left) and Airport Express Plugged-In (right) (Images from Apple Computer)

    The components inside the Airport Express are tightly packed.  A good deal of engineering had to go into making function follow form in this artifact.  Home wireless routers are usually two or three times the size of the Airport Express and they have an external power brick (that may be the same size as the Airport Express).  This device has to contain a power converter, wireless networking components, wired networking components, network routing components, USB printing components, and audio components.  Some of these parts are combined on a single piece of silicon to save space on the circuit board.

    Airport Express split in half.  Note the circuit boards on the left and power converter on the right.  (Image from ipodding.com)
    Airport Express split in half. Note the circuit boards on the left and power converter on the right. (Image from ipodding.com)

    Social Significance

    Apple Computer introduced its Airport technology in July 1999.  The choice to use the name “Airport” was a deliberate one.  It is easy to remember and it evokes certain images of what the technology is able to do.  The bits of data seem to fly through the air on invisible radio waves.  Airport technology is the place where these bits take off and land–from the base station to the computer and vice versa.  Speed, travel, and mobility are some of the images that Apple intended the Airport Extreme to conjure for potential buyers.

    The Airport Express uses the two most widely adopted wireless networking standards:  802.11b and 802.11g.  A working group within the Institute of Electrical and Electronics Engineers (IEEE) established those standards.  The IEEE 802 standards committee develops the standards for local area networks as well as for metropolitan area networks.  Work group 11 focuses on wireless networking standards.  Publicly available standards such as these are part of the success of computer and networking hardware.  Standards allow for components manufactured by different companies to be interoperable.  Because the Airport Express uses several open standards it will work along side other wireless hardware and it will work with Macs as well as PCs.

    The Federal Communications Commission (FCC) and the National Telecommunications and Information Administration (NTIA) regulate the radio frequency spectrum.  The NTIA is part of the Executive Branch of the US Government that “manages the Federal government’s use of the spectrum” while the FCC is an “independent agency” that “regulates the private use of the spectrum” (NTIA).  The 802.11b and 802.11g wireless networking standards are approved by the FCC to use the 2.4GHz radio band for transmitting and receiving bits of data carried on radio waves.

    The US Radio Spectrum Frequency Allocations.  The red ellipse approximately marks where in the spectrum 802.11b and 802.11g operate. (Image from NTIA)
    The US Radio Spectrum Frequency Allocations. The red ellipse approximately marks where in the spectrum 802.11b and 802.11g operate. (Image from NTIA)

    Each person with a computer with wireless capability, a copy of iTunes, a stereo, and an Airport Express is in effect a one-person radio station.  Music can be streamed from the computer to the Airport Express which passes it along to the home stereo via an audio cable.  Digital music is now freed from the computer and transferred back to the home stereo.  This capability points to one of the Airport Express’ weaknesses.  Music streaming from a computer can only be played on one Airport Express at a time.  There is no technology barrier keeping more than one Airport Express from receiving the streaming music so there is some reason that Apple restricted this capability on the Airport Express.  If this were enabled customers would buy more than one Airport Express so that they could stream music to multiple rooms.

    The music travels wirelessly to the Airport Express and then to the stereo via wires.  (Image from Apple Computer)
    The music travels wirelessly to the Airport Express and then to the stereo via wires. (Image from Apple Computer)

    The Airport Express’ limitations might be due to pressure from the music industry.  Apple gives the music playing software, iTunes, away for free.  It can play CDs, MP3s, and it can access Apple’s Online Music store.  This software can copy (i.e., rip) CDs that may or may not be owned by the iTunes user.  Additionally, iTunes will play legitimate MP3s as well as those that are obtained in violation of current copyright law.  The Recording Industry Association of America (RIAA) and some music recording artists find this unacceptable.  Apple has tried to work on the side of the consumer, but they have to appease the music industry as well.  To do this Apple has integrated special encryption in music downloaded from the Apple Online Music Store so that only the authorized buyer can play those MP3s.  Additionally, iTunes establishes a secure connection to the Airport Express by encrypting the music stream with Advanced Encryption Standard (AES) encryption, which is in turn protected by RSA encryption.  This prevents others from recording an iTunes music stream.

    Encryption is also employed to protect the wireless users on the Airport Extreme’s network.  Part of this protection comes from encrypting the wireless network traffic and the other part comes from the built-in firewall.  The older encryption is called Wired Equivalent Privacy (WEP) and the newer security is called Wi-Fi Protected Access (WPA).  WPA was built to supercede WEP.  The built-in firewall uses network address translation (NAT) to create a network that uses private IP addresses instead of public (and thus directly connected to the Internet) IP addresses.  NAT exchanges data between the public world and the private network.  Generally, only the NAT server can directly connect to the computer on its private network and not a computer in the outside world.

    Security and privacy is a growing concern for people in a more wired world.  Identify theft is becoming a boon for some (e.g., the thieves, private investigators, lawyers, politicians) and a bust for others (i.e., the person whose identity is stolen).  One way that a person’s private identifying information is stolen is by an individual “sniffing” a wireless network’s data traffic for that precious information.  New industries and groups have grown out of this problem of identity theft.  Wireless devices like the Airport Express need to have protections built-in so that a user’s private information will be better protected.

    The physical construction of the Airport Express involves electrical engineering, computer engineering, and industrial design.  Electrical engineering and computer engineering overlap in a project such as this.  Custom chips have to be designed and built that handle data traffic, digital-analog conversion of sound, configuration software, controlling of a radio transmitter/receiver, and print control software.  Simplicity and elegance of design are demanded in order to fit such a feature rich artifact into a very small package.  Apple has a history of taking an artifact that is assumed to look or work in a particular way and transforming its appearance into something new and fresh (e.g., the original Macintosh, iMac, and iPod).  Airport Express works similarly to any other wireless router, but it pushes the elements of design (both as a physical artifact and with the internal circuits and chips) so that it is identified by the user as something more than its function.

    Sleek and new shapes also reinforce the perception of speed.  Airplanes are fast and this artifact is the Airport (sending and receiving these fast airplanes of data) Express (quick, fast, simple).  Computer technology has been a long progression of speed.  How fast does this computer perform the tasks that I will be using it for?  Can it play Doom 3?  The same is true for networking technologies.  Wired networking is hands down the fastest networking technology so wireless has to compete with wires in speed, but it can distinguish itself by its convenience.

    (Photo by John M. Dibbs.)
    (Photo by John M. Dibbs.)

    These new designs effect a change in the way people think about their computer technology.  Soft colors, translucent plastics, curves and gentle transitions give technology a friendlier “face.”  It isn’t imposing and the technology can now fit into a color scheme in your home.  Computer technology shifts from utility to lifestyle.  Apple brings together these networks of technology, government oversight, music industry muscle, and industrial design principles so as to provide customers with the technology desired but in a package that makes it less technical and more like a streamlined appliance.

    Works Cited

    “Airport Express Gallery.”  Ipodding.com.  2004.  October 26, 2004           <http://ipodding.com/modules.php?set_albumName=album10&op=modload&na            me=gallery&file=index&include=view_album.php>.

    “Apple – Airport Express.”  Apple Computer, Inc..  2004.  October 26, 2004 <http://www.apple.com/airportexpress/&gt;.

    “Apple – Support – Airport Express.”  Apple Computer, Inc..  2004.  October 26, 2004 <http://www.apple.com/support/airportexpress/&gt;.

    Dibbs, John M..  “Concorde Takeoff.”  Planepix.com.  October 26, 2004    <http://www.planepix.com/pp/servlet/template/Detail.vm/id/2940&gt;.

    “Myths vs. Reality.”  National Telecommunications and Information Administration.       October 14, 2004.  October 26, 2004             <http://www.ntia.doc.gov/ntiahome/myths.html&gt;.