Category Archives: Technology

Recovered Writing: Undergraduate Technology & American Society Paper on Handheld Calculators, Nov 26, 2003

This is the fifteenth post in a series that I call, “Recovered Writing.” I am going through my personal archive of undergraduate and graduate school writing, recovering those essays I consider interesting but that I am unlikely to revise for traditional publication, and posting those essays as-is on my blog in the hope of engaging others with these ideas that played a formative role in my development as a scholar and teacher. Because this and the other essays in the Recovered Writing series are posted as-is and edited only for web-readability, I hope that readers will accept them for what they are–undergraduate and graduate school essays conveying varying degrees of argumentation, rigor, idea development, and research. Furthermore, I dislike the idea of these essays languishing in a digital tomb, so I offer them here to excite your curiosity and encourage your conversation.

This essay was my term paper in Professor Steven W. Usselman’s HTS 3083, Technology and American Society course at Georgia Tech in Fall 2003. I wrote this essay in the second class that I took from Professor Usselman (with the first being HTS 2082, Science and Technology of the Industrial Age). Professor Usselman gave his lectures as engaging stories full of detail and context. As a lecturer, he knows how to guide and support his students on their way to understanding. It is a credit to Professor Usselman that I remember enjoying his lectures, but I do not remember writing my essay below (which alarmingly is true for much of my early writing). However, I thought that this essay would share some correspondence with the object-oriented essays in my previously posted essays from Professor Kenneth J. Knoespel’s Technologies of Representation class. These kinds of interdisciplinary and cross-disciplinary connections are what excited me the most about my Georgia Tech undergraduate education.

Jason W. Ellis

Professor Steven W. Usselman

HTS3083

November 26, 2003

Introduction of Electronic Handheld Calculators

The story of the electronic handheld calculator is about making one product to sell to consumers while proving a piece of that product to industry.  Eventually the electronic handheld calculator would probably have come along, but its introduction in America by Texas Instruments was done not to fill a void or need in the marketplace for electronic handheld calculators.  It was introduced to push the idea of the “heart” of the calculator–the integrated circuit.  The story of the calculator is tightly woven with that of the integrated circuit, or microchip.

Before the handheld calculator debuted there was the desktop electronic calculator which “had to be plugged in (120 v), were the size of typewriters, and cost as much as an automobile” (Hamrick 633).  After WWII scientists, engineers, bankers, actuaries, and others found greater need of computational power.  With the advent of transistors to replace the much larger vacuum tube, electronic computation machines were able to be reduced in size.  The story of the integrated circuit and the transistor are almost a case of history repeating itself.  In 1954, Texas Instruments was one of the world leaders in mass producing transistors.  The public and industry, however, were not as ready to jump on the transistor bandwagon yet.  Pat Haggerty, VP of Texas Instruments, had his engineers develop a pocket sized radio using transistors.  TI had limited experience with consumer products so TI teamed up with Regency Company of Indiana to market the pocket radio.  The radio was introduced just before Christmas of 1954 and over 100,000 radios were sold in the first year.  The salability of the transistor pocket radio impressed companies like IBM who began to buy transistors from TI.

TI had trouble selling the integrated circuit to big companies for introduction into their products.  Also, the nature of the integrated circuit was not good as a business model as it stood when it was first developed.  It was difficult to built a good integrated circuit, but once a good one was built, it rarely went bad.  Without a need of replacing integrated circuits like with vacuum tubes, TI wanted to find new applications for the integrated circuit so that they could be sold for use in many other products not currently using electronics such as transistors or tubes.

Haggerty thought that this “invention technique” would work for introducing the world to the integrated circuit (Hamrick 634).  Haggerty ran the idea by the inventor of the integrated circuit, Jack Kilby while on a flight back to Dallas.  What was to be invented was up in the air at this point.  Haggerty suggested to Kilby, “invent a calculator that would fit in a shirt pocket like the radio, or invent a lipstick-size dictaphone machine, or invent something else that used the microchip” (Hamrick 634).  Kilby liked the idea of inventing a calculator so that is what he went with.  Kilby was allowed to choose his own team back at TI’s headquarters in Dallas.  He choose Jerry Merryman _  and James Van Tassel.   Kilby made his pitch to his assembled team.  He described to them that they would build a “our own personal computer of sorts which would be portable, and would replace the slide rule” (Hamrick 634).  At this time the invention was not yet called a “calculator,” but a “slide rule computer” (Hamrick 634).  It was code named CAL-TECH.  Tasks were divided among the team members:  Kilby worked on the power supply, Van Tassel worked mostly on the keyboard, and Merryman worked on the logic and the output.

The CAL-TECH prototype was completed in November 1966, almost one year after it was first discussed by Haggerty and Kilby.  This first handheld electronic calculator was about 4” by 6” by 1.5” and it was a heavy 45 oz. because it was constructed from a block of aluminum.  What is interesting about the display of the CAL-TECH is that it doesn’t have one.  Its output is handled by a newly designed “integrated heater element array and drive matrix” which was invented by Merryman for this project.  This allowed for the output to be burned onto a paper roll and it was designed to use little power.  The CAL-TECH had 18 keys:  0, 1, 2, 3, 4, 5, 6, 7, 8, 9, ., X, +, -, , C, E, and P (Hamrick 635).  This early calculator could actually only add and subtract.  For multiplication it would add repeatedly and for division it would subtract repeatedly.  The patent was first filed for the CAL-TECH on September 29, 1967_ .

As with the transistor radio, TI did not want to make the first handheld electronic calculators themselves.  TI partnered with Canon of Japan to market the consumer version of the CAL-TECH, the Pocketronic.  The Pocketronic was first offered to the market on April 14, 1970, the day before income tax returns were due (Hamrick 636).  The Pocketronic was lauded in Business Week magazine as “the portable, pocketable, all electronic consumer calculator that the electronics industry had long dreamed about” (Hamrick 636).  It was small and it only weighed 1.8 pounds.  Initially it cost $400 ($1500 in 1995 dollars).  This is compared to the bulky, desktop calculators which cost more than $2000 (over $7,500 in 1995 dollars) (Hamrick 636).  Production costs of the parts to build electronic handheld calculators decreased the cost of production compared to the electronic desktop calculators of the day.  For example, “the 1966 business calculator version retailing for $2000 contained over a thousand discrete semiconductors such as transistors and resistors with a cost of $170” (Ament).  Ament goes on to show that “in 1968, integrated circuits (ICs) began finding their niche in business calculators with a typical selling price of $1000…[which] had 90 ICs at a cost of $125.”  The Pocketronic used a MOS/LSI_  IC which put all the functions of the calculator on one IC chip.  This further reduced the cost of parts and it reduced the number of parts involved in production.  This better economy of production helped fuel the boom in electronic handheld calculators that took place in the early 1970s.

Compared to today’s calculators, the Pocketronic was outrageously expensive and it could only do basic arithmetic.  At that time, however, it was doing something that only specialized and much more expensive machines could do.  It was the first step in democratizing computational machines.  It would start the move of computation from academia and big business to K-12 schools and the home.

The instruction manual for the Pocketronic features a picture of a man dressed in a suit holding the Pocketronic performing a calculation for a woman wearing a coat, tie, and fashionable hat watches while she is standing in the open door of a car (Canon).  She is probably looking at the car at a dealership and the man is a car salesman.  Initially this higher cost item was probably marketed to professionals who could bear the cost of the new technology.  As with much technology it was suggested as primarily as a man’s tool.  Hamrick takes some excerpts from early articles and advertisements of calculators in the 1970s.  Here are a few examples:

1.  “Calculators are being sold to engineers, college students, and women to use for shopping.”

2.  “Every housewife will have one (calculator) when she goes shopping.”

3.  “Salesmen use them to compute estimates and prices for carpeting and fences.  A professional pilot carries one for navigational calculations.  A housewife with skeet-shooting sons checks shooting record cards.”

4.  “At the supermarket, the new calculator will help your wife find the best unit price bargains.  At the lumberyard, they’ll help you decide which combination of plywood, lumber and hardboard would be least expensive for your project” (Hamrick 639).

These excerpts reveal a sexism regarding how calculators will be used by men and by women.  Men are shown as using the calculator in a professional sphere.  The calculator is a tool that helps a man in his daily work.  Women are shown as using the calculator in the home sphere.  The calculator can be a tool for the woman to perform household duties much as she should use a sewing machine or some other appliance.  The calculator was marketed to both men and women, but the attitudes shown in the advertising shows a sexist bent regarding how the two sexes will use their respective calculators.

Demand was great enough however that other manufacturers quickly began making their own electronic handheld calculators.  By “October of 1974, the JS&A Company, which sold calculators through mail and magazine advertisement, offered the Texas Instrument TI-2550 for an incredible $9.95.  For this period, a calculator under $10 was incredible cheap!” (King).  It would follow that in order to justify such a ramp-up in production there must have been a lot of people wanting to buy these electronic handheld calculators.  Robert King writes that there were “seven such ‘milestones’ leading to today’s commonly-used calculator” (King).  He lists them as portability, small size, replaceable batteries, increased functions, liquid crystal display, solar power, and cheapness (King).  These stages of calculator evolution were each mastered or integrated into products increasing the market demand for the calculator while decreasing the cost of the calculator.

Slide rule manufacturers began to fall to the wayside because of the demand for calculators instead of slide rules.  For instance, “Keuffel & Esser, the oldest slide rule manufacturer…made its last slide rule in 1975,” only five years after the introduction of the Pocketronic (Hamrick 638).  Slide rules had been the primary portable computation device used by students, scientists, and engineers before the calculator came along.  The electronic desktop calculators also began to be phased out when more advanced and powerful calculators began to come out such as Hewlett-Packard’s HP-35 in 1972_ .  HP’s website describes the HP-35 as, “the world’s first scientific handheld calculator. Small enough to fit into a shirt pocket, the powerful HP-35 makes the engineer’s slide rule obsolete. In 2000, Forbes ASAP names it one of 20 “all time products” that have changed the world” (HP).  The first handheld calculator makes inroads into markets where people need to make basic arithmetic computations.  These newer, more advanced calculators move into the markets where the more specialized desktop calculators and early computer systems were the mainstay.  The explosion of the handheld calculator market muscles in quietly and quickly usurping the dominant position of calculation technology in many different arenas where people need to make calculations.

In the home and business market, the calculator was swiftly adopted and integrated into a standard tool.  A source of some controversy involved the introduction of the calculator into schools.  There was not a loud outcry about students using calculators in college level classes.  In one example, the University of Ohio redesigned its remedial college math class so that calculators were required for the curriculum.  Leitzel and Waits describe the situation at the University of Ohio in the autumn of 1974 as “we faced approximately 4500 students who were not prepared to begin our precalculus courses” (731).  The authors note that “the enrollment in our remedial course includes typically a large number of students from diverse backgrounds, with equally diverse abilities, with poor attitudes toward the study of mathematics, with poor study habits and, to a large extent, poor academic motivation” (Leitzel and Waits, 731).  Only a few years after the introduction of the handheld calculator these professors are designing a new approach to an old mathematics course that will try to capture the attention of these students with such poor school habits.  The calculator will be instructive and it will be a hook to get the students interested in the material.  They noted that “in using calculators students raised questions about arithmetic properties of numbers that would have been of little interest to them otherwise” and “the desire to use the calculator seemed often to motivate this understanding” (Leitzel and Waits, 732).  The calculator would let the students spend more time doing more problems in a sort of trial and error scenario.  It took a long time to do some calculations with a slide rule or by hand.  A calculator would allow for easy and quick computation involving larger numbers or large sets of numbers.  Leitzel and Waits are proposing that by letting the students explore mathematics with the calculator as a facilitating tool, it is allowing the students to accomplish what they were not motivated to do before.  They add, however, “the question of whether a person who uses a hand-held calculator to do computations is somehow less educated than a person who does computations mentally we will leave for others to decide” (Leitzel and Waits, 732).  This was the big question regarding the calculator for those in education.  Was the calculator something that built upon the learning process or was it something that detracted from one’s development of arithmetic ability.  This question weighed much more heavily on those in K-12 education than in colleges.  Calculators were not rushed into kindergartens or the early grades in school.  I remember using calculators and adding machines at home and at my parent’s business when I was young.  The school curriculum in the schools I attended in southeast Georgia didn’t allow the use of a calculator in until the sixth grade.  That was in 1988-1989.

This debate continues even in the higher levels of grade school.  One of the loudest arguments involves high school geometry and the development of proofs.  Proofs allow the student to see that there is a rational basis for particular mathematical rules and operations that might not appear intuitive at first glance.  James Stein Jr. writes, “I am extremely concerned by the current emphasis on calculators in the elementary and secondary mathematics curriculum.  The vast majority of my students, to borrow Hofstadter’s phrase, are woefully innumerate, a condition I believe has been exacerbated by the reliance on calculators” (447).  Stein_  reveals that by this time, about 17 years after the introduction of the Canon Pocketronic, calculators are used in elementary and secondary schools.  Neil Rickert_  writes regarding this issue, “although the curriculum a generation ago was far from ideal, at least the students learned that mathematics provided a powerful tool for solving interesting and difficult problems.  Today mathematically strong students are leaving high school convinced that mathematics is a boring and sterile subject, overloaded with pedantry” (447).  He feels that by having students spoon feed axioms instead of discovering the proof behind those axioms and principles, students are turned away from mathematics.  The dynamo of change from proofs to the more problem solving ideology is the calculator.  With the calculator students are better equipped to perform complex operations and solve difficult problems whereas before there was a limit to the number of problems or complexity of a problem that a student could tackle with only pencil, paper and a slide rule.  In response to Stein and Rickert, Lynn Arthur Steen_  writes, “the calculator makes possible precisely the exploration of arithmetic patterns that Stein seeks.  To translate this possibility into reality will require greater emphasis on quality teaching so that calculators can be used effectively” (447).  Steen is looking for a solution involving teaching and the use of calculators.  She isn’t placing all blame on the calculators.  She goes on to say, “the need to move students from lower, rote skills to complex problem-solving has been recognized in virtually every report on education during the last decade.  It is calculation rather than deduction (as Rickert states) that improperly dominates today’s school curriculum” (448).  This shows that she also thinks that calculators have too much school space in that students are encouraged and taught to use them in elementary and secondary schools.  She feels that there are greater skills that must be taught along side the use of calculators.  Steen is suggesting that better problem solving skills coupled with the calculator should be the new order for elementary and secondary school math education.  After the initial boom and integration of the calculator into educational life, everyday life, and professional life, there is a backlash against the adoption of the calculator in educational life.  There must be mediation between traditional rote skill learning and the use of the calculator.  There must also be an revision in the way problem solving skills are taught and approached to better utilize the calculator as a tool and not as a reliance.  The debate regarding calculators in the classroom continue to this day though it often regards more advanced calculators such as ones capable of symbolic manipulation_  and graphing complex equations.

The electronic handheld calculator was initially embraced by many different people in different spheres of life such as the home, business, or school.  People needed to calculate percentages, balance check books, more easily solve math problems, calculate interest, and many, many other things.  Initially the calculator moved into these different facets of society and debate or dissent did not arise until the growing use of calculators in the school environment.  College mathematics departments tried to use calculators to help some remedial students get up to speed while other math professionals decried the use of calculators in elementary and secondary schools.  In the professional and home arenas, the calculator has been accepted as a useful tool to solve many problems that were once tedious or nearly impossible to do without the aid of some mechanical or electrical computation technology.  The introduction of the electronic handheld calculator was a quiet revolution that brought a democratization of calculation to nearly everyone in America.

Works Cited

Ament, Phil.  “Hand-Held Calculator.”  The Great Idea Finder.  Oct. 22, 2002.  Nov. 23, 2003 <http://www.ideafinder.com/history/inventions/handcalculator.htm&gt;.

Canon Incorporated.  Canon Pocketronic Instructions.  Japan:  Canon.  1970.

Hamrick, Kathy.  “The History of the Hand-Held Electronic Calculator.”  The American Mathematical Monthly, Vol. 103, No. 8 (Oct., 1996), 633-639.

Hewlett-Packard Company.  “HP timeline – 1970s.”  2003. Nov. 23, 2003 <http://www.hp.com/hpinfo/abouthp/histnfacts/timeline/hist_70s.html&gt;.

King, Robert.  “The Evolution of Today’s Calculator.”  The International Calculator Collector, Spring 1997.  Nov. 23, 2003 <http://www.vintagecalculators.com/html/evolution_of_today_s_calculato.html&gt;

Leitzel, Joan and Bert Waits.  “Hand-Held Calculators in the Freshman Mathematics Classroom.”  The American Mathematical Monthly, Vol. 83, No. 9 (Nov., 1976), 731-733.

Rickert, Neil W..  “Mathematics Education.”  Science, New Series, Vol. 238, No.    4826 (Oct. 23, 1987), 447.

Steen, Lynn Arthur.  “Mathematics Education:  Response.”  Science, New Series, Vol. 238, No. 4826 (Oct. 23, 1987), 447-448.

Stein Jr., James D..  “Mathematics Education.”  Science, New Series, Vol. 238, No. 4826 (Oct. 23, 1987), 447.

1 Jerry Merryman is described as a “self-taught engineer” who attended Texas A & M, but never graduated.  He was considered “one of the brightest young engineers at TI (Hamrick 634).  _2 This first patent filing was followed by a refiling on May 13, 1971 and it was refiled again on December 21, 1972.  The CAL-TECH is covered by patent number 3,819, 921 (Hamrick 635)._3 MOS/LSI stands for metal-oxide-semiconductor/large scale integration._4 “The HP-35 was introduced in January, 1972 and was recalled in December, 1972.  The owners were sent a letter pointing out idiosyncrasies in programming caused by a defect in one logic algorithm.  HP offered to replace the calculator.  This was probably the world’s first instant recall.  The defect caused a few 10 digit numbers, when used in an exponential function, to give an answer that was wrong by 1%” (Hamrick, 638)._5 James D. Stein Jr. is in the Department of Mathematics at both the California State University, Long Beach, CA and the University of California, Los Angeles, CA._6 Neil W. Rickert is from the Department of Computer Science, Northern Illinois University, DeKalb, IL._7 Lynn Arthur Steen is from the Department of Mathematics at St. Olaf College, Northfield, MN._8 The TI-92 is able to solve equations for a numerical answer and it can perform many calculus operations such as derivatives, integrals, etc.._

Recovered Writing: Undergraduate Technologies of Representation Final Essay Response on Communication Tech and World of Warcraft, Dec 8, 2004

This is the fourteenth post in a series that I call, “Recovered Writing.” I am going through my personal archive of undergraduate and graduate school writing, recovering those essays I consider interesting but that I am unlikely to revise for traditional publication, and posting those essays as-is on my blog in the hope of engaging others with these ideas that played a formative role in my development as a scholar and teacher. Because this and the other essays in the Recovered Writing series are posted as-is and edited only for web-readability, I hope that readers will accept them for what they are–undergraduate and graduate school essays conveying varying degrees of argumentation, rigor, idea development, and research. Furthermore, I dislike the idea of these essays languishing in a digital tomb, so I offer them here to excite your curiosity and encourage your conversation.

This is my final post of material from Professor Kenneth J. Knoespel’s LCC 3314 Technologies of Representation class at Georgia Tech. LCC 3314 is taught in many different ways by the faculty of the Georgia Tech’s School of Literature, Media, and Communication, but I consider myself fortunate to have experienced Professor Knoespel’s approach to the course during the last phase of my undergraduate tenure. The ideas that we discussed in his class continue to inform my professional and personal thinking. Also, I found Professor Knoespel a great ally, who helped me along my path to graduation with side projects and independent studies.

This is my final paper assignment (I think given in lieu of a final exam) in LCC3314. The more exciting portion is question 2, which concerns Blizzard’s World of Warcraft. I break down how you navigate its space and I describe elements of its operation. It bears noting that at the time that I wrote this, WoW had been out for less than a month. I was rabidly playing it on my PowerMac G5 at 2560×1600 resolution on a 30″ Apple Cinema Display. While it might not have been the best essay, it certainly was one that I enjoyed writing to no end! I wish that I had found a way to make time for WoW since my days in Liverpool. I have played WoW on only rare occasions since returning to the States, but I continue to write about it from my memory of Azeroth.

Also included below is my response to question 1, which seems to be focused on the telegraph, telephone, and cellular phone. In this question, I explore the material experience of using these different communication media and technological devices. I suppose WoW is another kind of communication technology wrapped up in a highly interactive gaming environment (cf. Hack/Slash).

Jason W. Ellis

Professor Kenneth J. Knoespel

LCC3314 – Technologies of Representation

December 8, 2004

Final Paper Assignment

1. On the telegraph, telephone, and cellular phone

The telegraph, telephone, and cell phone each have a particular interface that works with different human senses and thus provide different experiences for the body.  The differences between these communication technologies lie in the physicality of the artifact as well as the technology underlying the technology for encoding and decoding communication.

The telegraph is a wired point-to-point textual communication technology.  Telegraph operation involves trained operators who can encode and decode the Morse code messages transmitted over wires with telegraph machines.  The process of sending a telegram involves finding a business that offers telegraph service, going there in person, telling the telegraph operator the message to send, the telegraph operator encodes the message with the telegraph machine, it is received by the appropriate destination telegraph operator, that operator decodes the message, a delivery person is dispatched with the message, and the message is hand delivered to the recipient.  The experience of the telegram sender is standing at a counter and speaking with an operator.  The receiver interfaces with a delivery person who hands them a piece of paper containing the message.  The technology that makes the sending and receiving messages over great distances possible is removed from the experience of the sender and receiver.  The sender and receiver also have to rely on a network of operators and delivery persons.  These people are in a unique position to view the correspondence between the sender and receiver.  This fact is probably something that senders of telegrams were well aware of.

The telephone is a wired point-to-point oral communication technology.  Telephones encode auditory information into electrical signals which travel over copper wires in a phone network to the receiving telephone that decodes the electrical signals into auditory information (the spoken voice).  Telephones allow users to hear the person’s voice that they are speaking with.  One problem with telephones is that the technology uses a narrow band of audible sound that can cause “m” to sound like “n” or “b” to sound like “d.”  Initially, telephones were prohibitively expensive and were direct wired from location to location.  After telephone networks were made possible with human operator switching technology, voice phone calls could be routed from the call initiator to the call receiver.  Therefore, over time the phone network mediation shifted from human operators to electrical switching technology.  When you would make a call you would speak to an operator first, and then the person that you were calling.  Now, one can dial a number and the phone network’s automatic switching technology connects the caller with the receiver.  Someone who makes a phone call assumes privacy when the call is made from home or within an enclosed space such as a phone booth.  The physical interaction between the user and the telephone is that a headset is lifted off the base and held to the ear and mouth.  The user taps out a phone number on the base or dials a number with a rotary phone base.  The telephone user experiences an interaction with a disembodied voice.

The cell phone is an unwired point-to-point oral and textual communication technology.  Modern cell phones are a synthesis of the telegraph, telephone, digital photography, video technology, and radio technology.  Cell phones facilitate voice conversations between cell phone to cell phone or cell phone to wired telephone.  They also allow for text messaging, audio messaging, picture messaging, and video messaging.  Widespread cell phone use is shifting voice phone conversation into a more commonplace activity.  Additionally, the private sphere of telephone conversation is shifting to the public sphere of wherever the cell phone user answers or makes a phone call.  Cell phones also connect to the Internet and Internet-based text messaging networks such as AOL Instant Messenger.  The cell phone has become a place of contact for the individual in more ways than merely talking on the phone.  It builds connections between the individual and others as well as between the individual and information (e.g., online weather information, movie listings, online news websites, etc.).  With ear bud speaker/microphones that plug into cell phones or wireless Bluetooth headsets, one can interface with the auditory communication features of their cell phone without needing to hold the cell phone up to the ear and mouth as one would with a traditional telephone.  The cell phone users also interface with a disembodied voice, but the cell phone also has other means of interaction with people as well as information.

The telegraph is not an interactive means of communicating in the way that the telephone and the cell phone are.  With the telephone or the cell phone, one can have a real-time conversation with someone else whereas with the telegraph, there is a delay between sending a message, delivery, and if need be, a return message.  The amount of information capable through transmissions has increased over time.  The telegraph had a finite amount of information that could be conveyed because of the time and cost of sending messages with Morse code.  The telephone increased the amount of conveyed information because it was a disembodied voice that could carry nuances of speech and emotive information (e.g., happiness, sadness, anger, etc.).  The cell phone has brought these communication systems full circle with the creation of a synthesis of voice and text.  Along with oral communications, there is so much textual and graphic information that can be conveyed through a cell phone.  Barbara Stafford writes, “we have been moving, from the Enlightenment forward, towards a visual and, now, an electronically generated, culture” (“Presuming images and consuming words” 472).  The cell phone represents the bringing together of communication, both between people and between people and sources of information.  Walter J. Ong writes in Orality and Literacy, “By contrast with vision, the dissecting sense, sound is thus a unifying sense.  A typical visual ideal is clarity and distinctness, a taking apart…The auditory ideal, by contrast, is harmony, a putting together” (71).  The modern cell phone brings together the visual and the oral in a way that previous communication technologies had not.  This unification ties two of the powerful human senses (sight and sound) to the cell phone that distinguishes it from the telegraph and telephone.

An interesting development in these technologies is that the perception is that better communication technologies lead to better communication between individuals (i.e., a bringing together of individuals).  George Myerson writes in Heidegger, Habermas, and the Mobile Phone, “There’s no real gathering at all.  Instead, there are only isolated individuals, each locked in his or her own world, making contact sporadically and for purely functional purposes” (38).  Thus, the cell phone has disconnected the individual from the wall phone where one might be “waiting on an important call.”  Casualness and importance are intertwined in the use of the cell phone.

I used Paul Carmen’s paper on the telegraph, Amanda Richard’s paper on the telephone, and Kevin Oberther’s paper on the cell phone as starting points for this essay.

2. On World of Warcraft

Blizzard Entertainment’s World of Warcraft video game was released on November 23, 2004 for both Windows and Mac OS X.  It is a massively multiplayer online role playing game (MMORPG) that immerses the player in a 3D fantasy world where the player is able to create a character based on several layers of identity (e.g., allegiance:  alliance or horde, races:  humans, dwarves, night elves, gnomes, orcs, tauren, trolls, or undead, and classes:  warrior, mages, druids, hunters, rogues, etc.).  After building one’s character (including designing a unique appearance), you choose a realm in which to play.  These realms correspond to computer servers that are in a particular time zone.  Other players around the world pick one of these realms to play in that best corresponds to when they will be playing, or when their friends will be playing.  The player is able to meet up with friends within a realm to go on adventures together, and if the player doesn’t know anyone, he or she can communicate with other players to form groups (large and small) to go on adventures with.  The objective of the game is to gain levels, complete quests, and to battle the forces opposite of your allegiance.  Working with others is the key to success in World of Warcraft.

When the player first enters the game, a movie clip is played that gives some introductory backstory information so that the player has a general idea about what is going on.  This movie is actually a fly-through of the area in which the player is going to begin playing.  This gives the player a chance to get his or her bearings before they are “on the ground.”

The screen space has pertinent information regarding the character as well as the character’s location within the game.  The upper right corner of the screen has a round map that has the cardinal directions with the character centered on this small map.  The character is represented as an arrow so that the player can see which direction they are pointing without having to move around to get one’s bearings.  This player-centered map is similar to the Blaeu Atlas because it is centered around the idea of the person needing to do the orientating is “inside the map.”  The Blaeu Atlas has lines emanating from points on open water toward landmarks.  These lines assist the person on the ocean to determine their approximate position from the landmarks that they see on particular lines of sight.  The system within the game takes this a step further by providing instant feedback of the direction the player is pointed in as well as the location of the player in relation to roads and landmarks.  Another feature that assists the player with recognizing one’s location is that as the character enters a new area or approaches a landmark, the name of that place will fade into the center of the screen for a few moments and then disappear.

Walking around is accomplished by using the keyboard with the mouse.  The W, A, S, and D keys (corresponding to forward, left, backward, and right) are used for walking around.  The mouse orients the “camera” around the player’s character on-screen.  Moving the camera around allows the player to better see up, down, or to the sides without having to walk in that direction (i.e., if the character’s neck were in a brace).

The ground, buildings, hills, mountains, and caves are textured so that they appear like one would think these things would like.  There are clouds and sky above, and the ponds and lakes have shimmering water.  There are small and large animals in the forests that the player can interact with.  Other players’ characters are walking around in the same area that you may be in.  There are also characters that are controlled by the game and the central game servers called non-player characters (NPCs).  These are characters that you can buy equipment from and some will invite you to undertake quests in return for rewards.  Because the world that the game is set in involves fantasy, magic, and mythical beings, the buildings and inhabitants can be fanciful.

The organization of the map, equipment, and battle function icons around the peripheral of the play area of the screen (the world and the character centered on the screen) works very well.  They do not take up that much area so that the player feels immersed in the game, but they are large enough to be meaningful and they all have unique icons (i.e., adheres to HCI principles).  The player interaction with other players and the NPCs is good, but it does require referring to the help system or the user manual.  When playing World of Warcraft on Mac OS X, they choose to do something differently than one would expect.  Within the Mac OS X Finder, you hold down the Control key while clicking with the mouse to emulate a right mouse button (because most Macs do not have a mouse with two buttons).  Inside the game however, you have to hold down the Command key (also known as the Apple key) while clicking with the mouse in order to perform a right click (which is used for picking up loot and for communicating with players and NPCs.  If the Blizzard developers had kept this consistent with what the player was expecting from using the operating system, interaction in the game space would have been more transparent.

The world in which the player navigates through is immersive.  The player’s character is modeled in three dimensions and the world that the character walks through is also modeled in three dimensions.  Physical principles such as gravity and optics are built into the game’s underlying technology.  Features in the distance are faded from view while those things up close have a tremendous amount of detail.  Because believability and level of detail can reach a point of diminishing returns, the look of the game is not photorealistic.  The Blizzard developers strike a balance between the look and feel of the world within the game and the amount of realism necessary for an immersive 3D environment.  Some physical laws are suspended however because of the mythic and fantasy elements of the world.  These elements have to be accepted on faith by the player in order for the game to have any meaning for the player.

The narrative is carried by the exploration and fulfillment of quests by the player/character.  Because the environment is so expansive (like the real world), the narrative created by the exploration of the player is successful.  The terrain that the character walks through is based on models that do not change.  There are certain assumptions about perspective that are upheld within the game.  If a cliff appears to rise about three hundred yards ahead, that distance will not shift.  This is a technical consideration regarding the way that the “camera” focuses and presents perspective of the 3D world.  The game models a space of fantasy but it must present it in a familiar way to the experiences of its intended audience.

There is a learning curve inherent in playing a game like World of Warcraft.  As Barbara Stafford writes in “Presuming images and consuming words,” “It is not accidental that this overwhelming volume of information—likened to drinking from the proverbial firehose—coincides with a mountain concern for bolstering and maintaining language ‘literacy’” (462).  Stafford is writing about the literacy of visual images.  There are subtle cues embedded in the game that the player has to recognize in order to play the game successfully (e.g., exclamation points over NPCs that have quests to offer and question marks over NPCs who are connected to quests in progress).  Iconic information provides the best way for quick access to game controls and functions.  The player has to develop a level of literacy of these icons in order to be a proficient game player.

Additionally, the 3D environments presented in the game are similar to the descriptions of Renaissance gardens in Kenneth J. Knoespel’s “Gazing on Technology.”  The 3D environment of the game is promoting the underlying technology that makes 3D computer graphics possible in the same way that Renaissance technology was employed in building those gardens.  Knoespel writes, “Gardens, whether set out in Renaissance poetry or on the estates of the nobility, offer a controlled means for assimilating the new technology.  In each case, the audience views the machinery at a privileged distance as it would an entertainer…In fact, the garden conceals technology in its mythological narrative” (117-118).  The player does not have to understand how his or her 3D graphics accelerator works in order to enjoy the immersive experience of playing World of Warcraft.  This game is the “controlled means for assimilating the new technology” of 3D computer graphics.

Honda Asimo Robot Presentation That I Recorded on Sept 25, 2004 at Georgia Tech

I am posting this as an example of multimodal Recovered Writing.

When I was an undergraduate at Georgia Tech, I went to two presentations made by Honda and its semi-autonomous robot Asimo in 2004.

If you’ve been reading my blog, you know that I am interested in robots. I own a Robie Sr. and I enjoy building simple robots with Lego. At the time when I made this video, I was gobsmacked by Asimo’s capabilities. I thought to myself that it seemed far more social and practical than R2-D2.

The video below is from the second presentation that I attended. I sat in the front row with my friend’s Sony camcorder to capture the action taking place on stage in the Ferst Center for the Arts where the presentations took place. Later, I edited the video and burned a DVD of it with my PowerMac G5. Today, I ripped the DVD with HandBrake and uploaded it to YouTube as an MP4 video file. Now, the video exists online:

Followup to Adventures with a CustoMac: Installing Mac OS X Mavericks on Asus P8Z77-V PC

Mavericks installed on CustoMac. NB: MBPr on desk and PowerMacintosh 8500/120 on right.

Mavericks installed on CustoMac. NB: MBPr on desk and PowerMacintosh 8500/120 on right.

Last summer, I wrote about my experiences installing Mac OS X 10.8 Mountain Lion on my Asus P8Z77-V and Intel i7-2700K PC here. What I neglected to say at the time was that an alarming number of creeping instabilities led me to ultimately abandon running Mountain Lion on my PC and return to Windows 7.

I later learned that some of these instabilities were likely linked to a bad PSU and video card–both of which were replaced by the manufacturers under warranty (awesome kudos to Antec and EVGA). With the new PSU and video card, my PC returned to 100% stability under Windows 7. This made me wonder if I could try rolling out a Mavericks installation on my PC.

Also, I wanted to use Mac OS X’s superior file content search technology and other third-party textual analysis tools in my research. I have a MacBook Pro 15″ retina (MBPr), but it lacks the hard drive capacity for my accumulated research files. The comfort that I feel in the MacOS environment and the need for lots of fast storage led me to turn my attention back to turning my PC into a CustoMac (aka “hackintosh”).

This time, I wanted to streamline and simply my setup as much as possible and incorporate components that should work out of the box (OOB). Toward this end, I reduced my hardware configuration from this:

  • ASUS P8Z77-V LGA 1155 Z77 ATX Intel Motherboard (disabled on-board Intel HD 3000 video and Asus Wi-Fi Go! add-on card)
  • Intel Core i7 2700K LGA 1155 Boxed Processor
  • Corsair XMS3 Series 16GB DDR3-1333MHz (PC3-10666) CL 9 Dual Channel Desktop Memory Kit (Four 4GB Memory Modules)
  • evga 01G-P3-1561-KR GeForce GTX 560 Ti 1024MB GDDR5 PCIe 2.0 x16 Video Card
  • Antec High Current Gamer 750W Gamer Power Supply HCG-750
  • Corsair Vengeance C70 Gaming Mid Tower Case Military Green
  • Cooler Master Hyper 212 Plus Universal CPU Cooler
  • Samsung 22X DVD±RW Burner with Dual Layer Support – OEM
  • Intel 128 GB SATA SSD
  • Western Digital Caviar Green WD10EARX 1TB IntelliPower 64MB Cache SATA 6.0Gb/s 3.5″ Internal Hard Drive – Bare Drive
Using on-board video and no ASUS wifi card.

Using on-board video and no ASUS wifi card.

to this:

  • ASUS P8Z77-V LGA 1155 Z77 ATX Intel Motherboard (using on-board Intel HD 3000 video and removing Asus Wi-Fi Go! add-on card)
  • Intel Core i7 2700K LGA 1155 Boxed Processor
  • Corsair XMS3 Series 16GB DDR3-1333MHz (PC3-10666) CL 9 Dual Channel Desktop Memory Kit (Four 4GB Memory Modules)
  • evga 01G-P3-1561-KR GeForce GTX 560 Ti 1024MB GDDR5 PCIe 2.0 x16 Video Card (removed to simply setup and save power–who has time for gaming?)
  • Antec High Current Gamer 750W Gamer Power Supply HCG-750
  • Corsair Vengeance C70 Gaming Mid Tower Case Military Green
  • Cooler Master Hyper 212 Plus Universal CPU Cooler
  • Samsung 22X DVD±RW Burner with Dual Layer Support – OEM
  • Intel 128 GB SATA SSD
  • Three Western Digital HDDs for file storage and work space. 
IoGear GBU521 and TP-Link TL-WDN4800 from Microcenter.

IoGear GBU521 and TP-Link TL-WDN4800 from Microcenter.

Also, I added two new components that were recommended from the TonyMacx86 Forums:

  • TP-Link 450Mbpx Wireless N Dual Band PCI Express Adapter (TL-WDN4800). It works in Mavericks OOB.
  • IoGear Bluetooth 4.0 USB Micro Adapter (GBU521). It works in Mavericks OOB.
DSC01487

ASUS’s Wi-Fi Go! card works great in Windows 7, but it caused problems with my Mavericks installation.

As noted above, I physically removed my 560 Ti video card, because I wanted to simply my setup for installation purposes. Also, I removed the ASUS Wi-Fi Go! add-on card, because despite disabling it in BIOS, the Mavericks installer seemed to hang on a wi-fi device while attempting to set its locale (a setting that determines what radio settings to use based on the country that you happen to be in). After I removed the Wi-Fi Go! card, I had a nearly flawless Mavericks installation process (NB: removing the Wi-Fi Go! card required removing the motherboard, turning it over, removing a screw holding in the Wi-Fi Go! card, turning the motherboard over, and unplugging the Wi-Fi Go! card).

These are the steps that I used to install Mavericks on my PC:

  1. Follow TonyMac’s Mavericks installation guide for making an installation USB drive and installing Mavericks.
  2. Following installation of Mavericks, boot from your USB drive, select your new Mavericks installation drive, arrive at the desktop, and run Multibeast.
  3. Select these settings in Multibeast:
    1. Quick Start > DSDT Free (I left all pre-selected options as-is. Below are additional selections that I made.)
    2. Drivers > Audio > Realtek > Without DSDT > ALC892
    3. Drivers > Disk > 3rd Party SATA
    4. Drivers > Graphics > Intel Graphics Patch for Mixed Configurations
    5. Drivers > Misc > Fake SMC
    6. Drivers > Misc > Fake SMC Plugins
    7. Drivers > Misc > Fake SMC HWMonitor App
    8. Drivers > Misc > NullCPUPowerManagement (I don’t want my machine to go to sleep)
    9. Drivers > Misc > USB 3.0 – Universal
    10. Drivers > Network > Intel – hank’s AppleIntelE1000e
    11. Customize > 1080p Display Mode
    12. Build > Install
  4. Repair Permissions on Mavericks drive from /Applications/Utilities/Disk Utility
  5. Reboot
  6. Run Chameleon Wizard (this will fix a problem that you might have with connecting to the App Store)
  7. Click SMBios > Edit > Premade SMBioses > choose MacPro 3,1 > Save
  8. Reboot
  9. CustoMac should now be fully operational!

In order to arrive at the above instructions, I read a lot of first hand experiences and third party suggestions on TonyMac’s forums. I owe a tremendous debt of gratitude to the amazing community of CustoMac builders who take the time to share their thoughts and lessons and equally so to the tool-builders who create amazing software including UniBeast, Multibeast, and Chameleon Wizard!

I would suggest that you remember that there is not always one path to a successful build. I distilled a lot of posts into my successful build. Your experience with similar hardware might take a different path. Reading others experiences and trying their suggestions experimentally can lead to your own successful discoveries. Thus, I took the time to try out different configurations of hardware until settling on the stripped down approach with on-board video and OOB networking gear. I tried several different installations: a failed Mavericks installation with kernel panics (Wi-Fi Go! card installed and wrong Multibeast configuration), a successful Mountain Lion installation (barebones and correct Multibeast configuration), and a successful Mavericks installation (detailed above).

Obviously, MacOS X can run on a wide range of PC hardware given the correct drivers, configuration information, etc. Apple could do great things if only Tim Cook and others would think differently and move beyond the tightly integrated hardware-software experience. Apple’s engineers could do great things with building better operating systems that adapt to a person’s hardware. Given the chance, they could challenge Microsoft and Google with a new MacOS X that is insanely great for everyone–not just those who can afford to buy new hardware.

Now, back to using some of the tools that I use in my research on a computing platform that I enjoy:

Precipitous Drop in DynamicSubspace.net Readers in 2013, Recommitting to the Site in 2014

DynamicSubspace.net Site Stats by Year.

DynamicSubspace.net Site Stats by Year.

As you can see in the chart image to the left, dynamicsubspace.net had a significant drop in readers this past year. From 2007 to 2012, my blog was steadily gaining visitors. It rose from 3,772 in 2007 to 91,530 in 2012 with each in between steadily adding visitors. However, the number of visitors in 2013 were just over half as many as I had in 2012: 55,933. This is a significant drop in visitors to the site and I assume far less outreach than I had achieved in the previous two years.

DynamicSubspace.net average visitors per day.

DynamicSubspace.net average visitors per day.

Of course, as you can see in the chart to the right, this means that the visitors per day declined from a high of 250 in 2012 to 153 in 2013. Far fewer visitors were stopping by dynamicsubspace.net per day.

While I blogged more about my teaching and pedagogical matters this past year (in a total of 63 blog posts), I was less frequently updating the site with some of my other interests: neurohumanities, computing, gaming, and our rights online (e.g., free speech and privacy). In fact, as you can see in the list below, only the second most visited page on my site last year was originally published in 2013–a post about installing Ubuntu on a MacBook Pro. All of the other top ten read pages were published before 2013 (or permanent as in the case of the Archives page). Unfortunately, none of my pedagogically-oriented pages (including my posts about teaching Science Fiction LMC3214 or LMC3403, Technical Communication: Lego, Haptics, and Instructions made the top ten):

  1. Home page / Archives
  2. Steps for Installing Mac OS X 10.8 Mountain Lion and Ubuntu 13.04 Raring Ringtail in Dualboot Configuration
  3. Safari Web Content Hogging RAM and CPU Time, Thread on Apple Support Communities
  4. 1080p Trouble with Windows 7, Nvidia, and Samsung LCD HDTV
  5. The United States and Canada Declare War on Japanese Manga and Lolicon
  6. On Forced Deep Throat in Aliens Vs. Predator Requiem
  7. Enable TRIM in Mac OS X 10.6 Snow Leopard for Speed and Longer SSD Life
  8. Lego Launch Command Sets For Sale
  9. Xbox 360 Controller Driver for Mac OS X, Works Like a Dream
  10. Lego Star Wars Luke’s Landspeeder 8092

I hope to expand my readership in the coming year with my recently announced Recovered Writing initiative. I have a lot of writing squirreled away that I believe should be made publicly available on my blog. Perhaps others will find some value in these works that helped me develop as a writer, scholar, and teacher.

Despite the requirements of time needed for teaching and research, I also intend to record my thoughts and provide commentary on dynamicsubspace.net more than I did in 2013. I believe blogging is one important way for scholars and professors to be public intellectuals. The blog is a medium for multimodal composition, audience engagement, and rigorous elaboration. While Twitter and other social media have their useful affordances, blogging gives us the technology and space to explore, engage, and discuss matters more fully than we might otherwise be able to do in other forms of social media. We should use a variety of writing and interaction technologies in our work as scholars and public intellectuals. I commit to redoubling my efforts on dynamicsubspace.net in the new year–more scholarship, more pedagogy, and more engagement!

DevLab’s End of Semester Best Computing Practices Workshop, Wed, Dec 4, 2013, 4-5PM

S is for Security!

S is for Security!

Our computers and other computing devices store some of our most important belongings: photos, videos, music, syllabi, research, and manuscripts. We owe it to ourselves to maintain and protect these things through best practices in computer maintenance, security, backups, and training. During the upcoming winter break, I would like to encourage everyone to spend some time putting your cyber-house in order before the spring semester begins.

To help you with this and to promote best practices, I will hold a workshop in DevLab on Wednesday, Dec. 4 from 4:00-5:00PM before D-Ped. Workshop participants are encouraged to bring their Mac or PC to the meeting. Tablets are also welcome.

Before or after the workshop, you can download the first version of my best practices guide from here: ellis-jason-best-computing-practices-v1.pdf

If you have a question for the workshop that I cannot answer off the top of my head, we can use the workshop as an opportunity to learn something new together.

See you in DevLab!

My Poster for SAMLA 2013: The Brittain Fellowship’s DevLab: Space, Resources, Expertise, and Collaboration

My DevLab Poster.

My DevLab Poster.

This year, Georgia Tech’s Writing and Communication Program and its Brittain Fellows had a strong presence at the annual South Atlantic Modern Language Association meeting in Atlanta, Georgia.

I presented a poster on the program’s R&D unit, DevLab. To compose the poster, I took a panoramic photo of DevLab’s main space (we also have an external recording booth). My students and fellow Brittain Fellows are pictured doing work and collaborating at various events over the past few months.

Standing next to my poster in the Buckhead Marriott.

Standing next to my poster in the Buckhead Marriott.

We also had posters on the Communication Center, our pedagogical research, and our scholarly research. Here’s a list of all posters from the official program:

10. Georgia Institute of Technology Brittain Fellowship,

Poster Series I

The Marion L. Brittain Postdoctoral Fellowship at the Georgia Institute of Technology

a. Jason Ellis, DevLab: Research and Development Lab Facility

b. CommLab: Tutoring Center for Multimodal Communication

11. Georgia Institute of Technology Brittain Fellowship, Poster Series II

WOVEN: Multimodal Communication in the Classroom

a. Joy Bracewell

b. Jennifer Lux

c. Julia Munro

12. Georgia Institute of Technology Brittain Fellowship, Poster Series III

Intersections between Scholarship and Pedagogy

a. Aaron Kashtan

b. Jennifer Orth-Veillon

c. Aron Pease

13. Georgia Institute of Technology Brittain Fellowship, Poster Series IV

Changing Higher Education

a. Mirja Lobnik, World Englishes Committee

b. Multiple Presenters, Curriculum Innovation Committee

c. Arts Initiatives Committee

Besides participating in the poster session, I also took notes from N. Katherine Hayles’ plenary lecture on Friday afternoon. I will post my notes from that talk here soon.

Next year, I will propose a paper for SAMLA and hopefully present an updated version of the DevLab poster. See you there!