Category: Computers

  • Digital Archives and Vintage Computing @ Georgia Tech, Co-Presentation by Wendy Hagenmaier and Jason W. Ellis, VCF 2.0

    Screen Shot 2014-05-03 at 11.02.38 PMOn May 4, 2014 at 11AM, Wendy Hagenmaier and I will give a co-presentation on Digital Archives and Vintage Computing @ Georgia Tech at the Vintage Computing Festival 2.0 in Roswell, Georgia. This post includes a support video embedded below, a link to our PowerPoint presentation, and a transcript of our talk.

    During my part of the presentation, I will discuss this Google Glass captured demo of the Voyager Expanded Books series ebook of William Gibson’s Sprawl Trilogy on a Powerbook 145:

    We have provided a transcript of Jason’s part of the presentation below (and Wendy’s follows):

    Digital Archives and Vintage Computing at Georgia Tech

    Jason W. Ellis and Wendy Hagenmaier

    Jason:

    [OPENING SLIDE-COMPUTERS]

    Hello and welcome to our presentation on Digital Archives and Vintage Computing at Georgia Tech. I am Jason Ellis, a Marion L. Brittain Postdoctoral Fellow, and this is Wendy Hagenmaier, Digital Collections Archivist at the Georgia Tech Library.

    In the first part of our presentation on digital archives and vintage computing at Georgia Tech, I will describe how these fit into my research and teaching before suggesting how the library can fulfill those needs for the communities it serves. Wendy will conclude with a discussion of the trajectory of the Georgia Tech Library as a place of research, learning, and making beyond the traditional image of a library.

     

    [JASON W. ELLIS]

    My primary work at Tech is to teach first year composition, tech comm, and occasionally, science fiction.

    [HOW I CAME TO FOLD VINTAGE COMPUTING INTO MY WORK]

    While I have long considered myself a computer hobbyist and I was an IT professional before going back to school to finish my degrees, I have leveraged my interest in computer technology and the human brain to do innovative research on the interplay between the digital and the biological. This raises issues of accessing digital culture on older media and making meaning from these significant forms of culture. These things are important to my research, but I want to enrich my teaching and help my students develop their digital literacies, too.

    [AUTHOR’S AFTERWORD]

    What specifically led me down this path professionally was that I needed to find a citation for a text I found online. It was an intriguing article attributed to the cyberpunk SF writer William Gibson on a Russian website (cyberpunk.ru). In it, he talks about the ephermerality of technologies—a very interesting idea in light of the fact that he wrote his novel Neuromancer on a typewriter. The afterword seemed ephemeral, too, because I couldn’t find a trace of this afterword in any printed book. A friend of my tweeted Gibson (@GreatDismal) and gave me a lead on a floppy disk-based ebook by the Voyager Company. After a search in Worldcat, the massive library database, I found a copy at the Michigan State University Library: the pictured Voyager Expanded Book series floppy disk of Gibson’s Sprawl Trilogy (Neuromancer, Count Zero, and Mona Lisa Overdrive). Unfortunately, I had no way of reading it.

    [POWERBOOK 145]

    After calling around northeast Ohio area schools and libraries without any luck finding a Macintosh with a 3.5” floppy disk drive, I turned to eBay where I acquired this Powerbook 145 (one much like the first computer I carried to Georgia Tech as a freshman in 1995). While I could have purchased an external floppy disk drive that connects with USB to access the ebook software, I wanted to experience the ebook as it was meant to be.

    [VOYAGER EBOOK SOFTWARE]

    With my Powerbook 145 and the Voyager Expanded Books floppy disk, I copied the self expanding archive’s contents to the Powerbook’s 80 MB hard drive. I observed that the Voyager ebook software is Hypercard-based. While it is made for the Macintosh Portable, it works fine on the later model Powerbook 145.

    You can navigate the complete text of the novels and afterword with the trackball or arrow keys. While it has a global search box, you can also search by clicking on a word to see where else the word appears (much like Apple’s iBooks today). It supports annotations and bookmarking with virtual paperclips—an issue of remediation.

    [AUTHOR’S AFTERWORD IN VOYAGER EBOOK]

    This was the prize that I was looking for—the original author’s afterword available only in this ebook. In fact, Gibson did not even include it in his recent collection of nonfiction writing—Distrust That Particular Flavor. If you visit my blog at dynamicsubspace.net, you can watch an experimental video that I made with the Powerbook 145, Gibson’s ebook, an iPad Air, and my Google Glass.

    [LET ME DO THAT FOR YOU]

    Besides my research with and on vintage computing, I believe that these technologies should be an important part of teaching. Our students and young people need to have an idea about how the technology we enjoy today came to be the way that it is and to know that the past is full of ideas that might be repurposed, retried, or rediscovered as we continue developing ever new digital technologies.

    For example, when I was researching Philip K. Dick in the Eaton Science Fiction Collection at the University of California at Riverside—the largest SF collection in the US if not the world—I had to stop a young, special collections librarian-in-training from jamming a one-of-a-kind cassette tape interview into a VHS machine on the AV cart. I directed her attention to the record/cassette combo on the bottom rack and offered, let me show you how to do that. These issues of use, operation, and support are passed on through teaching and first-hand experience.

    [HOW I CONNECT RESEARCH AND TEACHING]

    In my research, I have built a personal “Retrocomputing Lab” of Macs and PCs that support my research in the development of reading on screens just prior to and after the widespread adoption of the Internet. You can learn more about these on dynamicsubspace.net.

    Most recently, I have embarked on a new way of sharing my research with others. In addition to writing essays for publication in journals and online, I am using Google Glass to record my experiences as a raw dataset that I can share on YouTube to support my scholarship and connect with others.

    In my teaching, I encourage my freshmen students to learn how our computing technologies in the past and present have an influence on our neurobiology—put another way how we create computers with our brains and how do computing technologies change the way that we think over time. In Tech Comm, I have students research problems on the Tech campus that can be fixed with a technical communication solution. In one case, students resurrected an online printing solution that had died before they were students. Finally, in Science Fiction, I invite students to read Gibson’s afterword on the Powerbook and play the DOS video game interpretation of Neuromancer on an IBM-compatible PC.

    [A VISION FOR THE FUTURE OF GEORGIA TECH]

    My suspicion is that the need for accessing older media, studying vintage computing hardware and software, and teaching others how to use and preserve these technologies is not limited to literary and cultural studies. Obviously, computing is an interdisciplinary endeavor— specifically, I am thinking what Steve Jobs said about Apple being at the intersection of technology and the liberal arts—I think that this is a long tradition in computing not confined to the fine work at Apple.

    I told Wendy, Sherri Brown, Alison Valk, and Elizabeth Rolando about my hopes for the Georgia Tech Library to serve as a synthesis of vintage computing research and teaching. The library’s archival mission can simultaneously maintain access to knowledge while preserving hardware and software as important artifacts of study. The library’s learning mission can support theoretical issues such as archival work and the history of science and technology alongside practical issues of training, using, and making. The library can do this through acquisition and on-going support, providing space for this kind of work, coordinating across institutions and the private sector, outreach, and more. Already, the Georgia Tech Library is a nexus of research and teaching that evolves to meet the research and learning needs of the communities that it serves. Wendy will tell us more about that in the next part of our presentation.

    We have provided a transcript of Wendy’s part of the presentation below:

    Hi everyone, I’m Wendy Hagenmaier, the Digital Collections Archivist at the Georgia Tech Library. I’m responsible for digital archives (similar to the work Al and Anne have discussed).

     

    Reimagining the Georgia Tech Library

    In light of Jason’s insights, I want to talk about some exciting changes happening at the Georgia Tech Library—changes we’ve been referring to as “reimagining the Library.” Though some of these changes are unique to Georgia Tech, many of them reflect how libraries everywhere are evolving to anticipate the needs of future library users, including people like Jason and all of you, the attendees here today.

    The GT Library is transforming into a technological research library for the 21st century, but its mission remains the same: to be a creative partner and essential force in the learning community and the Institute’s programs.

    At the GT Library lately, we’ve been asking ourselves: How can we support the research and teaching needs of faculty like Jason and inspire the scholarship of our broader community? And how can we invite the community to explore the past and design the future? As an archivist, I’m always interested in what the past can teach us about the future, so let’s take a quick look at the GT Library of long ago…

    The Georgia Tech Library of the Past

    Welcome to the Library of the 1960s.

    Like many research libraries of the era, the GT Library provided services to support traditional, print book and journal-based research. The emphasis was on creating the most massive collection of print material possible, to position the library as a secluded, exclusive repository of knowledge that could only be found within a print collection. Imagine the shushing librarian, no food, no drink, no talking.

    This worked well for a while, but radical changes in research and daily life on campus—mobile/ubiquitous/wearable technologies, Massive Open Online degrees, flipped classrooms, project based learning, digital repositories, university history now enacted on YouTube and Twitter—have made it essential that the Library undergo its own transformation. Print book checkouts are declining, but the number of visitors to the Library is exploding and users are accessing our e-resources over a million times a year. So here we are, at the Georgia Tech Library of the Present:

    The Georgia Tech Library of the Present

    In light of the cultural shifts I mentioned, the Library is presently planning its own shifts, both literally and metaphorically, on several fronts:

    Here’s the first literal shift: the GT Library and Emory Libraries are partnering to construct a large climate-controlled facility to house the majority of our collection. This means we’re moving perhaps as much as 90% of our print collection to Emory’s Briarcliff campus. Books will be delivered to users on demand, and traditional browsing of physical library stacks will have to be translated into the digital realm.

    Another shift: the Library is conducting user research with students and faculty, including focus groups, interviews, and surveys, to develop a shared vision for the Library’s future.

    The walls of our 1960s buildings are now covered with post-it notes from dozens of internal brainstorming sessions, where we’re defining and innovating future services.

    And another literal shift: we’re working with an architectural team to completely redesign the interiors of our buildings over the next five years.

    Through reimagined spaces and services, the Library is becoming an interdisciplinary platform for scholarship, an integrated network of human and technological resources, and a champion of innovation.

    The Georgia Tech Library of the Future

    My colleague Sherri Brown and I interviewed Jason a few months ago as part of the Library’s user research, and he brought up the idea that the GT community has unmet retrocomputing needs. Faculty members from all sides of campus are encountering the need to access information stored on outdated media and to teach their students about the history of technology.

    This academic interest in retrocomputing parallels the digital archaeology work being conducted in libraries and archives—everywhere from Emory’s Digital Archives to the New York Public Library. Archivists at these institutions are using old hardware and software to access and preserve content created with obsolete technologies (such as Salman Rushdie’s manuscripts saved on floppy disks). To date, however, all of the retrocomputing work in the library world has been conducted by library staff. These digital archaeology labs are not accessible to the libraries’ user communities.

    My colleagues Jason, Sherri, Alison Valk, Lizzy Rolando and I are trying to imagine how we might do something different at the GT Library: offer our technologically-savvy patrons a chance to use the retrocomputing equipment typically restricted to library staff.

    This might take the form of one or two retrocomputing consoles—or perhaps a larger lab—within the Library, which would be available to users who would be vetted by Library staff.

    The idea is to take the digital forensics and archaeology work occurring behind the scenes in archives, plus the rise of hacker and makerspaces in libraries, plus collaborations with campus and community partners (perhaps even you?)…to imagine creating a retrocomputing lab. This space would not only serve as a hands-on historical reference point; it could activate new ideas about future technology and preservation of tools and ideas.

    So how could we make this space happen, and how might we collaborate? Collectors, experts, and community organizations like the Atlanta Historical Computing Society could support an idea like this through:

    -equipment sourcing

    -IT support and expertise, knowledge of the history of computing

    -and mentorship

    In return, a project like this might someday offer collectors, experts, and community organizations:

    -a collaborative meeting and hacking space, for making connections with like-minded people and hacking the past, present and future

    -space dedicated to preservation (libraries specialize in preservation environments in a way that most individuals and community groups can’t)

    -as well as infrastructure, branding, and support for community organizations seeking institutional allies

    In many ways, the retrocomputing space we’re envisioning resembles the high tech computing lab of Georgia Tech’s past, which once seemed so futuristic and advanced, bringing us full circle, so that imagining the future of our Library becomes an act of reimagining our past.

  • Lego Building Experiments with Google Glass, Thoughts on Its Potential for Interdisciplinary Humanities Research

    Since I received my Google Glass last week, I have been learning how to wear and use it. Ultimately, I want to incorporate Glass into my Retrocomputing Lab research workflow. I am interested in the experience of using computer hardware and software (something that I have been interested in for a long time and wrote about as an undergraduate), so Glass will provide a way of capturing some of my phenomenal experience–perspective, vision, and sound. I can provide oral commentary of my haptic and olfactic experiences (yes, computers have unique smells–something that helps store/recall memories and emotions) while also recording thoughts, memories, and asides that enrich my shared video experience. As one component of the digital humanities, I want to create an archive of my raw research of working with computers and their software that others can use, draw inspiration from, or comment on through their own research, writing, and teaching.

    For the work that I do in my personal Retrocomputing Lab, I will use Glass as one more tool among a variety of other technologies that enable my research. Glass will add another data layer–itself richly textured and layered with audio/video/Internet/software capabilities–to the research that I do. Due to the ease of sharing images and video in real time, I can immediately make my in-process research available on YouTube, Twitter, and here on dynamicsubspace.net. Furthermore, my research will be useable by others–hobbyists, students, and other researchers in many interdisciplinary fields. Glass will join my non-real time distribution of data on paper, computer written notes (though, I could make these freely viewable in realtime on say, Google Drive), and published research.

    Finally, I am interested in the mixing of old and new technologies. Glass meets its ancestors in the IBM PC and Macintosh. Glassware meets DOS, Windows 3.1, and System 7. I want to explore how the intermingling of these technologies leads to new insights, connections, and elaborations. While I am only speculating, I strongly believe that Glass and similar wearable computing technologies will elevate the outcomes and knowledge produced in humanities research–conceptualized as interdisciplinary like mine or not.

    The videos included in this post were tests of the manually extended video recording feature. They don’t involve the Retrocomputing Lab, because how I want to use Glass to record my work will involve more and different kinds of planning. I used what I had at hand to test out Glass’ video capabilities included below.

    Glass Video from Apr 22, 2014, Lego Build of The Batman Tumbler 30300 Polybag

    Glass Video from Apr 24, 2014, Target Exclusive Lego 30215 Legolas Greenleaf Polybag

  • Recovered Writing: Undergraduate Technologies of Representation Final Essay Response on Communication Tech and World of Warcraft, Dec 8, 2004

    This is the fourteenth post in a series that I call, “Recovered Writing.” I am going through my personal archive of undergraduate and graduate school writing, recovering those essays I consider interesting but that I am unlikely to revise for traditional publication, and posting those essays as-is on my blog in the hope of engaging others with these ideas that played a formative role in my development as a scholar and teacher. Because this and the other essays in the Recovered Writing series are posted as-is and edited only for web-readability, I hope that readers will accept them for what they are–undergraduate and graduate school essays conveying varying degrees of argumentation, rigor, idea development, and research. Furthermore, I dislike the idea of these essays languishing in a digital tomb, so I offer them here to excite your curiosity and encourage your conversation.

    This is my final post of material from Professor Kenneth J. Knoespel’s LCC 3314 Technologies of Representation class at Georgia Tech. LCC 3314 is taught in many different ways by the faculty of the Georgia Tech’s School of Literature, Media, and Communication, but I consider myself fortunate to have experienced Professor Knoespel’s approach to the course during the last phase of my undergraduate tenure. The ideas that we discussed in his class continue to inform my professional and personal thinking. Also, I found Professor Knoespel a great ally, who helped me along my path to graduation with side projects and independent studies.

    This is my final paper assignment (I think given in lieu of a final exam) in LCC3314. The more exciting portion is question 2, which concerns Blizzard’s World of Warcraft. I break down how you navigate its space and I describe elements of its operation. It bears noting that at the time that I wrote this, WoW had been out for less than a month. I was rabidly playing it on my PowerMac G5 at 2560×1600 resolution on a 30″ Apple Cinema Display. While it might not have been the best essay, it certainly was one that I enjoyed writing to no end! I wish that I had found a way to make time for WoW since my days in Liverpool. I have played WoW on only rare occasions since returning to the States, but I continue to write about it from my memory of Azeroth.

    Also included below is my response to question 1, which seems to be focused on the telegraph, telephone, and cellular phone. In this question, I explore the material experience of using these different communication media and technological devices. I suppose WoW is another kind of communication technology wrapped up in a highly interactive gaming environment (cf. Hack/Slash).

    Jason W. Ellis

    Professor Kenneth J. Knoespel

    LCC3314 – Technologies of Representation

    December 8, 2004

    Final Paper Assignment

    1. On the telegraph, telephone, and cellular phone

    The telegraph, telephone, and cell phone each have a particular interface that works with different human senses and thus provide different experiences for the body.  The differences between these communication technologies lie in the physicality of the artifact as well as the technology underlying the technology for encoding and decoding communication.

    The telegraph is a wired point-to-point textual communication technology.  Telegraph operation involves trained operators who can encode and decode the Morse code messages transmitted over wires with telegraph machines.  The process of sending a telegram involves finding a business that offers telegraph service, going there in person, telling the telegraph operator the message to send, the telegraph operator encodes the message with the telegraph machine, it is received by the appropriate destination telegraph operator, that operator decodes the message, a delivery person is dispatched with the message, and the message is hand delivered to the recipient.  The experience of the telegram sender is standing at a counter and speaking with an operator.  The receiver interfaces with a delivery person who hands them a piece of paper containing the message.  The technology that makes the sending and receiving messages over great distances possible is removed from the experience of the sender and receiver.  The sender and receiver also have to rely on a network of operators and delivery persons.  These people are in a unique position to view the correspondence between the sender and receiver.  This fact is probably something that senders of telegrams were well aware of.

    The telephone is a wired point-to-point oral communication technology.  Telephones encode auditory information into electrical signals which travel over copper wires in a phone network to the receiving telephone that decodes the electrical signals into auditory information (the spoken voice).  Telephones allow users to hear the person’s voice that they are speaking with.  One problem with telephones is that the technology uses a narrow band of audible sound that can cause “m” to sound like “n” or “b” to sound like “d.”  Initially, telephones were prohibitively expensive and were direct wired from location to location.  After telephone networks were made possible with human operator switching technology, voice phone calls could be routed from the call initiator to the call receiver.  Therefore, over time the phone network mediation shifted from human operators to electrical switching technology.  When you would make a call you would speak to an operator first, and then the person that you were calling.  Now, one can dial a number and the phone network’s automatic switching technology connects the caller with the receiver.  Someone who makes a phone call assumes privacy when the call is made from home or within an enclosed space such as a phone booth.  The physical interaction between the user and the telephone is that a headset is lifted off the base and held to the ear and mouth.  The user taps out a phone number on the base or dials a number with a rotary phone base.  The telephone user experiences an interaction with a disembodied voice.

    The cell phone is an unwired point-to-point oral and textual communication technology.  Modern cell phones are a synthesis of the telegraph, telephone, digital photography, video technology, and radio technology.  Cell phones facilitate voice conversations between cell phone to cell phone or cell phone to wired telephone.  They also allow for text messaging, audio messaging, picture messaging, and video messaging.  Widespread cell phone use is shifting voice phone conversation into a more commonplace activity.  Additionally, the private sphere of telephone conversation is shifting to the public sphere of wherever the cell phone user answers or makes a phone call.  Cell phones also connect to the Internet and Internet-based text messaging networks such as AOL Instant Messenger.  The cell phone has become a place of contact for the individual in more ways than merely talking on the phone.  It builds connections between the individual and others as well as between the individual and information (e.g., online weather information, movie listings, online news websites, etc.).  With ear bud speaker/microphones that plug into cell phones or wireless Bluetooth headsets, one can interface with the auditory communication features of their cell phone without needing to hold the cell phone up to the ear and mouth as one would with a traditional telephone.  The cell phone users also interface with a disembodied voice, but the cell phone also has other means of interaction with people as well as information.

    The telegraph is not an interactive means of communicating in the way that the telephone and the cell phone are.  With the telephone or the cell phone, one can have a real-time conversation with someone else whereas with the telegraph, there is a delay between sending a message, delivery, and if need be, a return message.  The amount of information capable through transmissions has increased over time.  The telegraph had a finite amount of information that could be conveyed because of the time and cost of sending messages with Morse code.  The telephone increased the amount of conveyed information because it was a disembodied voice that could carry nuances of speech and emotive information (e.g., happiness, sadness, anger, etc.).  The cell phone has brought these communication systems full circle with the creation of a synthesis of voice and text.  Along with oral communications, there is so much textual and graphic information that can be conveyed through a cell phone.  Barbara Stafford writes, “we have been moving, from the Enlightenment forward, towards a visual and, now, an electronically generated, culture” (“Presuming images and consuming words” 472).  The cell phone represents the bringing together of communication, both between people and between people and sources of information.  Walter J. Ong writes in Orality and Literacy, “By contrast with vision, the dissecting sense, sound is thus a unifying sense.  A typical visual ideal is clarity and distinctness, a taking apart…The auditory ideal, by contrast, is harmony, a putting together” (71).  The modern cell phone brings together the visual and the oral in a way that previous communication technologies had not.  This unification ties two of the powerful human senses (sight and sound) to the cell phone that distinguishes it from the telegraph and telephone.

    An interesting development in these technologies is that the perception is that better communication technologies lead to better communication between individuals (i.e., a bringing together of individuals).  George Myerson writes in Heidegger, Habermas, and the Mobile Phone, “There’s no real gathering at all.  Instead, there are only isolated individuals, each locked in his or her own world, making contact sporadically and for purely functional purposes” (38).  Thus, the cell phone has disconnected the individual from the wall phone where one might be “waiting on an important call.”  Casualness and importance are intertwined in the use of the cell phone.

    I used Paul Carmen’s paper on the telegraph, Amanda Richard’s paper on the telephone, and Kevin Oberther’s paper on the cell phone as starting points for this essay.

    2. On World of Warcraft

    Blizzard Entertainment’s World of Warcraft video game was released on November 23, 2004 for both Windows and Mac OS X.  It is a massively multiplayer online role playing game (MMORPG) that immerses the player in a 3D fantasy world where the player is able to create a character based on several layers of identity (e.g., allegiance:  alliance or horde, races:  humans, dwarves, night elves, gnomes, orcs, tauren, trolls, or undead, and classes:  warrior, mages, druids, hunters, rogues, etc.).  After building one’s character (including designing a unique appearance), you choose a realm in which to play.  These realms correspond to computer servers that are in a particular time zone.  Other players around the world pick one of these realms to play in that best corresponds to when they will be playing, or when their friends will be playing.  The player is able to meet up with friends within a realm to go on adventures together, and if the player doesn’t know anyone, he or she can communicate with other players to form groups (large and small) to go on adventures with.  The objective of the game is to gain levels, complete quests, and to battle the forces opposite of your allegiance.  Working with others is the key to success in World of Warcraft.

    When the player first enters the game, a movie clip is played that gives some introductory backstory information so that the player has a general idea about what is going on.  This movie is actually a fly-through of the area in which the player is going to begin playing.  This gives the player a chance to get his or her bearings before they are “on the ground.”

    The screen space has pertinent information regarding the character as well as the character’s location within the game.  The upper right corner of the screen has a round map that has the cardinal directions with the character centered on this small map.  The character is represented as an arrow so that the player can see which direction they are pointing without having to move around to get one’s bearings.  This player-centered map is similar to the Blaeu Atlas because it is centered around the idea of the person needing to do the orientating is “inside the map.”  The Blaeu Atlas has lines emanating from points on open water toward landmarks.  These lines assist the person on the ocean to determine their approximate position from the landmarks that they see on particular lines of sight.  The system within the game takes this a step further by providing instant feedback of the direction the player is pointed in as well as the location of the player in relation to roads and landmarks.  Another feature that assists the player with recognizing one’s location is that as the character enters a new area or approaches a landmark, the name of that place will fade into the center of the screen for a few moments and then disappear.

    Walking around is accomplished by using the keyboard with the mouse.  The W, A, S, and D keys (corresponding to forward, left, backward, and right) are used for walking around.  The mouse orients the “camera” around the player’s character on-screen.  Moving the camera around allows the player to better see up, down, or to the sides without having to walk in that direction (i.e., if the character’s neck were in a brace).

    The ground, buildings, hills, mountains, and caves are textured so that they appear like one would think these things would like.  There are clouds and sky above, and the ponds and lakes have shimmering water.  There are small and large animals in the forests that the player can interact with.  Other players’ characters are walking around in the same area that you may be in.  There are also characters that are controlled by the game and the central game servers called non-player characters (NPCs).  These are characters that you can buy equipment from and some will invite you to undertake quests in return for rewards.  Because the world that the game is set in involves fantasy, magic, and mythical beings, the buildings and inhabitants can be fanciful.

    The organization of the map, equipment, and battle function icons around the peripheral of the play area of the screen (the world and the character centered on the screen) works very well.  They do not take up that much area so that the player feels immersed in the game, but they are large enough to be meaningful and they all have unique icons (i.e., adheres to HCI principles).  The player interaction with other players and the NPCs is good, but it does require referring to the help system or the user manual.  When playing World of Warcraft on Mac OS X, they choose to do something differently than one would expect.  Within the Mac OS X Finder, you hold down the Control key while clicking with the mouse to emulate a right mouse button (because most Macs do not have a mouse with two buttons).  Inside the game however, you have to hold down the Command key (also known as the Apple key) while clicking with the mouse in order to perform a right click (which is used for picking up loot and for communicating with players and NPCs.  If the Blizzard developers had kept this consistent with what the player was expecting from using the operating system, interaction in the game space would have been more transparent.

    The world in which the player navigates through is immersive.  The player’s character is modeled in three dimensions and the world that the character walks through is also modeled in three dimensions.  Physical principles such as gravity and optics are built into the game’s underlying technology.  Features in the distance are faded from view while those things up close have a tremendous amount of detail.  Because believability and level of detail can reach a point of diminishing returns, the look of the game is not photorealistic.  The Blizzard developers strike a balance between the look and feel of the world within the game and the amount of realism necessary for an immersive 3D environment.  Some physical laws are suspended however because of the mythic and fantasy elements of the world.  These elements have to be accepted on faith by the player in order for the game to have any meaning for the player.

    The narrative is carried by the exploration and fulfillment of quests by the player/character.  Because the environment is so expansive (like the real world), the narrative created by the exploration of the player is successful.  The terrain that the character walks through is based on models that do not change.  There are certain assumptions about perspective that are upheld within the game.  If a cliff appears to rise about three hundred yards ahead, that distance will not shift.  This is a technical consideration regarding the way that the “camera” focuses and presents perspective of the 3D world.  The game models a space of fantasy but it must present it in a familiar way to the experiences of its intended audience.

    There is a learning curve inherent in playing a game like World of Warcraft.  As Barbara Stafford writes in “Presuming images and consuming words,” “It is not accidental that this overwhelming volume of information—likened to drinking from the proverbial firehose—coincides with a mountain concern for bolstering and maintaining language ‘literacy’” (462).  Stafford is writing about the literacy of visual images.  There are subtle cues embedded in the game that the player has to recognize in order to play the game successfully (e.g., exclamation points over NPCs that have quests to offer and question marks over NPCs who are connected to quests in progress).  Iconic information provides the best way for quick access to game controls and functions.  The player has to develop a level of literacy of these icons in order to be a proficient game player.

    Additionally, the 3D environments presented in the game are similar to the descriptions of Renaissance gardens in Kenneth J. Knoespel’s “Gazing on Technology.”  The 3D environment of the game is promoting the underlying technology that makes 3D computer graphics possible in the same way that Renaissance technology was employed in building those gardens.  Knoespel writes, “Gardens, whether set out in Renaissance poetry or on the estates of the nobility, offer a controlled means for assimilating the new technology.  In each case, the audience views the machinery at a privileged distance as it would an entertainer…In fact, the garden conceals technology in its mythological narrative” (117-118).  The player does not have to understand how his or her 3D graphics accelerator works in order to enjoy the immersive experience of playing World of Warcraft.  This game is the “controlled means for assimilating the new technology” of 3D computer graphics.

  • Recovered Writing: Undergraduate Technologies of Representation Essay on a Future Technology, Personal Computing Device, Nov 18, 2004

    This is the thirteenth post in a series that I call, “Recovered Writing.” I am going through my personal archive of undergraduate and graduate school writing, recovering those essays I consider interesting but that I am unlikely to revise for traditional publication, and posting those essays as-is on my blog in the hope of engaging others with these ideas that played a formative role in my development as a scholar and teacher. Because this and the other essays in the Recovered Writing series are posted as-is and edited only for web-readability, I hope that readers will accept them for what they are–undergraduate and graduate school essays conveying varying degrees of argumentation, rigor, idea development, and research. Furthermore, I dislike the idea of these essays languishing in a digital tomb, so I offer them here to excite your curiosity and encourage your conversation.

    In the next few Recovered Writing posts, I will present my major assignments from Professor Kenneth J. Knoespel’s LCC 3314 Technologies of Representation class at Georgia Tech. LCC 3314 is taught in many different ways by the faculty of the Georgia Tech’s School of Literature, Media, and Communication, but I consider myself fortunate to have experienced Professor Knoespel’s approach to the course during the last phase of my undergraduate tenure. The ideas that we discussed in his class continue to inform my professional and personal thinking. Also, I found Professor Knoespel a great ally, who helped me along my path to graduation with side projects and independent studies.

    This is another example of a WOVEN multimodal essay assignment. In it, I used WVE/written, visual, electronic modes to discuss a specific technology. These essays (focused on past, present, and future technologies) gave me a chance to use technology to explore the meaning behind and impact of technologies. The next essay will focus on a future technology of my own design.

    In this essay assignment, we were tasked with exploring an imagined future technology. At the time, I was fascinated with wearable computing. However, I only knew about it from my reading in magazines and online. I could not afford a 2004-era wearable computing rig, so I thought about how to improve on an idea of wearable computing for everyone. If only I had made a few more connections–namely touch and the phone.

    Nevertheless, I had a lot of fun designing the PCD and writing this essay.

    Jason W. Ellis

    Professor Kenneth J. Knoespel

    LCC3314 – Technologies of Representation

    November 18, 2004

    Artifact of the Future – Personal Computing Device

    Personal Computing Device - PCD (Drawing by Jason Ellis)
    Personal Computing Device – PCD (Drawing by Jason Ellis)

    The Artifact

    The Personal Computing Device (PCD) is an inexpensive and portable computer that can interface with many different input/output (I/O) components.  It is a one-piece solution to the ubiquity of computing and information storage in the future.  Its plain exterior hides the fact that this artifact is a powerful computing platform that transforms “dummy terminals” into points of access where one may access their own computer that is small enough to fit in a shirt pocket.

    Description

    The device measures 3″ wide by 4″ tall by 3/4″ thick.  On one of the long sides there is a small 1/4″ notch.  This notch matches with a similar notch on the interface port of wearable computer networks, computing stations, and entertainment systems.  The notch allows user to insert the PCD in only one orientation.  This protects the PCD and the interface port it is being plugged into.  The PCD is housed in a thin aluminum shell.  As the PCD does computing work, its circuits emit heat which needs to be removed from the system.  Because of the very small (< 90nm) circuit manufacturing process, the PCD uses very little power which translates to it emitting less heat than today’s Pentium 4 or Athlon64 processors.  Aluminum is an excellent choice for its metal housing because it is thermally conductive (removes heat), it is lightweight, and it is inexpensive.

    Dimensional view of PCD (Drawing by Jason Ellis)
    Dimensional view of PCD (Drawing by Jason Ellis)

    There are no switches or indicators on the PCD.  It has only one interface port as pictured in the top-left of the drawing above.  This interface makes the PCD unique.  This standardized interface allows the PCD to be used on any computing system that is designed for the PCD.  Computer hardware, wearable computer networks, and home entertainment systems are “dummy terminals” which rely on the PCD to be the “brains.”

    The PCD is a full featured computer.  It processes data, runs programs, and stores data on built-in solid-state memory.  Engineers were able to build a complete “computer on a chip” using new silicon circuitry layering techniques.  The result of this is the Layered Computing System as drawn in the internal schematic of the PCD (below).  Reducing the number of chips needed for a computing application has been a long-standing goal of electrical and computer engineering.  Steven Wozniak at Apple Computer was able to design elegant layouts for the original Apple I, and later, the Apple II.  He designed custom chips that brought the functions of several chips into a single chip.  AMD is continuing the trend today after integrating the CPU memory controller onto the new Athlon64 processor.  NVIDIA introduced the nForce3 250 GB chipset which integrated the system controller chip, sound, LAN (networking), and firewall all onto one chip.

    Internal layout of the PCD (Drawing by Jason Ellis)
    Internal layout of the PCD (Drawing by Jason Ellis)

    The solid-state memory is similar to today’s flash memory (e.g., USB Flash Drives or compact flash digital camera memory).  The difference lies in the density of the memory on the PCD.  Layering techniques are used in building the solid-state memory so that it is very dense (more data storage per unit area than today’s flash memory).  Typical PCD solid-state memory storage is 120 GB (gigabytes).  The PCD’s large memory area has no moving parts because it is made out of solid-state memory.  Traditionally, computers need a hard drive to store large amounts of information for random access.  Hard drives are a magnetic storage that depends on round platters rotating at high speed while a small arm moves across the platters reading and writing information.  Flash memory does not need to spin or have a moving arm.  Data is accessed, written, and erased electronically.

    The PCD has a built-in battery for mobile use.  When the PCD is plugged into a wall-powered device such as a computer terminal or entertainment system, it runs off power supplied by the device it is plugged into and its battery will recharge.

    Social Significance

    The introduction of the PCD revolutionizes personal computing.  The PCD empowers users to choose the way in which they interface with computers, networks, and data.  Computer displays, input/output, and networks have become abstracted from the PCD.  A user chooses the operating system (the latest Linux distribution, Windows, or Mac OS X) and the programs (e.g., Office, Appleworks, iTunes) for his or her own PCD.  That person uses only their own PCD so that it is customized in the way that they see fit and they will develop an awareness of its quirks and abilities in the same way that a person learns so much about his or her own car.

    The “faces” of computers (i.e., monitors, keyboards, mice, trackballs, and printers) are abstracted away from the “heart” of the computer.  The PCD is the heart because it processes data through it (input/output) much like the heart muscle moves blood through itself.  A PCD also acts as a brain because it stores information and it can computationally work on the stored data.  The traditional implements of computer use are transformed into dummy terminals (i.e., they possess no computational or data storage ability).  Each of these devices have an interface port that one plugs in their personalized PCD.  The PCD then becomes the heart and brain of that device and it allows the user to interface with networks, view graphics on monitors, or print out papers.

    Computer Terminal and Entertainment Systems with PCD Interfaces (Drawing by Jason Ellis)
    Computer Terminal and Entertainment Systems with PCD Interfaces (Drawing by Jason Ellis)

    Both the PCD and the dummy terminals are a standardized computing platform.  Consumer demand, market forces, and entrepreneurial insight led to the evolution that culminated with the PCD as the end product.  Consumers were overburdened with desktop computers, laptop computers, and computer labs.  Every computer one might encounter could have a very different operating system or set of software tools.  The data storage on one computer would differ from the next.  A new standard was desired to allow a person to choose their own computing path that would be accessible at any place that they might be in need of using a computer.

    Computer manufacturer businesses saw ever declining profits as computers were becoming more and more mass-produced.  Additionally, no one company built all of the parts that went into a computer so profit was lost elsewhere as parts were purchased to build a complete computer for sale.

    New integrated circuit manufacturing techniques allowed for greater densities of transistors and memory storage.  These manufacturing techniques also allowed for lower power consumption and thus reduced heat from operation (which was a long-standing problem with computers).

    With the consumer, desire for something new and innovative coupled with a new way of building computer components led to the founding of a new computer design consortium.  Hardware and software manufacturers came together to design a computing platform that would fulfill the needs of consumers as well as improve failing profits.  The PCD design consortium included computer and software businesses, professional organizations, and consumer/enthusiast groups.

    The PCD almost didn’t see the light of day because of influence from large lobbying groups in Washington.  This involved copyright groups such as the Recording Industry Association of America (RIAA) and the Motion Picture Association of America (MPAA).  These groups decried the potential copyright violations possible with the PCD.  Epithets, curses, and bitching issued from the RIAA and MPAA lobbyists’ mouths.  Consumer outrage over these large business groups attempting to throw their weight around caused a surge of grassroots political involvement that unseated some Congressional members and scared the rest into line.  The public wanted to see what would come out of the PCD Design Consortium before judgment was passed on its useful and legal purposes.

    With the legal hurdles temporarily under control, the PCD was released to the public.  New and inventive uses were immediately made of the PCD.  One of the first innovations involved the Wearable Computer Network.  Wearable computing was a long researched phenomenon at the Wearable Computing Lab at MIT and Georgia Tech’s Contextual Computing Group.  The two factors holding back wide adaptation of wearable computing were the cost of the mobile computing unit and the mobile computing unit’s singular purpose.  These two factors were eliminated by the PCD because it was cheap and it could be removed from the wearable computing network and used in other computing situations (e.g., at a desktop terminal or in an entertainment system).

    Wearable Computing Network with Integrated PCD Interface Pocket (Drawing by Jason Ellis)
    Wearable Computing Network with Integrated PCD Interface Pocket (Drawing by Jason Ellis)

    Entertainment systems and desktop terminals became popular receptacles for the PCD.  Music and movies purchased over the Internet could be transferred to the PCD and then watched on a home entertainment system that had a PCD interface port.  Desktop terminals and laptop terminals also began to come with PCD interface ports so that a computer user could use their PCD at home or on the go, but still be able to use their PCD in other situations such as at a work terminal.  Being able to carry a PCD between work and home allowed for easier telecommuting because all of a person’s files were immediately available.  There was no more tracking down which computer had downloaded an email, because a person’s email traveled with that person on his or her PCD.  Easier teleworking helped the environment in metropolitan areas because more people could do their work from home without needing to drive their fossil fuel consuming cars down the highway.

    Instant computing access meant that PCD users were able to expand the possibilities of the human-computer dynamic.  There was more Internet use and that use was more often on the go.  As people began donning wearable computing networks for their PCD, they would chat with friends while riding a commuter train or they would spend more time getting informed about what was going on in the world with NPR’s online broadcasts or with the BBCNews’ website.  Social networks like Orkut and Friendster received even more users as friends began inviting friends who may have just got online (with a mobile setup) with their new PCD.

    As more computer, clothing, and HDTV terminals began to support the PCD, more jobs were created, more units were sold, more raw materials were consumed, more shipping was taking place, more engineering and design was going on, and new business models were being created.  The web of connections built upon itself so that more connections were made between industries and businesses.  The popularity of the PCD boosted tangential industries involved in building components that went into the PCDs as well as entertainment services.  Aluminum and silicon processing, chip manufacturing, battery production and innovation (for longer battery life), new networking technologies to take advantage of the greater number of computing users who purchase PCDs, and PCD interface devices (such as HDTVs and wearable computing networks) all ramped up production as demand for the PCD rose.  New services popped up such as computer terminal rental and new entertainment services that would allow customers to purchase copy-protected versions of music and movies that could easily be transported for enjoyment wherever the user took his or her PCD.  Some entertainment companies held out too long while others reaped rewards for modifying their business models to take advantage of this new (and popular) technology.

    Choice is the driving factor behind the PCD’s success.  Wrapped in the PCD’s small form is the choice of human-computer interaction, choice of where to use a PCD, and choice of data (visual and auditory) to be accessed with a PCD.  These choices are made available by the choices made by many people such as consumers, industrialists, and entertainment antagonists.  Those who embraced the PCD and found ways of interfacing with it (literally and figuratively) succeeded while those that did not were left by the wayside.

    Works Cited

    Contextual Computing Group at Georgia Tech.  September 29, 2004. November 14, 2004 <http://www.gvu.gatech.edu/ccg/&gt;.

    Hepburn, Carl.  Britney Spears’ Guide to Semiconductor Physics.  April 7, 2004.              November 14, 2004 <http://britneyspears.ac/lasers.htm&gt;.

    Owad, Tom.  “Apple I History.”  Applefritter.  December 17, 2003.  November 14, 2004 <http://www.applefritter.com/book/view/7&gt;.

    “Single-Chip Architecture.”  NVIDIA.  2004.  November 14, 2004 <http://www.nvidia.com/object/feature_single-chip.html&gt;.

    Wearable Computing at MIT.  October 2003.  November 14, 2004 <http://www.media.mit.edu/wearables/&gt;.

  • Recovered Writing: Undergraduate Technologies of Representation Essay on Present Technology, Airport Express, Oct 28, 2004

    This is the twelfth post in a series that I call, “Recovered Writing.” I am going through my personal archive of undergraduate and graduate school writing, recovering those essays I consider interesting but that I am unlikely to revise for traditional publication, and posting those essays as-is on my blog in the hope of engaging others with these ideas that played a formative role in my development as a scholar and teacher. Because this and the other essays in the Recovered Writing series are posted as-is and edited only for web-readability, I hope that readers will accept them for what they are–undergraduate and graduate school essays conveying varying degrees of argumentation, rigor, idea development, and research. Furthermore, I dislike the idea of these essays languishing in a digital tomb, so I offer them here to excite your curiosity and encourage your conversation.

    In the next few Recovered Writing posts, I will present my major assignments from Professor Kenneth J. Knoespel’s LCC 3314 Technologies of Representation class at Georgia Tech. LCC 3314 is taught in many different ways by the faculty of the Georgia Tech’s School of Literature, Media, and Communication, but I consider myself fortunate to have experienced Professor Knoespel’s approach to the course during the last phase of my undergraduate tenure. The ideas that we discussed in his class continue to inform my professional and personal thinking. Also, I found Professor Knoespel a great ally, who helped me along my path to graduation with side projects and independent studies.

    This is another example of a WOVEN multimodal essay assignment. In it, I used WVE/written, visual, electronic modes to discuss a specific technology. These essays (focused on past, present, and future technologies) gave me a chance to use technology to explore the meaning behind and impact of technologies. The next essay will focus on a future technology of my own design.

    In this essay assignment, we were tasked with exploring an example of a present technology. I chose to write about Apple’s Airport Express, which my roommate Perry Merier had recently purchased. At the time, the idea of an extremely small computing/routing/audio device was new and innovative. Also, it was incredibly useful.

    Jason W. Ellis

    Professor Kenneth J. Knoespel

    LCC3314 – Technologies of Representation

    October 28, 2004

    Artifact of the Present – Apple Airport Express

    Apple Airport Express (Image from Apple Computer)
    Apple Airport Express (Image from Apple Computer)

    The Artifact

    The Apple Airport Express is a multifunction wireless Internet router (i.e., base station) that first hit shelves in June 2004.  It can serve as a wireless Internet base station, extend the range of an existing wireless network, receive streaming music and transfer that to a home stereo, and share a USB printer on a wireless network.  It can do all of these things and yet its small rectangular shape can be inscribed in the circumference of an audio CD.

    Description

    The Airport Express is only 3.7 inches tall, 2.95 inches wide, and 1.12 inches deep.  It is about the size of a Powerbook G4’s power brick (AC to DC converter).  If you do not need the included power cord extender, then the Airport Express is completely self-contained.  Unlike most other wireless routers, the Airport Express has its power converter built-in.  The electronics that allow it to juggle all of its functions lie within the glossy white plastic housing.

    On the back edge of the Airport Express there is a fold-out AC power connector.  The power prongs fold back into the unit so that it is easily carried in a bag without snagging on anything.  The bottom edge has three connectors.  The first is the ethernet RJ-45 connector.  This can be connected to a DSL or cable modem so that the Airport Express can wirelessly transmit Internet access to computers with wireless capabilities that are within range.  Next is the USB connector.  This can be hooked to a USB printer so that the printer can be shared with anyone on the wireless network.  The last connector is an audio mini-jack that supports both digital and optical audio output.  This can be connected to a home stereo so that music can be streamed from a computer running iTunes to the Airport Express.  In the event of a lockup, there is a small reset button on the bottom of the device.  The front edge of the device has an LED.  This LED lights up as amber or green.  The color of the LED and its state (i.e., on, off, blinking) can indicate different things about the status of the Airport Express.

    Airport Express Connectors (left) and Airport Express Plugged-In (right)(Images from Apple Computer)
    Airport Express Connectors (left) and Airport Express Plugged-In (right) (Images from Apple Computer)

    The components inside the Airport Express are tightly packed.  A good deal of engineering had to go into making function follow form in this artifact.  Home wireless routers are usually two or three times the size of the Airport Express and they have an external power brick (that may be the same size as the Airport Express).  This device has to contain a power converter, wireless networking components, wired networking components, network routing components, USB printing components, and audio components.  Some of these parts are combined on a single piece of silicon to save space on the circuit board.

    Airport Express split in half.  Note the circuit boards on the left and power converter on the right.  (Image from ipodding.com)
    Airport Express split in half. Note the circuit boards on the left and power converter on the right. (Image from ipodding.com)

    Social Significance

    Apple Computer introduced its Airport technology in July 1999.  The choice to use the name “Airport” was a deliberate one.  It is easy to remember and it evokes certain images of what the technology is able to do.  The bits of data seem to fly through the air on invisible radio waves.  Airport technology is the place where these bits take off and land–from the base station to the computer and vice versa.  Speed, travel, and mobility are some of the images that Apple intended the Airport Extreme to conjure for potential buyers.

    The Airport Express uses the two most widely adopted wireless networking standards:  802.11b and 802.11g.  A working group within the Institute of Electrical and Electronics Engineers (IEEE) established those standards.  The IEEE 802 standards committee develops the standards for local area networks as well as for metropolitan area networks.  Work group 11 focuses on wireless networking standards.  Publicly available standards such as these are part of the success of computer and networking hardware.  Standards allow for components manufactured by different companies to be interoperable.  Because the Airport Express uses several open standards it will work along side other wireless hardware and it will work with Macs as well as PCs.

    The Federal Communications Commission (FCC) and the National Telecommunications and Information Administration (NTIA) regulate the radio frequency spectrum.  The NTIA is part of the Executive Branch of the US Government that “manages the Federal government’s use of the spectrum” while the FCC is an “independent agency” that “regulates the private use of the spectrum” (NTIA).  The 802.11b and 802.11g wireless networking standards are approved by the FCC to use the 2.4GHz radio band for transmitting and receiving bits of data carried on radio waves.

    The US Radio Spectrum Frequency Allocations.  The red ellipse approximately marks where in the spectrum 802.11b and 802.11g operate. (Image from NTIA)
    The US Radio Spectrum Frequency Allocations. The red ellipse approximately marks where in the spectrum 802.11b and 802.11g operate. (Image from NTIA)

    Each person with a computer with wireless capability, a copy of iTunes, a stereo, and an Airport Express is in effect a one-person radio station.  Music can be streamed from the computer to the Airport Express which passes it along to the home stereo via an audio cable.  Digital music is now freed from the computer and transferred back to the home stereo.  This capability points to one of the Airport Express’ weaknesses.  Music streaming from a computer can only be played on one Airport Express at a time.  There is no technology barrier keeping more than one Airport Express from receiving the streaming music so there is some reason that Apple restricted this capability on the Airport Express.  If this were enabled customers would buy more than one Airport Express so that they could stream music to multiple rooms.

    The music travels wirelessly to the Airport Express and then to the stereo via wires.  (Image from Apple Computer)
    The music travels wirelessly to the Airport Express and then to the stereo via wires. (Image from Apple Computer)

    The Airport Express’ limitations might be due to pressure from the music industry.  Apple gives the music playing software, iTunes, away for free.  It can play CDs, MP3s, and it can access Apple’s Online Music store.  This software can copy (i.e., rip) CDs that may or may not be owned by the iTunes user.  Additionally, iTunes will play legitimate MP3s as well as those that are obtained in violation of current copyright law.  The Recording Industry Association of America (RIAA) and some music recording artists find this unacceptable.  Apple has tried to work on the side of the consumer, but they have to appease the music industry as well.  To do this Apple has integrated special encryption in music downloaded from the Apple Online Music Store so that only the authorized buyer can play those MP3s.  Additionally, iTunes establishes a secure connection to the Airport Express by encrypting the music stream with Advanced Encryption Standard (AES) encryption, which is in turn protected by RSA encryption.  This prevents others from recording an iTunes music stream.

    Encryption is also employed to protect the wireless users on the Airport Extreme’s network.  Part of this protection comes from encrypting the wireless network traffic and the other part comes from the built-in firewall.  The older encryption is called Wired Equivalent Privacy (WEP) and the newer security is called Wi-Fi Protected Access (WPA).  WPA was built to supercede WEP.  The built-in firewall uses network address translation (NAT) to create a network that uses private IP addresses instead of public (and thus directly connected to the Internet) IP addresses.  NAT exchanges data between the public world and the private network.  Generally, only the NAT server can directly connect to the computer on its private network and not a computer in the outside world.

    Security and privacy is a growing concern for people in a more wired world.  Identify theft is becoming a boon for some (e.g., the thieves, private investigators, lawyers, politicians) and a bust for others (i.e., the person whose identity is stolen).  One way that a person’s private identifying information is stolen is by an individual “sniffing” a wireless network’s data traffic for that precious information.  New industries and groups have grown out of this problem of identity theft.  Wireless devices like the Airport Express need to have protections built-in so that a user’s private information will be better protected.

    The physical construction of the Airport Express involves electrical engineering, computer engineering, and industrial design.  Electrical engineering and computer engineering overlap in a project such as this.  Custom chips have to be designed and built that handle data traffic, digital-analog conversion of sound, configuration software, controlling of a radio transmitter/receiver, and print control software.  Simplicity and elegance of design are demanded in order to fit such a feature rich artifact into a very small package.  Apple has a history of taking an artifact that is assumed to look or work in a particular way and transforming its appearance into something new and fresh (e.g., the original Macintosh, iMac, and iPod).  Airport Express works similarly to any other wireless router, but it pushes the elements of design (both as a physical artifact and with the internal circuits and chips) so that it is identified by the user as something more than its function.

    Sleek and new shapes also reinforce the perception of speed.  Airplanes are fast and this artifact is the Airport (sending and receiving these fast airplanes of data) Express (quick, fast, simple).  Computer technology has been a long progression of speed.  How fast does this computer perform the tasks that I will be using it for?  Can it play Doom 3?  The same is true for networking technologies.  Wired networking is hands down the fastest networking technology so wireless has to compete with wires in speed, but it can distinguish itself by its convenience.

    (Photo by John M. Dibbs.)
    (Photo by John M. Dibbs.)

    These new designs effect a change in the way people think about their computer technology.  Soft colors, translucent plastics, curves and gentle transitions give technology a friendlier “face.”  It isn’t imposing and the technology can now fit into a color scheme in your home.  Computer technology shifts from utility to lifestyle.  Apple brings together these networks of technology, government oversight, music industry muscle, and industrial design principles so as to provide customers with the technology desired but in a package that makes it less technical and more like a streamlined appliance.

    Works Cited

    “Airport Express Gallery.”  Ipodding.com.  2004.  October 26, 2004           <http://ipodding.com/modules.php?set_albumName=album10&op=modload&na            me=gallery&file=index&include=view_album.php>.

    “Apple – Airport Express.”  Apple Computer, Inc..  2004.  October 26, 2004 <http://www.apple.com/airportexpress/&gt;.

    “Apple – Support – Airport Express.”  Apple Computer, Inc..  2004.  October 26, 2004 <http://www.apple.com/support/airportexpress/&gt;.

    Dibbs, John M..  “Concorde Takeoff.”  Planepix.com.  October 26, 2004    <http://www.planepix.com/pp/servlet/template/Detail.vm/id/2940&gt;.

    “Myths vs. Reality.”  National Telecommunications and Information Administration.       October 14, 2004.  October 26, 2004             <http://www.ntia.doc.gov/ntiahome/myths.html&gt;.