Since I received my Google Glass last week, I have been learning how to wear and use it. Ultimately, I want to incorporate Glass into my Retrocomputing Lab research workflow. I am interested in the experience of using computer hardware and software (something that I have been interested in for a long time and wrote about as an undergraduate), so Glass will provide a way of capturing some of my phenomenal experience–perspective, vision, and sound. I can provide oral commentary of my haptic and olfactic experiences (yes, computers have unique smells–something that helps store/recall memories and emotions) while also recording thoughts, memories, and asides that enrich my shared video experience. As one component of the digital humanities, I want to create an archive of my raw research of working with computers and their software that others can use, draw inspiration from, or comment on through their own research, writing, and teaching.
For the work that I do in my personal Retrocomputing Lab, I will use Glass as one more tool among a variety of other technologies that enable my research. Glass will add another data layer–itself richly textured and layered with audio/video/Internet/software capabilities–to the research that I do. Due to the ease of sharing images and video in real time, I can immediately make my in-process research available on YouTube, Twitter, and here on dynamicsubspace.net. Furthermore, my research will be useable by others–hobbyists, students, and other researchers in many interdisciplinary fields. Glass will join my non-real time distribution of data on paper, computer written notes (though, I could make these freely viewable in realtime on say, Google Drive), and published research.
Finally, I am interested in the mixing of old and new technologies. Glass meets its ancestors in the IBM PC and Macintosh. Glassware meets DOS, Windows 3.1, and System 7. I want to explore how the intermingling of these technologies leads to new insights, connections, and elaborations. While I am only speculating, I strongly believe that Glass and similar wearable computing technologies will elevate the outcomes and knowledge produced in humanities research–conceptualized as interdisciplinary like mine or not.
The videos included in this post were tests of the manually extended video recording feature. They don’t involve the Retrocomputing Lab, because how I want to use Glass to record my work will involve more and different kinds of planning. I used what I had at hand to test out Glass’ video capabilities included below.
Glass Video from Apr 22, 2014, Lego Build of The Batman Tumbler 30300 Polybag
Glass Video from Apr 24, 2014, Target Exclusive Lego 30215 Legolas Greenleaf Polybag
Over the weekend, I launched a new page under the “Research” menu on DynamicSubspace.net for my Retrocomputing Lab.
I use the Retrocomputing Lab’s hardware and software resources in my continuing research on human-computer interaction, human-computer experiences, and human-computer co-influence. So far, its primary focus is on the shift from the pre-Internet, early-1990s to the post-Internet, late-1990s and early-2000s.
During that time, technological and cultural production seems to accelerate. Imagine all of the stories yet to be recovered from that time. How do we untangling of the long shadow of that time from the innovations and disruptions of the present passing into future?
The computer hardware includes Macs and PCs. There are laptops and desktops. There are different add-on cards and peripherals to enhance and change experiences. There are 3.5″ floppy disks, CD-ROMs, and DVDs. There are many different kinds of software ranging from games to interactive encyclopedias to operating systems to word processors. There are different motherboards that can be swapped out in various computer cases (AT and ATX). The machines can be temperamental, but each configuration reveals its own indelible soul (for lack of a better word, but it is a word that I quite like in this context).
My research focuses on reading on screens, depictions of electronic-facilitated reading, and the cognitive effects of reading on screens (of course, there are a multitude of screens and interfaces–a worthy complication) as opposed to other forms of non-digital media (and their multitude).
The Retrocomputing Lab continues to grow and new research possibilities abound. If you are interested in collaborating on a project with Retrocomputing Lab resources, drop me a line at jason dot ellis at lmc dot gatech dot edu.
This is the eleventh post in a series that I call, “Recovered Writing.” I am going through my personal archive of undergraduate and graduate school writing, recovering those essays I consider interesting but that I am unlikely to revise for traditional publication, and posting those essays as-is on my blog in the hope of engaging others with these ideas that played a formative role in my development as a scholar and teacher. Because this and the other essays in the Recovered Writing series are posted as-is and edited only for web-readability, I hope that readers will accept them for what they are–undergraduate and graduate school essays conveying varying degrees of argumentation, rigor, idea development, and research. Furthermore, I dislike the idea of these essays languishing in a digital tomb, so I offer them here to excite your curiosity and encourage your conversation.
In the next few Recovered Writing posts, I will present my major assignments from Professor Kenneth J. Knoespel’s LCC 3314 Technologies of Representation class at Georgia Tech. LCC 3314 is taught in many different ways by the faculty of the Georgia Tech’s School of Literature, Media, and Communication, but I consider myself fortunate to have experienced Professor Knoespel’s approach to the course during the last phase of my undergraduate tenure. The ideas that we discussed in his class continue to inform my professional and personal thinking. Also, I found Professor Knoespel a great ally, who helped me along my path to graduation with side projects and independent studies.
In this essay assignment, we were tasked with exploring an example of a past technology. I chose to write about the Altair 8800–the first personal computer. Coincidentally, I am re-watching Robert X. Cringely’s Triumph of the Nerds, which discusses and demonstrates the Altair 8800 in the first episode.
I enjoyed writing this essay, because it was one of the first that permitted me to combine words and images (thinking about WOVEN). I had done this before on webpages, but not in an essay that I would hand in to my professor.
Jason W. Ellis
Professor Kenneth J. Knoespel
LCC 3314 – Technologies of Representation
September 28, 2004
Artifact from the Past – The Altair 8800
The Altair 8800 is credited as the first personal computer. H. Edward Roberts invented the Altair 8800 after being approached by the magazine, Popular Electronics, to build a kit computer that could be sold through the magazine. It utilized a central processing unit microprocessor and a bus that “signals and power traveled from one part of the machine to another on” (Ceruzzi 228). When it was introduced in 1975 by Roberts’ company, MITS, you could purchase an Altair as a kit for $397 or assembled for $498.
The exterior of the Altair 8800 is a steel enclosure. The front faceplate is black and it has two rows of lights and two rows of flip switches. Each of the lights and switches are labeled. The back had an opening for cooling and the power plug connector.
The first Altair 8800 included a very small amount of computer memory (256 bytes–not kilobytes). Also, when the computer was turned off, anything in the computer memory was lost. This means that each time you used the Altair 8800 you had to input the program you were going to use and any data that the program was going to work with. The input was handled through flipping of different switches on the faceplate. The lights indicated the status of computer during input and the lights would later reveal the output of the program that was laboriously entered. If the power went out during the programming of the Altair 8800, the program was lost and would have to be reentered when power was restored.
In a sense, the Altair 8800 was as self-contained as a modern day iMac. The difference being that teletypes and display technology was prohibitively expensive for the computer hobbyist. When the hobbyist had completed the construction of the Altair there was only the Altair 8800 in its steel enclosure and a power cord that plugged into a wall outlet. Input and output was handled through the lights and switches on the face plate.
The inside of the Altair contained the electronics of the faceplate, the open bus, a CPU card, a memory card, and the power supply. The open bus and the CPU chosen for the Altair 8800 are what ignited the possibility for the upcoming personal computer boom.
The open bus (also called S-100) was unique in that it was a board that was attached to the bottom of the inside of the enclosure that had four card connectors on it. The open bus allowed for expansion possibilities and it was an open architecture which meant that others could build cards that would work in anyone’s Altair 8800. Additionally, others could copy the open bus architecture so that they could build their own branded computer system that would use parts that were interchangeable with the Altair 8800 and other “clones.”
The Altair 8800 used Intel’s latest microprocessor, the 8080. The 8080 distinguished itself from the older Intel microprocessor, the 8008, because “it had more instructions and was faster and more capable than the 8008” (Ceruzzi 228). The 8080 required fewer supporting chips than the 8008 to make a functional system, it could address more memory than the 8008, and it used the “main memory for the stack, which permitted essentially unlimited levels of subroutines instead of the 8008’s seven levels” (Ceruzzi 228). The 8080 was the first microprocessor powerful enough to run this early iteration of the personal computer.
The Altair 8800 was a hobbyist computer. The kit that one could buy for about $400 was a box full of individual components that had to be skillfully soldiered and connected together. MITS did offer a pre-built Altair 8800, but even a completed Altair entailed a good deal of expertise to make it do anything. This first model handled all input and output through the lights and switches on the front panel. The “front panel of switches…controlled the contents of internal registers, and small lights [indicated] the presence of a binary one or zero” (Ceruzzi 228). This was lightyears away from MS-DOS and it was even further away from the GUI of the Macintosh, but it was able to do calculations on data by using programmed instructions. The representation of the program was stored (temporarily, while the power was on) in an integrated circuit. The output was displayed in a series of lights in the same location where the program and data were entered earlier. The output was given in the same format in which it was received, through binary code (i.e., ones and zeros). Input required encoding into binary and output required decoding from binary into results that the computer user could more concretely understand. The computer user had to have command of the the encoding and decoding process in order to use the Altair.
The open bus allowed others to follow in MITS footsteps in building a computer that was similar in design to the Altair 8800. Also, hobbyists and other companies could build add-in cards that would interface with any computer based around the S-100 open bus that the Altair employed. This meant that an aftermarket industry was created for the Altair and its clones. More electrical components, memory chips, circuit boards, lead soldier, and etching materials would be sold and used in the creation of these add-on products. More research and development took place both on the hobbyist’s workbench and in corporate research labs. Some creations were sold as a final product whereas others would have been talked about at user group meetings or published as “how-to” guides in magazines like Popular Electronics. A dynamic cycle of innovation was introduced to the personal computer that had not been present before. This is what led to the personal computer becoming something different than an elitist computing device. The critical mass was building for what led to the first Apple computer and the IBM PC.
Within this creative cycle was Roberts’ choice to use the Intel 8080 microprocessor. Intel had been selling this microprocessor for $360.00 if ordered in small quantities. MITS was able to buy them from Intel for $75.00 each. If MITS had not been able to secure this low price, the Altair would have failed because of its much higher cost. Because MITS was able to buy these processors for the lower price they were able to sell the Altair to customers for a price that they were willing and able to pay. When the Altair took off, this meant that each one had an Intel 8080 CPU in the kit. This meant that Intel started selling a lot more of these new microprocessors that, up until that time, they really didn’t know how to market. Intel began to see that microprocessors weren’t just for expensive, business computers, but they were also for smaller, personal computers. When Intel saw that there was a demand they began to further develop and diversify the microprocessor line over time. Later, other companies began to adopt the S-100 bus. This meant that other companies were buying Intel’s microprocessor to use in those computers. Every computer had to have a CPU and at the time these particular computers had to have an Intel microprocessor. Then other companies, such as AMD, reversed engineered the Intel 8080 microprocessor and began selling their own model that was functionally identical to Intel’s offering. Money was being made and more innovation and work was taking place as a result.
Along with all of this building, research, and development new construction methods had to be developed and new distribution networks had to be employed. The Altair was designed to be built at home by the buyer, but MITS also offered a pre-built turn-key system. MITS did not anticipate the demand and customers quickly had to endure up to a one year wait for their Altair computer. MITS (and others) learned from these delays. Also, new buying and distribution channels had to be established. MITS was buying microprocessors from Intel. The many other components had to be purchased from other companies and distributors. Parts had to be ordered and processed in order to send out kits and turn-key systems to customers. Additionally, Intel had to be prepared to have microprocessors ready to sell to MITS and other companies. When demand rose for the Altair it would have impacted each company that supplied the individual pieces that comprised the finished product. Ordering systems, packing, and shipping had to be arranged to get the Altair from their headquarters to the customer’s home. This involved materials for shipping, personnel, and the logistics of order processing.
MITS tried to market the Altair 8800 as a business computing solution after they saw how popular it was. This was made easier when teletype, CRT displays, disk drives, punch card rolls, and other computing technology was developed for the Altair and S-100 bus systems. Businesses liked easier interaction with the computer and dependable memory storage. These business systems were not very successful because there was no “killer app” for the platform at that time. MITS changed hands several times until its last remnant disappeared.
The Altair 8800 began the desktop computing revolution. Initially it was very complicated and elitist. The very first kits had to be built and used by persons that were skilled in electronics and computer science. The hardware had to be constructed from individual elements and then software had to be devised that would run on this built-from-scratch computer. The Altair became more user friendly over time. The aftermarket, MITS, and the clone manufacturers wanted to attract more customers. The potential customers formed a triangle with the most knowledgeable at the peak with a gradation of less knowledgeable customers toward the bottom. The early adopters of the Altair were at the top of this triangle but their numbers were few. This meant that new computers with new input and output and new features had to be devised that would entice the greater number of potential computer users to want to buy their product. This cycle continues to this day in the personal computer market. Apple, Microsoft, Sony, HP, and many other companies continually work at making something feature rich, but easier and easier to use. Note the utopian artwork below that was used for an early Altair advertisement. It recalls Soviet artwork, utopian imagery, and an Altair on every desk. The Altair was going to offer a leveling of the computing playing field so that all could take part in the use of computers.
Along with this cycle there are those persons who are intrigued by the new technology and they learn more about it on their own or through school. This bolsters the book industry that may sell computer programming or electrical engineering books (or today, the plethora of “Dummies” guides). Schools began to introduce computers into the classroom. At first, it was strictly computer science and programming classes. Later, computers were added for other things such as graphic design, CAD, and word processing. Universities saw more computer science, electrical engineering, and computer engineering majors. These universities added more professors, classroom space, and equipment to compensate for this demand. State and federal spending was sought to cover some of these expenses. Private enterprise was also asked to help through different kinds of agreements that would assist the business while helping the school’s students in need of projects and equipment. This work done by school research could in turn help the businesses with their products that will be sold on the open market.
The Altair 8800 introduced computer enthusiasts to the possibility of working with digital information on their desktop. Time sharing on large mainframes and minicomputers was still the primary interaction people had with computers in business and in schools. With the flip of switches and the monitoring of lights, one could work problems and evaluate data at home or in the office. There were early games, calculating problems, logarithms, and other numerical manipulation. The early adopters questioned what other things could be manipulated with a personal computer. With the introduction of new input and output systems, the list expanded a great deal because human-computer interaction became easier with the connection of a CRT monitor and a keyboard or punch card reader. Also, the binary code and bits of information that were only ones and zero to the computer could be made to represent abstractions rather than mere numbers.
The Altair 8800 was the pebble that began rolling down the snow covered mountain (figuratively and literally of the user base). The concept of the personal computer gained mass and momentum that could not be stopped. The development of the first microprocessor based personal computer created new networks and new demands that were met by computer enthusiasts, students, researchers, and business people.
“Altair 8800.” Old-Computers.com. October 6, 2004. October 6, 2004
After my students took their second exam yesterday, I lectured on cyberpunk to accompany their readings: William Gibson’s “Burning Chrome” and Bruce Sterling’s “Preface” to Mirrorshades. I talked about its historical and cultural moment, proto-cyberpunk examples in the SF genre, and the movement itself. In particular, I contextualized the cyberpunk movement in terms of postmodernism and post-industrial society. We ran out of time while I was talking about Gibson’s contributions to the development of the cyberpunk movement. Besides my enjoyment of talking about cyberpunk, I was happy that my former professor Dr. Carol Senf was in attendance to observe my teaching.
Today, we watched the William Gibson and Tom Maddox penned episode of The X-Files, “Kill Switch.” Released approximately 16 years after “Burning Chrome” in 1998, it is one of the best examples of cyberpunk in a visual medium–especially in the fact that it takes place in the here-and-now instead of the near future.
Then, I lectured on The X-Files and cyberpunk film/television before returning to my notes on Gibson, Bruce Sterling, and Pat Cadigan.
After the lecture, I launched into a retrocomputing demonstration with emulation and my personal collection of resurrected computer gear. I showed my students how to use the http://www.virtualapple.org website to see what cutting edge computing looked like in the early 1980s. Most of my students were born in the early to mid-1990s, so I wanted them to experience first hand how much extrapolation was being done on the part of the cyberpunks and Gibson in particular (of course, telling them about his Hermes 2000 typewriter and its celluloid keys and his recollection of getting inspiration for the cyberspace deck from the Apple IIc–something that his memory likely colored due to the fact that the IIc was released the same year as Neuromancer). Also, I brought in an Apple Powerbook 145 with Gibson’s Voyager Company ebook of the Sprawl trilogy pre-loaded and a Pentium-I PC with old software including Neuromancer (for DOS), Star Wars: Dark Forces (DOS), and the Star Trek Interactive Technical Manual (Windows). I took the U-shaped sheet metal case off my PC so that they could see the insides.
I had to lug everything across campus in my carry-on sized suitcase with the PC strapped to the handle with nylon straps. I felt like Case in Neuromancer returning from his shopping expedition.
Tomorrow: Taiwanese SF and review for the third exam.
This morning, Becky, Robyn, Aaron, Chris, and Colin joined me for the Play | Retrocomputing 9:30am session at THATCamp SE 2013 at Georgia Tech. Aaron recorded our lively and interesting conversation on the shared GoogleDoc available here (along with notes from all of the sessions).
Above, you can see pictures that I took while we were playing, working, and talking. Our conversation veered from materiality of experiencing old software on original computing hardware to archiving/preserving old computer and software artifacts.
The computers that I brought to kickstart our conversation were a Powerbook 145 and Powerbook 180c.
Other conversations from THATCamp SE 2013 are on Twitter with the #thatcampse13 hashtag.