My desktop PC, which I wrote about its build and benchmarks previously, has performed very well since I built it late last year. However, I built it on a budget, so I wasn’t able to outfit it as well as I would have liked. After deciding that I would use the desktop computer as my primary computer, I upgraded it with those components that I needed most: more storage space and more RAM.
The most pressing need was additional hard drive space. The original ADATA 128GB SSD was adequate when I was testing the system and deciding if I wanted to use it as my primary computer. When I wanted to do more than just the bare necessities and have access to my data more quickly than an external backup hard drive or flash drive could provide, I added two hard disk drives.
First, I picked up a Toshiba 5400rpm 2TB OEM drive when Microcenter had them on sale. I had good luck with Toshiba drives from Microcenter with previous computer builds, so I was comfortable using a larger format capacity one in this computer. Due to the limited warranty on OEM drives, I put the drive through its paces to ensure that it wasn’t a lemon: I performed a low level format on the drive, and then I began the laborious task of moving files to the drive via USB and over the network. Then, I culled through the copied files to remove duplicate files. Finally, I erased the free space to stress test the drive again.
Second, I waited for another sale at Microcenter and purchased a Western Digital Blue 5400rpm 4TB drive. After adding it the computer, which required routing the power cable and SATA cables differently than I had done before, I stress tested the new drive with a low level format (this took all evening to perform!) and then copied everything from the Toshiba 2TB drive to the WD 4TB drive.
Another important need was additional RAM for the software that I use–multiple productivity applications, Wolfram Mathematica, and games. The Gigabyte B250-DS3H mATX motherboard supports four sticks of DDR4 RAM. I bought the computer’s first dual-channel pair of Crucial DDR4-2400 4GB RAM sticks at an amazing discount. Unfortunately, DDR4 RAM prices rose and have stayed elevated since that time. When a more modest discount was offered than originally, I chose to take it. Now, all four DDR4 slots are filled with two pairs of Crucial DDR4-2400 RAM for a total of 16GB RAM.
I dabbled with VR before video card prices went through the roof. For this experiment, I upgraded the video card and PSU. I don’t have the video card any longer, but I kept the Corsair CX650M PSU so that I can switch out video cards for something more powerful in the future.
After these upgrades, my computer’s stats are:
Gigabyte B250-DS3H mATX Motherboard
Asus Radeon Rx-550 4GB GDDR5 Video Card
Crucial 16GB 4×4 DDR4-2400 RAM
ADATA SU800 128GB 3D-NAND 2.5 Inch SATA SSD
2TB Toshiba OEM HDD
4TB WD Blue HDD
Corsair CX650M PSU
ROSEWILL Micro ATX Mini Tower Computer Case, FBM-01
Matthew Kirschenbaum constructs a compelling and interesting argument in his book Mechanisms: New Media and the Forensic Imagination (2008). He argues that while new media and computer software might seem ephemeral and intangible, it has in fact physicality, a many-layered history, and emerging archaeological protocols (developed by Kirschenbaum and many others).
However, one section titled “Coda: CTRL-D, CTRL-Z” attracted my attention, because its use of the term “recover” in a story about the debut of the Apple Disk II seemed to imply computer disk data recovery instead of what historically happened, which was the manual rewriting of the software that had been accidentally overwritten during a botched disk copy operation.
Kirschenbaum uses the story of Steve Wozniak and Randy Wigginton’s development of software to control the reading and writing of data to Apple’s Disk II, which was based on Shugart’s 5 1/4″ floppy disk drive, before its unveiling at the 1978 CES in Las Vegas to establish an analogy: “Nowadays we toggle the CTRL-D and CTRL-Z shortcuts, deleting content and undoing the act at a whim. Gone and then back again, the keyboard-chorded Fort and Da of contemporary knowledge work” (Kirschenbaum 69). The idea is that computer facilitate a kind of gone and back again play as described by Freud. Of course, the keyboard shortcuts that he refers to are not universal across platforms or software, but the concept is pervasive. Nevertheless, my focus is not on that concept per se but instead on the Apple Disk II debut anecdote, the terminology surrounding what actually happened, and how that relates to the kinds of work that we do in new media archaeology.
After introducing the story of the Apple Disk II’s debut at CES, Kirschenbaum cites a passage from Steven Weyhrich’s Apple II History website:
“When they got to Las Vegas they helped to set up the booth, and then returned to working on the disk drive. They stayed up all night, and by six in the morning they had a functioning demonstration disk. Randy suggested making a copy of the disk, so they would have a backup if something went wrong. They copied the disk, track by track. When they were done, they found that they had copied the blank disk on top of their working demo! By 7:30 am they had recovered the lost information and went on to display the new disk drive at the show.” (Weyhrich par. 13, qtd. in Kirschenbaum 69).
First, it should be noted that Weyhrich uses the term “recovered” to describe the way that the “lost information” was brought back from the brink of the overwritten disk. Then, Kirschenbaum reads Weyhrich’s account above in the following way:
“Thus the disk handling routines that took the nascent personal computer industry by storm were accidentally overwritten on the very morning of their public debut–but recovered and restored again almost as quickly by those who had intimate knowledge of the disk’s low-level formatting and geometry” (Kirschenbaum 69).
Weyhrich uses the term “recovered” to refer to the software Wozniak and Wigginton had lost during the bad copy operation. Kirschenbaum borrows Weyhrich’s “recovered” and adds “restored” to describe the final state of the software on Wozniak and Wigginton’s floppy disks for use on the CES show floor. When I first read Kirschenbaum’s book, his reading seemed unncessarily ambiguous. On the one hand, Kirschenbaum does not directly say that the two Apple engineers used their knowledge of controlling the disk drive and reading low-level information on the floppy disks to “recover” the lost data–i.e., use the drive and disk technology to salvage, rescue, or retrieve what remains on the disk but otherwise might seem lost to someone with less advanced knowledge. On the other hand, Kirschenbaum’s reading of the incident–“recovered and restored again almost as quickly”–is implicitly aligned with his own project of the physicality of data stored on new media storage devices. One could mistakenly believe that Wozniak and Wigginton had restored the lost data from the overwritten floppy disk.
Steven Wozniak writes about this episode in his autobiography, iWoz: Computer Geek to Cult Icon (2006). Before turning to Wozniak’s later recall of this event in 1978, I would like to look at the two sources that Weyhrich cites on the passage that Kirschenbaum cites in his argument.
Weyhrich’s first of two footnotes on his passage points to page 168 of Gregg Williams and Rob Moore’s 1985 interview with Steve Wozniak titled, “The Apple Story, Part 2: More History And The Apple III” in the January 1985 issue of Byte magazine. In the interview, Wozniak tells them:
“We worked all night the day before we had to show it [the disk drive] at CES. At about six in the morning it was ready to demonstrate. Randy thought we ought to back it up, so we copied the disk, track by track. When we were done, he looked down at them in his hands and said, “Oh, no! I wrote on the wrong one!” We managed to recover it and actually demonstrated it at CES” (Williams and Moore 168).
In this primary source, we see Wozniak using the term “recover” to indicate that they were able to get the demonstration operational in time for CES that day, but what form the “recovery” took place is not explained. Was it data recovery in the technical sense or data recovery in the hard work sense of re-writing the code?
Weyrich’s second footnote on his passage points to Paul Freiberger and Michael Swaine’s “Fire In The Valley, Part Two (Book Excerpt)” in the January 1985 issue of A+ Magazine. While I have been unable to find a copy of this magazine, I did refer to the book that this excerpt was taken from: Freiberger and Swaine’s Fire in the Valley (1984). On page 286, they write in regard to Wozniak and Wigginton’s disk dilemma at CES:
“Wigginton and Woz arrived in Las Vegas the evening before the event. They helped set up the booth that night and went back to work on the drive and the demo program. They planned to have it done when the show opened in the morning even if they had to go without sleep. Staying up all night is no novelty in Las Vegas, and that’s what they did, taking periodic breaks from programming to inspect the craps tables. Wigginton, 17, was elated when he won $35 at craps, but a little later back in the room, his spirits were dashed when he accidentally erased a disk they had been working on. Woz patiently helped him reconstruct all the information. They tried to take a nap at 7:30 that morning, but both were too keyed up” (Freiberger and Swaine 286).
Unlike Wozniak’s “recover” in the Williams and Moore interview above, Freiberger and Swaine use the term “reconstruct” in their narrative about the pre-CES development of the Disk II demonstration software. Unlike the term recover, which means to regain what is lost, reconstruct means to build something again that has been destroyed. Freiberger and Swaine’s selection of this term seems more accurate when considering what Wozniak says about this episode in his autobiography:
“We set up in our booth and worked until about 6 a.m., finally getting everything working. At that point I did one very smart thing. I was so tired and wanted some sleep but knew it was worth backing up our one good floppy disk, with all the right data. . . . But when I finished this backup, I looked at the two unlabeled floppy disks and got a sinking feeling that I’d followed a rote pattern but accidentally copied the bad floppy to the good one, erasing all the good data. A quick test determined that this is what happened. You do things like that when you are extremely tired. So my smart idea had led to a dumb and unfortunate result. . . . We went back to the Villa Roma motel and slept. At about 10 a.m. I woke up and got to work. I wanted to try to rebuild the whole thing. The code was all in my head, anyways. I managed to get the good program reestablished by noon and took it to our booth. There we attached the floppy and started showing it” (Wozniak and Smith 218-219).
In this account, Wozniak says that he is responsible for overwriting the good disk with the bad (as opposed to what he said to Williams and Moore for the 1985 Byte magazine interview), but most important is the terms that he uses to describe how he made things right: “I wanted to try to rebuild the whole thing.” He “reestablished” the program by reentering “the code . . . in [his] head” into the computer that they had on-hand. Wozniak’s word choice and description makes it clearer than in his earlier interview that he had to remake the program from memory instead of attempting to “recover” it from the overwritten media itself. While, it might have been theoretically possible for someone as well versed in the mechanism that by that point he had had a significant hand in redesigning from the original Shugart drive mechanism and controller card and of course his development with Wigginton of the software that controlled the hardware to read and write floppy disks in the Apple Disk II system (computer-controller card-disk drive), Wozniak, who reports throughout his autobiography as an engineer who works things out in head meticulously before putting his designs into hardware or software, took the easiest path to the solution of this new media problem: write out the software again from memory.
Memory, of course, is another tricky element of this story. It was my memory of Wozniak’s exploits that drew me to this passage in Kirschenbaum’s book. My memory of Kirschenbaum’s argument informed the way that I interpreted what I thought Kirschenbaum meant by using this episode as a way of making his Fort-Da computer analogy. Kirschenbaum’s memory of the episode as it had been interpreted secondhand in Weyhrich’s history of the Apple II informed how he applied it to his argument. Wozniak’s own memory is illustrated as pliable through the subtle differences in his story as evidenced in the 1985 Byte magazine interview and twenty-one years later in his 2006 autobiography.
Ultimately, the episode as I read it in Kirschenbaum’s Mechanisms was caught in an ambiguous use of language. The use of certain terms to describe the work that we do in new media–in its development, implementation, or daily use–relies on the terminology that we use to describe the work that is done to others–lay audience or otherwise. Due to the kind of ambiguity illustrated here, we have to strive to select terms that accurately and explicitly describe what it is we are talking about. Of course, primary and secondary accounts contribute to the possibility of ambiguity, confusion, or inaccuracy. Sometimes, we have to dig more deeply through the layers of new media history to uncover the fact that illuminates the other layers or triangulate between differing accounts to establish a best educated guess about the topic at hand.
Freiberger, Paul and Michael Swaine. Fire in the Valley: The Making of the Personal Computer. 2nd ed. New York: McGraw-Hill, 1984. Print.
Kirschenbaum, Matthew G. Mechanisms: New Media and the Forensic Imagination. Cambridge: MIT Press, 2008. Print.
Weyhrich, Steven. “The Disk II.” Apple II History. Apple II History, n.d. Web. 13 Sept. 2015.
Williams, Gregg, and Rob Moore. “The Apple Story, Part 2: More History And The Apple III”, Byte, Jan 1985: 167-180. Web. 13 Sept. 2015.
Wozniak, Steve and Gina Smith. iWoz: Computer Geek to Cult Icon. New York: W. W. Norton & Co., 2006. Print.
Thanks to City Tech’s Stanley Kaplan, I now have a substantial new collection of early personal computers including IBM PCs, Radio Shack TRS-80s, a Commodore PET, Texas Instruments TI-99s, ATARI 800, and a number of other computers and peripherals in my office in Namm 520. Some of the smaller items are locked in my filing cabinet, but as you can see from the photos included in this post, I have the larger items arranged around my desk and on a new set of Edsal steel shelves that I purchased on Amazon.com. Now, I have to make some additional room for a large, removable magnetic disk from a TRIAD Computer System (c. late-1970s~early-1980s, the drive that reads this disk was about the size of a washing machine) and an Apple Macintosh Centris 650, which I shipped to myself from Brunswick when I recently visited my parents. In the coming months, I will catalog these machines, see what works, and plan how to use them (research, pedagogy, and exhibits). If you have older computers, disks, or user manuals and would like to donate them for use in my research and teaching, please drop me a line at dynamicsubspace at gmail dot com.
Over the weekend, I launched a new page under the “Research” menu on DynamicSubspace.net for my Retrocomputing Lab.
I use the Retrocomputing Lab’s hardware and software resources in my continuing research on human-computer interaction, human-computer experiences, and human-computer co-influence. So far, its primary focus is on the shift from the pre-Internet, early-1990s to the post-Internet, late-1990s and early-2000s.
During that time, technological and cultural production seems to accelerate. Imagine all of the stories yet to be recovered from that time. How do we untangling of the long shadow of that time from the innovations and disruptions of the present passing into future?
The computer hardware includes Macs and PCs. There are laptops and desktops. There are different add-on cards and peripherals to enhance and change experiences. There are 3.5″ floppy disks, CD-ROMs, and DVDs. There are many different kinds of software ranging from games to interactive encyclopedias to operating systems to word processors. There are different motherboards that can be swapped out in various computer cases (AT and ATX). The machines can be temperamental, but each configuration reveals its own indelible soul (for lack of a better word, but it is a word that I quite like in this context).
My research focuses on reading on screens, depictions of electronic-facilitated reading, and the cognitive effects of reading on screens (of course, there are a multitude of screens and interfaces–a worthy complication) as opposed to other forms of non-digital media (and their multitude).
The Retrocomputing Lab continues to grow and new research possibilities abound. If you are interested in collaborating on a project with Retrocomputing Lab resources, drop me a line at jason dot ellis at lmc dot gatech dot edu.
This is the eleventh post in a series that I call, “Recovered Writing.” I am going through my personal archive of undergraduate and graduate school writing, recovering those essays I consider interesting but that I am unlikely to revise for traditional publication, and posting those essays as-is on my blog in the hope of engaging others with these ideas that played a formative role in my development as a scholar and teacher. Because this and the other essays in the Recovered Writing series are posted as-is and edited only for web-readability, I hope that readers will accept them for what they are–undergraduate and graduate school essays conveying varying degrees of argumentation, rigor, idea development, and research. Furthermore, I dislike the idea of these essays languishing in a digital tomb, so I offer them here to excite your curiosity and encourage your conversation.
In the next few Recovered Writing posts, I will present my major assignments from Professor Kenneth J. Knoespel’s LCC 3314 Technologies of Representation class at Georgia Tech. LCC 3314 is taught in many different ways by the faculty of the Georgia Tech’s School of Literature, Media, and Communication, but I consider myself fortunate to have experienced Professor Knoespel’s approach to the course during the last phase of my undergraduate tenure. The ideas that we discussed in his class continue to inform my professional and personal thinking. Also, I found Professor Knoespel a great ally, who helped me along my path to graduation with side projects and independent studies.
In this essay assignment, we were tasked with exploring an example of a past technology. I chose to write about the Altair 8800–the first personal computer. Coincidentally, I am re-watching Robert X. Cringely’s Triumph of the Nerds, which discusses and demonstrates the Altair 8800 in the first episode.
I enjoyed writing this essay, because it was one of the first that permitted me to combine words and images (thinking about WOVEN). I had done this before on webpages, but not in an essay that I would hand in to my professor.
Jason W. Ellis
Professor Kenneth J. Knoespel
LCC 3314 – Technologies of Representation
September 28, 2004
Artifact from the Past – The Altair 8800
The Altair 8800 is credited as the first personal computer. H. Edward Roberts invented the Altair 8800 after being approached by the magazine, Popular Electronics, to build a kit computer that could be sold through the magazine. It utilized a central processing unit microprocessor and a bus that “signals and power traveled from one part of the machine to another on” (Ceruzzi 228). When it was introduced in 1975 by Roberts’ company, MITS, you could purchase an Altair as a kit for $397 or assembled for $498.
The exterior of the Altair 8800 is a steel enclosure. The front faceplate is black and it has two rows of lights and two rows of flip switches. Each of the lights and switches are labeled. The back had an opening for cooling and the power plug connector.
The first Altair 8800 included a very small amount of computer memory (256 bytes–not kilobytes). Also, when the computer was turned off, anything in the computer memory was lost. This means that each time you used the Altair 8800 you had to input the program you were going to use and any data that the program was going to work with. The input was handled through flipping of different switches on the faceplate. The lights indicated the status of computer during input and the lights would later reveal the output of the program that was laboriously entered. If the power went out during the programming of the Altair 8800, the program was lost and would have to be reentered when power was restored.
In a sense, the Altair 8800 was as self-contained as a modern day iMac. The difference being that teletypes and display technology was prohibitively expensive for the computer hobbyist. When the hobbyist had completed the construction of the Altair there was only the Altair 8800 in its steel enclosure and a power cord that plugged into a wall outlet. Input and output was handled through the lights and switches on the face plate.
The inside of the Altair contained the electronics of the faceplate, the open bus, a CPU card, a memory card, and the power supply. The open bus and the CPU chosen for the Altair 8800 are what ignited the possibility for the upcoming personal computer boom.
The open bus (also called S-100) was unique in that it was a board that was attached to the bottom of the inside of the enclosure that had four card connectors on it. The open bus allowed for expansion possibilities and it was an open architecture which meant that others could build cards that would work in anyone’s Altair 8800. Additionally, others could copy the open bus architecture so that they could build their own branded computer system that would use parts that were interchangeable with the Altair 8800 and other “clones.”
The Altair 8800 used Intel’s latest microprocessor, the 8080. The 8080 distinguished itself from the older Intel microprocessor, the 8008, because “it had more instructions and was faster and more capable than the 8008” (Ceruzzi 228). The 8080 required fewer supporting chips than the 8008 to make a functional system, it could address more memory than the 8008, and it used the “main memory for the stack, which permitted essentially unlimited levels of subroutines instead of the 8008’s seven levels” (Ceruzzi 228). The 8080 was the first microprocessor powerful enough to run this early iteration of the personal computer.
The Altair 8800 was a hobbyist computer. The kit that one could buy for about $400 was a box full of individual components that had to be skillfully soldiered and connected together. MITS did offer a pre-built Altair 8800, but even a completed Altair entailed a good deal of expertise to make it do anything. This first model handled all input and output through the lights and switches on the front panel. The “front panel of switches…controlled the contents of internal registers, and small lights [indicated] the presence of a binary one or zero” (Ceruzzi 228). This was lightyears away from MS-DOS and it was even further away from the GUI of the Macintosh, but it was able to do calculations on data by using programmed instructions. The representation of the program was stored (temporarily, while the power was on) in an integrated circuit. The output was displayed in a series of lights in the same location where the program and data were entered earlier. The output was given in the same format in which it was received, through binary code (i.e., ones and zeros). Input required encoding into binary and output required decoding from binary into results that the computer user could more concretely understand. The computer user had to have command of the the encoding and decoding process in order to use the Altair.
The open bus allowed others to follow in MITS footsteps in building a computer that was similar in design to the Altair 8800. Also, hobbyists and other companies could build add-in cards that would interface with any computer based around the S-100 open bus that the Altair employed. This meant that an aftermarket industry was created for the Altair and its clones. More electrical components, memory chips, circuit boards, lead soldier, and etching materials would be sold and used in the creation of these add-on products. More research and development took place both on the hobbyist’s workbench and in corporate research labs. Some creations were sold as a final product whereas others would have been talked about at user group meetings or published as “how-to” guides in magazines like Popular Electronics. A dynamic cycle of innovation was introduced to the personal computer that had not been present before. This is what led to the personal computer becoming something different than an elitist computing device. The critical mass was building for what led to the first Apple computer and the IBM PC.
Within this creative cycle was Roberts’ choice to use the Intel 8080 microprocessor. Intel had been selling this microprocessor for $360.00 if ordered in small quantities. MITS was able to buy them from Intel for $75.00 each. If MITS had not been able to secure this low price, the Altair would have failed because of its much higher cost. Because MITS was able to buy these processors for the lower price they were able to sell the Altair to customers for a price that they were willing and able to pay. When the Altair took off, this meant that each one had an Intel 8080 CPU in the kit. This meant that Intel started selling a lot more of these new microprocessors that, up until that time, they really didn’t know how to market. Intel began to see that microprocessors weren’t just for expensive, business computers, but they were also for smaller, personal computers. When Intel saw that there was a demand they began to further develop and diversify the microprocessor line over time. Later, other companies began to adopt the S-100 bus. This meant that other companies were buying Intel’s microprocessor to use in those computers. Every computer had to have a CPU and at the time these particular computers had to have an Intel microprocessor. Then other companies, such as AMD, reversed engineered the Intel 8080 microprocessor and began selling their own model that was functionally identical to Intel’s offering. Money was being made and more innovation and work was taking place as a result.
Along with all of this building, research, and development new construction methods had to be developed and new distribution networks had to be employed. The Altair was designed to be built at home by the buyer, but MITS also offered a pre-built turn-key system. MITS did not anticipate the demand and customers quickly had to endure up to a one year wait for their Altair computer. MITS (and others) learned from these delays. Also, new buying and distribution channels had to be established. MITS was buying microprocessors from Intel. The many other components had to be purchased from other companies and distributors. Parts had to be ordered and processed in order to send out kits and turn-key systems to customers. Additionally, Intel had to be prepared to have microprocessors ready to sell to MITS and other companies. When demand rose for the Altair it would have impacted each company that supplied the individual pieces that comprised the finished product. Ordering systems, packing, and shipping had to be arranged to get the Altair from their headquarters to the customer’s home. This involved materials for shipping, personnel, and the logistics of order processing.
MITS tried to market the Altair 8800 as a business computing solution after they saw how popular it was. This was made easier when teletype, CRT displays, disk drives, punch card rolls, and other computing technology was developed for the Altair and S-100 bus systems. Businesses liked easier interaction with the computer and dependable memory storage. These business systems were not very successful because there was no “killer app” for the platform at that time. MITS changed hands several times until its last remnant disappeared.
The Altair 8800 began the desktop computing revolution. Initially it was very complicated and elitist. The very first kits had to be built and used by persons that were skilled in electronics and computer science. The hardware had to be constructed from individual elements and then software had to be devised that would run on this built-from-scratch computer. The Altair became more user friendly over time. The aftermarket, MITS, and the clone manufacturers wanted to attract more customers. The potential customers formed a triangle with the most knowledgeable at the peak with a gradation of less knowledgeable customers toward the bottom. The early adopters of the Altair were at the top of this triangle but their numbers were few. This meant that new computers with new input and output and new features had to be devised that would entice the greater number of potential computer users to want to buy their product. This cycle continues to this day in the personal computer market. Apple, Microsoft, Sony, HP, and many other companies continually work at making something feature rich, but easier and easier to use. Note the utopian artwork below that was used for an early Altair advertisement. It recalls Soviet artwork, utopian imagery, and an Altair on every desk. The Altair was going to offer a leveling of the computing playing field so that all could take part in the use of computers.
Along with this cycle there are those persons who are intrigued by the new technology and they learn more about it on their own or through school. This bolsters the book industry that may sell computer programming or electrical engineering books (or today, the plethora of “Dummies” guides). Schools began to introduce computers into the classroom. At first, it was strictly computer science and programming classes. Later, computers were added for other things such as graphic design, CAD, and word processing. Universities saw more computer science, electrical engineering, and computer engineering majors. These universities added more professors, classroom space, and equipment to compensate for this demand. State and federal spending was sought to cover some of these expenses. Private enterprise was also asked to help through different kinds of agreements that would assist the business while helping the school’s students in need of projects and equipment. This work done by school research could in turn help the businesses with their products that will be sold on the open market.
The Altair 8800 introduced computer enthusiasts to the possibility of working with digital information on their desktop. Time sharing on large mainframes and minicomputers was still the primary interaction people had with computers in business and in schools. With the flip of switches and the monitoring of lights, one could work problems and evaluate data at home or in the office. There were early games, calculating problems, logarithms, and other numerical manipulation. The early adopters questioned what other things could be manipulated with a personal computer. With the introduction of new input and output systems, the list expanded a great deal because human-computer interaction became easier with the connection of a CRT monitor and a keyboard or punch card reader. Also, the binary code and bits of information that were only ones and zero to the computer could be made to represent abstractions rather than mere numbers.
The Altair 8800 was the pebble that began rolling down the snow covered mountain (figuratively and literally of the user base). The concept of the personal computer gained mass and momentum that could not be stopped. The development of the first microprocessor based personal computer created new networks and new demands that were met by computer enthusiasts, students, researchers, and business people.
“Altair 8800.” Old-Computers.com. October 6, 2004. October 6, 2004
Last summer, I wrote about my experiences installing Mac OS X 10.8 Mountain Lion on my Asus P8Z77-V and Intel i7-2700K PC here. What I neglected to say at the time was that an alarming number of creeping instabilities led me to ultimately abandon running Mountain Lion on my PC and return to Windows 7.
I later learned that some of these instabilities were likely linked to a bad PSU and video card–both of which were replaced by the manufacturers under warranty (awesome kudos to Antec and EVGA). With the new PSU and video card, my PC returned to 100% stability under Windows 7. This made me wonder if I could try rolling out a Mavericks installation on my PC.
Also, I wanted to use Mac OS X’s superior file content search technology and other third-party textual analysis tools in my research. I have a MacBook Pro 15″ retina (MBPr), but it lacks the hard drive capacity for my accumulated research files. The comfort that I feel in the MacOS environment and the need for lots of fast storage led me to turn my attention back to turning my PC into a CustoMac (aka “hackintosh”).
This time, I wanted to streamline and simply my setup as much as possible and incorporate components that should work out of the box (OOB). Toward this end, I reduced my hardware configuration from this:
ASUS P8Z77-V LGA 1155 Z77 ATX Intel Motherboard (disabled on-board Intel HD 3000 video and Asus Wi-Fi Go! add-on card)
evga 01G-P3-1561-KR GeForce GTX 560 Ti 1024MB GDDR5 PCIe 2.0 x16 Video Card (removed to simply setup and save power–who has time for gaming?)
Antec High Current Gamer 750W Gamer Power Supply HCG-750
Corsair Vengeance C70 Gaming Mid Tower Case Military Green
Cooler Master Hyper 212 Plus Universal CPU Cooler
Samsung 22X DVD±RW Burner with Dual Layer Support – OEM
Intel 128 GB SATA SSD
Three Western Digital HDDs for file storage and work space.
Also, I added two new components that were recommended from the TonyMacx86 Forums:
TP-Link 450Mbpx Wireless N Dual Band PCI Express Adapter (TL-WDN4800). It works in Mavericks OOB.
IoGear Bluetooth 4.0 USB Micro Adapter (GBU521). It works in Mavericks OOB.
As noted above, I physically removed my 560 Ti video card, because I wanted to simply my setup for installation purposes. Also, I removed the ASUS Wi-Fi Go! add-on card, because despite disabling it in BIOS, the Mavericks installer seemed to hang on a wi-fi device while attempting to set its locale (a setting that determines what radio settings to use based on the country that you happen to be in). After I removed the Wi-Fi Go! card, I had a nearly flawless Mavericks installation process (NB: removing the Wi-Fi Go! card required removing the motherboard, turning it over, removing a screw holding in the Wi-Fi Go! card, turning the motherboard over, and unplugging the Wi-Fi Go! card).
These are the steps that I used to install Mavericks on my PC:
In order to arrive at the above instructions, I read a lot of first hand experiences and third party suggestions on TonyMac’s forums. I owe a tremendous debt of gratitude to the amazing community of CustoMac builders who take the time to share their thoughts and lessons and equally so to the tool-builders who create amazing software including UniBeast, Multibeast, and Chameleon Wizard!
I would suggest that you remember that there is not always one path to a successful build. I distilled a lot of posts into my successful build. Your experience with similar hardware might take a different path. Reading others experiences and trying their suggestions experimentally can lead to your own successful discoveries. Thus, I took the time to try out different configurations of hardware until settling on the stripped down approach with on-board video and OOB networking gear. I tried several different installations: a failed Mavericks installation with kernel panics (Wi-Fi Go! card installed and wrong Multibeast configuration), a successful Mountain Lion installation (barebones and correct Multibeast configuration), and a successful Mavericks installation (detailed above).
Obviously, MacOS X can run on a wide range of PC hardware given the correct drivers, configuration information, etc. Apple could do great things if only Tim Cook and others would think differently and move beyond the tightly integrated hardware-software experience. Apple’s engineers could do great things with building better operating systems that adapt to a person’s hardware. Given the chance, they could challenge Microsoft and Google with a new MacOS X that is insanely great for everyone–not just those who can afford to buy new hardware.
Now, back to using some of the tools that I use in my research on a computing platform that I enjoy:
Jennifer Schuessler looks at current trends in one area of the digital humanities–to study the way published writers use computer technology to create their works–in her New York Times article, “The Muses of Insert, Delete and Execute.” The take away bit about the field is:
The study of word processing may sound like a peculiarly tech-minded task for an English professor, but literary scholars have become increasingly interested in studying how the tools of writing both shape literature and are reflected in it, whether it’s the quill pen of the Romantic poets or the early round typewriter, known as a writing ball, that Friedrich Nietzsche used to compose some aphoristic fragments. (“Our writing tools are also working on our thoughts,” Nietzsche typed.)