Tag: retrocomputing

  • Pat Ruddin’s Bondi Blue iMac at City Tech

    original bondi blue imac on a wood desk, whatnots and other computers on desk in background

    Last year, Pat Ruddin, a kind and tenacious colleague, passed away. Also a Macintosh aficionado, she kept this original Bondi Blue iMac in her office on a filing cabinet. When I came back from my sabbatical, it had made its way to my desk where I maintain the Retrocomputing at City Tech collection of vintage computers. It’s a prestigious addition to the collection, and a marvelous remembrance of Pat.

    My first on-site job at NetlinkIP on St. Simons Island, Georgia was to go to a big, fancy house and setup their original iMac. Soon thereafter, my friend Chris Lee got an iMac, too. I think that I still had my Power Macintosh 8500 at that time. When I got a job at Mindspring in Atlanta, I upgraded to a Blue and White G3, which I later traded to Chris for a Dual G4 (this was a surprising and gracious offer that rekindled our friendship after drifting apart).

    Pat’s iMac doesn’t boot up now, but I think it will make a great project for rejuvenation.

    Powerbook, Commodore PET, Apple iMac, and Macintosh 512K on an office desk
  • Enjoyed Alien: Romulus Despite Too Damn Loud IMAX and Other Customers Who Were Annoying

    xenomorph alien made out of paper in origami style. Image created with Stable Diffusion.

    Yesterday, Y and I took the subway to Manhattan to watch the film Alien: Romulus on the IMAX screen at the AMC 14 on 34th Street.

    I thought that Alien: Romulus was an interesting story that threaded the needle of connecting the origin film Alien (1979) via the first Xenomorph we saw and the android Ash (Ian Holm) to Prometheus (2012) and Alien: Covenant (2017) via the black liquid (hints of the black oil from The X-Files) and the Engineers. The retrocomputers, ASCII text, and a computer with a 3.5″ floppy disk drive made it feel like the same world as Alien. I felt that some of the lines were corny, over-the-top, and unnecessary fan service, but overall, it was an interesting and sometimes exciting addition to the series.

    Unrelated to the film per se, I have some thoughts instead about the technologies of presentation and communal engagement with the film.

    First, movies shown in theaters, especially IMAX films, are shown with the volume far too loud. Y and I last went to an IMAX film over 10 years ago, but the memory of how that experience hurt both of our ears, we planned ahead and brought foam ear plugs. Even with our ear plugs, which work wonders at eliminating noise in other settings, were just barely up to the task of keeping the volume of the film presentation at tolerable levels. Let me put that another way: While wearing ear plugs, I was able to hear the film’s dialog and sound effects and music just fine and sometimes a little not fine when it got so loud as to overpower the ear plugs. That’s too damn loud. It was only after we were leaving that Y thought we should have checked the decibel levels. Hindsight is 20-20.

    Second, I know to some I might sound like an old man yelling at kids to get off my lawn, but for those who have known me a long time, they know that I’ve been deadly serious about this since going to see films when I was a kid. That is we owe other theater goers our respect so that everyone can enjoy the film. Carrying on, talking, or using a phone during a movie can disturb others, so we shouldn’t do those things. Unfortunately, some of the other customers, who would have paid the same $30 per ticket we paid, don’t care for social norms and simple decency. It would be one thing if these were kids who didn’t know any better, but these were adults who acted like kids. Hell is other people, I suppose.

    Considering these things, I prefer to stay at home to enjoy a film without ear plugs or annoying guests. Of course, I am assuming the neighbors don’t act the fool, which I’ve tried my best to address following these tips.

  • Blue Polygonal Sculpture in Manhattan Titled “Jean-Marc”

    Low-resolution polygonal statue of a human figure standing on a sidewalk in Manhattan

    This blue sculpture looks like a blue polygonal figure that has stepped out of a mid-1990s Playstation game. The sculpture is called “Jean-Marc” and was made by Xavier Veilhan. It’s located in Manhattan near MOMA.

  • Introduction to Piet Mondrian’s Neoplasticism Through Star Trek: The Next Generation

    Piet Mondrian's "Tableau I" hands on a wall between Lt. Cmd. Data standing and his daughter Lal sitting.

    In “The Offspring,” the 16th episode of the third season of Star Trek: The Next Generation, we get to see Piet Mondrian’s “Tableau I” hanging on the wall of his quarters when he shows it to his daughter Lal. I think this might be the first time that I had really seen or had my attention drawn to a work by Mondrian. I thought it was quite striking as a work of art, and it seemed fitting that Data might be drawn to this work for its ordered lines despite Mondrian’s neoplasticism theory and its connection to nature and emotion as being the motivators for the artist’s composition.

    Lt. Cmd. Data seated next to Timothy. Mondrian's "Tableau I" is in the background.

    Mondrian’s “Tableau I” appears in Lt. Cmd. Data’s quarters–notably in the eleventh episode of season five titled “Hero Worship,” in which Timothy, a young boy traumatized by the loss of his parents, apes Data’s mannerisms in order to erase his emotional response to his loss. In one scene, Data and Timothy paint in Data’s quarters where “Tableau I” is on an easel to the side.

    Screenshot from the Star Trek: The Next Generation Interactive Technical Manual of Lt. Cmd. Data's quarters where Mondrian's "Tableau I" is seen on an easel.

    In the Star Trek: The Next Generation Interactive Technical Manual, Mondrian’s “Tableau I” is on an easel in about the same place as pictured in “Hero Worship.”

    Yesterday, I was able to see some of Mondrian’s works in person at the Museum of Modern Art (MOMA) in Manhattan. Y and I went there to see our friends from Japan, Masaya and Saki. While I didn’t get to see “Tableau I,” because it hangs in the Kunstmuseum in The Hague, I did get to see some representative works of his neoplasticism.

    Painting of lines and colored rectangles by Piet Mondrian at MOMA.
    Painting of lines and colored rectangles by Piet Mondrian at MOMA.
    Painting of lines and colored rectangles by Piet Mondrian at MOMA.
    Painting of lines and colored rectangles by Piet Mondrian at MOMA.
    Painting of lines and colored rectangles by Piet Mondrian at MOMA.
  • Mark V. Shaney v1.0, a Probabilistic Text Generator for MS-DOS

    Mark V. Shaney v.1.0 running in DOSBox.

    Of the text generators that I’ve discussed this past year, Mark V. Shaney v. 1.0 (MARKV.EXE) is by far the simplest to use but it is also one of the most advanced due to its implementation of weighted probability tables (Markov chains–the program’s name is a pun on this) that underpin how it generates text. I was able to obtain a copy from the TextWorx Toolshed archived on the Internet Archive’s Wayback Machine.

    MARKV.EXE (44,365 bytes) was developed in 1991 by Stefan Strack, who is now a Professor of Neuroscience and Pharmacology at the University of Iowa. In the MARKV.DOC (10,166 bytes) file that accompanied the executable, Strack writes, “Mark V. Shaney featured in the “Computer Recreations” column by A.K.Dewdney in Scientific American. The original program (for a main-frame, I believe) was written by Bruce Ellis based on an idea by Don P. Mitchell. Dewdney tells the amusing story of a riot on net.singles when Mark V. Shaney’s ramblings were unleashed” (par. 2). Dewdney’s article on the MARKV.EXE program appears in the June 1989 issue of Scientific American. The article that Strack mentions is available in the Internet Archive here. A followup with reader responses, including a reader’s experiment with rewriting Dewdney’s June 1989 article with MARKV.EXE, is in the January 1990 issue here.

    The program works by the user feeding a text into MARKV.EXE, which is “read.” This generates a hashed table of probabilistic weights for the words in the original text, which can be saved. The program then uses that table and an initial numerical seed value to generate text until it encounters the last word in the input text or the user presses Escape. The larger the text (given memory availability) , the more interesting its output text, because more data allows it to generate better probability weights for word associations (i.e., what word has a higher chance to follow a given word). Full details about how the program works can be found in the highly detailed and well-organized MARKV.DOC file included with the executable.

    Using DOSBox on Debian 12 Bookworm, I experimented by having MARKV.EXE read William Gibson’s “Burning Chrome” (1982). I pressed “R” for “Read,” entered the name of the text file (bchrome.txt), and pressed enter.

    The program reported “reading” for a few minutes (running DOSBox at default settings).

    After completing its “reading,” the program reported stats on the table that it created using bchrome.txt: 9167 terms (608,675 bytes).

    I pressed “G” and the program began to generate text based on its table of probabilities generated from the bchrome.txt text file, which contained the short story, “Burning Chrome.” While the generated text flows across the screen, there are options to press “Esc” to stop or any other key to pause.

    After it completed writing the generated text to the screen, I pressed “S” to save the generated text and it prompted me to type in a file name for the saved generated text: gibson.txt.

    Pressing “S” gives the user an option to save the table for future use. I went with the default name, MARKKOV.MKV (not to be confused with a modern Matroska container file). This file can be loaded in MARKV.EXE on subsequent runs by pressing “L” and entering the name of the table. When the user presses “Q”, the program exits back to DOS and displays a message, “The random number seed was x,” where x is a random number used in the generation of text. If repeatability is important to the user, you’ll want to make a note of that number and use it with the -s modifier when running MARKV.EXE again (e.g., markv.exe -s2510).

    Mark V. Shaney’s implementation of a Markov chain that builds a table of next word probability on a small text sample is one example of the predecessors to large language models (LLMs) like LLaMA and ChatGPT. However, Mark V. Shaney’s word association probabilities is far simpler than the much more complicated neural networks of LLMs (especially considering attention) with many orders of magnitude more parameters trained on gargantuan data sets. Nevertheless, Mark V. Shaney is one aspect of the bigger picture of artificial intelligence and machine learning development that led to where we are now.