An Idea for Aggregation of Student Online Artifacts Using Visual Rendering and Metadata Collection

Diagram of Visual Aggregation.

Diagram of Visual Aggregation. Click image above to view full resolution version.

This afternoon, I participated in an online reunion with my colleagues at Georgia Tech–Nirmal Trivedi, Pete Rorabaugh, Andy Frazee, and Clay Fenlason–about the first-year reading program, Project One.

During the conversation, I thought of this idea for aggregating student online work in a database and presenting student work through a website.

This builds on Pete’s ideas about dispersed exploration and fragmented student artifactual creation. So, if we have our students working online using any service, platform, or software, how can we bring their work together so that we can see and more importantly, they can see how their work fits together with the work of others? We can build a simple website that collects information (a URL, a brief, optional description, tags, and an affirmation that the content linked belongs to the student and is legal), generates a rendered image of the content, and presents those images as thumbnails with the collected information on a visually dynamic website that supports different ways of arranging aggregated content (by date, by dominant color, by tags, etc.). Beyond making these aggregated student artifacts available through the presentation website, the archive of rendered images and supporting metadata can be dispersed once the project is over (dispersing the archive–an idea I received from a conversation with Bob Stein of The Future of the Book project).

The image that leads this post illustrates my idea:

  1. Students login to a collection site with Active Directory (no new account needed). The collection website asks for the URL to the student’s work anywhere publicly available online, a brief description (not required–move this down the page and elevate tags), content tags or keywords (required), and a commitment that the content belongs to the student and is legal. The student’s name is automatically associated with the content after logging into the site with Active Directory.
  2. A service running on the site creates a JPG or PNG image of the rendered website URL supplied by the student, which is added to their content’s entry in the aggregation database. The site’s backend takes the URL, loads the URL in webkit, and captures the rendered page as  JPG or PNG. CutyCapt does this kind of work.
  3. On the public-facing side of the aggregation website, the students’ work is presented in either a grid of images (with ordering options based on dominant color, date of publication, tags) or a word cloud of tags (which can be clicked revealing the artifact thumbnails associated with that tag). Other possibilities can be concurrence between tags–visually depicting links between different tags, etc. On the visual presentation of artifacts, the square thumbnails enlarge as the user mouseovers each thumbnail to reveal a larger preview of the content, description, tags, student name, etc. (think of Mac OS X’s dock animation). There are lots of different ways to use visualization techniques and technologies to make the presentation of student work interesting, engaging, and layered with additional meaning and context.
  4. Finally, after the project is completed, the archive of student work exists online on the website and distributed among the students on flash drives. The content can be in directories for each aggregated student project, or a Java app that recreates the functionality of the website (or Java can be used on the presentation site, too–the website connects to an online database and the thumb drive version connects to the local database).

Transformations of ‘Cyberspace’ Across Media: My Poster for 12th Annual City Tech Poster Session

Click my poster to see full version.

Click my poster to see full version.

Today, the New York City College of Technology, CUNY (City Tech) hosted its 12th Annual Poster Session for Faculty and Student Research. I presented my poster, “Transformations of ‘Cyberspace’ Across Media,” as one of the over 100 other posters in the event.

My poster abstract sketches the project that I am currently developing: In this research project, I explore how the concept of cyberspace transforms based upon the medium in which it is expressed. Specifically, I focus on the term’s explication in William Gibson’s Neuromancer as a printed artifact (1984), video game developed by Interplay (1988) for multiple personal computing platforms, and eBook produced by the Voyager Company for the Apple PowerBook platform (1992). I examine the term’s transformation as originating in typewritten text, developing in an interactive game, and finally, joining print and computing in one of the first mass-marketed eBooks.

At the Poster Session, wearing Google Glass.

At the Poster Session, wearing Google Glass.

After having the opportunity to talk about my project with colleagues and students during the poster session today, my ideas began to crystalize further about the trace of meaning that I am following in the ‘cyberpunk’ term as it appears in these early forms, transformations, and remediations of William Gibson’s Neuromancer. In particular, I am thinking about how each medium adds to the term’s meaning, but I am also thinking about how these media subtract from the term–each in its own way. For example, the novel’s imaginative possibilities are visually realized in the interactive video game. The promise of the original story is made more real within the platform described by the term. However, the ebook confronts the reality of computing technology in the early 1990s and reveals the concrete limitations of the imaginative concept as it was presented in the novel and video game. Yet, it does this while illustrating other possibilities–perhaps more mundane but nonetheless important and interesting–that were not explored in the two earlier forms.

I am developing the ideas behind this poster into a publishable essay that I hope to have completed in the coming months and sent out for review.

Personal Reflection and Improved Battery Life on iPhone 4S with iOS 8 (Hint: It’s about Twitter and Technology Use)

2011-10-14 - IMG_2037

I really liked my iPhone 4S after I received it on October 14, 2011. It had tremendously long battery life (2-3 days between charges initially), and it had a lot of get-up-and-go for apps, games, and online activities supported by my then-unlimited AT&T data plan. However, my attitude towards my phone soured after 12-18 months. It began needing recharging more frequently and it lost its speed as the years past, new versions of iOS were installed, and new apps were updated.

I long thought that two things were conspiring against my iPhone 4S’s battery life. First, as iOS matured, it increased in complexity and became more feature-rich. Also, it seemed apparent that Apple was optimizing new iOS releases for correspondingly new iDevice hardware and CPUs. Put another way, my iPhone 4S’s A5 processor was not as efficient as the newer CPUs appearing in the iPhone 5, 5S, and 6. Unfortunately, Apple does not make it easy for its phone’s owners to choose which compatible operating system to run on their phone. After a brief period following a new iOS’s release, you cannot downgrade to an earlier version of iOS. This means that after the biggest jump in my experience–upgrading from iOS 6 to 7–was not reversible, because I waited too long to downgrade my iPhone 4S.

The other issue had to do with the nature of lithium-ion batteries. While they are tremendously better than older battery technologies, they suffer from the same problem as those older batteries: the maximum storage capacity of the battery decreases over time due to the number of recharge cycles. I thought that after two years, perhaps my battery needed to be replaced. By this point, I was having to recharge my phone once a day, so it seemed that its battery’s maximum capacity had been depleted. I purchased a battery replacement kit from, but after installing it, I did not see any improved battery life.

In my search for a technological solution to my iPhone 4S’s battery life problem, I was ignoring a bigger piece of the puzzle: my behavior. It occurred to me after uninstalling the Twitter app on my iPhone 4S about a week ago that my iPhone seemed to return to its halcyon days of needing a recharge about every two days! At first, I wondered if it had been the Twitter app that had been sucking the battery dry, but then, I reflecting on what I had been doing during the day differently when I had the Twitter app installed.

Around the time that I got the iPhone 4S, I began using Twitter more than I had in the past. When I used Twitter, I usually accessed it on my phone many times each day. Each time that I would check Twitter, I had to activate my phone (turn on the screen), unlock it, open the app, download data (wifi/less power draw or cellular/more power draw), send a tweet, take a photo occasionally to attach to a tweet, etc. Essentially, I was using my phone more often and the things that I was using it for was drawing a lot of power from the batter (data use, screen brightness high if outside, using the camera).

While I still seem to use my phone a lot (text messaging, web browsing, phone calls, other app use), taking my behavior and phone use as a Twitter user out of the equation seems to have significantly improved my phone’s battery life. Additionally, it has helped me refocus my attention on more important (at least to me) work and reading.

Of course, someone might point out the obviousness of using your phone less will prolong its battery life. However, as we use these technologies (mobile computing and social networking) more as a part of our daily practices, it is easy to miss how the pattern of our use might have changed over time. It is easy to fall into the trap of thinking that I am using this technology the same now as I did one or two years ago when that believe might not be supported by empirical evidence.

This is why I recommend reflecting on your behavior as a technology user before assuming that there is a technological problem involved in depleted battery life. While we shouldn’t rule out hardware or software sources as the root cause of a quickly discharged battery, my experience reveals how significant our behavior and use patterns (and how those patterns imperceptibly change over time) impact the battery life of our rechargeable devices.

Furthermore, we should all reflect on our technology use for non-technical reasons; meaning that we should reflect on how we use these technologies, what effect our use of these technologies have on our lives and interpersonal relationships, and how do these technologies effect our learning, critical thinking, and decision making abilities. Taking a time out to reflect might improve our human capacity to avoid “plugging in” as often as our devices might require.

Writing Advice From Neil Gaiman and Bruce Campbell, Written on My Old Powerbook G4, 2005

DSC00461 2

In 2005, I had the pleasure of meeting Neil Gaiman and Bruce Campbell during their separate book tours (Anansi Boys for Gaiman and If Chins Could Kill for Campbell). I asked each of these great people for advice on writing, which they committed to the front of my old 12″ Powerbook G4.

Campbell wrote, “Get Busy.”

Gaiman wrote, “Finish Things.”

Words that apply to all endeavors.

Words that drive me in mine.

My Brain in 3D: Rendered Videos and Images of My fMRI Scan Data

My brain (c 2007).

My brain (c 2007).

Back in 2007, I made a deal with a friend to participate in his fMRI brain scan study at the University of Liverpool in exchange for a copy of the DICOM data from my scan. He agreed to the trade.

Since then, I occasionally pull my scan data off the shelf and dust off the cobwebs and disk errors, and import it into the DICOM Viewer, OsiriX (e.g., as I did in 2009). With the latest versions, I have had a lot of trouble importing the files as they were given to me into OsiriX. Luckily, I saved the installers for earlier versions including the venerable version 3.5.1, which still runs fine on MacOS X Mavericks and Yosemite.

Using OsiriX’s many features, I created these four videos and an album of images of my 2007 brain. I wonder how it has changed since that time–completing my MA, then PhD, taking a postdoc at Georgia Tech, and now, working at City Tech. Also, I think about the technologies of representation that make it possible for me to see my brain without injury or invasion–OsiriX and unseen software libraries for working with, manipulating, and displaying DICOM data, MacOS X and its technology APIs, my MacBook Pro retina, disk and flash drives, email (how I originally received the scan data), the fMRI machine that I sat in for 30 minutes to an hour, the physical laws behind each technology and the biology of myself, etc. What do you think about when you see my brain represented below?

Final Videos

Draft Video (I had not yet removed all the tissues and bone around the brain)

Rendered Images

Teaching at City Tech, Fall 2014, Syllabi for ENG 1101 and ENG 3771

For my first teaching assignments at City Tech, I received two sections of ENG 1101 English Composition I and one section of ENG 3771 Advanced Career Writing (a professional and technical communication course for students in these majors: Legal Assistant Studies, Communication Design, Electrical Technology, and Telecommunications Engineering Technology). I created syllabi that meet and exceed the outcomes defined for these courses while carefully considering the material conditions of my students in and out of the classroom. You can find copies of my Fall 2014 syllabi here: ellis-jason-eng1101-syllabus and ellis-jason-eng3771-syllabus.

We are entering the fourth week today, so we are picking up momentum and getting good work done. Students in ENG 1101 are working on a brand new take on my “Writing the Brain” assignment, and the students in ENG 3771 are building job application portfolios while getting plenty of time to interact with one another and cooperate on the revision process. With a strong start, engaged students, and stimulating projects, I’m looking forward to what I believe will be a great first semester at City Tech.

Personal Digital Archaeology: Jason’s Icons 1.0, Feb. 7, 1997

I have been spending some time digging through my past online and conducting personal, digital archaeology. While doing this research, I ran across a collection of Macintosh icons that I made back in 1997 and bundled on Feb. 7, 1997. I likely used ResEdit to make the icons (32 x 32 pixels).

You can download the collection in its original HQX/SIT container from here on the Info Mac Archive.

In the archive, I included a Read Me file with my reasoning behind making the icons set. Also, it reminded me of my first email address at Georgia Tech, which was replaced when I returned to complete my studies in 2001. The Read Me file includes this text:

Jason’s Icons v1.0

February 7, 1997

Dear Downloader, These are some icons that I created out of pure desperation to label the folder contents of one of my hard drive partitions. This is how I use them:  After careful consideration I have decided to let other people enjoy the fruit of my labors and perhaps spread a little happiness throughout the world. (Hey, I can dream!) If you do happen to use these icons and have any suggestions for a new set or would just like to say “hi,” please feel free to contact me at my email address listed below.

Sincerely, Jason Woodrow Ellis

I grouped the icons into these folders (some for reasons lost to me): Cameras, Enjoyment Icons, Internet Metaphor, Office Equipment, Tools of Torture, and Video Equipment.

Jason's Icons: Enjoyment Icons

Jason’s Icons: Enjoyment Icons

Internet Metaphor

Internet Metaphor

Jason's Icons: Office Equipment

Jason’s Icons: Office Equipment

Jason's Icons: Tools of Torture

Jason’s Icons: Tools of Torture

Jason's Icons: Video Equipment

Jason’s Icons: Video Equipment

Jason's Icons: Cameras

Jason’s Icons: Cameras