Category: Computers

  • Computer Metamorphosing, or Upgrades as a Way of Life

    Gradient blend of my most recent desktop computer from before (left) to now (right).

    “In nova fert animus mutatas dicere formas / corpora.”
    “It is my design to speak of forms changed into new bodies.”
    –Ovid, Metamorphoses: Translated into English Prose, Published by G. And W.B. Whittaker, London, 1822, p. 1.

    As much as I wish that I had a hoard of computers in a basement or attic, I don’t. It’s not for a lack of want to keep my old computers. It’s always been a financial consideration–sell the old to help finance the new (or used = new to me). While an errant tree limb destroyed my first practical computer–an Amiga 2000HD, I’ve been selling my old computers to help pay for newer ones and upgrades since I was in high school when I sold my 486 DX2/66MHz system before going off to Georgia Tech in 1995.

    For someone who values and enjoys working on vintage computer systems, it’s a bitter pill to swallow that I have to do this. However, it also means that my computers often take on a Frankenstein monster-like existence of becoming–morphing from one system into another via upgrades and reconfigurations.

    I wanted to share some background on my most recent desktop computers from the past 10 years or so as a way to reflect on this practice of tinkering and changing that produces more capable and powerful computer over time. Sometimes, a shift in architecture or new work requirements calls for a change. Sometimes, it’s wanting to try something new.

    2012: Intel i7-2700K in Corsair Case

    I wrote about turning this computer, which I had originally intended to use with Windows 7, into a “Customac” or “Hackintosh,” meaning a PC that ran MacOS X, here and here. I built the computer using on sale gear from the Microcenter in Duluth, GA. The 50 cal. ammo box case by Corsair and green cold cathode light tubes were its two extravagances.

    2014: i7-2700K in Retro Sleeper Case

    Before moving to Brooklyn to start my job at City Tech, I asked my friend Mark with help finding a beige ATX case that I could transplant my i7-2700K system into. What the kids call a “sleeper case,” or a retro-styled case but sporting contemporary computing kit. By this point, I had jettisoned the video card and relied on the CPU’s built-in graphics as this simplified using it as a Hackintosh.

    After moving to Brooklyn, I switched from MacOS X, which was becoming more troublesome with Apple ID-connected software with Hacintoshes, to Linux Mint.

    I had a Sapphire video card of some sort, but I can’t recall what it was now.

    2016: Intel NUC 6I5SYH with i5-6260U CPU

    I carried the i7-2700K sleeper system to City Tech to use in my office space. This left me with only a MacBook to use at home. When I saw the Brooklyn Microcenter offer an i5-based NUC for sale, I thought that would fulfill my computing needs at home and be a new kind of miniature computing experience for me. I wrote about my initial setup of it here. I was surprised by its capabilities, but new computing needs led me to build a new computer.

    2017: Homebuilt Computer with i7-7700 CPU

    I wrote about building, pricing, and benchmarking the first iteration of this i7-7700 based computer here. There were several needs that prompted me to build this machine: I run my own self-hosted instance of World of Warcraft Vanilla and experience some fan-built 3D experiences based in the Star Wars and Star Trek universes. The i5 NUC didn’t have the horsepower for this, so I sold it and built this new computer.

    Later, I wanted to try out virtual reality, so when Best Buy had a sale on the Oculus Rift, I purchased a beefier NVIDIA GTX 1060 video card and VR headset (I’ll write about this more soon).

    I wasn’t happy with the Oculus Rift in my small apartment space, so I sold it and the MSI Geforce GTX 1060 video card. Then, when Microcenter ran an insane deal on HP’s Mixed Reality headset, I picked it up and an EVGA Geforce GTX 1060 to try VR again (more on this soon).

    Long story short: I struck out with VR again, so I sold the 1060 video card and HP mixed reality headset and settled on the built-in video graphics, which is fine for most things on a day-to-day basis.

    2020: Pandemic and Upgrades

    Then, the pandemic hit in 2020 and I was doing everything with my computer–lecturing, video editing, running online symposia, etc. So, I used my first pandemic Economic Impact Payment to purchase a Powercolor Red Devil AMD RX5700XT video card and an MSI 32″ curved LCD monitor to support my online, video-focused existence at that time.

    My small micro ATX case wasn’t an ideal solution for the thermal needs of the RX5700XT video card, so I transplanted the computer into a more spacious Corsair Carbide Series 100R case.

    And, I added a cool 5.25″ drawer insert to keep my flash drives and other on-hand media.

    Before selling it, I had swapped out the RX5700XT video card for an MSI AMD RX550 and sold the RX5700XT for a profit due to the beginnings of the video card shortage during the cryptomining boom during the pandemic.

    2021, early: Lenovo IdeaCentre 5 with Ryzen 4700G

    Even though the RX5700XT video card was great, I ran into some cases where video card processing workflows produced workflows that I wasn’t happy with. I didn’t want to change software, so I figured the easier solution was to shift to tried-and-true CPU-focused workflows on a processor with more horsepower than the i7-7700. I opted for the least expensive Ryzen 7 system that I could find–a Lenovo IdeaCentre 5 with Ryzen 4700G. It was easy to modify and make strategic upgrades to for my needs. I wrote about purchasing this system on sale and upgrading its CPU cooler here and then improving its CPU cooling a few months later here.

    2021, late: Asus ROG G15DK with Ryzen 7 5800X

    Asus ROG G15DK desktop computer with side window revealing its interior components.

    While I enjoyed the Lenovo IdeaCentre 5, I began seeing new 3D demos and games released that I was interested in checking out. Lenovo’s big shortcoming was its proprietary power supply. If I had been able to swap it for a more powerful one, I could have got a video card and made the upgrade. Unfortunately, there are tales across the Internet of a mismatched PSU or adapter killing someone’s Lenovo desktop. Therefore, I began looking for a good deal on a complete system with a similar 8 core/16 thread system with a beefy video card. Granted, this was at the height of the video card shortage, so I remained patient and studied the market while waiting to pounce on this deal when I saw it.

    The Asus ROG G15DK came with a motherboard similarly speced to an Asus PRIME B550M-K with an AMD Ryzen 7 5800X 8-Core/16-Thread CPU, 16GB RAM, 512GB NVMe boot drive, WiFi (occupying second NVMe slot), and NVIDIA RTX 3070 8GB video card. I swapped the 32GB of RAM from the Lenovo with the 16GB of RAM in the Asus, pulled out the WiFi card to free up the second NVMe slot, and ripped out the disco lighting that was pre-installed in the case.

    Then, the next big upgrade that I made was to change out the inadequate 3-heatpipe cooling tower supplied by Asus for a 5-heatpipe Noctua NH-U9S, which I added an additional fan to for a push-pull configuration.

    Later, I transplanted the computer into a less flashy case without a glass side panel–the Thermaltake Versa H17.

    As DDR4 RAM prices improved, I upgraded from 32GB to 64GB to 128GB. And, as SSD prices plummeted, I upgraded the system drive from a 512GB nvme drive to a 2TB Samsung 970 EVO Plus drive as I had described earlier here.

    2023: Current Form with NVIDIA RTX A6000

    As I wrote here, the most significant upgrade to my computer–or any computer that I have ever owned for that matter–has been the NVIDIA RTX A6000 video card for AI and machine learning work that I am doing now.

    2023: Free i7-6700K Bonus System

    In early 2023, someone in my apartment building left this computer in the lobby with a post-it note that said, “Works! No HD.” I didn’t look the gift horse in the mouth! I carried it up the 4 flights of stairs and got to work cleaning it up and checking it out. It had an i7-6700K CPU, 16GB of DDR4-3000 XMP RAM, and EVGA Geforce RTX 2070 8GB video card on a Gigabyte GA-Z170X-Gaming 5 motherboard. I installed a spare SSD and HDD in it, ran memtest86+, and stress tested the still impressive EVGA Geforce RTX 2070 8GB video card. Everything checked out! I sold the RTX 2070 on eBay to help pay for the A6000 video card in my primary system. And, I kept this computer to serve as a media center PC (the built-in graphics work great after making the fix for screen tearing found here). Thank you to whoever gave away such a wonderful machine!

    Reflections

    As Ovid shows us, things change form and function and purpose. This is very true in my experience of computers. I would have liked to have held on to my computers longer–changing them further through upgrades and reconfigurations. However, I always thought at the time that I had a good reason to do the things that I did–sell one computer to help pay for a new one, or switch from a larger computer to a smaller one (or vice versa). Nevertheless, I can see that sometimes my reasons might have been motivated more by a desire for change, that perhaps using or learning a new computer might move me forward in my work or curiosity or explorations. I don’t think that’s always been the end result, but it might have played a part in the musical chairs of my computing life.

    Another thing that I’ve noticed looking at these photos is how sloppy I have been with cable management. Perhaps this is a manifestation of other aspects of my life. A hurry to use rather than perfect the tools of my work, and a worry that too much tweaking when something is operational bodes well for future stability. I admit that I am nervous when working on computers because of problems with some of my earliest computers–some problems brought on by me and other problems instigated by others. The money that I put into my computers is a lot for my meager salary in an extremely high cost of living environment. Every metamorphosing change that I’ve documented in this post cost me in dollars and time and energy–the latter two involving studying, considering, weighing options, etc. You can ask Y, I don’t rush into things that I buy for myself. I have to know that I’m making the best possible decision at that moment after crushing days and weeks of self-doubt and second guessing.

    But, as you can see, I’ve had some adventures building, tinkering, and upgrading computers with this post showing the most recent 10 years or so. I’ll work on another post showing some of my earlier computers, but unfortunately, the record is not nearly as complete due to my not taking as many photos back then as I try to do now. When I do, I’ll write about my Amiga 2000HD, 486DX2/66, Powerbook 145B, PowerMacintosh 8500, Blue and White G3, Dual G4, and more. Stay tuned!

  • Buy Used Laptops to Save Money, Obtain Older High Tier Gear, and Reduce Ewaste

    Since Y and I moved to Brooklyn, we’ve focused our XP grind on budgeting, scrounging, and saving.

    At the core of making ends meet is identifying those things that are negotiable and those that are not. For example, coffee is a negotiable for me. I don’t have to have a $4 Starbucks everyday. Instead, I’m happy to get my caffeine fix from discount coffee brewed at home and carried in an efficient Zojirushi thermos. The A6000 video card that I use in my desktop computer is not negotiable. It’s an investment in my work that should pay a dividend in the future.

    While my desktop computer fits into my non-negotiable category, my laptop computer, which I carry to work daily and use for remote work and classroom instruction, is negotiable. My only requirements for a laptop is that it is stable, has a good keyboard and trackpad, and weighs 3 pounds or less. Certainly, there are many new laptops that fit this bill, but there are also many used laptops that also fit this bill. In fact, a used, well-cared for laptop can have a powerful feature set, albeit a few generations old, that can hold its own against today’s computing rigors. This means that a used laptop with high tier features might cost a fraction of what it cost new. Furthermore, getting additional life out of a used laptop will keep it from winding up in ewaste too soon, which is a bonus for the environment and our collective health impacted by ewaste and the industrial impact of processing it.

    My First Used Laptop: ThinkPad X230

    I purchased this ThinkPad X230 on eBay in 2018 (and wrote a review of it here). It looked and worked as if it were brand new. I used it for my remote work, classroom instruction, and travel until early 2020 (just before the pandemic began). By that point, it felt like it was getting long in the tooth for some of my software (e.g., Wolfram Mathematica), so I was thinking about selling it. One day, my colleague Aaron Barlow saw me using it at City Tech and he asked me to let him know if I hear of any similar machines available for sale. I offered him this one, which he bought a week later after I had wiped the drive and reinstalled Windows 10 for him. He got some use out of it for his writing, and his partner continued using it after he passed away.

    My Second Used Laptop: Lenovo ThinkPad X270

    My Lenovo ThinkPad X270 open to the Debian 12 xfce desktop.

    During the long at home time of the pandemic, I didn’t rely on a laptop–I just used my desktop for work and remote instruction, and I read on my Microsoft Surface Go tablet. Then, when it looked like things would be opening up again, I got a Lenovo ThinkPad X270 from a seller on eBay in December 2021. It was slightly lighter and slimmer than the X230 I had before. Also, my computing needs had changed, so I ran Linux Mint on it from the beginning (but I recently switched to Debian 12 Bookworm on it and my desktop). It was also easy to upgrade to a 1TB NVMe drive and 16GB DDR4 RAM.

    NB: After upgrading your computer’s RAM, remember to run a full diagnostic test with memtest86+. Being in a rush, I installed this 16GB RAM module and went directly to work. Unexpectedly, I occasionally experienced random errors and reboots. I should have tested the RAM before using the laptop for work. Once I identified the error, I was able to exchange the RAM for a new module that passed memtest86+ successfully.

    Lenovo Thinkpad X270 disassembled on my desk.

    How to Find Your Own Top-Tier Used Laptop

    • Spend time identifying your non-negotiable and negotiable features on the laptop. Think about how and where you use the laptop. If power outlets are a premium where you are or you simply don’t want to lug around an AC adapter, you will want to prioritize battery capacity. Or, you might need more computing horsepower and have easy access to power outlets, so a speedier model with less battery capacity might be okay for you. Another important consideration is video output (HDMI, mini HDMI, USB-C + dongle/adapter, etc.). I would suggest writing down these lists in two columns so that you can make sure you don’t overlook a non-negotiable feature or miss a negotiable feature that would be nice to have.
    • With your non-negotiable and negotiable lists in hand, look through Wikipedia, Google Searches, and computer seller websites to get a sense of what laptops were available several generations back. With model numbers, you can also search Google, Reddit, and other social media for reviews. You want to be careful to avoid error prone models (e.g., a model that was known to have problem X).
    • While there are deals to be found on Craigslist or Facebook Marketplace, there is more risk purchasing from someone through those services than eBay. The longstanding online auction house has several features built-in to help protect us buyers. First, buyers and sellers rely on the feedback system. You can see what a seller’s feedback is like (switch to their seller feedback to get the best picture of what matters to you as a buyer), and you can see other metrics about what other buyers thought of the seller’s communication, speed to ship, etc. Second, eBay offers buyer protection through their “eBay Money Back Guarantee.” Third, many (but not all) sellers offer returns on the items that they sell. However, you will want to read their terms and conditions carefully before bidding or purchasing an item. And, that is also a general rule: If you have a question about a product, you should message the seller before bidding or purchasing the item.
    • Study listings carefully. While you are looking at all of these listings, spend time studying the photos and descriptions. If a seller is too lazy to write a description of the item for sale, I pass on those. Similarly, if a seller takes too few or blurry photos, I pass on those, too. If a seller says that the item being sold is similar to but not the item pictured, I pass on those. Despite these issues, if you are interested in an item, then that’s the time to message the seller for more details. You can ask for more photos or a description of the item. If the seller responds to your inquiry, that is a good sign, but if they don’t, you should pass.
    • Be patient. Finding a good deal that meets your non-negotiable parameters usually doesn’t happen right away. You need to educate yourself about the currently acceptable prices for the particular hardware that you are looking for. On eBay, you can do this by filtering your searches to “Sold Items.” This will give you an idea about what others are paying for similar items and gives you a metric for a deal that might fall below the currently accepted price for that item.
    • When you get your new, used laptop, feel good about saving some money, getting solidly capable computing equipment, and saving a computer from joining the ewaste environmental catastrophe earlier than its time.
  • Mirrored Moment of Computing Creation: KPT Bryce for Macintosh

    Outer space scene rendered in KPT Bryce on Mac OS 7.5.5.
    Outer space scene rendered in KPT Bryce 1.0.1 on Mac OS 7.5.5.

    A conversation on LinkedIn yesterday with a former Professional and Technical Writing student about user experience (UX) and generative artificial intelligence (AI) technologies reminded me of the UX innovations around an earlier exciting period of potential for computers creating art: KPT Bryce, a three-dimensional fractal landscape ray trace rendering program for Mac OS released in 1994. It was one of the first programs that I purchased for my PowerMacintosh 8500/120 (I wrote about donating a similar machine to the Georgia Tech Library’s RetroTech Lab in 2014 here). Much like today when I think about generative AI, my younger self thought that the future had arrived, because my computer could create art with only a modicum of input from me thanks to this new software that brought together 3D modeling, ray tracing, fractal mathematics, and a killer user interface (UI).

    Besides KPT Bryce’s functionality to render scenes like the one that I made for this post (above), what was great about it was its user interface, which made editing and configuring your scene before rendering in an intuitive and easy-to-conceptualize manner. As you might imagine, 3D rendering software in the mid-1990s was far less intuitive than today (e.g., I remember a college classmate spending hours tweaking a text-based description of a scene that would then take hours to render in POVRay in 1995), so KPT Bryce’s easy of use broke down barriers to using 3D rendering software and it opened new possibilities for average computer users to leverage their computers for visual content creation. It was a functionality and UX revolution.

    Below, I am including some screenshots of KPT Bryce 1.0.1 emulated on an installation of Mac OS 7.5.5 on SheepShaver (N.B. I am not running SheepShaver on BeOS–I’ve modified my Debian 12 Bookworm xfce installation to have the look-and-feel of BeOS/Haiku as I documented here).

    KPT Bryce 1.0 program folder copied to the computer's hard drive from the KPT Bryce CD-ROM.
    KPT Bryce 1.0 program folder copied to the computer’s hard drive from the KPT Bryce CD-ROM.
    KPT Bryce 1.0 launch screen.
    KPT Bryce 1.0 launch screen.
    Basic scene randomizer/chooser. Note the UI elements on the lower window border.
    KPT Bryce initial scene randomizer/chooser. Note the UI elements on the lower window border.
    KPT Bryce's scene editor opens after making initial selections.
    KPT Bryce’s scene editor opens after making initial selections.
    KPT Bryce's rendering screen--note the horizontal dotted yellow line indicating the progression of that iterative ray tracing pass on the scene.
    KPT Bryce’s rendering screen–note the horizontal dotted yellow line indicating the progression of that iterative ray tracing pass on the scene.
    KPT Bryce rendering completed. It can be saved as an image by clicking on File > Save As Pict.
    KPT Bryce rendering completed. It can be saved as an image by clicking on File > Save As Pict.

  • All In on Artificial Intelligence

    An anthropomorphic cat wearing coveralls, working with advanced computers. Image generated with Stable Diffusion.

    As I wrote about recently about my summertime studying and documented on my generative artificial intelligence (AI) bibliography, I am learning all that I can about AI–how it’s made, how we should critique it, how we can use it, and how we can teach with it. As with any new technology, the more that we know about it, the better equipped we are to master it and debate it in the public sphere. I don’t think that fear and ignorance about a new technology are good positions to take.

    I see, like many others do, that AI as an inevitable step forward with how we use and what we can do with computers. However, I don’t think that these technologies should only be under the purview of big companies and their (predominantly) man-child leaders. Having more money and market control does not mean one is a more ethical practitioner with AI. In fact, it seems that some industry leaders are calling for more governmental oversight and regulation not because they have real worries about AI’s future development but instead because they are in a leadership position in the field and likely can shape how the industry is regulated through industry connections with would-be regulators (i.e., the revolving door of industry-government regulation in other regulatory agencies).

    Of course, having no money or market control in AI does not mean one is potentially more ethical with AI either. But, ensuring that there are open, transparent, and democratic AI technologies creates the potential for a less skewed playing field. While there’s the potential for abuse of these technologies, having these available to all creates the possibility for many others to use AI for good. Additionally, if we were to keep AI behind locked doors, only those with access (legally or not) will control the technology, and there’s nothing to stop other countries and good/bad actors in those countries from using AI however they see fit–for good or ill.

    To play my own small role in studying AI, using generative AI, and teaching about AI, I wanted to build my own machine learning-capable workstation. Before I made any upgrades, I maxed out what I could do with a Asus Dual RTX 3070 8GB graphics card and 64GB of RAM for the past few months. I experimented primarily with Stable Diffusion image generation models using Automatic1111’s stable-diffusion-webui and LLaMA text generation models using Georgi Gerganov’s llama.cpp. An 8GB graphics card like the NVIDIA RTX 3070 provides a lot of horsepower with its 5,888 CUDA cores and memory bandwidth across its on-board memory. Unfortunately, the on-board memory is too small for larger models or adjusting models with multiple LORA and the like. For text generation, you can layer some of the model on the graphic’s card memory and your system’s RAM, but this is inefficient and slow in comparison to having the entire model loaded in the graphics card’s memory. Therefore, a video card with a significant amount of VRAM is a better solution.

    Previous interior of my desktop computer with air cooling, 128GB RAM, and Asus Dual Geforce RTX 3070 8GB graphics card.

    For my machine learning focused upgrade, I first swapped out my system RAM for 128GB DDR4-3200 (4 x 32GB Corsair shown above). This allowed me to load 65B parameters into system RAM with my Ryzen 7 5800X 8 core/16 thread CPU to perform the operations. The CPU usage while it is processing tokens on llama.cpp looks like an EEG:

    CPU and memory graphs show high activity during AI inference.

    While running inference on the CPU was certainly useful for my initial experimentation and the CPU usage graph looks cool, it was exceedingly slow. Even an 8 core/16 thread CPU is ill-suited for AI inference in part due to how it lacks the massive parallelization of graphics processing units (GPUs) but perhaps more importantly due to the system memory bottleneck, which is only 25.6 GB/s for DDR4-3200 RAM according to Transcend.

    Video cards, especially those designed by NVIDIA, provide specialized parallel computing capabilities and enormous memory bandwidth between the GPU and video RAM (VRAM). NVIDIA’s CUDA is a very mature system for parallel processing that has been widely accepted as the gold standard for machine learning (ML) and AI development. CUDA is unfortunately, closed source, but many open source projects have adopted it due to its dominance within the industry.

    My primary objective when choosing a new video card was that it had enough VRAM to load a 65B LLaMA model (roughly 48GB). One option for doing this is to install two NVIDIA RTX 3090 or 4090 video cards with each having 24GB of VRAM for a total of 48GB. This would solve my needs for running text generation models, but it would limit how I could use image generation models, which can’t be split between multiple video cards without a significant performance hit (if at all). So, a single card with 48GB of VRAM would be ideal for my use case. Three options that I considered were the Quadro 8000, A40, and RTX A6000 Ampere. The Quadro 8000 used three-generation-old Turing architecture, while the A40 and RTX A6000 used two-generation-old Ampere architecture (the latest Ada architecture was outside of my price range). The Quadro 8000 has memory bandwidth of 672 GB/s while the A40 has 696 GB/s and the A6000 has 768 GB/s. Also, the Quadro 8000 has far fewer CUDA cores than the other two cards: 4,608 vs. 10,572 (A40) and 10,752 (A6000). Considering the specs, the A6000 was the better graphics card, but the A40 was a close second. However, the A40, even found for a discount, would require a DIY forced-blower system, because it is designed to be used in rack mounted servers with their own forced air cooling systems. 3D printed solutions that mate fans to the end of an A40 are available on eBay, or one could rig something DIY. But, for my purposes, I wanted a good card with its own cooling solution and a warranty, so I went with the A6000 shown below.

    nvidia A6000 video card

    Another benefit to the A6000 over the gaming performance-oriented 3090 and 4090 graphics cards is that it requires much less power–only 300 watts at load (vs ~360 watts for the 3090 and 450 watts for the 4090). Despite this lower power draw, I only had a generic 700 watt power supply. I wanted to protect my investment in the A6000 and ensure it had all of the power that it needed, so I opted to go with a recognized name brand PSU–a Corsair RM1000x. It’s a modular PSU that can provide up to 1,000 watts to the system (it only provides what it is needed–it isn’t using 1000 watts constantly). You can see the A6000 and Corsair PSU installed in my system below.

    new computer setup with 128GB RAM and A6000 graphics card

    Now, instead of waiting for 15-30 minutes for a response to a long prompt ran on my CPU and system RAM, it takes mere seconds to load the model on the A6000’s VRAM and generate a response as shown in the screenshot below of oobabooga’s text-generation-webui using the Guanaco-65B model quantized by TheBloke to provide definitions of science fiction for three different audiences. The tool running in the terminal in the lower right corner is NVIDIA’s System Management Interface, which can be opened by running “nvidia-smi -l 1”.

    text generation webui running on the a6000 video card

    I’m learning the programming language Python now so that I can better understand the underlying code for how many of these tools and AI algorithms work. If you are interested in getting involved in generative AI technology, I recently wrote about LinkedIn Learning as a good place to get started, but you can also check out the resources in my generative AI bibliography.

  • Summer Studying with LinkedIn Learning

    An anthropomorphic cat taking notes in a lecture hall. Image created with Stable Diffusion.

    I tell my students that I don’t ask them to do anything that I haven’t done or will do myself. A case in point is using the summer months for a learning boost. LinkedIn Learning offers new users a free trial month, which I’m taking advantage of right now.

    While I’ve recommended students to use LinkedIn Learning for free via the NYPL, completion certificates for courses don’t include your name and they can only be downloaded as PDFs, meaning you can’t easily link course completion to your LinkedIn Profile. Due to the constraints with how library patron access to LinkedIn Learning works, I opted to try out the paid subscription so that it links to my LinkedIn Profile. However, I wouldn’t let these limitations hold you back from using LinkedIn Learning via the NYPL if that is the best option for you–just be aware that you need to download your certificates and plan how to record your efforts on your LinkedIn Profile, your resume, and professional portfolio.

    After a week of studying, I’ve earned certificates for completing Introduction to Responsible AI Algorithm Design, Introduction to Prompt Engineering for Generative AI, AI Accountability Essential Training. And, I passed the exam for the Career Essentials in Generative AI by Microsoft and LinkedIn Learning Path. I am currently working on the Responsible AI Foundations Learning Path. These courses support the experimentation that I am conducting with generative AI (I will write more about this soon), the research that I am doing into using AI pedagogically and documenting on my generative AI bibliography, and thinking how to use AI as a pedagogical tool in a responsible manner.

    For those new to online learning, I would make the following recommendations for learning success:

    1. Simulate a classroom environment for your learning. This means find a quiet space to watch the lectures while you are watching them. Don’t listen to music. Turn off your phone’s notifications. LinkedIn courses are densely packed with tons of information. Getting distracted for a second can mean you miss out on something vital to the overall lesson.
    2. Have a notebook and pen to take notes. While watching the course, pause it to write down keywords, sketch charts, and commit other important information to your notes. The act of writing notes by hand has been shown to improve your memory and recall of learned information. Don’t keep notes by typing as this is less information rich learning than writing your notes by hand.
    3. Even though a course lists X hours and minutes to completion, you should budget at least 50% more time in addition to that time for note taking, studying, quizzes, and exams (for those courses that have them).
    4. While not all courses require you to complete quizzes and exams for a completion certificate, you should still take all of the included quizzes and exams. Research shows that challenging ourselves to recall and apply what we’ve learned via a test helps us remember that information better.
    5. After completing a course, you should add the course certificate to your LinkedIn Profile, post about completing the course (others will give you encouragement and your success might encourage others to learn from the same course that you just completed), add the course certificate to your resume, and think about how you can apply what you’ve learned to further integrate your learning into your professional identity. On this last point, you want to apply what you’ve learned in order to demonstrate your mastery over the material as well as to fully integrate what you’ve learned into your mind and professional practices. This also serves to show others–managers, colleagues, and hiring personnel–that you know the material and can use it to solve problems. For example, you might write a blog post that connects what you’ve learned to other things that you know, or you might revise a project in your portfolio based on what you’ve learned.
    6. Bring what you’ve learned into your classes (if you’re still working toward your degree) and your professional work (part-time job, internship, full-time job, etc.). Learning matters most when you can use what you’ve learned to make things, solve problems, fulfill professional responsibilities, and help others.