8

Rubenerd: Weird Al had 100 gigs of RAM

 1 year ago
source link: https://rubenerd.com/weird-al-had-100-gb-of-ram/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

Weird Al had 100 gigs of RAMSkip to content

By Ruben Schade in s/Singapore/Sydney/. 🌻

I’m old enough to remember when Weird Al Yankovic’s It’s All About the Pentiums first came out. Every few years I like to listen back and see how much more has changed from those heady days of the late 1990s!

Today I gravitated to these two lines:

Defraggin’ my hard drive for thrills;
I got me a hundred gigabytes of RAM

Defragmentation isn’t necessary if you’re on an SSD, and if anything could add to wear without benefit. But I do miss those animations.

The bigger observation there is memory. A hundred gigs seemed ridiculous and unobtainable at the time. Consumer-level machines still measured memory in megabytes, and people had memories of a decade prior when this was kilobytes, or even less. I still remember a kid at my school being amazed that my Commodore 16 from eBay didn’t have 16 MiB of memory.

But memory has felt like an exception to Moore’s Law for a while, at least in practice. While many of the song’s bombastic lines have long been superseded, most people still don’t have 100 GiB of memory in 2022, more than two decades later. My experience is that people are rarely running with more than 16, or 6 times less.

I think that’s interesting, and makes me wonder why.

The first, and most obvious reason, was that the song was supposed to be a bit silly! Earnestly analysing satire puts me right back into high school extension English class, with all the unsubstantiated certainty that comes from saying that “closed curtains represent the passage of time” (my personal favourite).

And yet, plenty of other lines have long been met or surpassed, including T1 network connectivity, monitor sizes, 32-bit binaries, sending faxes, and Y2K compliance. Why not memory?

The easiest, and most widely-accepted advice for people complaining of slow performance at the time was to add more memory. Swapping memory to IDE or early SCSI hard drives was a miserable experience, even if you had a fancy 10,000 RPM caviar device. Heck, I had a spreadsheet at a part-time job in high school that was so massive, it ground my Vaio laptop to dust. Thesedays, we have SSDs with excellent random access performance, and fat buses with less latency to make swap a bit less frustrating. OSs like FreeBSD and macOS also have ARC to optimise memory before hitting slower storage.

But we also have new pressures. People are running hundreds of tabs in their browsers, which have become de facto operating systems in their own right. A decade or more of efficiency gains have been eaten by the likes of Electron applications. Fire up a few conferencing or chat applications, and that’s much of your memory gone.

The other issue is the trend towards consolidation. Manufacturers like Apple are embedding memory directly on their CPUs to increase performance and energy efficiency, which also removes the ability to upgrade. It’s not new behaviour from Cupertino, but I worry about the signal it sends to an industry that so regularly copies what Apple does while pretending to criticise it.

Memory keeps getting faster, and our software keeps soaking it up. Yet take a glance at OEM and system integrator websites, and most machines are still shipping with 8 GiB standard. I had this on my mid-tier MacBooks and ThinkPads a decade ago.

The general public isn’t asking for a hundred gigs, but I’d love to see the baseline rise up a bit. It doesn’t feel like we’ve budged meaningfully here for years. Or is that just me?


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK