Toxic E-masculinity

Mistaking a Mountain for a Molehill

Or "How I heard about the global semiconductor shortage."

My computer is just over six years old, born in March of 2015.

Computers age faster than large dogs. They are obsolete as soon as they leave the assembly line. While Moore's Law still carries weight, despite the CPU coping in alternative ways (multiple cores per CPU, multiple threads per core), graphics processing power just keeps increasing and, unfortunately, that's not all because of the gamers.

Show me the quan

Enter Bitcoin (and many, many others). Graphics cards turned out to be really good at the kind of math necessary for blockchain transactions. Imagine a computer with hundreds, or even thousands, of graphics cards all churning at full power. One would hope that the enormous amount of energy consumed by such a rig would be turning out the most stunning VR of all time. Pixar quality graphics in a home video game. Right?

The only "game" here is racing to process blockchain transactions in order to be paid in more blockchain transactions for your trouble.

Once, a decent home office computer could perform these calculations and often did. Early adopters could easily retire to a private island at the current prices of some of these cryptocurrencies. Now entire data centers packed with GPUs (graphics cards) do the work, effectively preventing the small timer from ever joining in the "fun" and simultaneously really helping along that whole global warming thing with the massive power drain.

But that's all background noise to the real drama.

My six-year-old graphics card inexplicably died. I fought against the idea for a long time, eventually even reinstalling my whole system to wipe out the possibility of any bugs from cruft left over from previous versions or even a prior OS clogging up my filesystem.

It's the only way to be sure.

Photo of shocking graphics card abuse: a crypto mining rig.

Photo by Marco Krohn, used under a CC BY-SA 4.0 license.

Easy come, sleazy go

At this point I resigned myself to spending money I did not want to spend on the single most expensive part of any gaming PC. I had been holding onto some money for lots of other things, but all those plans evaporated rather quickly. Ah, well, I'll just hop online and... Sold out? That's weird. Well surely Neweg-- Whaaat?

It was the same story everywhere: every video card from every manufacturer in every legitimate store I could find on the entire Internet was gone. And that was also the day I learned my motherboard does not actually have any integrated video at all, a new experience for me.

With no card, I of course have no monitor, and with no monitor I can't work. It was Friday by then.

Deadline was Monday.

Long distance relationship

I live about forty miles from "town". That is to say that the nearest point of civilization if one does not count the Dollar General is a small burg 20 minutes away that has a few fast food joints, a real grocery store, and an historic downtown district. It's also the county seat. "Town" is an honest to goodness city an hour away that has a university, a music scene, and a dedicated foodie culture in the middle of all these cornfields. It's also the only place within a relatively comfortable driving distance that has more than one computer shop.

I proceeded to call all of them.

All I got was "Sorry."

What the hell happened, I eventually demanded. Yes, I know it's a global pandemic but come on!

What happened was three things: Trump's tariffs, Covid, and stimulus checks.

There were only two video cards available in a city with a six-figure population, they were both the same card, and both places were charging a premium price for 5 year old card. I took the lower premium price because I had no time to mess around at this point. I had to have the card immediately and there were no other options within hours of drive time even if I did not have to make it back to my day job.

It turned out my premium purchase was even used. Two of the foil pins were partially scraped away, but not so much that it failed to function at least well enough I could use my computer well enough to do page layout. I could even game with it in 3D as long as I turned every setting down as low as it could possibly go and even then just barely.

The winter of our missed content

The only way to get a real card at that time was to pay at least double MSRP and, for the most expensive part of a computer, that ended up being an incredibly significant amount of money that only increased over time. I found a site with prices that remained, while high, mostly reasonable. The upside was that they claimed to never artificially inflate prices. The downside was backorder, ETA unknown. I put in my credit card information and waited.

During this time the latest expansion for Stellaris, Nemesis, dropped, an update we had been waiting some time for. This was April. I had not played Stellaris, one of my favorite multiplayer games of all time, since the loss of my GPU.

Like the infant I am, I refused to play it in low graphics mode. I want to see the new content pretty! Change my dydee! I want my ba ba!

Axl Rose whistling

Okay, so I waited six months on a preorder for the only reasonably priced Big Navi on Earth, I think, that did not include actually purchasing an entire computer to get one. Six, frustrating, low graphics having months. I did eventually cave and play Stellaris with terrible graphics. As it's not really a graphical game, not like a first person shooter, it actually was not bad at all and I felt pretty dumb for having held off. Really, as long as you can clearly read all the text, the stairstepping of empire boundaries is not that big a deal. I didn't have any huge space battles, so I don't know how that would have gone, but I certainly had a couple large ones that ran pretty okay. The rest of my system is still pretty beefy despite its age, however, and that helps.

Now, the Big Navi (an AMD RX 6900 XT, specifically) is actually a PCIe 4.0 card, and my motherboard is only a 3.0 as it's getting on in years. Maybe you don't know, but your hardware has version numbers just like your software does. It just tends to change less often because it's still challenging to download a motherboard, but I digress. The point is that my computer would actually have made the card worse. I had to convince myself of the technical truth of this despite knowing full well that when I upgraded the rest of the box, the card would have been right at home.

Commoditized casual racism

I hesitate to use the term "scalper" as it has an unfortunate history, but that's the lingua franca of modern capitalism. Basically, you buy up just anything you can that seems like it might be popular and then charge extortionate prices for those things. "Mmm, good business sense on little Bobby depriving hardworking people of the opportunity to shop for desirable items at MSRP!"

The MSRP of my desired card was actually supposed to be $999 at launch. The price I signed up to pay was a low, low $1500 that crept north of $1600 while I waited. The price set by the very good businesspeople on eBay was around $3000, or maybe a mere $2500 with $500 shipping in the smaller font with the lower contrast. So if I waited indefinitely, I could get the card of my dreams for a mere 160% of its actual price! Yay, capitalism!

I continued to pursue my goal of AMD graphics as AMD actually has open source drivers that are native to Linux and so will not "taint the kernel" as the nVidia modules do. (If you check dmesg, you can see this notice. What that means is that the nVidia proprietary kernel modules are non-free code in an otherwise free space. So, you know: bad.) By further looking at old cards, I could get them for a much lower grossly inflated price. I'm embarrassed to say how much I actually spent on a card nearly as aged as my computer, but far less than the Big Navi and I didn't have to continue waiting on "ETA Uknown".

Instead I only waited for shipping from the other side of the world.

Six and half months after losing my video card, my video games are pretty again.

Mmm, look at that sumptuous water and the light blooms at the base of the turbines. The tall buildings at the horizon, blurred by distance, form a lovely backdrop to this shot of the river's edge. At least the pollution is downstream.

Maybe difficult to pick out in this image, but the depth of field effects really thrill me in this game. The building nearest the camera on the far right is blurred due to the proximity and angle. I can assure you the aliasing in this lower res image is far more extreme than that of the actual game.

Size comparisons

So, somewhat crudely (where "somewhat" translates to "totally"), I explained to my sons that the graphics card is the computer's dick.

This wasn't so much an effort on my part to further cement the juvenile ideas about manhood and masculinity that get bandied about in high school, but rather to leverage that attitude in order to instantly clarify the situation. My beloved computer had had it's manhood torn out. Not really the thing that makes a computer a computer any more than having a penis makes you a man, but rather the sense of primal potency it represents to a person raised in a society totally obsessed by phalli but too deeply frightened to ever actually see one.

Monuments and monoliths and monarchs' scepters all thrusting up into the sky to represent the mighty, tumescent power of the obviously fair laws of the land.

Full frontal male nudity in a Hollywood film, though? The horror!

Measuring up

Even the most basic integrated graphics available today are absolutely astounding by the standards of yesteryear. Modern computing power is actually shocking to me because I remember a time when you could load a program off of an audio cassette tape and how listening to your favorite song on the 8-track was really difficult because you could only fast forward. The level of visuals that I was lamenting from my slightly overpriced temporary fix graphics card are actually outstanding and unthinkable even from only a few years prior to its original release date.

However, one must understand the level of downgrade I am talking about, best illustrated by the "Simba stepping in Mufasa's pawprint" image below:

An nVidia GeForce 710 dwarfed by the cutout in the foam for the Radeon 570 8G, over twice as wide and twice as long.

In the footsteps of giants

That is an image of the temporary fix nVidia card, a GeForce GT 710, sitting in the box for a Gigabyte RX 570 8G. This is what I intended my crude comparison to embody for my boys in a single phrase: It was like going from Ron Jeremy to Michelangelo's David, from Stonehenge to Spinal Tap.

The physical size difference here is very much an appropriate measure of power. Despite computer technology becoming ever more miniscule, these GPUs don't really shrink much. In fact, they tend to grow!

More and more processing cores and memory are packed into these units and the power they burn through produces massive amounts of heat as a side effect. In order that there is not a fire, cooling fans pull that heat off the board. Note that the stand-in, above, has one fan. The new card has two. The newest card, the one going for triple MSRP that I waited six months for before settling on a lower-powered card more befitting the age of my system? Three fans.

The angles of the photographs do not show it well, but the older card has eight pins for power connection and the Big Navi has sixteen. Where the RX 570 can potentially use up to 420 watts, the Big Navi can draw up to 100 more.

Yak loin

Again, miniaturization continues apace yet all this active male energy just keeps growing bigger and hotter (and harder?). You can buy an almost ridiculously tiny computer to sit on your desktop with more power than you can possibly imagine (it's powered by dead Jedi), but you will find that gaming geeks tend to keep having bigass boxen taking up far too much space in their desk area. The reason is that you need a whole lotta case to contain all that luscious.

Nano PCs look at high power GPUs the same way women look at the aforementioned performer in erotic theater: Oh, Hell no! You are not putting all of that in here!

Gigabyte Radeon RX 570 8G card turned to show two cooling fans, pictured next to it's box.
AMD Radeon RX 6900 XT card turned to show three cooling fans.

Afterglow

I made a lot of references both overt and oblique in this little article. If you grokked them all, congratulations: you probably have a bad back and carpal tunnel. Give yourself a pat on the back and a Tylenol.