A few years later in ~2004/5 I dug that same beige computer out of the closet, bought some extra RAM (I think it was 256mb total I could fit in it) and used that to host a private Lineage 2 server, which is how I got into databases / software development in the first place. With a whole bunch of tuning I could run ~50 people concurrently on that machine without terrible lag.
Eventually I had enough people who donated that I could upgrade to a newly released Athlon x2 stuffed into a rack mount case, which I sent to a colo.
PCs were so dynamic at the time, half my paychecks were spent on discounted upgrades before I ever saw the paper. EDO ram? sign me up. 512K of pipelined L2 cache? yes please. HX chipset? of course. Dual socket pentium pros? I need a raise.
I remember in the mid to late 90's, you could build a computer for someone and walk away with enough for an upgraded system for yourself. Of course the churn on performance was very real. IIRC, 1992 maxed out with a 486 DX2 @66mhz. Around 2000 we crossed the 1ghz mark from both Intel and AMD. We went from OG Doom that couldn't cut it full screen, to Half Life and Quake 3 Arena on Voodoo 3 and early NVidia cards.
That stopped being effective sometime before 2010. Instead I'd recommend buying a decent enough machine and sticking a graphics card in it.
4th gen Core series was the longest I'd held onto a single PC (close to 5 years total for a 4790K). I did a mid-cycle gpu and nvme upgrade and that was it. I bumped to a 3950X/5950X and now 9950X since... AM3 is really the first socket in a long time I'd done an in-place upgrade for any CPU. My daughter's Ryzen 2400 to a 5000 series, and my own build from a 3600 -> 3950X -> 5950X... the 3600 was a placeholder as I couldn't get a 3950X for a few months.
I couldn't even name half the CPUs I ran from 1998 to 2005 or so... it was such a blur of upgrades every 6-12 months... I'd upgrade my computer, my wife's, my son's... etc. Then, things just completely stagnated... I mean there's been progress, but it's over the course of years, not seeing 2-3x in under a year.
Then of course there was the huge "replace everything with SSDs ASAP" performance bump, but ever since the later Core and before the M1, everything felt incremental. Nothing like the "Wolfenstein 3D to Quake Glide in 5 years" era.
Holy shit it was only 5 years - the M1 was released 6 years ago!
(Fresh out of college while the dot-com crash was still in effect, I briefly took a job for a local phone company. Their primary income was from people who were still paying 1996-ish prices for T1 lines, of hundreds and hundreds a month. Meanwhile I would go home to my cable modem which was about 4 times faster for ~$50/month. Now, techically, the T1s were dedicated bandwidth and of course my cable modem was shared, but it was still a terrible deal for them. And they weren't even getting subsidized computers out of it!)
FWIW, I haven't been to the Phoenix Microcenter yet, mostly in that I'm afraid of how much I might otherwise spend there.
A few years later my mom finally let us get one with buffer underrun protection (and some multiplier on the write speed) so I could make mix CDs with music off Napster for my girlfriend and life was good.
She still listens to it when working.
I didn't realize until right now that it had no relationship to Hewlett Packard. I guess I always assumed that it was HP's budget line.
Ugh, I despised dealing with that gear.
Now add money they were making on those mandatory dialup subscriptions and you got a money printer.
I only remembered a couple CompUSAs, Circuit City, and Best Buy selling computers growing up. I don't remember visiting any independent computer stores in the mid 90s.
But talking to those in my parents' generation, most of them bought their computers from some local small shop (and sometimes went back there for computer training!).
I count St. Louis lucky for at least having a Micro Center today, otherwise all my parts would have to come from online stores.