About a year and a half ago, I re-entered the realm of PC gaming, buying a high end NUC and dropping in an nVidia card. It has served me well, providing countless hours of enjoyment playing my favorites and some new titles.
But recently I have decided to wade back into the Fall Out universe. I played a bunch of Fall Out 4 on my Xbox, and wanted to get the same experience on my gaming rig.
Now, I know that the graphics card I have is about 4 years old (tech wise) and it is about as good as I am going to get for my NUC. It has been a trooper so far, providing an excellent experience in use.
But Fall Out 4 is challenging its suitability.
What am I seeing? If I choose the native resolution of my monitor, a 2K unit (2560×1440) I get a paltry 30 FPS. And the game is mostly unplayable.
Fortunately, if I drop the resolution to HD (1900×1080) I get a solid 60 Hz refresh rate, and the game is super playable.
But that has me thinking about when will I truly need more horsepower. That will entail building a new PC, and acquiring a better graphics card.
Fortunately, the supply chain constraints, and the hogging of the GPUs by the pond scum crypto miners have been alleviated, so when it is time, I will not have to spend an assload of $$$’s to upgrade.
But that is for another day…
I remember back in the 1990’s where every 18 months or so, I would build a new computer to keep up with the rapid changes of technology. ISA bus -> AGP (special graphics bus) -> PCI -> PCIe and 386 -> 486 (33, then DX2 66, then DX 100) -> Pentium -> Pentium II, TNT -> 3DFx and literally annual new graphics cards to keep up with the Jones’.
Back then, as 3D became mainstream, all those upgrades were required to get to the magic of 30 frames per second. Any game where I got to that frame rate, I considered kick-ass.
But today? Bah, 30 FPS is practically unplayable.
Yep, we’ve gotten spoiled.