Careful when upgrading GPUs in old(er) systems

Four years ago I bought an Alienware gaming PC. It has an i7 processor, 64GB of RAM, and it came with (2) Nvidia TitanX Pascal GPUs. It was (at the time) the perfect rig for working with DAZ Studio.

Last year shortly after COVID started I upgraded the original TitanX Pascals with (2) 2080Tis, and after a few months of instability and other wonkiness that I now attribute to Nvidia driver issues late last year I was able to fully replace the TitanX Pascals with the 2080s permanently and have been running this for the past 4 or 5 months without any problems.

Then, sometime in late January I think Nvidia tweaked something in the driver because starting in late January the computer would shut itself down without warning as soon as Iray kicked-in any time I tried to render either in preview or as a final any scene that was quite complex or otherwise resource intensive. (Either characters, textures, special effects, etc.)

I knew/ know)what was/ is happening. The power supply is tripping due to an overcurrent protection kicking in.

The question though is why, and why starting in January after several months of things being fine?

So now I’m looking at spec’ing and building out another computer. This time spending almost as much as a car for an AMD Ryzen system with 128GB of RAM and now (3) 3090 GPUs. I’m hoping I can get at least another 5 or more years out of the rig before I’ll need to replace it. At least it’s not as much as a Mac!

Still, I was hoping to get another year or more out of my current rig.

So I think what I’m going to do is put the original Titan Xps back in the machine and donate it after I wipe the drive.

Which means I’m going to have (2) 2080Tis I’m probably going to sell here pretty soon. But with the 3090s out now I don’t think anybody is even selling the 2080s any longer.

The purpose of this post is to caution you to be careful when attempting to upgrade GPUs in your older systems! These newer GPUs draw a LOT of current, and if your power supply isn’t spec’d for the components you’re now running you are more than likely going to have problems.

I can’t upgrade the power supply in this Alienware rig, so my option really is to spec out and build out a new system, and if I’m going to do it I might as well make sure it’s as future proof as much as I can make it.

It would be cheaper for me to buy the parts myself off New Egg or Amazon for sure, but I just turned 49, and while I used to be really into building my own PCs 20 years ago, today none of that holds any kind of appeal for me I’m afraid. I want to unbox a device, plug it in, and have it “just work”.

And frankly, this will probably be my last PC purchase for 3D rendering if I’m being honest.

Until then, I’ve now got one 2080 disabled in Device Manager and I’m contemplating putting the original TitanXps back-in to the case.

1 Like

I am in the same boat. I was lucky enough to get a gaming computer last year, because I am a product tester. The machine rendered DAZ studio excellently, up until the latest updates which are so GPU Reliant.

I dropped my version of Studio back down to 4.11 and am gold again, after also reverting NVidia drivers.

This machine won’t take a new card, nor do I really want to upgrade it. It is a loud noisey beast not the best thing on the market, but it is great for now.

I will likely buy one new computer before retirement, and then I will have other priorities, so I am looking to see what kind of readymade gaming machine with warranty I can afford.

I am really frustrated that the store is letting NVIDIA drive the bus so much, forcing people to rely heavily on GPU when the GPUS just simply are overpriced and Not available. Releasing G8.1 and forcing people to upgrade to use it, seems really out of touch to me.

But, I am satisfied if I can render my G8’ men in peace, and am willing to let DAZ store drive itself off the bridge by supporting only high end users with high end cards.

That said, I wouldn’t mind a better computer, and one t hat doesn’t sound like a plane is taking off when it starts rendering… So I am thinking about it.

1 Like