can my power supply support two NVIDIA GPUs?

Mike Hewson
Mike Hewson
Moderator
Joined: 1 Dec 05
Posts: 6588
Credit: 311907715
RAC: 124459

RE: So my current guess is

Quote:
So my current guess is that the seemingly excessive margin often advocated is not there to account for "all those cheap, low quality PSUs" but rather for time-variation.

Well I've always gone for higher rated supplies for different reasons ( myth ? ) : keep them in the sweet spot/band for the sake of efficiency, secondarily for longevity of the device. If pressed I couldn't justify at an engineering level ( as I ain't one of them ), but that's been my belief.

Cheers, Mike.

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

Anonymous

It amazing what you can learn

It amazing what you can learn by reading your motherboard manual. I learned/found something that addresses power supply issues and provides a most useful link for calculating PSU requirements for those of us using ASUS motherboards and the equipment we plug into them. It allows you to define multiple devices, i.e., two GTX 650 Ti(s), etc.

Here is the link: ASUS MB ONLY - Recommended Power Supply Wattage Calculator

FalconFly
FalconFly
Joined: 16 Feb 05
Posts: 191
Credit: 15650710
RAC: 0

Wow, that ASUS website WAAAY

Wow, that ASUS website WAAAY overrecommends PSU sizes.

It recommends a 300W PSU for a System with peak 90W consumption.
It recommends a 400W PSU for a System with peak 150W consumption.

For my config (375W consumption @ full load) it recommends a 900W (!!) PSU.

All this calculator does is add known hardware TDP data and add the hughe safety buffer for cheap no-name PSUs.
This is known among hardware experts since many years (as the question what PSU do I need for my new GPU is an old one often answered that arises with every new generation of GPUs)

In the end, you'll pay tons of money for very high rated PSUs you don't need that run in a quite inefficient area of operations (<50% load)...
If you're going the cheap way, you'll also pay for every % of increased inefficiency of the PSU itself, which adds to the inefficient low workload.

Again, nothing beats a manual measurement or realistic assessment of component power usage, then making up your own mind how far you want to load your HIGH QUALITY PSU (or how much headroom you'd like, i.e. including future hardware upgrades).
Once you know where their overrated numbers come from, the manufacturers recommendations basically become useless, in many cases even downright wrong or ludicrous.
Their figures are basically intended as failsafe for dummies with no hardware knowledge whatsoever.
It's your choice to inform yourself and be smart about your investment or go the easy aka expensive way.

If you really want to dig into the matter (and the differences between a good and a bad PSU), there are a number of high quality tests/reviews out there that highlight the differences (and some vividly show what happens when cheap PSUs hit their limits, which is often far below their advertised power outputs - versus HQ PSUs that often even can handle short periods of overload peaks or why they can even handle short power surges etc.)

MAGIC Quantum Mechanic
MAGIC Quantum M...
Joined: 18 Jan 05
Posts: 1885
Credit: 1397154774
RAC: 1136288

That is true Falcon, The

That is true Falcon,

The most I paid was $150 for my 850 watt Ultra X4 "80+ Bronze" and it runs a complete desktop with a GTX 660 Ti SSC and I also have a GTX 650Ti OC plugged in to it from another desktop next to it that still uses its cheap stock PSU to run the rest.

And I only paid $100 (each) for my other three 750 Watt Ultra X4 "80+ Bronze"

Which of course is cheaper than any of my nVidia cards.

They always work and the best part is they have a lifetime warranty.

Anonymous

RE: Here is the link:

I am wishing I had not posted this link. I believe it will only cause confusion - something we don't need. According to this site I will be about 300 watts short of supporting an additional GPU with my system's current configuration. I not buying that.

Anyway I will be "adding" another GPU. Probably not a GTX 770 though. I am currently awaiting a 200mm fan replacement for a top mount exhaust fan that seems to be having issues. I gave it some CPR yesterday to get it going but its not going to last.

Anonymous

RE: RE: Here is the

Quote:

I am wishing I had not posted this link. I believe it will only cause confusion - something we don't need.

Quote:
According to this site

Clarification: "This site" in the above statement is referring to the ASUS site and not this forum.

Quote:

I will be about 300 watts short of supporting an additional GPU with my system's current configuration. I not buying that.

Anyway I will be "adding" another GPU. Probably not a GTX 770 though. I am currently awaiting a 200mm fan replacement for a top mount exhaust fan that seems to be having issues. I gave it some CPR yesterday to get it going but its not going to last.


ExtraTerrestrial Apes
ExtraTerrestria...
Joined: 10 Nov 04
Posts: 770
Credit: 576046900
RAC: 184189

Robl, what PSU do you

Robl, what PSU do you currently have? If it's decent quality it will take a GTX770 without problems.

Which wouldn't automatically mean this is the best buy. I suppose it's for your i7 3770? If so - good for Einstein, as both cards will be running 8x PCIe 3. And will that card "only" run Einstein?

MrS

Scanning for our furry friends since Jan 2002

Anonymous

RE: Robl, what PSU do you

Quote:
Robl, what PSU do you currently have? If it's decent quality it will take a GTX770 without problems.

The PSU is a CoolMaster 750 W.

Quote:
Which wouldn't automatically mean this is the best buy. I suppose it's for your i7 3770? If so - good for Einstein, as both cards will be running 8x PCIe 3. And will that card "only" run Einstein?

Yes it is for the I7 3770. About a week ago I "reorganized" my PCs with respect to the projects they would support. This particular PC is solely E@H now.

Can you explain: "both cards running 8x PCIe 3". I am not quite clear on your meaning.

I have two PCIe x 16 slots on the mother board one rated at 3.0 and the other at 2.0. I assume I would place the 770 in the 3.0 X16 slot and the 650Ti in the 2.0 X 16 slot.

At the moment I am "thinking" about the GTX770. It is a bit pricey, but I would sure like to have it. I have my eye on the "virtual toaster in the virtual gift catalog". I am currently waiting on a 200mm top fan to replace the current one which is misbehaving.

Quote:
MrS


ExtraTerrestrial Apes
ExtraTerrestria...
Joined: 10 Nov 04
Posts: 770
Credit: 576046900
RAC: 184189

I wouldn't buy a Cooler

I wouldn't buy a Cooler Master PSU (or heatsink-fan), but they're not rubbish either. So power-wise you're fine :)

Regarding the slots: the 16 PCIe 3 lanes built into the CPU can be split up into 2 x 8 lanes. There are a lot of boards offering 2 physical 16x PCIe slots, but when the 2nd slot is populated both slots are switched to "only" use 8 lanes. Since it's PCIe 3 this is still pretty good. I assumed you had such a mainboard.

If you say your 2nd slot is PCIe 2 it's probably "16 lanes physically (like those other slots I talked about) with 4 lanes being used electrically". This would be the norm and actually about the maximum the connection between chipset and CPU could handle. If you want to know for sure this can easily be found out given the full mainboard name.

In a 4x PCIe 2 slot Einstein performance does suffer. I can not say for sure by how much.. but you could try it out.

MrS

Scanning for our furry friends since Jan 2002

FalconFly
FalconFly
Joined: 16 Feb 05
Posts: 191
Credit: 15650710
RAC: 0

It's also best to check out

It's also best to check out the Board Specification details on the manufacturer's website.
They usually show what PCIe configurations exist and which slots offer what bandwidth when occupied (including recommendations for Multi-GPU config).

Otherwise, a Tool like GPU-Z can show the currently used PCIe config (that's how I found out mine when I checked it for the 1st time)...

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.