Now I do NOT own one of those fancy 12pin gpu cables so maybe the older cables could handle more than that?
Are you referring to the 16 pin cables (12VHPWR) that are starting to be implemented? If so, I can say that I have not been a fan of using them. In some ways, they make things easier and in other ways, not at all.
Here is a nice little thing I have been trying to figure out. Nvidia just recently announced the new Ada workstation cards and I was looking at the datasheet for the RTX 4000 Ada (see here: https://www.nvidia.com/en-us/design-visualization/rtx-4000/ ). Now, I might be crazy, but is that a 16 pin (12VHPWR) connector on it? If so, WHY? It's a 130w GPU. A different article I read said that it used an 8-pin cable so I really have no idea.
When I blow up the picture it gets blurry but it does look like a 4 pin plug and then a 2nd plug with a double row or pins that are more than 4 across.
Yes, it's a 16 pin connection, 2 rows of 7 pins each plus the 4 pin.
I think that is NVIDIA's new connection status for any and all new GPUs.
Now I do NOT own one of those fancy 12pin gpu cables so maybe the older cables could handle more than that?
Are you referring to the 16 pin cables (12VHPWR) that are starting to be implemented? If so, I can say that I have not been a fan of using them. In some ways, they make things easier and in other ways, not at all.
Here is a nice little thing I have been trying to figure out. Nvidia just recently announced the new Ada workstation cards and I was looking at the datasheet for the RTX 4000 Ada (see here: https://www.nvidia.com/en-us/design-visualization/rtx-4000/ ). Now, I might be crazy, but is that a 16 pin (12VHPWR) connector on it? If so, WHY? It's a 130w GPU. A different article I read said that it used an 8-pin cable so I really have no idea.
When I blow up the picture it gets blurry but it does look like a 4 pin plug and then a 2nd plug with a double row or pins that are more than 4 across.
Yes, it's a 16 pin connection, 2 rows of 7 pins each plus the 4 pin.
I think that is NVIDIA's new connection status for any and all new GPUs.
I understand the benefits of this connection for high end GPUs with huge power draws and spikes. This GPU is not even close to that so I just don't understand any benefit. I know these connections can "talk" with PSUs better (as long as they are ATX 3.0) but that usually has to do with power spikes and such. These professional GPUs can't be overclocked (easily) and usually just sit at their wattage maximum if being pushed (assuming no throttling). Even if it did spike, being that it is a 130w card, would it matter? Just confusing...
I understand the benefits of this connection for high end GPUs with huge power draws and spikes. This GPU is not even close to that so I just don't understand any benefit. I know these connections can "talk" with PSUs better (as long as they are ATX 3.0) but that usually has to do with power spikes and such. These professional GPUs can't be overclocked (easily) and usually just sit at their wattage maximum if being pushed (assuming no throttling). Even if it did spike, being that it is a 130w card, would it matter? Just confusing...
I agree with you that it shouldn't be necessary because of the 130W card, and no, it probably wouldn't even matter if it did spike because of the newer PSUs being more capable of handling spikes.
But think of it this way... NVIDIA is selling (not yet available) this professional GPU primarily to businesses that will likely be using the newer ATX 3.0 power supply. With higher efficiencies and features, it is a standard that should be set. Hence, the 16 pin connector...
I understand the benefits of this connection for high end GPUs with huge power draws and spikes. This GPU is not even close to that so I just don't understand any benefit. I know these connections can "talk" with PSUs better (as long as they are ATX 3.0) but that usually has to do with power spikes and such. These professional GPUs can't be overclocked (easily) and usually just sit at their wattage maximum if being pushed (assuming no throttling). Even if it did spike, being that it is a 130w card, would it matter? Just confusing...
I agree with you that it shouldn't be necessary because of the 130W card, and no, it probably wouldn't even matter if it did spike because of the newer PSUs being more capable of handling spikes.
But think of it this way... NVIDIA is selling (not yet available) this professional GPU primarily to businesses that will likely be using the newer ATX 3.0 power supply. With higher efficiencies and features, it is a standard that should be set. Hence, the 16 pin connector...
I want to reach out to my contact at Dell and see what their solution is to this power connector. They just (about 2 weeks ago) offered the RTX 6000 Ada in their workstations for the first time (I think they were trying to figure it out too). I want to see if they are using a ATX 3.0 PSU or they are just using lots and lots of adapters. All of the Dell PSUs are not modular so I am really curious.
I have been window shopping for rtx 3090 ti water-cooled GPU's. Not too many out there.
It turns out there are a lot more rtx 3090 hybrids (water-cooled) available. On eBay.
If it turns out I have to replace my cranky EVGA rtx 3080 ti I may end up with another EVGA rtx 3080 ti ftw(?).
I prefer pre built water cooled GPU's to trying to DIY.
I would like to get at least one of my single GPU systems competitive with Freewiil's rtx 3090 on e@h. Ideally I would like to beat his RAC. More likely I can " come real close".
This "race" has been dubbed "The Single GPU Race". Freewill has been leading it (mostly) forever. It is open to any brand/model. You limited to a single GPU.
Tom M
A Proud member of the O.F.A. (Old Farts Association). Be well, do good work, and keep in touch.® (Garrison Keillor)
mikey wrote: Boca Raton
)
Yes, it's a 16 pin connection, 2 rows of 7 pins each plus the 4 pin.
I think that is NVIDIA's new connection status for any and all new GPUs.
Proud member of the Old Farts Association
GWGeorge007 wrote: mikey
)
I understand the benefits of this connection for high end GPUs with huge power draws and spikes. This GPU is not even close to that so I just don't understand any benefit. I know these connections can "talk" with PSUs better (as long as they are ATX 3.0) but that usually has to do with power spikes and such. These professional GPUs can't be overclocked (easily) and usually just sit at their wattage maximum if being pushed (assuming no throttling). Even if it did spike, being that it is a 130w card, would it matter? Just confusing...
Boca Raton Community HS
)
I agree with you that it shouldn't be necessary because of the 130W card, and no, it probably wouldn't even matter if it did spike because of the newer PSUs being more capable of handling spikes.
But think of it this way... NVIDIA is selling (not yet available) this professional GPU primarily to businesses that will likely be using the newer ATX 3.0 power supply. With higher efficiencies and features, it is a standard that should be set. Hence, the 16 pin connector...
Proud member of the Old Farts Association
GWGeorge007 wrote: Boca
)
I want to reach out to my contact at Dell and see what their solution is to this power connector. They just (about 2 weeks ago) offered the RTX 6000 Ada in their workstations for the first time (I think they were trying to figure it out too). I want to see if they are using a ATX 3.0 PSU or they are just using lots and lots of adapters. All of the Dell PSUs are not modular so I am really curious.
Apparently my supply to get
)
Apparently my supply to get reliable gen3 ribbon cables is flaky.
I need to try to buy some more.
I need a link too.
Thank you.
A Proud member of the O.F.A. (Old Farts Association). Be well, do good work, and keep in touch.® (Garrison Keillor)
I have been window shopping
)
I have been window shopping for rtx 3090 ti water-cooled GPU's. Not too many out there.
It turns out there are a lot more rtx 3090 hybrids (water-cooled) available. On eBay.
If it turns out I have to replace my cranky EVGA rtx 3080 ti I may end up with another EVGA rtx 3080 ti ftw(?).
I prefer pre built water cooled GPU's to trying to DIY.
I would like to get at least one of my single GPU systems competitive with Freewiil's rtx 3090 on e@h. Ideally I would like to beat his RAC. More likely I can " come real close".
This "race" has been dubbed "The Single GPU Race". Freewill has been leading it (mostly) forever. It is open to any brand/model. You limited to a single GPU.
Tom M
A Proud member of the O.F.A. (Old Farts Association). Be well, do good work, and keep in touch.® (Garrison Keillor)
I am having trouble with
)
I am having trouble with trying to run two rtx 3080 ti's in the context of a system which easily over heats it's CPU VRM's.
The VRM's have a tightly directed fan so I probably can't improve that. The MB/CPU combo is not changeable or replaceable.
I can get an open-flow type rtx 3080 ti to run there.
I ran across some blower type rtx 3080 ti's for competitive prices.
I am pretty sure blower type GPU's take a performance hit.
I am confident the case will not run as hot.
Should I take the chance?
Tom M
A Proud member of the O.F.A. (Old Farts Association). Be well, do good work, and keep in touch.® (Garrison Keillor)
Tom M wrote: I am having
)
Yes as long as you can resell them for at least at nearly what you pay for them if they don't do what you want, otherwise maybe not.
I'm sure we've been down this
)
I'm sure we've been down this path before but I decided to give it another shot and am about to run out of steam again..
Running dual Nvida (30x0) card under Linux Mint BOINC with coolbits. If anybody wants to read most recent foray...
https://forums.linuxmint.com/viewtopic.php?t=404389
I can actually get the xorg part working but then it breaks Cinnamon :-(
Skip
Skip Da Shu wrote: I'm sure
)
I am assuming that if you start with a naked/working X org then do: sudo nvidia-xconfig --thermal-configuration-check --cool-bits=28 --enable-all-gpus
Reboot.
Something breaks?
Tom M
A Proud member of the O.F.A. (Old Farts Association). Be well, do good work, and keep in touch.® (Garrison Keillor)