I am starting this thread so that anyone owning any system with more than 2 gpus can discuss their system(s).
By my count 12 out of the top 50 are running 3 or more gpus. It is possible that some of these video cards have more than one GPU on the same card. If that pushes your system into 3 or more gpus and I missed it please let us know.
I now have two. A 4 card box here. And a 9 card box here.
The 9 card box is not (yet) a member of the top 50. But then I just re-started it. It was a 7 NVIDIA GPU box. It is now a NVIDIA/Radeon GPU server.
Tom M
A Proud member of the O.F.A. (Old Farts Association). Be well, do good work, and keep in touch.® (Garrison Keillor) I want some more patience. RIGHT NOW!
Copyright © 2024 Einstein@Home. All rights reserved.
5x RTX 2080ti -
)
6x RTX 2080ti - https://einsteinathome.org/host/12803486
8x RTX 2070 - https://einsteinathome.org/host/12803503
7x RTX 2080 - https://einsteinathome.org/host/12803483
_________________________________________________________________________
1 X 1080Ti @215W 1 X 2080
)
1 X 1080Ti @215W
1 X 2080 @ 215W
2 X 2070 @ stock 175W
https://einsteinathome.org/host/12291110
All running GR tasks one at a time.
3 X 2080 @200W
https://einsteinathome.org/host/12444941
All running GR tasks one at a time.
1 X 1080Ti @ 215W
2 X 2070 Supers @ 200W
https://einsteinathome.org/host/4284634
All running GR tasks one at a time.
So far setting my "store at
)
So far setting my "store at least 0.01 days of work" and "0.1" additional days has resulted in very modest GW "waiting to run" cpu queues on my Amd 2700x/4 Rx 5700 system.
I have previously had a lot of trouble trying to run E@H CPU tasks on the same machine as E@H gpu tasks.
Tom M
A Proud member of the O.F.A. (Old Farts Association). Be well, do good work, and keep in touch.® (Garrison Keillor) I want some more patience. RIGHT NOW!
https://einsteinathome.org/ho
)
https://einsteinathome.org/host/12784895
It looks like Gaurav Khanna has switched from an i9-9900k CPU on the above machine to an i9-9900X (different socket, 10c/20t).
Apparently, his co-location had some kind of power outage which knocked his systems off their game.
I have switched my "gpu-server" to GW gpu tasks. I have a CPU upgrade due in next weekend. I will switch back to 9 GPUs running then. The gpu-server now sports an MSI B360-F Pro MB. I don't expect to be limited (much) by the number of GPUs I can run.
If I drop the P106-90's offline I will be able to go to (probably) 2 threads per gpu. Maybe.
Tom M
A Proud member of the O.F.A. (Old Farts Association). Be well, do good work, and keep in touch.® (Garrison Keillor) I want some more patience. RIGHT NOW!
So what are the main benefits
)
So what are the main benefits of limiting the power usage of a gpu?
You probably are limiting absolute production unless you are hitting a thermal limit that is throttling your gpu processing speed.
Tom M
A Proud member of the O.F.A. (Old Farts Association). Be well, do good work, and keep in touch.® (Garrison Keillor) I want some more patience. RIGHT NOW!
Tom M wrote: So what are the
)
Increased power efficiency. You lose some production as the clock speeds don't boost as high when it hits the power limit. I like to overclock on top of a power limit which brings some performance back that would otherwise be lost. but overall "performance per watt" goes up.
In the case of my 8x RTX 2070 system, I've power limited the GPUs mainly to try to keep the total power draw of the system acceptable for a standard 15A 120V circuit. 8x150W = 1200W, plus the Epyc CPU at another 200W, then factoring in PSU efficiency, it puts me close to the 80% circuit load. I don't really have a good way to add a dedicated 20A or 240V circuit to this location so I'm stuck dealing with the 15A 120V.
Back when I had 10x RTX 2070 on a single machine without this power delivery constraint (was on a 30A 240V circuit), I power limited to 165W more for thermal reasons as I packed the GPUs pretty close together.
_________________________________________________________________________
Tom M
)
that system has always been on a 9900X, as long as I can remember. I don't think it was ever on a 9900K
_________________________________________________________________________
Ian&Steve C. wrote: Tom M
)
Yet another detail I missed :(
A Proud member of the O.F.A. (Old Farts Association). Be well, do good work, and keep in touch.® (Garrison Keillor) I want some more patience. RIGHT NOW!
The next multiple gpu
)
The next multiple gpu experiment will start once I have an i9-9900 installed on the "gpu server".
I have 2 XFX Rx 580's, 4 XFX Rx 570's, and 3 P106-90's I want to crank up full-time with 1 thread per gpu on Gravity Wave processing.
I expect to move the P106-90's offline as I get them sold. But until then I will use them.
I think the "final" goal is likely to be an all XFX Rx 570 8GB machine running 2 GW's per gpu (9 GPUs) but even those GPUs are pretty expensive right now.
This is probably the only way I can give GW tasks under gpu a fair shake (GW only machine).
So far I have been able to (lately) run GW CPU tasks on a GR gpu box without getting overrun by GW tasks (yet). The AMD 2700x is taking on the order of 18 hours per GW CPU task.
While I am pretty sure the AMD 3950x would crunch them faster, right now that box seems to be pretty stable and I am loath to play with the CPU mix.
Tom M
A Proud member of the O.F.A. (Old Farts Association). Be well, do good work, and keep in touch.® (Garrison Keillor) I want some more patience. RIGHT NOW!
Ian&Steve C. wrote:6x RTX
)
Ian&SteveC.
What are you crunching on the CPU side of things on these machines?
Tom M
A Proud member of the O.F.A. (Old Farts Association). Be well, do good work, and keep in touch.® (Garrison Keillor) I want some more patience. RIGHT NOW!