Gag Me With A Spoon

Claggy
Claggy
Joined: 29 Dec 06
Posts: 560
Credit: 2694028
RAC: 0

RE: Hi Hot. I specified to

Quote:

Hi Hot.

I specified to my builder the pcie needed to be 3.0, and have auxiliary power to run four GTX650s. He subsequently informed me even so there was not enough power to supply four GTX650s, but he was mistaken. The reason he [and me] originally could not get more then two running together had to do with WINDOWS video idiosyncrasies. Apparently each new card added needs to be added with a monitor load. Either resistors or an actual monitor. Then close Windows and do it again.

I have been running four GTX650s for a day now with good results. Errors tasks occur like the plague if you unplug and replug a monitor. Even if you shut down between times. I lost at least an hours worth of work while Windows, stupid windows, scratched its head and remembered that yes, yesterday I DID have a VGA monitor attached to this very same card. So now I have a $20 thrift shop flat screen permanently attached to GTX650 number four. Thinking about applying super glue.

.


The other option is to extend the desktop onto each GPU in turn, and where windows says 'no display detected' change it to 'try and connect anyway on: VGA'

Claggy

David Rapalyea
David Rapalyea
Joined: 3 Jan 13
Posts: 79
Credit: 63886821
RAC: 0

ANOTHER DAY, ANOTHER UPDATE -

ANOTHER DAY, ANOTHER UPDATE - on my four GPU FrankenMonster PC

Althought the 3xGTX650+1 GTX660 worked flawlessly, it did not live up to my productivity/watts expectations. My goal was 35-45w per 10k stones. It looked like it would come in at 46w. So I doubled down and managed to get four power-cordless GTX 650's to work. It would require a science fiction writer to do the process justice. However, the results were as I originally hoped. First, the power draw dropped at least 30 watts from 300-310 to 265-270. More importantly, production has stayed high.

I am hoping for an eventual RAC on this machine of at about 70,000 stones at 270w for about 38 watts/10k stones. This is nowhere near the 31w reported by one of the project gurus. But he uses words like 'overclocking' that are at least 18 months in MY future, if ever.....

Arecibo 19 Oct 2012
Just Because The Space Alien Is Green
Does Not Mean You Should Go

David Rapalyea
David Rapalyea
Joined: 3 Jan 13
Posts: 79
Credit: 63886821
RAC: 0

Claggy Thanks for an

Claggy

Thanks for an additional alternative. This monster has now been leashed, at least for the time being, but the more info the better. However, I would not recommend anyone try this at home. EYE might do it again, depending on how my next two-card machine works out. With some luck it might get under 35w per 10k stones.

I do disagree with those who say simpler and bigger is always best. As exhibit A I present my GTX660. It GPU Benchmarks more then TWICE the GTX650, but I have never, ever, gotten it to live up to that by even a close margin. In every machine and in every combination I have been able do devise. I would sooner order a home rattlesnake farm beginners kit before getting another one of these dogs.

Arecibo 19 Oct 2012
Just Because The Space Alien Is Green
Does Not Mean You Should Go

mountkidd
mountkidd
Joined: 14 Jun 12
Posts: 175
Credit: 11009809067
RAC: 5445131

RE: It GPU Benchmarks more

Quote:
It GPU Benchmarks more then TWICE the GTX650, but I have never, ever, gotten it to live up to that by even a close margin.

Here's a user 4l4r1 that has both 650/660 running under Windows on E@H only and he does pretty much what the benchmarks show. Do note though that his 660 MB is an i7-3770K with PCIe3 (which is what the cards really want). His overall credits are quite close to your original target of 50000 and this is on one card. Methinks you have more work to do...

Gord

Gary Roberts
Gary Roberts
Moderator
Joined: 9 Feb 05
Posts: 5845
Credit: 109959093903
RAC: 31159989

RE: Hi, just some remarks

Quote:
Hi, just some remarks on the GTX 650: As far as I know they have no external 6pin PCIe power adapter.


Perhaps you can get a particular version without the 6pin power but my understanding is that the standard version of the card comes with a 6pin socket. The ones I own certainly do.

From a long term stability perspective, if a person was intending to run multiple GTX650 cards, it would be wise to deliberately choose cards with the 6pin option so as to minimise the power draw from (and the risk of thermal damage to) the motherboard. And, as you correctly point out, cards do start generating errors even if they are only slightly starved for power.

Cheers,
Gary.

Beyond
Beyond
Joined: 28 Feb 05
Posts: 118
Credit: 1672881437
RAC: 5344313

RE: RE: It GPU Benchmarks

Quote:
Quote:
It GPU Benchmarks more then TWICE the GTX650, but I have never, ever, gotten it to live up to that by even a close margin.

Here's a user 4l4r1 that has both 650/660 running under Windows on E@H only and he does pretty much what the benchmarks show. Do note though that his 660 MB is an i7-3770K with PCIe3 (which is what the cards really want). His overall credits are quite close to your original target of 50000 and this is on one card. Methinks you have more work to do...


And here's a nice chart that would confirm that the 660 is at least 2x faster than a 650:

http://www.dskag.at/images/Research/EinsteinGPUperformancelist.pdf

Gary Roberts
Gary Roberts
Moderator
Joined: 9 Feb 05
Posts: 5845
Credit: 109959093903
RAC: 31159989

RE: And here's a nice chart

Quote:

And here's a nice chart that would confirm that the 660 is at least 2x faster than a 650:

http://www.dskag.at/images/Research/EinsteinGPUperformancelist.pdf


The only problem is that the 2x entry for a GTX650 in that chart is not as good as can readily be achieved. I have many GTX650s (at factory settings) that do sub-3600 secs for 2x even when all CPU cores are crunching CPU tasks. The 4340 number in the chart doesn't represent the true performance.

There is a similar discrepancy for GTX650 Tis. Mine does sub-3000 secs for 2x, compared to the listed value of 3430 secs. I don't believe that differences like these can be attributed to factors like OS differences (Windows vs Linux). I only run Linux these days but in the past I found quite small differences between Windows XP and Linux when I was running both.

Cheers,
Gary.

mountkidd
mountkidd
Joined: 14 Jun 12
Posts: 175
Credit: 11009809067
RAC: 5445131

RE: I only run Linux these

Quote:
I only run Linux these days but in the past I found quite small differences between Windows XP and Linux when I was running both.

Differences in which direction? Was Linux always faster? By how much? Were you running on identical hardware and running both OS configs in the same manner?

Gord

Beyond
Beyond
Joined: 28 Feb 05
Posts: 118
Credit: 1672881437
RAC: 5344313

RE: RE: And here's a nice

Quote:
Quote:

And here's a nice chart that would confirm that the 660 is at least 2x faster than a 650:

http://www.dskag.at/images/Research/EinsteinGPUperformancelist.pdf


The only problem is that the 2x entry for a GTX650 in that chart is not as good as can readily be achieved. I have many GTX650s (at factory settings) that do sub-3600 secs for 2x even when all CPU cores are crunching CPU tasks. The 4340 number in the chart doesn't represent the true performance.

There is a similar discrepancy for GTX650 Tis. Mine does sub-3000 secs for 2x, compared to the listed value of 3430 secs. I don't believe that differences like these can be attributed to factors like OS differences (Windows vs Linux). I only run Linux these days but in the past I found quite small differences between Windows XP and Linux when I was running both.


Could you post your average completion times (along with OS) in the "CUDA and openCL Benchmarks" thread so that dskagcommunity can update his chart?

Gary Roberts
Gary Roberts
Moderator
Joined: 9 Feb 05
Posts: 5845
Credit: 109959093903
RAC: 31159989

RE: RE: I only run Linux

Quote:
Quote:
I only run Linux these days but in the past I found quite small differences between Windows XP and Linux when I was running both.

Differences in which direction? Was Linux always faster? By how much?


I first started loading Linux onto machines around 2007. At that time I had large numbers of pretty much 'hardware identical' machines so I had the opportunity of observing (subjectively - I didn't make the effort of working out a rigorous testing regime) the two systems side by side. I just used average crunch times over a period.

I would regard myself as a closet Linux zealot - I believe passionately that Linux is 'better' (in ways that are important to me) and that this will become more and more apparent to more and more people over time. However I could never see myself 'manning the barricades' to push that point of view. The honest result of my subjective comparisons of crunching performance, either then or more recently, is that I could always achieve pretty much comparable performance irrespective of the OS, even though I had hoped to see Linux clearly 'winning' :-).

With regard to GPU crunching performance, the biggest factor I've seen (on the pretty limited range of GPUs I've tried - 550Ti, 650, 650Ti, HD7770) has been the particular motherboard model I've plugged the GPU into. Driver version, too, seems to be another factor at times. And it's not just PCIe1 vs PCIe2 vs PCIe3 although obviously that's another factor. I've found a couple of old PCIe1 motherboards (LGA775) which run just as fast as recent PCIe2 boards. One factor seems to be the use of DDR3 RAM. I have some old Asrock G41 chipset boards (that use DDR3) that are performing very well.

Quote:
Were you running on identical hardware and running both OS configs in the same manner?


Not quite sure what you mean by "in the same manner" but if you mean running a GUI on both - then yes. I've always used KDE on Linux and continue to do so even though machines are running for 99.9% of the time with no keyboard, mouse, or screen attached. BOINC runs as a daemon, so I can shut down X and log out if I wish, but I don't usually bother since it doesn't seem to make a measurable difference to crunching performance. I know it should make at least a small difference and maybe one of these days I'll test it more objectively and do something about it :-). The DE is just there for the odd occasions when I might decide to hook up the peripherals and do a bit of fiddling around.

Cheers,
Gary.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.