GPU Price Performance Curve

MAGIC Quantum Mechanic
MAGIC Quantum M...
Joined: 18 Jan 05
Posts: 1705
Credit: 1070476048
RAC: 1262780

The GTX 460 (336 cores) sure

The GTX 460 (336 cores) sure looks like the best deal right now.

(work per $)

Of course if I order a couple that will probably change but that's ok.

I can just add the ATI Radeon's to the parts bin.

(of course you guys can let us know what works the best since you tested many of them)

Robert
Robert
Joined: 5 Nov 05
Posts: 47
Credit: 318668854
RAC: 19373

RE: 10. All the other

Quote:

10. All the other questions I should have asked but failed to which would help people to decide whether to add CUDA, to buy the right stuff, and successfully to start up their new capability.

For me the biggest surprise was the restriction of not switching users under Windows. Based on several posts scattered around switching users causes the GPU work units to error out.

As far as the direction these questions take this thread, I feel the more users can find the information they need in one spot the better.

joe areeda
joe areeda
Joined: 13 Dec 10
Posts: 285
Credit: 320378898
RAC: 0

RE: The GTX 460 (336 cores)

Quote:

The GTX 460 (336 cores) sure looks like the best deal right now.

(work per $)

Of course if I order a couple that will probably change but that's ok.

I can just add the ATI Radeon's to the parts bin.

(of course you guys can let us know what works the best since you tested many of them)


I wonder what kind of experiment it would take to answer the question of which is best or which is the best deal. And how generally applicable the results would be. The encouraging thing about this thread is the consistency among the different ways to measure and display price performance.

It seems nVidia is determined to blanket every price point between $30 and $3000 (Tesla) with just as wide a range as performance, (which is in the eye of the beholder as much as we try to quantify it).

I wonder how predictive using a program like BRP3 to measure performance will be of future GPU apps. It may very well turn out to be as accurate a benchmark as the ones BOINC runs on installation.

I guess the bottom line for me is that the current chipset displays from Intel and AMD are pretty good and very cost effective for most of the applications I use. These CUDA devices add a significant performance boost to some 3D displays, video processing apps, and to current E@H GPU capable apps.

I think the 460 is an excellent choice but won't touch the concept of optimality unless someone is willing to fund a really cool but long and expensive project to measure total cost and long term throughput.

Joe

FrankHagen
FrankHagen
Joined: 13 Feb 08
Posts: 102
Credit: 272200
RAC: 0

RE: I wonder how predictive

Quote:
I wonder how predictive using a program like BRP3 to measure performance will be of future GPU apps. It may very well turn out to be as accurate a benchmark as the ones BOINC runs on installation.

BRP3 is not going to be something like a benchmark for GPU's because it still largely depends on CPU-power.

if you want to know - run PPS-sieve on primegrid.

and - boinc does not run benchmarks on GPU's - afaik it just estimates the speed.

Quote:
I think the 460 is an excellent choice but won't touch the concept of optimality unless someone is willing to fund a really cool but long and expensive project to measure total cost and long term throughput.

cost effective?

we all got different rates on our power bills, so your milage may vary.

under the line you'll allways save energy when buying latest technology.

MAGIC Quantum Mechanic
MAGIC Quantum M...
Joined: 18 Jan 05
Posts: 1705
Credit: 1070476048
RAC: 1262780

RE: cost effective? we

Quote:


cost effective?

we all got different rates on our power bills, so your milage may vary.

under the line you'll allways save energy when buying latest technology.

Well I know the cost of the power used matters to some of us but that doesn't matter to me and the main thing I like to know is which card does the most work for the best price.

Of course the new versions always cost more at first and I am looking more at the ones that are still good for running certain tasks here with those prices more between $100 and $200

Well unless the most expensive ones just appear on my desk for free!

Fred J. Verster
Fred J. Verster
Joined: 27 Apr 08
Posts: 118
Credit: 22451438
RAC: 0

Eighteen month ago, I bought

Eighteen month ago, I bought a GTX480, from an friend, who own a Computer Shop, does repairs, etc. Cost €485,00 ! (And for the same amount an ASUS P5E mobo+
C2QX9650 and 650Watt PSU.
I got a GTX470, for free, on one condition, he wanted to measure the difference
in several ways of using the cards/GPUs.
Normal VIDEO(+Encoding/decoding) Play/Edit/Render/Fractals, Gaming, also GPGPU. (Like Einstein@home, GPUgrid, SETI@home, Milkyway, etc.).
The difference is 448 v.s 480 CUDA cores. Also noticeble.

Also, I'm aware of the amount of electricity I use, running 3 QUADs (+GPUs)
and an I&-2600 and 2 HD5870 GPUs, running SETI on CPU & GPU and Milkyway on 2 GPUs, uses over 1500Watt, sometime an AC is needed, if room
temp. >25C, another 900Watt.
(But I rented these rooms, as "all in", as compensation for no allowance to use gas for heating or cooking! Only a change in contract can change that!))
(200 year old buildings in the center of a little town.)

A great idea to put it all into a number of graphs and thanks to those
who actually did this...

Bikeman (Heinz-Bernd Eggenstein)
Bikeman (Heinz-...
Moderator
Joined: 28 Aug 06
Posts: 3522
Credit: 689339551
RAC: 217634

>(But I rented these rooms,


>(But I rented these rooms, as "all in",

Fred, my friend...you still have some room left for a few of my hosts, right ???

:-) just kidding. Would like to see the face of the landlord when receiving the electricity bill, tho....

HB

DanNeely
DanNeely
Joined: 4 Sep 05
Posts: 1364
Credit: 3562358667
RAC: 162

RE: >(But I rented these

Quote:


>(But I rented these rooms, as "all in",

Fred, my friend...you still have some room left for a few of my hosts, right ???

:-) just kidding. Would like to see the face of the landlord when receiving the electricity bill, tho....

HB

*sigh* I get free heat and water; but have to pay my own electric and cooling. *sigh* My computer farm has gradually ramped in size though and I wonder if the LL has noticed his winter gas bill has fallen by >25% (4 units but since my windows are partially open most of the winter to vent excess heat the net flow has to be out of mine and into the rest of the building). Unfortunately whenever I move out I won't be able to see his/her reactions when the bill suddenly jumps up again.

archae86
archae86
Joined: 6 Dec 05
Posts: 3145
Credit: 7058264931
RAC: 1607402

The machine which I built

The machine which I built inspired by this thread came to life yesterday, got the GTX 460 graphics card installed this morning, and finally starting processing Einstein work a little before 6:00 p.m. Mountain Time today. The final obstacle was a dim-witted oversight on my part--I failed to enable the "use NVIDA GPU" item on the Einstein preference web page of my account for the location (venue) used by this host.

The CPU is an i5-2500K. Motherboard a very low end Z68 chipset card costing about $95 US. Power supply is a Nexus silent 430, rated at 430 W, of which the four 12V rails sum to a maximum just under 400 watts, and claimed to exceed 80% efficiency across the full usage range of interest, peaking about 85% efficient near 50% load.

The specific model of GPU is the Gigabyte GV-N460SO-1GI, which is an nVidia GTX 460 claimed to be equipped with selected better-performing copies of the graphics chip, and better other components, including a fan system with lower acoustic noise at a given cooling level than the (already good) base Gigabyte 460 card. I read good reviews, was particularly pleased with the likely lower noise, and driven to decision by the current $50 rebate, which drove the as-delivered priced down to U.S. $150.

The case, while far from gamer-grade, has a bit more cooling capacity than low-end consumers ones, with a pair of 80mm front fans pushing air in, and one 120mm rear fan sucking air out, with a possible assist from the 120mm fan integrated into the power supply (though Nexus programs that to turn very slowly unless things get pretty hot). I have all the fans throttled back considerably below maximum speed, and am currently seeing reported GPU temperature and the hottest reporting CPU core both right about 55 C. The box is far from silent, but a pretty good neighbor, and were I braver about temperatures, I could turn down fan speeds to make it pretty quiet.

Here are some initial observations:

I was surprised at how low power the system was before I put the graphics card in. Even though I had locked clock and CPU voltage the idle power of the system was 48 watts, and the power running four Einstein GW tasks was 100 watts.

Adding in the graphics card took the idle power up to about 75 watts. The operating power running four Einstein GW CPU tasks plus one SETI MB GPU task is about 255 watts, while the power running four Einstein GW CPU tasks plus one Einstein BRP GPU task is about 222 watts.

GPU monitoring applications show the GPU as about 70% loaded on the Einstein task, while 85% or so on the SETI task. This suggests there is some room to get more GPU Einstein performance, possibly by lowering the CPU load so that the GPU is serviced more promptly, or, more likely, by using the anonymous platform (app_info) mechanism to process two GPU jobs simultaneously. It appears that I have enough power supply and cooling headroom to allow this to be tried. Even though the 2500K is not hyperthreaded, the CPU requirement of the single BRP GPU task is pretty high at little over 30%, so GPU use will cut back CPU production somewhat.

BRP elapsed times so far are tightly clustered around 1900 seconds. This is a bit faster than is reported by most other single-threaded GTX460s I checked, suggesting that Gigabyte's claims of overclocking success on this card have some truth.

Here is the SETI listing of this host.

Here is the Einstein listing of this host.

A very rough initial estimate suggests that this host, if run pure Einstein, would generate a RAC a bit over 25000, with the $150 investment in this GPU over tripling what would otherwise be achieved. That particular relation arises from the current BRP implementation. Code for other tasks may have widely varying ratios between GPU and CPU performance, and projects may or may not find the investment of development in producing versions capable of running on a particular GPU worth their while.

But, if you confess to having bought a more capable CPU than otherwise at all needed (I've done that at least three times) with BOINC output in mind, you need to ask yourself under current conditions whether an appropriate GPU addition would not get you farther per unit expenditure.

Rechenkuenstler
Rechenkuenstler
Joined: 22 Aug 10
Posts: 138
Credit: 102567115
RAC: 0

RE: Thanks to all who

Quote:
Thanks to all who posted. I recently made this decision based on the "relative compute power" that nVidia publishes http://www.nvidia.com/object/graphics_cards_buy_now.html vs the price for the cards on Amazon.

Here is a link (german language, but intuitive understandable) which lists graphics by performance, combined with technical data, and which is permanently updated.

http://www.pc-erfahrung.de/grafikkarte/vga-grafikrangliste.html

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.