My observations - GTX 260 Maxcore vs GTX 660 Ti Superclocked

dmike
dmike
Joined: 11 Oct 12
Posts: 76
Credit: 31369048
RAC: 0
Topic 196567

First and foremost allow me to start off by saying that I'm no PC expert, and I'm even less experienced with BOINC. I'm not really supporting or opposing any card/system/brand, rather I'm just sharing my observations and my opinion based on those observations.

The same rig used both cards at different times, beginning with the BFG GTX 260 Maxcore. It has 896 MB of RAM and 216 Cuda cores.

Basic system specs are as follows;
Phenom II X4 940 @ 3.0Ghz
4 GB DDR2-1066 RAM
Windows 7x64

The 260 consumed under load 150 watts. It was detected properly as having 216 cores. The range of time to complete 1 WU ranges from between 46 and 49 minutes. I have not done the math, but watching it crunch it seems to be progressing at .040% per second with an occasional pause.

The 260 was removed and replaced with a 660 Ti Superclocked (EVGA). This card consumes under load 150 watts. It was detected improperly as having 0 Cuda cores. It has 2 GB of RAM and 1134 Cuda cores.
The range of time to complete 1 WU ranges from between 24 and 25 minutes. I have not done the math, but watching it crunch it seems to be progressing at .074% per second with an occasional pause.

I surmised the following; power consumption notwithstanding, either the 260 is punching more than its weight, or the 660 Ti is punching less than its weight. Overall, the 660 Ti is faster while at the same time consuming less power which regardless of other nuances means a good upgrade.

I have moved the 260 into another phenom II x4 box which is very similar, and it replaced a Sapphire HD4770 which is worthless for BOINC.

Overall, I'm gald I did the upgrade (I do play games as well) but I'm not certain that the price was worth the gain in performance. However, to have two systems crunching and upgrade the other box (used by the wife) to play games means that ultimately, everyone is happy. But I wouldn't say, "ecstatic".

archae86
archae86
Joined: 6 Dec 05
Posts: 3161
Credit: 7265021763
RAC: 1581762

My observations - GTX 260 Maxcore vs GTX 660 Ti Superclocked

What method of power measurement did you use?

dmike
dmike
Joined: 11 Oct 12
Posts: 76
Credit: 31369048
RAC: 0

Oh, I didn't do it

Oh, I didn't do it personally, just looked online for the TDP ratings for each under full load which they were close to.

hotze33
hotze33
Joined: 10 Nov 04
Posts: 100
Credit: 368387400
RAC: 0

@dmike The latest update on

@dmike
The latest update on the BRP app (1.28) has changed a lot of things. Before the app didn´t fully utilize the gpu. So it was necessary to run more than 1 workunit at a time to get the maximum amount of work done / optimize efficiency. The old geforce series (below Fermi) aren´t capable of switching efficiently between two tasks. So running two workunits at the same time just double the calculation time for one unit. Fermi and now Kepler allow switching between different tasks and see benefits from running more than one workunit.
Because of the new app (thanks to the developers!) one task can now nearly saturate the gpu and scaling is now mainly to number of cuda cores and PCIe bandwidth and of course the cpu (the app still depends heavily on the cpu).
You have also to keep in mind, that nvidia not only changed the core number, but also the clock frequency of the shaders. So a comparison between different architectures is not as easy.
Hell, even my 9800GX2s are back in the game now. I just need a faster cpu and two full PCIex16 slots and they can compete with my GTX 470.

Dirk Broer
Dirk Broer
Joined: 10 Sep 05
Posts: 13
Credit: 35196214
RAC: 26510

RE: I have moved the 260

Quote:
I have moved the 260 into another phenom II x4 box which is very similar, and it replaced a Sapphire HD4770 which is worthless for BOINC

I use a HD4770 for MilkyWay@Home and it performs great!

Dirk Broer
Dirk Broer
Joined: 10 Sep 05
Posts: 13
Credit: 35196214
RAC: 26510

RE: RE: I have moved the

Quote:
Quote:
I have moved the 260 into another phenom II x4 box which is very similar, and it replaced a Sapphire HD4770 which is worthless for BOINC

I use a HD4770 for MilkyWay@Home and it performs great!

Furthermore: it performs far better than my GTX260 for MilkyWay (twice as good, around 60-75k a day). And when MilkyWay is out of work I switch to Collatz, Moo!, POEM or DistrRTgen. You must install the SDK kit in order to get the HD 4770 rolling, but when it rolls almost nothing touches it in credits/Watt.

I especially like the good Double Precision performance/Watt (that is why it is soo good at MilkyWay).

dmike
dmike
Joined: 11 Oct 12
Posts: 76
Credit: 31369048
RAC: 0

Sorry, I should have said E@H

Sorry, I should have said E@H rather than BOINC.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.