Nvidia drivers underclock 6xx GPU?

dmike
dmike
Joined: 11 Oct 12
Posts: 76
Credit: 31369048
RAC: 0

RE: Well, of course we are

Quote:
Well, of course we are talking "under load" here. Who cares about idle, that never happens w/ Einstein@Home ;-). The question is whether the clock frequencies displayed by nvidia-settings under Load in Linux are for real (that would be underclocking under load), or not.

But we were never told if the 705mhz was under load. The issue was that it was at 705mhz at all.
The OP stated that the base clock should be 905 but isn't. During this time he didn't have any system reporting use of a 680, so it could be either. We don't even know if it was a 680M that doesn't go above 705 on battery.

Bikeman (Heinz-Bernd Eggenstein)
Bikeman (Heinz-...
Moderator
Joined: 28 Aug 06
Posts: 3522
Credit: 765241077
RAC: 1083003

I don't get it. The model is

I don't get it. The model is listed in the left hand pane of the nvidia-settings dialog, isn't it. See the screenshot in the original message at the thread start. The dialog should enumerate all the available performance levels, lower clocking in idle would appear as a lower performance level being highlighted, with the other, higher, levels still visible. At least that's the way nvidia-settings worked in the past. Strange.

Cheers
HB

HenkM
HenkM
Joined: 29 Sep 09
Posts: 32
Credit: 279008202
RAC: 0

@Astrocrab You have to read

@Astrocrab
You have to read your screenshot very carefully.
Graphics Clock is not the same as GPU Clock.
You can see this in the screenshot Jeroen gave, where the graphics clock is also 705 MHz but the processor clock is 1411 MHz. Processor clock is not given in your screenshot and that probably means that you have an older version of the particular program. Update it!

You also see that memory clock is 3004 MHz but memory speed is according to the specifications 6008 MHz.
Nvidia makes cards where clocks and speeds differ mostly a factor 2 or 4.
If you see memory speed of a 600 series card in GPU-Z, you will see that memory clock is 1502 Mhz.

The proof of the pudding is in the eating.
Go to the list of top computers: http://einstein.phys.uwm.edu/top_hosts.php
Look for a system that is comparable with yours.
You have to have comparable results that is: your system must not be a factor 4 slower.
Some systems are because of overclocking considerable faster than others.

Happy crunching and don’t worry!!

astrocrab
astrocrab
Joined: 28 Jan 08
Posts: 208
Credit: 429202534
RAC: 0

of course we are talking

of course we are talking about frequency under load, and the screenshot was taked under full load. what the point to tell about idle clocks?

astrocrab
astrocrab
Joined: 28 Jan 08
Posts: 208
Credit: 429202534
RAC: 0

no, that is not. 500 series

no, that is not. 500 series has processor clock as double of the graphic clock.
but senior 600 series has the processor clock exactly the same as graphic clock.
gtx 680 never has clock of 1411 mhz, it has maximum of 1006 mhz.
the reason why Jeroen's screen reports 1411 is he has 295.xx driver version, and i have 310.xx version, which reports more precise for 600 series.
i agree about memory clock.

HenkM
HenkM
Joined: 29 Sep 09
Posts: 32
Credit: 279008202
RAC: 0

@Astrocrab Drivers never

@Astrocrab
Drivers never change clockspeeds.
They are just an interface between the OS and in this case the graphics card.
It would be a deadly programmers mistake if drivers would change parameters of the card or the OS.

I have looked at your data, especially the system with the single graphics card.
Even if you have 3 concurrent wu’s in your card 5000 seconds processing time is far too much.
For the case of 3 concurrent wu’s 560 Ti processing time should be less then 1 hour.
Maybe you tried to overclock the card too much and then it is not unthinkable that it falls back to a very low clock speed.
If the overclock program has a function like “Reset to Default†then you should use that function.
If it has not and you have been overclocking then it is better to remove the overclock program from your system.
The graphics card always starts in the default factory settings.

It is hard to say something about the other system as long as we don’t know how many concurrent wu’s both cards have in memory.

astrocrab
astrocrab
Joined: 28 Jan 08
Posts: 208
Credit: 429202534
RAC: 0

i'm not having problem with

i'm not having problem with my 560ti ) and 5000sec is for 5wu at a time. so 560ti runs at it's best.
if you take a look at my initial screenshot you will see what i'm talking about 660ti.

Quote:
It would be a deadly programmers mistake if drivers would change parameters of the card or the OS.


it was an option in earlier versions of nvidia drivers to turn on clocking option, so you can overclock or underclock gpu.
and i can overclock my current ati card using drivers option. so, yes, driver can change parameters of the card.

dmike
dmike
Joined: 11 Oct 12
Posts: 76
Credit: 31369048
RAC: 0

RE: it was an option in

Quote:
it was an option in earlier versions of nvidia drivers to turn on clocking option, so you can overclock or underclock gpu.
and i can overclock my current ati card using drivers option. so, yes, driver can change parameters of the card.

Drivers aren't changing the speed in this example, the user is interfacing software that does this.

Remember, Nvidia uses a unified driver. If it was the driver doing it, then all cards of all architecture using the same driver would have varying clock speeds.

astrocrab
astrocrab
Joined: 28 Jan 08
Posts: 208
Credit: 429202534
RAC: 0

are nvidia-settings and

are nvidia-settings and aticonfig a driver's component or user applications?

dmike
dmike
Joined: 11 Oct 12
Posts: 76
Credit: 31369048
RAC: 0

Both are included. A part of

Both are included. A part of the package interfaces the hardware with the OS (driver). Another part gives software control to the user over the item (software).

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.