Nvidia Pascal and AMD Polaris, starting with GTX 1080/1070, and the AMD 480

Daniels_Parents
Daniels_Parents
Joined: 9 Feb 05
Posts: 101
Credit: 1877689213
RAC: 0

Release Notes 367 Graphics

Release Notes 367 Graphics Drivers for Windows, Version 368.39 (June 7, 2016)

Page 12:

Windows 10 Fixed Issues

... GeForce GTX 1080 Founders Edition cards spin fan up and down rapidly.
[1771960] ...

... Stutter occurs during full-screen playback of YouTube videos in Edge
browser. [1769515] ...

... System hangs during transition from monitor sleep to system sleep. [1757517] ...

Arthur

I know I am a part of a story that starts long before I can remember and continues long beyond when anyone will remember me [Danny Hillis, Long Now]

AgentB
AgentB
Joined: 17 Mar 12
Posts: 915
Credit: 513211304
RAC: 0

..and some news about the RX

..and some news about the RX 460 and 470 here

AgentB
AgentB
Joined: 17 Mar 12
Posts: 915
Credit: 513211304
RAC: 0

Maxwell GPUs are getting a

Maxwell GPUs are getting a price drop this week GTX 980 Ti drops 125 USD.

I suspect this will not be the first.

Sid
Sid
Joined: 17 Oct 10
Posts: 164
Credit: 971808924
RAC: 426395

RE: I mentioned the power

Quote:

I mentioned the power efficiency of the 750 cards previously. I have one dual-750Ti PC, which runs just under 140,000 credit/day at 173.1 wall watts operation, up from about 74 watts idle.

How many WUs are you running simultaneously on one 750 ti card?

archae86
archae86
Joined: 6 Dec 05
Posts: 3157
Credit: 7225624931
RAC: 1051220

RE: How many WUs are you

Quote:
How many WUs are you running simultaneously on one 750 ti card?


I currently run 2X on all cards in my flotilla (one 970, one plain-Jane 750, and four 750Ti SC). I also overclock all those cards save one in both core clock and memory clock. The 750s gain less than the 970,and one 750 Ti is running stock.

Oops, after I wrote that I finally remembered that my newest PC, the only one running a single card (a 750 Ti SC) is currently running 4x. That is a recent change which is part of my preparation for initial evaluation of a GTX 1070 on this host. As part of that work I did a quick comparison of 2x, 3x, and 4x productivity on this 750 Ti on this host for Einstein GRP6 CUDA55, and observed rather small differences with 3X being the best, but on too small a sample size and too informal an observation to be at all sure.

On a matter closer to the primary thread topic, Amazon has set the status of my MSI GTX 1070 FE card to shipping today, arriving Monday 6/20. When it arrives I initially plan to swap it for the 750Ti currently running in my most recently built host which has a quad-core Haswell on which I currently allow only one CPU task. That 750Ti is my laggard--it would not overclock at more than modest level, and as it is factory set to a higher voltage than any of my other 750 cards, it gets into power limitation (before it actually fails) at overclocks all my other 750 cards can run.

[edited to correct false assertion that all cards are running 4X]

Sid
Sid
Joined: 17 Oct 10
Posts: 164
Credit: 971808924
RAC: 426395

RE: I currently run 2X on

Quote:


I currently run 2X on all cards in my flotilla (one 970, one plain-Jane 750, and four 750Ti SC). I also overclock all those cards save one in both core clock and memory clock. The 750s gain less than the 970,and one 750 Ti is running stock.


Thank you, archae86.

archae86
archae86
Joined: 6 Dec 05
Posts: 3157
Credit: 7225624931
RAC: 1051220

My MSI GTX 1070 Founders

My MSI GTX 1070 Founders Edition Pascal card arrived today, and has been running in the system I call Stoll9 for about an hour. I first set it the task of finishing a few work Units which I had already run to about 90% completion running 4X on a 750 Ti, so don't be alarmed at the long elapsed times. The good news is that work performed at 1X, 2X, 3X, and 4X has already validated so it at least gets right answers when set the task of completing a WU.

Initial impressions:
It was easy to install, even giving me a no-nonsense tongue lashing for my error of failing to hook up the 8-pin PCI-e power connector. The very first characters I saw on the screen were a large-font admonition to power the machine down and hook up the power cable. Maybe this has been universal for a long time, but it is the first I've seen it. The means it used to do so must have been a very low-level brutal one, as my BIOS reported an overclocking failure on the next boot (my host system is not actually overclocked). Just ignoring it and rebooting was the right answer.

Power consumption seems a bit less than I expected, with roughly 100 watts added to wall socket power draw when running BRP6 at 4X (or 3X or 2X) over the base. I assume this means the card itself it pulling no more than 90 watts.

Despite this low power consumption, the non-aggressive default fan scheduling is letting the GPU temperature get quite high. Say about 80C in a room at 83F ambient, with the fan only displaying as 53% speed. At that speed this fan is not annoying in tone, nor loud compared to a case I regard as pretty quiet.

The core clock rate starts at about 1500, but soon spikes way up, then sags a bit. It seems willing to go slightly over 1900 MHz when the GPU is very cool, but quickly sags back to about 1835 at my current conditions. By contrast the memory clock has never varied when running BRP6 actively, it sits at what GPU-Z reports as 1901.2. As spec would lead me to expect 2000, I suspect this board has a mild form of the distributed computing memory clock limitation we saw in pretty severe form on Maxwell2 cards. As many reviewers report being able to overclock memory on 1070s by substantial amounts, and historically we have thought the Einstein BRP applications to be particularly hungry for memory speed improvement, there may be a fancy overclocking opportunity here again.

The high GPU temperature, and what I think is likely a strong adjustment of core clock in response to temperature, suggest that people buying 1070 cards for BOINC may be well advised to avoid the Founders Edition in favor of cards reviewed as having more effective cooling. Also adjusting the fan speed control curve may yield actual performance improvement at default settings, as may better case ventilation, all the way up to the extreme "caseless with a window fan" approach. Putting one of these in a tiny case with whimsical case fans may well lose some available performance, even though it only is consuming 100W or less.

I'll have more to say later about performance observations. My short-term plan is to run full WUs (not previously started) at 1X, 2X, 3X, and 4X, then settle for the short-term on one configuration and let it run for a while, not yet touching any OC knobs, case fan speeds, or such.

Keith Myers
Keith Myers
Joined: 11 Feb 11
Posts: 4964
Credit: 18748686812
RAC: 7072788

From the 10X0 board reviews

From the 10X0 board reviews I've seen so far, the common thread on loaded temps is the cards are set up for low noise and NOT for maintaining loaded core and memory clock speed. The simple solution for most reviewers is a 60% manual fan profile which maintains GPU Boost core and memory clock speeds and a still acceptable fan noise profile. I assume that NVidia Inspector or EVGA Precision or MSI Afterburner are still capable of boosting memory clocks back to stock. Is the card running in P0 or P2 power state when running the Einstein tasks? Is the card gimped on purpose for GPGPU work by Nvidia like with the Maxwell 2 cards?

 

archae86
archae86
Joined: 6 Dec 05
Posts: 3157
Credit: 7225624931
RAC: 1051220

RE: Is the card running in

Quote:
Is the card running in P0 or P2 power state when running the Einstein tasks?


I just started up nvidia inspector to examine this point and believe it indicated I was running in P2 state.

There may be an opportunity there. It has been about 18 months since I worked through this matter on my 970, and I've forgotten a lot, and the nvidia inspector user interface looks a bit different. I'll put off this particular joy for a little while yet.

Keith Myers
Keith Myers
Joined: 11 Feb 11
Posts: 4964
Credit: 18748686812
RAC: 7072788

You can either use the

You can either use the Overclocking interface or do as I do and use the command line method of adjusting core and memory clocks. If you do either, remember you can only adjust the clocks when the card is not loaded with the BOINC client. Exit the client and manager and adjust the clocks, then restart the Manager. Your clock boosts will stick then.

Using the Overclocking interface, you just adjust the sliders for core and memory for the P0 pulldown setting. Then apply the settings. Go back to the pulldown tab and select the P2 state. Push the memory slider all the way to the right and apply. That will make the P2 memory clock stick to its new adjusted setting.

I can't tell you what the clock speeds are supposed to be for stock for your card. They all are different depending on whether its a reference design or factory overclocked.

Einstein always responds best to memory overclocks vice core clocks.

This is an example of my command line settings for my two GTX 970 cards. I used the Inspector's [Create Clocks Shortcut] tool to make a shortcut for each 970 on my Desktop. When I start the computers up I just double-click the shortcuts before I start the Client and Manager and my overclocks are automatically applied to the cards.

nvidiaInspector.exe -setBaseClockOffset:0,0,40 -setMemoryClockOffset:0,0,100 -setMemoryClock:0,2,3605

nvidiaInspector.exe -setBaseClockOffset:1,0,40 -setMemoryClockOffset:1,0,100 -setMemoryClock:1,2,3605

These settings overclock the core clocks by 40 Mhz and the memory clocks by 100 MHz. The first parameter enumerates the card. Only necessary if you have multiple Nvidia cards in your system. The second number identifies the Power State. The third number sets the clock offset boost/buck or sets the target clock frequency. These command line parameters just do the same thing as using the sliders but in one step.

You could also use the Nvidia Inspector to set the 60% fan speed target I suggested also.

 

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.