GPU usage at ~3% on RTX2070

Tom M
Tom M
Joined: 2 Feb 06
Posts: 6461
Credit: 9586384537
RAC: 6874855

Tom M wrote: Many Nvidia

Tom M wrote:

Many Nvidia gpus do not seem to run more than one gamma ray task at a time effectively.

This might be true of gravity wave tasks too.

===edit===

The Nvidia GPU manager for Windows will tell you how loaded your GPU is.

 

Sorry.  Conflated the Ubuntu Nvidia X-Server app with the Windows one.

----edit----

So will gpu-z.

Tom M

A Proud member of the O.F.A.  (Old Farts Association).  Be well, do good work, and keep in touch.® (Garrison Keillor)  I want some more patience. RIGHT NOW!

Tom M
Tom M
Joined: 2 Feb 06
Posts: 6461
Credit: 9586384537
RAC: 6874855

Harri Liljeroos wrote: Tom M

Harri Liljeroos wrote:

Tom M wrote:

Many Nvidia gpus do not seem to run more than one gamma ray task at a time effectively.

This might be true of gravity wave tasks too.

The Nvidia GPU manager for Windows will tell you how loaded your GPU is.

So will gpu-z.

Tom M

So will Windows 10 Task Manager. Select the Performance tab and select their Nvidia GPU. There you see 4 different categories of loads. Click on the title of one of them (arrow down) and select Cuda. This will show the computing load (includes OpenCL as well) of that GPU.

I hate it when I have a brain fart!

Yup.

Tom M

A Proud member of the O.F.A.  (Old Farts Association).  Be well, do good work, and keep in touch.® (Garrison Keillor)  I want some more patience. RIGHT NOW!

petri33
petri33
Joined: 4 Mar 20
Posts: 124
Credit: 4062285819
RAC: 6847609

Richie wrote: Markus

Richie wrote:

Markus Windisch wrote:
How can I do staggered starts for GPU? All tasks are synced right now. Thanks in advance

If you have tasks in queue already: Suspend a task somewhere in the middle while it's running. Then another task should start. It's not easy to give any  suggestions when exactly would be the best moment to suspend a running task. Just try pausing them... and you will find good intervals.

Or you could set "no new tasks", let your queue run out, open up computing preferences and set 0 for "store at least X days of fork" and "store up to and additional X days of work". Then set "allow new tasks" again. Right after the first task begun running... set "no new tasks". Then wait... and at some point set "allow new tasks" again. Boinc should download another one and start it.

That sounds nice.


Will that work forever without intervention?

I'd like to run 2 or N tasks at a time and have them communicating or at least understanding that only one of those assigned to a particular GPU will do its GPU job perfectly alone and then turn a flag (mutex) and say Now it is you other tasks on this GPU turn to run.

With a CUDA application, not OpenCL, you could run with NVIDIA provided tools to make many processes to share a GPU with the highest throughput without changes to the code. And with new generation of high end cards you could virtualise a GPU and decide how big a proportion a given process gets access to.

I'm sure AMD has similar tools too.

--

Petri33.

 

 

Markus Windisch
Markus Windisch
Joined: 23 Aug 21
Posts: 61
Credit: 97881372
RAC: 0

Harri Liljeroos schrieb: So

Harri Liljeroos wrote:

So will Windows 10 Task Manager. Select Performance tab and select there Nvidia GPU. There you see 4 different categories of loads. Click on the title of one of them (arrow down) and select Cuda. This will show the computing load (includes OpenCL as well) of that GPU.

Thanks a lot! Didn't know there's more!

eeqmc2_52
eeqmc2_52
Joined: 10 May 05
Posts: 38
Credit: 3688740183
RAC: 910741

Nvidia driver 496.49 trashed

Nvidia driver 496.49 trashed my RTX3080Ti CUDA performance.  I rolled back my driver to 496.13 and everything is performing at peak.

Did the upgrade to 496.49 twice and had the same results.  It's not CUDA friendly for E@H.

There are only 10 kind of people in the world, those that understand binary and those that don't!

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.