Laptop running Intel i8750H (6 core 2.2GHz unboosted) and Nvidia RTX 2060 Mobile. Compute prefs set to 25% of CPUs, 30% of the time. Also running Folding@Home which runs on Light. I'm not running any new apps or settings, or subscribing to new distributed computing projects.
Since receiving hsgamma assignments, my CPU ramps right up to 100C because BOINC won't let me specify GPU preferences.
How can I scale it back and/or give resources from the dedicated GPU?
Copyright © 2024 Einstein@Home. All rights reserved.
G J B wrote:Compute prefs set
)
Please be aware that 30% of the time means run it full bore for 1 second then set it to idle for ~2.33 seconds. This is likely to create thermal cycling (expansion and contraction effects) as the core heats up and cools down. I'm no expert but that doesn't sound particularly good to me. Apart from that, unless you have a really good cooling solution (100C doesn't suggest you have) there's quite a bit of danger in high intensity compute operations on laptops.
With BOINC, you either use a GPU or you don't. You can't do the same throttling as you can with CPUs.
By telling BOINC not to use the internal GPU.
When BOINC starts, the event log shows BOINC detecting the GPUs. By default, BOINC uses what it considers to be the 'best' GPU. For some reason, BOINC must be thinking that the Intel GPU is 'best'. You can configure BOINC to change that. Read through the BOINC User Manual on Client Configuration for options like <ignore_intel_dev>N</ignore_intel_dev>, and others. You need to create a cc_config.xml file. There are full instructions in the documentation. You just need to read it carefully and ask specific questions if anything is not clear. I don't need this level of control so I don't have first hand experience with these options to turn specific GPUs on or off.
Cheers,
Gary.
It's also worth looking at
)
It's also worth looking at the server log for the computer. So far, I've seen:
'CUDA' would be the RTX 2060 - it seems as if you're not even requesting work for that device.
That looks as if the RTX 2060 is too recent to be supported on this project - which would be surprising. But maybe this one's OK: