In my computing preferences I specified use at most 25% of processors, and in my project preferences all my GPU utilization factors are 1.00. I'm only running two GPU applications, and I know from experience that each (I have a core-i5 processor, meaning 4 cores) CPU takes 25%. My preference to use at most 25% of processors may be new, I really don't remember.
Also, I run a 2 CPU lhc@home application, so it takes 50% of the processor. I've had problems before with lhc tasks ending in computation errors. So when I allowed new tasks from Einstein I expected to download two tasks, but I only expected one of them to run. They used to start ~1 minute apart, and maybe that would have happened again. But the LHC task quickly dropped to 25% then, as I struggled to abort both Einstein tasks (I don't know why I didn't go for pause instead), LHC dropped to 0% and Waiting to run. LHC tasks don't like to be interrupted.
My memory is only 6 GB (I have 8 GB more on order, which will give me 12 GB), and the amount that was free was just a few MB, although ~2900 MB was available. So I suspect it was a memory problem.
My question is this: If I hadn't had the LHC task running, given my Einstein preferences, would only one of the tasks started? If the answer is yes, LHC had a memory problem. If the answer is no, why wouldn't my specifying use at most 25% of processors result in just one task running?
"Remember, nothing that's good works by itself, just to please you. You have to make the damn thing work." Thomas A. Edison
Copyright © 2024 Einstein@Home. All rights reserved.
Steveplanetary wrote:In my
)
Computing preferences are global and will apply to all projects running on the machine. If you set them to different values on different projects Boinc will use the newest set and this will then propagate to the other projects. It's recommended to choose one project site to set the computing prefs to not get confused.
If you set the global prefs at Einstein and then allowed work then the new prefs would be downloaded with the new work and applied. So the limit of using only 25% of the processors would account for limiting LHC and then when one of the Einstein tasks started LHC was paused.
Free memory = wasted memory
Windows uses free memory to load programs and files for quicker access, but these unused programs and files can quickly be removed from memory if a running program needs more memory. Look at the "available" number to see of you are close to running out of memory for your active programs.
Are you sure both Einstein tasks started to run at the same time?
Or did one start and then you aborted it and then the other one started?