Suggestion for E@H preferences: GPU Utilization

PsiberMan
PsiberMan
Joined: 16 Apr 07
Posts: 11
Credit: 793546
RAC: 0
Topic 197953

Every time I go in to edit my E@H preferences, I shudder... What if I accidentally change something in any of the three GPU Utilization options...? Will I wake up one morning and find my machine a slag of melted metal and charred plastic...?

Sure, I exaggerate... But having just replaced a .5 TB hard drive due to (suspected) overheating (lots of cat hair blocking 95% of the massive heatsink's airflow), I know what heat can do. Seriously, this is a frightening message:

DANGEROUS! Only touch this if you are absolutely sure of what you are doing!
Wrong setting might even damage your computer! Use solely on your own risk!
Min: -1.0 / Max: 1.0 / Default: 1.0, negative values will disable GPU tasks of this type

In the interest of simplicity and consideration of technotards like me, why not have a "DO NOT USE ANY GPU UTILIZATION STUFF?" followed by a binary YES/NO choice?

When I read that "dangerous" message, I am tempted to set them all to "-1" to disable, but since I am NOT ABSOLUTELY SURE of what I am doing, I leave them alone. I actively avoid even seeing them because I am not absolutely sure. Which almost guarantees that one day I will set one of 3 "1" values to something like 1.3, or 2, or even 0, quite by accident. I shall lay awake at night, waiting for the telltale smell of melting plastic and oxidizing metal...

But rather than that, I'll just turn the machine off, or just stop running E@A all together... And after 7 years of running Einstein@home, that would be a shame...

So, how about it? Add something like this to the E@H preferences?

Do NOT use any GPU Utilizations? YES|NO

A hard drive is a lot easier to replace than a GPU in an HP All-In-One machine.

Thanks!

-_=G.g

Phil
Phil
Joined: 8 Jun 14
Posts: 579
Credit: 228493502
RAC: 0

Suggestion for E@H preferences: GPU Utilization

Quote:

Every time I go in to edit my E@H preferences, I shudder... What if I accidentally change something in any of the three GPU Utilization options...? Will I wake up one morning and find my machine a slag of melted metal and charred plastic...?

Sure, I exaggerate... But having just replaced a .5 TB hard drive due to (suspected) overheating (lots of cat hair blocking 95% of the massive heatsink's airflow), I know what heat can do. Seriously, this is a frightening message:

DANGEROUS! Only touch this if you are absolutely sure of what you are doing!
Wrong setting might even damage your computer! Use solely on your own risk!
Min: -1.0 / Max: 1.0 / Default: 1.0, negative values will disable GPU tasks of this type

In the interest of simplicity and consideration of technotards like me, why not have a "DO NOT USE ANY GPU UTILIZATION STUFF?" followed by a binary YES/NO choice?

When I read that "dangerous" message, I am tempted to set them all to "-1" to disable, but since I am NOT ABSOLUTELY SURE of what I am doing, I leave them alone. I actively avoid even seeing them because I am not absolutely sure. Which almost guarantees that one day I will set one of 3 "1" values to something like 1.3, or 2, or even 0, quite by accident. I shall lay awake at night, waiting for the telltale smell of melting plastic and oxidizing metal...

But rather than that, I'll just turn the machine off, or just stop running E@A all together... And after 7 years of running Einstein@home, that would be a shame...

So, how about it? Add something like this to the E@H preferences?

Do NOT use any GPU Utilizations? YES|NO

A hard drive is a lot easier to replace than a GPU in an HP All-In-One machine.

Thanks!

It would be a shame to lose a cruncher that has been here this long. Hopefully I can make this easier for you.

The GPU utilization factor appears to not even be an issue for you. No GPU is listed on your computer. I'm guessing this is an older computer that Einstein cannot take advantage of for the purpose of using your video card for crunching. It is running CPU units only if I read your tasks right. There is nothing wrong with that, at all. Any work performed helps the project.

The settings in question can be deceiving unless someone has explained them. They are simply settings to tell your computer how many GPU (Graphics Processing Unit - i.e. Video Card) work units to run at the same time. A factor of 1 means run one at a time. A "factor" of .5 means run 2 at a time. A factor of .33 means run 3 at a time, and so on.

Running more than one work unit at a time can make your video card run hotter, hence the "danger" of changing a setting if you go too far with it. Most modern video cards can run 2 or 3 work units at a time with no problems. Some members here run 4 or more at a time.

My personal machines are set to run 3 at a time with no problems with heat.

I personally would like to see the subject settings changed to say how many work units to run at the same time instead of using a factor. That's what the setting means anyway and it would make it easier for non-programmers to understand.

Phil

Gavin
Gavin
Joined: 21 Sep 10
Posts: 191
Credit: 40643371168
RAC: 1582282

Whilst its true a more in

Whilst its true a more in depth explanation of the GPU utilisation factor may be helpful to some, I very much doubt you will accidentally change these settings now you are aware and suitably scared (You shouldn't be really ;-)). Accidentally going from the default setting of 1 to say 2,3 or 657 will make no changes as the maximum value is, as the message states, 1. You would have to accidentally enter a value thats less than 1 but greater than 0 for any change (beneficial or detrimental) to take effect. i.e. a setting of 0.5 is required to run two tasks 0.33 for three and so on. For those that are interested there is plenty of info related to this subject on Einstein's various sub-forums...

The disclaimer is there to make the user aware that changing the default settings can and will increase GPU/system loads and temperatures within their computers, and that the project cannot be held liable for the user pushing their system components to the limit of failure... for whatever reason, including excessive build up of cat hair :-)

Gavin.

PsiberMan
PsiberMan
Joined: 16 Apr 07
Posts: 11
Credit: 793546
RAC: 0

Thanks, Phil! I actually

Thanks, Phil!

I actually thought I had a rather modern machine ... It was new in March of 2012, anyway, an HP all-in-one 520-1080 (I think)...

All the same, I kind of grasp that idea of work units (having been a programmer for 24 years, until the turn of the century anyway ...). Like when I assign a resource share for example. I forgot how programmers love to "eschew obfuscation", and didn't grasp what the text was actually telling me. Of course, I suppose I could have read the BOINC documentation, but never have since SETI started using it.

I was thinking it would be more along the lines of sharing video frequencies, or somesuch... And I thought my machine had an intel video cpu. I know it's a dual processor, anyway.

I believe I will actually sleep better tonight, and will attempt to do so without the prescribed sleeping aid I've been on for the last 2 years. (insert wry grin and tongue-in-cheek here)...

Thanks again!

-_=G.g

PsiberMan
PsiberMan
Joined: 16 Apr 07
Posts: 11
Credit: 793546
RAC: 0

Thanks, Gavin ... I shall

Thanks, Gavin ... I shall indeed be resting easier tonight. In fact it's past my bedtime now, but I think I'll listen to the gigacycle hum of my dual processor (sans gpu... :( ) ;)

-_=G.g

Phil
Phil
Joined: 8 Jun 14
Posts: 579
Credit: 228493502
RAC: 0

Just for grins and

Just for grins and giggles...

http://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&sqi=2&ved=0CB4QFjAA&url=http%3A%2F%2Fark.intel.com%2Fproducts%2F53480%2FIntel-Pentium-Processor-G620-3M-Cache-2_60-GHz&ei=eYbTVPbiK8ikgwSq7oHoCQ&usg=AFQjCNFr8EyY6I5_d1brLUUGr2SojQulBw&sig2=r28sB0UgupVnAZL5zNjT3w&bvm=bv.85464276,bs.1,d.cWc

Sorry for the long link, I just woke up and my brain is failing me.

You are right, your computer is not that old. It just uses a processor that was not in production that long.

Nothing wrong with that at all. If memory serves, one of our big crunchers 'round these parts uses some much older and slower gear, he just has a bunch of it. Keep that thing running until it blows up in a few years. All the data is useful.

If you have not been there yet, try using the link going to the Einstein home page. There you will find a link to statistics where you can look at top hosts, top teams, and such. I think it's fun to look at the equipment other folks are running and you might find someone running a similar machine.

Phil

PsiberMan
PsiberMan
Joined: 16 Apr 07
Posts: 11
Credit: 793546
RAC: 0

RE: Yep. I look all the

Quote:

Yep. I look all the time. Well, I did. Then BOINC'ing got popular, and my 99th percentile turned into a 95, then an 88, then a 76 ... I got tired of being depressed, so I stopped looking... :)

And I have used as many as 4 older machines to crunch simultaneously, but then lost my apartment (this machine kept running, though). They're in storage right now, but probably won't ever run again on their own. Well, maybe one of them if I can ever dig it out from under the other flotsam and jetsam.

Wait ... those grins and giggles ... that info is for this (my) 520 machine?? s two-year life cycle? Seriously?? Schinns 'n kikkles, more like it... ;)

I wonder what the life cycle was for the 8086, likewise the 486 & 586... or is THIS the 586... I lost track back around windows 3.1... ;)

-_=G.g

Phil
Phil
Joined: 8 Jun 14
Posts: 579
Credit: 228493502
RAC: 0

RE: I lost track back

Quote:
I lost track back around windows 3.1.

Rofl. I can remember being so excited when I finally got 3.1 to put on a 286 machine and thinking this is the ticket!

These days I run only Mac and Linux.

Phil

PsiberMan
PsiberMan
Joined: 16 Apr 07
Posts: 11
Credit: 793546
RAC: 0

Given the opportunity to do

Given the opportunity to do it all again? I'd make those same choices. Windows is the epitome of definition for a virus, and Linux is to dos, as say, VMS is to CPM. And I really liked all of those. Well, CPM, maybe not so much, but...

-_=G.g

robertmiles
robertmiles
Joined: 8 Oct 09
Posts: 127
Credit: 20843095
RAC: 71763

One of my computers currently

One of my computers currently needs a limit on its GPU use if it it to do much at all with its GPU. The GPU heatsink is probably clogged, but it's so unreachable that I can't unclog it.

I've already bought a newer graphics board, but it came without the cables I'll
need to supply its power connector. I had to order the power cables separately; they might arrive in about a week.

Until then, could you at least allow a setting that says send this computer CPU-only workunits, but no GPU workunits?

Claggy
Claggy
Joined: 29 Dec 06
Posts: 560
Credit: 2694028
RAC: 0

RE: Until then, could you

Quote:
Until then, could you at least allow a setting that says send this computer CPU-only workunits, but no GPU workunits?


Every project has project preferences, included in those project preferences are 'Use CPU', 'Use Nvidia GPU', 'Use AMD/ATI GPU' and 'Use Intel GPU' preferences, Just deselect the one you don't want.

And don't say that'll effect your other computer, Each project has four locations/venues so you can set four different sets of computing, or project preferences, just move one of your computers to a different location/venue, and make preferences to suit it.

Claggy

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.