CPU Life Expectancy

Division Brabant.Schaduwtje
Division Braban...
Joined: 29 Oct 08
Posts: 34
Credit: 5526816
RAC: 0
Topic 194058

I am relatively new to distributed computing projects. I started at about the beginning of the month with the Einstein@home project because it attracted me the most as a BOINC-project - and BOINC is a simple and convenient platform to me.

My motivation was that my machine is on 100% of the time anyway, because I use it as a server, e.g. ftp, svn, apache http and tomcat, etc (besides the usual work of working on documents and typing emails on it). Running a project that is worth something to science seems a good way to at least use the unused capacity of my machine. I have run Einstein at 100% for about the entire month, which doesn't seem to affect the performance of the applications that I use on it.

However, now I am questioning myself what this does to my CPU life expectancy. Is it known if it has a negative effect to fully load the CPU all of the time? Would it for instance be better to only load it when I am not working on it, so that it gets 'rest' (e.g. cool off) during the time I work on it? Are any numbers available on the impact of distributed user projects on CPU life? Or should the CPU just be able, without problem, to handle the load as it is made for that? I did not overclock the machine or anything like that (no use for me anyway). Anyone has experience with dead CPUs, maybe caused by running such distributed projects?

tullio
tullio
Joined: 22 Jan 05
Posts: 2118
Credit: 61407735
RAC: 0

CPU Life Expectancy

Quote:

I am relatively new to distributed computing projects. I started at about the beginning of the month with the Einstein@home project because it attracted me the most as a BOINC-project - and BOINC is a simple and convenient platform to me.

My motivation was that my machine is on 100% of the time anyway, because I use it as a server, e.g. ftp, svn, apache http and tomcat, etc (besides the usual work of working on documents and typing emails on it). Running a project that is worth something to science seems a good way to at least use the unused capacity of my machine. I have run Einstein at 100% for about the entire month, which doesn't seem to affect the performance of the applications that I use on it.

However, now I am questioning myself what this does to my CPU life expectancy. Is it known if it has a negative effect to fully load the CPU all of the time? Would it for instance be better to only load it when I am not working on it, so that it gets 'rest' (e.g. cool off) during the time I work on it? Are any numbers available on the impact of distributed user projects on CPU life? Or should the CPU just be able, without problem, to handle the load as it is made for that? I did not overclock the machine or anything like that (no use for me anyway). Anyone has experience with dead CPUs, maybe caused by running such distributed projects?


I have been running 2 BOINC projects (Einstein and SETI) from 2004 to 2008 24/7 on a 400 MHz PII with no problem. Now I am running 6 projects 24/7 on an AMD Opteron 1210 at 1.8 GHz since January also with no problem. I am not an overclocker.
Tullio

Holmis
Joined: 4 Jan 05
Posts: 1118
Credit: 1055935564
RAC: 0

I did run a P4 2.4 GHz for

I did run a P4 2.4 GHz for about 5 years on different BOINC-projects and never had a problem with it.

Now i run a Intel Q9450 and it seems to work ok.

Pooh Bear 27
Pooh Bear 27
Joined: 20 Mar 05
Posts: 1376
Credit: 20312671
RAC: 0

If you keep the system clean,

If you keep the system clean, and the cooling at a constant, you should not have a problem for years to come.

Many of us have run systems for years without any adverse effects on them.

Division Brabant.Schaduwtje
Division Braban...
Joined: 29 Oct 08
Posts: 34
Credit: 5526816
RAC: 0

Ok, thanks for the reassuring

Ok, thanks for the reassuring words. I'll just keep it running for 24/7 now. :-)

mikey
mikey
Joined: 22 Jan 05
Posts: 11944
Credit: 1832479146
RAC: 216457

RE: I am relatively new to

Quote:

I am relatively new to distributed computing projects. I started at about the beginning of the month with the Einstein@home project because it attracted me the most as a BOINC-project - and BOINC is a simple and convenient platform to me.

My motivation was that my machine is on 100% of the time anyway, because I use it as a server, e.g. ftp, svn, apache http and tomcat, etc (besides the usual work of working on documents and typing emails on it). Running a project that is worth something to science seems a good way to at least use the unused capacity of my machine. I have run Einstein at 100% for about the entire month, which doesn't seem to affect the performance of the applications that I use on it.

However, now I am questioning myself what this does to my CPU life expectancy. Is it known if it has a negative effect to fully load the CPU all of the time? Would it for instance be better to only load it when I am not working on it, so that it gets 'rest' (e.g. cool off) during the time I work on it? Are any numbers available on the impact of distributed user projects on CPU life? Or should the CPU just be able, without problem, to handle the load as it is made for that? I did not overclock the machine or anything like that (no use for me anyway). Anyone has experience with dead CPUs, maybe caused by running such distributed projects?

I have been running DC projects since the late 90's and the only reason I am not running the original pc's is because they got to be too slow. A 133mhz pc just won't cut it these days!! Cooling is an issue, especially for laptops, but I use the stock cpu coolers on most of my machines and have no troubles.

Bikeman (Heinz-Bernd Eggenstein)
Bikeman (Heinz-...
Moderator
Joined: 28 Aug 06
Posts: 3522
Credit: 688992710
RAC: 211625

One reassuring thing about

One reassuring thing about CPU life expectancy under full load is that desktop and server CPUs are manufactured using the same process and technologies (server grade CPUs might have different RAM interfaces or more cache etc, but the manufacturing process is basically the same. So these chips are used in servers and even supercomputers nowadays that are designed to run under full load all the time.

CU
Bikeman

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.