Milestones IV

WolfK
WolfK
Joined: 25 May 08
Posts: 19
Credit: 32757190
RAC: 6121

Third year of Einstein

Message 82401 in response to message 82336

Third year of Einstein chrunching completed: 671 122 credits. Go on!

Rechenkuenstler
Rechenkuenstler
Joined: 22 Aug 10
Posts: 138
Credit: 102567115
RAC: 0

I've just passed the 6

I've just passed the 6 Million border with einstein@home since August last year.

all4naija
all4naija
Joined: 19 May 10
Posts: 7
Credit: 19864
RAC: 0
Michael Karlinsky
Michael Karlinsky
Joined: 22 Jan 05
Posts: 888
Credit: 23502182
RAC: 0

2.5 million thanks to BRP and

2.5 million thanks to BRP and CUDA!

Michael

robertmiles
robertmiles
Joined: 8 Oct 09
Posts: 127
Credit: 29440881
RAC: 21040

From

Message 82405 in response to message 82358

From DanNeely:

Quote:

Einstein work units are memory intensive. The ABP2 ones aren't that bad at 50MB each, but the S5GCE ones take about 250MB each. On an i7 with HT enabled that's a memory footprint as high as 2GB of ram, and since Siran's computer only has 3GB available that leaves only 1GB free for everything else. On a heavily used system that's not much.

Siran: at this point I think unless you're willing to upgrade to a 64bit OS, and install more memory I don't think Einstein will be a good choice for you to run on all your cores at once. With your current system I'd suggest either limiting Einstien to between 2 and 4 cores and running a different, lighter project on the rest; or switching Einstein out for something else entirely.

Is the way for restricting Einstein to using only 2 or 4 cores usable for other BOINC projects as well? I have adequate memory for running Einstein on all the cores on my computers at once, but not enough to do it for some of the other BOINC projects I participate in.

Mikie Tim T
Mikie Tim T
Joined: 22 Jan 05
Posts: 105
Credit: 263777741
RAC: 0

As of today, Einstein just

As of today, Einstein just crossed 400 TFlops. Imagine how quickly it'll ramp up from here when they get an OpenGL app!

Luigi Naruszewicz
Luigi Naruszewicz
Joined: 18 Jul 05
Posts: 8
Credit: 220930158
RAC: 18056

Just got my 1st million

Just got my 1st million credits on Einstein, 2 million on BOINC overall coming up shortly.

DanNeely
DanNeely
Joined: 4 Sep 05
Posts: 1364
Credit: 3562358667
RAC: 0

RE: From

Message 82408 in response to message 82405

Quote:

From DanNeely:

Quote:

Einstein work units are memory intensive. The ABP2 ones aren't that bad at 50MB each, but the S5GCE ones take about 250MB each. On an i7 with HT enabled that's a memory footprint as high as 2GB of ram, and since Siran's computer only has 3GB available that leaves only 1GB free for everything else. On a heavily used system that's not much.

Siran: at this point I think unless you're willing to upgrade to a 64bit OS, and install more memory I don't think Einstein will be a good choice for you to run on all your cores at once. With your current system I'd suggest either limiting Einstien to between 2 and 4 cores and running a different, lighter project on the rest; or switching Einstein out for something else entirely.

Is the way for restricting Einstein to using only 2 or 4 cores usable for other BOINC projects as well? I have adequate memory for running Einstein on all the cores on my computers at once, but not enough to do it for some of the other BOINC projects I participate in.

IIRC I was thinking of a hack in one of the xml files that control boinc; so it should be applicable to any project; but after half a year I'm not sure what I had in mind.

Filipe
Filipe
Joined: 10 Mar 05
Posts: 186
Credit: 407818694
RAC: 358906

First Million

First Million

Svenie25
Svenie25
Joined: 21 Mar 05
Posts: 139
Credit: 2436862
RAC: 0

RE: First Million

Message 82410 in response to message 82409

Quote:
First Million

Unbelievably.

After over 6 years, I´ve done the same. Now I will have alook, if I can buy a ice for them.

No, I´m really a little proud of me.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.