Binary Radio Pulsar Search (Perseus Arm Survey) "BRP5"

Beyond
Beyond
Joined: 28 Feb 05
Posts: 121
Credit: 2332346212
RAC: 5231499

RE: I'd have thought the

Quote:
I'd have thought the time shown in the "time sent" column was the actual time sent for a particular task, and that the time shown as "time created" was the time created.


I think you're correct. All these years and I've never noticed that at the top of the page. Thanks for pointing it out.

Jeroen
Jeroen
Joined: 25 Nov 05
Posts: 379
Credit: 740030628
RAC: 0

I have been monitoring my

I have been monitoring my SSDs via smartctl for over two weeks now to determine how much data was being written on a 24-hour basis. I set a cron job to run at midnight every night and write the info to a log file. Back when my system was running all BRP4 tasks, the data written per day ranged from 24.1 - 45.2 GB per day. Since the same host finished running BRP4 tasks and is now only running BRP5 tasks, the data being written has dropped to 3.7 - 4.3 GB per day which is a very significant reduction.

In addition to that, I have been looking at MRTG graphs and the amount of data being downloaded now is very small compared to before to the point where I can barely see the bandwidth spikes on the graphs anymore. I used to limit BOINC downloads during a daily time range when bandwidth usage was not counted towards the monthly limit. Now I set the hosts to download at any time.

Beyond
Beyond
Joined: 28 Feb 05
Posts: 121
Credit: 2332346212
RAC: 5231499

RE: RE: I'd have thought

Quote:
Quote:
I'd have thought the time shown in the "time sent" column was the actual time sent for a particular task, and that the time shown as "time created" was the time created.

I think you're correct. All these years and I've never noticed that at the top of the page. Thanks for pointing it out.


Now I'm not so sure. Look at these resends that were validated this morning:

http://einsteinathome.org/task/383515626
http://einsteinathome.org/task/383498381
http://einsteinathome.org/task/383417678

All are listed as being created yesterday yet they're 4000 credit WUs. So the original creation date must have been much earlier, and the creation date listed doesn't seem to be so useful.

archae86
archae86
Joined: 6 Dec 05
Posts: 3157
Credit: 7219384931
RAC: 975426

Beyond wrote:Now I'm not so

Beyond wrote:

Now I'm not so sure. Look at these resends that were validated this morning:

http://einsteinathome.org/task/383515626

All are listed as being created yesterday yet they're 4000 credit WUs. So the original creation date must have been much earlier, and the creation date listed doesn't seem to be so useful.

We are mixing two different creation dates here. The creation date which distresses you is listed on the task page and appears to be the task creation date, which for a resend is quite properly a long time after the work unit creation date. However if you click on the work unit link conveniently shown on the task page you will find your first task's parent work unit is shown as having being created 28 May 2013 9:50:24 UTC, which is perfectly consistent with 4000 credit.

I was not previously clear that I was referring to the Work Unit creation date as shown on the Work Unit page--sorry about that. I had not noticed that the word creation date appeared on task pages, and had a different meaning, though it seems quite reasonable now.

Beyond
Beyond
Joined: 28 Feb 05
Posts: 121
Credit: 2332346212
RAC: 5231499

RE: I had not noticed that

Quote:
I had not noticed that the word creation date appeared on task pages, and had a different meaning


Seems confusing. Thanks for the explanation. Never paid attention to creation dates until this question came up above.

Maximilian Mieth
Maximilian Mieth
Joined: 4 Oct 12
Posts: 130
Credit: 10265084
RAC: 2814

I observed something strange

I observed something strange that might be one possible explanation for the different observed relations of running times of BRP4 and BRP5.

Im am running 2 GPU tasks at once. Two BRP4 tasks take about 13,500 seconds each to complete on my GPU (NVIDIA 260M). Two BRP5 tasks take about 100,000 seconds each.

However, when I run one BRP4 and one BRP5 task at a time, BRP4 tasks seem to be computed much faster and last only about 9,500 seconds. In turn, The respective BRP5 tasks need much more time. I don't know how much more exactly, since I can't run BRP4 and BRP5 tasks simultaniously all the time. I observed a run time of up to 119,000 for a BRP5 that was running in parallel with a BRP4 first and after this BRP4 finished it ran in parallel with another BRP5.

archae86
archae86
Joined: 6 Dec 05
Posts: 3157
Credit: 7219384931
RAC: 975426

Maximillian Mieth

Maximillian Mieth wrote:
However, when I run one BRP4 and one BRP5 task at a time, BRP4 tasks seem to be computed much faster and last only about 9,500 seconds. In turn, The respective BRP5 tasks need much more time. I don't know how much more exactly, since I can't run BRP4 and BRP5 tasks simultaniously all the time. I observed a run time of up to 119,000 for a BRP5 that was running in parallel with a BRP4 first and after this BRP4 finished it ran in parallel with another BRP5.


I've seen (and reported) somewhat similar behavior (speedup of BRP4 and slow down of BRP5 when run simultaneously) on both of my GTX660 hosts, both of which are currently running three GPU tasks at a time, under Windows 7.

You report running two GPU tasks at a time on that GeForce 610M Windows 7 host. However some others have reported not seeing such effects. Some differences in the BPU used, host system or running parameters must be responsible. I surmise that something as simple as more frequent task switching in the BRP5 application (i.e. the moments when it needs host system service) as compared to the BRP4 application may upset the "fair split" of actual active working time on the GPU.

Have you noticed the CPU support application to use somewhat more CPU resource when you are running 2x BRP5 than when running 2X BRP4? I believe I have.

Horacio
Horacio
Joined: 3 Oct 11
Posts: 205
Credit: 80557243
RAC: 0

RE: Maximillian Mieth

Quote:
Maximillian Mieth wrote:
However, when I run one BRP4 and one BRP5 task at a time, BRP4 tasks seem to be computed much faster and last only about 9,500 seconds. In turn, The respective BRP5 tasks need much more time. I don't know how much more exactly, since I can't run BRP4 and BRP5 tasks simultaniously all the time. I observed a run time of up to 119,000 for a BRP5 that was running in parallel with a BRP4 first and after this BRP4 finished it ran in parallel with another BRP5.

I've seen (and reported) somewhat similar behavior (speedup of BRP4 and slow down of BRP5 when run simultaneously) on both of my GTX660 hosts, both of which are currently running three GPU tasks at a time, under Windows 7.

You report running two GPU tasks at a time on that GeForce 610M Windows 7 host. However some others have reported not seeing such effects. Some differences in the BPU used, host system or running parameters must be responsible. I surmise that something as simple as more frequent task switching in the BRP5 application (i.e. the moments when it needs host system service) as compared to the BRP4 application may upset the "fair split" of actual active working time on the GPU.

Have you noticed the CPU support application to use somewhat more CPU resource when you are running 2x BRP5 than when running 2X BRP4? I believe I have.


On my GTX560TIs Ive also noticed the speedup of BRP4 when paired with BRP5... But I have no conclusive data to say that the BRP5 took more time.
In the past Ive seen interesting speedups when a BRP4 and SETI task were running togheter (against the times of 2 BRPs or 2 SETI tasks)...
Not sure why, but I guess it has something to do with which hardware inside the GPU needs to be cleaned and reloaded on every context switch done to run the apps concurrently... and this effect could be highly different on different models/brands of GPUs...

MAGIC Quantum Mechanic
MAGIC Quantum M...
Joined: 18 Jan 05
Posts: 1886
Credit: 1403721324
RAC: 1053006

RE: I observed something

Quote:

I observed something strange that might be one possible explanation for the different observed relations of running times of BRP4 and BRP5.

Im am running 2 GPU tasks at once. Two BRP4 tasks take about 13,500 seconds each to complete on my GPU (NVIDIA 260M). Two BRP5 tasks take about 100,000 seconds each.

However, when I run one BRP4 and one BRP5 task at a time, BRP4 tasks seem to be computed much faster and last only about 9,500 seconds. In turn, The respective BRP5 tasks need much more time. I don't know how much more exactly, since I can't run BRP4 and BRP5 tasks simultaniously all the time. I observed a run time of up to 119,000 for a BRP5 that was running in parallel with a BRP4 first and after this BRP4 finished it ran in parallel with another BRP5.

Yes Max, that is what you can expect on your laptop running with the NVIDIA GeForce 610M with a quad-core CPU since you also have Gamma Ray tasks running at the same time.

I have that same GPU card running in my 8-core laptop and it never gets that extra free core snd I never saw it worth doing with mine.

I still have BRP4's to do on mine but I know I will be getting BRP5's after that which will take over 100,000 just like yours.

(mine runs 8 Gamma Rays and 2 BRP's at the same time on my 610M and T4T tasks)

Maybe Bernd will make these tasks on our 610M's get 7000 credits

Maximilian Mieth
Maximilian Mieth
Joined: 4 Oct 12
Posts: 130
Credit: 10265084
RAC: 2814

I had a typo in my post. As

I had a typo in my post. As Magix said I have a 610M GPU.

Quote:
Maybe Bernd will make these tasks on our 610M's get 7000 credits

Actually I don't care so much about credits ;) For me 4000 credits seem just right for a BRP5. However, I won't complain about 5000...

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.