That's not how the validator works. Your result either is within the expected value range and has a wingman result that confirms this or it isn't valid.
If they decide to rerun some tasks they will be new task numbers. They don't rerun anything within the original task name. Once the task is validated, it gets moved into the science database and purged from the working database.
1. Sure the validation depends on the data. A clear signal that sticks out is easy to identify and thus to validate. On data that contains only noise, possibly all on the same level, minor fluctuations in computing (such as order of instruction execution) can lead to larger differences in the results. With other searches I've seen so far these differences in validation average out over a shorter time, resulting in a relatively stable validation rate. MeerKat data seems to be a bit different there, though.
2. There's plenty of MeerKat data, we'll add more soon (as we used to do for FGRP).
Due to some I/O limitation on the server that handles the BRP7 results (einstein3) (see https://einsteinathome.org/goto/comment/220405) I moved that search to a different server. You don't need to do anything and shouldn't notice much, except that the progress counter for BRP7 will get out of sync. It's only counting the results coming in on the old server for a few weeks.
That's not how the validator
)
That's not how the validator works. Your result either is within the expected value range and has a wingman result that confirms this or it isn't valid.
If they decide to rerun some tasks they will be new task numbers. They don't rerun anything within the original task name. Once the task is validated, it gets moved into the science database and purged from the working database.
Thank you for explaining.
)
Thank you for explaining.
I've noticed the uptick in
)
I've noticed the uptick in invalid results as well with amdgpus. BRP7.
At a rate of 53,018 dns work
)
At a rate of 53,018 dns work units a day BPR7 will be out of work in 23 days
1. Sure the validation
)
1. Sure the validation depends on the data. A clear signal that sticks out is easy to identify and thus to validate. On data that contains only noise, possibly all on the same level, minor fluctuations in computing (such as order of instruction execution) can lead to larger differences in the results. With other searches I've seen so far these differences in validation average out over a shorter time, resulting in a relatively stable validation rate. MeerKat data seems to be a bit different there, though.
2. There's plenty of MeerKat data, we'll add more soon (as we used to do for FGRP).
BM
Bernd Machenschalk wrote: 2.
)
FGRP5 (cpu) has 40 days of work left according to the server status page. Will this be ending soon? Thanks.
pututu wrote:FGRP5 (cpu) has
)
This would be indeed nice to know.
.
New work for FGRP5 has been
)
New work for FGRP5 has been added.
BM
Thanks for the update, looks
)
Thanks for the update, looks like it's enough for about a year.
.
Due to some I/O limitation on
)
Due to some I/O limitation on the server that handles the BRP7 results (einstein3) (see https://einsteinathome.org/goto/comment/220405) I moved that search to a different server. You don't need to do anything and shouldn't notice much, except that the progress counter for BRP7 will get out of sync. It's only counting the results coming in on the old server for a few weeks.
BM