I believe you have said in an other tread that the Einstein@home app's was purely an linear process that would not take benefit of the parallel processing capabilities of the GPU and yet they have develop one, What have change that suddenly make the effort worth doing?
I believe you have said in an other tread that the Einstein@home app's was purely an linear process that would not take benefit of the parallel processing capabilities of the GPU and yet they have develop one, What have change that suddenly make the effort worth doing?
G'day Tomas ... Did I? :-)
I think I said to get maximum value from GPU's you'd have to be massively parallel, as ( NVidia at least ) manufacturers only quote significant advantages for thousands of threads. Plus other constraints related to addressing and such. But the problem, at least in parts, has to lend itself to parallelism. I'm not a developer so perhaps Bernd or Oliver could clue us in on the trick that turns 'n burns. My guess is fast fourier transform and compiler technology .....
Cheers, Mike.
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal
Hi Mike! I believe you
)
Hi Mike!
I believe you have said in an other tread that the Einstein@home app's was purely an linear process that would not take benefit of the parallel processing capabilities of the GPU and yet they have develop one, What have change that suddenly make the effort worth doing?
Tomas
RE: Hi Mike! I believe you
)
G'day Tomas ... Did I? :-)
I think I said to get maximum value from GPU's you'd have to be massively parallel, as ( NVidia at least ) manufacturers only quote significant advantages for thousands of threads. Plus other constraints related to addressing and such. But the problem, at least in parts, has to lend itself to parallelism. I'm not a developer so perhaps Bernd or Oliver could clue us in on the trick that turns 'n burns. My guess is fast fourier transform and compiler technology .....
Cheers, Mike.
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal
Thank you Mike! I take that
)
Thank you Mike!
I take that as Einsteing@home gpu app's will be close to 1:1
compered to the CPU applications.
RE: Thank you Mike! I take
)
Well at the moment its using 1 cpu + 1 gpu to run. You have to remember that this is a 1st beta, so expect some improvement.
My 1st test run on a Core 2 Quad with a GTS250 card the times were 5 hrs 45 mins using cuda compared to 8 hours for the normal app.
I have a 2nd test running on a Core i7 with dual GTX260 cards and will know later how it goes.
BOINC blog