Is there a GPU version of the app in the works?

MarkJ
MarkJ
Joined: 28 Feb 08
Posts: 437
Credit: 139,002,861
RAC: 1
Topic 194007

The guys over at Orbit were having a fair bit of a discussion on GPU computing and BOINC. I was wondering if there was a GPU version of the science app in the works?

I vaguely remember some comments from the recent BOINC conference that (E@H) might be looking at one.

Bikeman (Heinz-Bernd Eggenstein)
Bikeman (Heinz-...
Moderator
Joined: 28 Aug 06
Posts: 3,522
Credit: 707,687,643
RAC: 712,162

Is there a GPU version of the app in the works?

Hi!

The video of Bruce Allen's talk and now also the slides (which were kind of difficult to read in the video) are available for download: Boinc 2008 workshop proceedings.

The slides contain a hint on GPU app plans.

CU
Bikeman

MarkJ
MarkJ
Joined: 28 Feb 08
Posts: 437
Credit: 139,002,861
RAC: 1

RE: Hi! The video of Bruce

Message 87101 in response to message 87100

Quote:

Hi!

The video of Bruce Allen's talk and now also the slides (which were kind of difficult to read in the video) are available for download: Boinc 2008 workshop proceedings.

The slides contain a hint on GPU app plans.

CU
Bikeman

Yes thats the one. Any idea how far along things are?

Bernd Machenschalk
Bernd Machenschalk
Moderator
Administrator
Joined: 15 Oct 04
Posts: 4,308
Credit: 249,992,904
RAC: 34,592

RE: RE: Hi! The video of

Message 87102 in response to message 87101

Quote:
Quote:

Hi!

The video of Bruce Allen's talk and now also the slides (which were kind of difficult to read in the video) are available for download: Boinc 2008 workshop proceedings.

The slides contain a hint on GPU app plans.

CU
Bikeman

Yes thats the one. Any idea how far along things are?


We have some code that doesn't compile with NVidias current SDK and last time it did it didn't give correct results. Might take a week of work to get this going at all, though the speedup is not that impressive. With some help from NVidia we also developed a new more promising approach optimizing memory access, but that's not gone into actual code yet. At the very moment nobody finds the time to drive this a bit further, due to a lot of other things with higher priority. It's definitely a pending item on Oliver's and my todo-list, but I don't know when we'll find the time to get back to that. Hopefully later this year.

BM

BM

MarkJ
MarkJ
Joined: 28 Feb 08
Posts: 437
Credit: 139,002,861
RAC: 1

RE: RE: RE: Hi! The

Message 87103 in response to message 87102

Quote:
Quote:
Quote:

Hi!

The video of Bruce Allen's talk and now also the slides (which were kind of difficult to read in the video) are available for download: Boinc 2008 workshop proceedings.

The slides contain a hint on GPU app plans.

CU
Bikeman

Yes thats the one. Any idea how far along things are?


We have some code that doesn't compile with NVidias current SDK and last time it did it didn't give correct results. Might take a week of work to get this going at all, though the speedup is not that impressive. With some help from NVidia we also developed a new more promising approach optimizing memory access, but that's not gone into actual code yet. At the very moment nobody finds the time to drive this a bit further, due to a lot of other things with higher priority. It's definitely a pending item on Oliver's and my todo-list, but I don't know when we'll find the time to get back to that. Hopefully later this year.

BM

Thanks for the update Bernd.

Annika
Annika
Joined: 8 Aug 06
Posts: 720
Credit: 494,410
RAC: 0

Just curious: Would this kind

Just curious: Would this kind of thing work on a Geforce 9?

Mike Hewson
Mike Hewson
Moderator
Joined: 1 Dec 05
Posts: 6,586
Credit: 310,937,881
RAC: 69,097

I reckon it's just bonza that

I reckon it's just bonza that we're getting some input from NVidia on this. :-)

[aside]
And I like the : "We will provide all the screensaver code nicely packaged so that
users can modify it or write their own. Based on new SDL graphics library (no more GLUT)."
[/aside]

Cheers, Mike.

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

jedirock
jedirock
Joined: 11 Jun 06
Posts: 23
Credit: 1,517,411
RAC: 0

RE: Just curious: Would

Message 87106 in response to message 87104

Quote:
Just curious: Would this kind of thing work on a Geforce 9?


List of supported CUDA devices.

Annika
Annika
Joined: 8 Aug 06
Posts: 720
Credit: 494,410
RAC: 0

RE: RE: Just curious:

Message 87107 in response to message 87106

Quote:
Quote:
Just curious: Would this kind of thing work on a Geforce 9?

List of supported CUDA devices.

Thanks a lot :-) looks like my card is on there... is this Windows only, or will there be a Linux version?

jedirock
jedirock
Joined: 11 Jun 06
Posts: 23
Credit: 1,517,411
RAC: 0

RE: RE: RE: Just

Message 87108 in response to message 87107

Quote:
Quote:
Quote:
Just curious: Would this kind of thing work on a Geforce 9?

List of supported CUDA devices.

Thanks a lot :-) looks like my card is on there... is this Windows only, or will there be a Linux version?


Looks like the CUDA SDK supports Windows, Mac, and Linux, so there's no reason a Linux app couldn't be made. Provided you're running the Nvidia drivers.

Bernd Machenschalk
Bernd Machenschalk
Moderator
Administrator
Joined: 15 Oct 04
Posts: 4,308
Credit: 249,992,904
RAC: 34,592

RE: RE: RE: Just

Message 87109 in response to message 87107

Quote:
Quote:
Quote:
Just curious: Would this kind of thing work on a Geforce 9?

List of supported CUDA devices.

Thanks a lot :-) looks like my card is on there... is this Windows only, or will there be a Linux version?


Actually the first version to come out will probably be for Linux. That's the platform the code is been developed on, both on our and on NVidias side. The Mac version should also be rather easy to derive from this; how to build an App for Windows out of that code is still a mystery to me, so that will probably come out last.

BM

BM

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.