the waiting room

Mikkie
Mikkie
Joined: 2 Apr 07
Posts: 25
Credit: 242,066
RAC: 0
Topic 194354

Old school crunching is history. I'm in the waitingroom till e@h is starting cuda crunching. Saw a example with ATI on the Milkyway@home project.

Look - see and shiver

"souls ain't born, souls don't die"

mikey
mikey
Joined: 22 Jan 05
Posts: 7,760
Credit: 623,127,097
RAC: 145,803

the waiting room

Quote:

Old school crunching is history. I'm in the waitingroom till e@h is starting cuda crunching. Saw a example with ATI on the Milkyway@home project.

Look - see and shiver

I would guess that is part of what is making the projects shiver too, people being able to SCREAM thru the units faster than the project can make them. Like Milky Way is experiencing now! Then of course all those results have to used in some way, making for a TON of work on the project side. Hopefully it can be automated, but I have no idea.

Alinator
Alinator
Joined: 8 May 05
Posts: 927
Credit: 9,352,143
RAC: 0

Well, I don't think the

Well, I don't think the projects themselves are too concerned with the ability of the hosts to burn through the work like smelly stuff through a goose. They can produced and handle what they can reproduce and handle effectively, and that's all there is to it.

Strictly speaking, the hosts couldn't care less one way or the other. The 'problem' is with the pilots sitting behind the consoles. The solution is what it always been, if you want to be reasonably assured your hosts will always stay 'hot', join more projects.

Also, as far as CPU crunching being dead, don't make me laugh. GPU crunching is still a small fraction of the DC crunching community, which as a whole is an almost infinitesimal percentage of all idle computing resources potentially available. I don't see that changing any time in the immediate future.

In addition, the GPGPU paradigm is still so new that many fundamental aspects of it have yet to be fully resolved. One big show stopper is how you efficiently integrate them into the grand PC scheme of things and have them do that compute intensive crunching without adversely effecting the hosts user experience for it's 'normal' tasks, the primary adapter, or all of them if running SLI or Crossfire, main function (which is to put pretty pictures on your screen as fast as possible), or requiring the user to jump through hoops and constantly babysit the box or play with settings. The simple truth here is that the overwhelming number of users pony up the big bucks (relatively) for these cards precisely because the have a graphics application in mind that they want that all that horsepower for and have little tolerance for something which is supposed to be neither seen nor heard interfering with that.

Alinator

Mike Hewson
Mike Hewson
Moderator
Joined: 1 Dec 05
Posts: 6,221
Credit: 137,332,827
RAC: 24,309

Basically these GPUs are here

Basically these GPUs are here on account of the market for them. Which is fast graphics. They optimise the work flow by going massively parallel, hence the speed. Most of the output is related to the conceptual problem of rendering some virtual 3D space onto a 2D plane in a way that suits human eyes and brains for the purpose at hand. Frustums et al.

To hook into this blinding speed a project needs to have, or can massage matters into, a (sub)set of data upon which operations can be usefully parallelised by GPUs. Note that, currently, one GPU thread is not equivalent to one CPU thread as there are other restrictions. The speed benefit for going parallel is only that vast if it uses thousands of threads. So that comes back to the primary data a project has to work upon. There's probably no shortage of suitable datasets, but E@H and other projects use DC as part of a single total pipeline.

Also a given user, as per BOINC preferences and on a per computer basis, will be splicing that into their own personal workflow according to other projects they wish to support.

Quote:
And they may have even other, non-BOINC, work for that hardware! Yes, I know .... weird .... but I've heard some people actually do that :-) :-)


Hence you have to split a stream of some sequential data, like the LIGO IFO records, into many substreams already. And then re-form it coherently. This is the DC work allocation at present, and it has alot of housekeeping which we discuss here regularly. That may, though increasingly so thesedays, also involve multiple CPU/core hosts.

With GPUs even further splitting is to occur, within the host now and using a rather different programming paradigm from CPUs, and reformulating the data for return to the project in an assured & validated way. My observation of E@H would be that control of work quality is first rank, with work speed at lower priority.

Cheers, Mike.

( edit ) My goodness I've just realised, because my son keeps asking for it, that one ( + $$$$ ) can have two graphics cards in a system AND with some fancy private link between the two! Oh my, time for a lie down ....

I have made this letter longer than usual because I lack the time to make it shorter. Blaise Pascal

Paul D. Buck
Paul D. Buck
Joined: 17 Jan 05
Posts: 754
Credit: 5,385,205
RAC: 0

On the other hand, there are

On the other hand, there are those of us that custom build PCs to feed our BOINC addiction. Like the two i7 boxes I just constructed...

I even bought two new super cases on the anticipation that in 6 months or so I will likely junk the MB in the boxes, promote the CPU to the new MB and buy a couple more GPU cards.

tullio
tullio
Joined: 22 Jan 05
Posts: 2,081
Credit: 46,859,328
RAC: 11,816

For me graphic boards are

For me graphic boards are objects of art. I admire them as I admire a Ferrari. I don't have either and I am crunching on my Opteron CPU. I just completed a CPDN task in 1961 hours. I am running 6 BOINC projects and only one of them (SETI) uses a graphic board, so I don't have any excuse to buy one.
Tullio

Mike Hewson
Mike Hewson
Moderator
Joined: 1 Dec 05
Posts: 6,221
Credit: 137,332,827
RAC: 24,309

RE: On the other hand,

Message 92912 in response to message 92910

Quote:

On the other hand, there are those of us that custom build PCs to feed our BOINC addiction. Like the two i7 boxes I just constructed...

I even bought two new super cases on the anticipation that in 6 months or so I will likely junk the MB in the boxes, promote the CPU to the new MB and buy a couple more GPU cards.


Of course, Paul. Same here for my wee farm, 'tis an entertaining diversion and a terrific way to learnt IT skills. But we are a minority, outliers of the pack. :-)

Tullio, I do understand the art. My son has a very nicely modded case, rather dark side & Alien-esque, savage looking grills/meshes for air intake, fluoro colors, amazing keyboard thru-the-keys lighting, art deco window looking onto the face of the MB, sculpted heat sinks/piping in that lovely copper sheen, LEDs than shine into the rotating fan blades giving a cool curvy 'laser' effect, cable wrap and clips in a 'webbed' style ..... he keeps the case well polished with "Armour All" an automotive interior surface polish. It's a mean performer to boot, contributing ~ half my total RAC!! I think this is the 21st century equivalent to 1950's hot-rodding/roadsters/dragsters .. :-)

Cheers, Mike

I have made this letter longer than usual because I lack the time to make it shorter. Blaise Pascal

Jord
Joined: 26 Jan 05
Posts: 2,952
Credit: 5,779,100
RAC: 0

RE: For me graphic boards

Message 92913 in response to message 92911

Quote:
For me graphic boards are objects of art. I admire them as I admire a Ferrari. I don't have either and I am crunching on my Opteron CPU. I just completed a CPDN task in 1961 hours. I am running 6 BOINC projects and only one of them (SETI) uses a graphic board, so I don't have any excuse to buy one.
Tullio


There are a couple of things about this post.

1. If you do not have a graphics board, then how do you see anything on screen, how did you make that post?
2. 6 BOINC projects, 1 graphics board; you make it sound as if you need 6 graphics boards if all 6 projects were doing something with it.

So then, explanations. It isn't a "graphics board"; It isn't a Matrox Parhelia, an ATI HD4850, VIA S3 or something old, it needs to be able to do these special calculations on the GPU: It is an Nvidia videocard with CUDA capabilities.

If your motherboard supports that many, you can of course run 6 CUDA cards next to each other, but they won't be used independently for each of the 6 projects that can then utilize them. Same thing as with how your CPUs are used.

Anyway, I would guard against using the term graphics board so loosely. Yes, the nVidia cards with CUDA capability are graphics cards. It's just that all the other graphics cards out there at this moment, including older nVidia cards, aren't by any means capable of doing CUDA. Or CAL. Or OpenCL for that matter.

Edit: The above isn't true and isn't what I meant. I meant the older graphics cards for all brands, not the newer ones. Anyway.. (I was nitpicking)

tullio
tullio
Joined: 22 Jan 05
Posts: 2,081
Credit: 46,859,328
RAC: 11,816

RE: There are a couple of

Message 92914 in response to message 92913

Quote:


There are a couple of things about this post.

1. If you do not have a graphics board, then how do you see anything on screen, how did you make that post?
2. 6 BOINC projects, 1 graphics board; you make it sound as if you need 6 graphics boards if all 6 projects were doing something with it.

So then, explanations. It isn't a "graphics board"; It isn't a Matrox Parhelia, an ATI HD4850, VIA S3 or something old, it needs to be able to do these special calculations on the GPU: It is an Nvidia videocard with CUDA capabilities.

If your motherboard supports that many, you can of course run 6 CUDA cards next to each other, but they won't be used independently for each of the 6 projects that can then utilize them. Same thing as with how your CPUs are used.

Anyway, I would guard against using the term graphics board so loosely. Yes, the nVidia cards with CUDA capability are graphics cards. It's just that all the other graphics cards out there at this moment, including older nVidia cards, aren't by any means capable of doing CUDA. Or CAL. Or OpenCL for that matter.


My SUN WS has a Radeon graphic chip ES1000 515E. I am able to see graphics on my Linux box in Einstein@home (but not in ABP1)and QMC@home. Astropulse app has no graphics in SETI and also optimized MultiBeam in SETI has no graphics. In CPDN I should upgrade my BOINC client, which is 5.10.45, to see its graphics. LHC@home has no graphics and Aqua@home has no graphics. So I would gain very little by installing a graphic board. I am no gamer. I think that the use of graphic boards which are CUDA capable in SETI is making many users with no CUDA capable boards to abandon SETI and I would not like to see the same thing happening in Einstein@home. Most of CUDA is marketing-inspired by NVidia.
Tullio

Jord
Joined: 26 Jan 05
Posts: 2,952
Credit: 5,779,100
RAC: 0

RE: My SUN WS has a Radeon

Message 92916 in response to message 92914

Quote:
My SUN WS has a Radeon graphic chip ES1000 515E.


As I said in my edit of my last post, I was nitpicking. :-)

A graphics chip is an embedded on the motherboard graphics card. What's drawn on your monitor are graphics. 2D perhaps, but graphics non-the-less.

Those 3D graphics, whether they are OpenGL or DirectX 3D, are just features on the graphics card. There are still manufacturers that don't have OpenGL capabilities on their embedded chips. A while ago there were still manufacturers that only had 2D capability.

Perhaps it's better not to speak of graphics board, but of 3D accelerator boards (if you go for D3D and OpenGL) and of co-processors (if you talk about CUDA, CAL, OpenCL). It will just be confusing to say that since you don't watch "graphics" you don't need a "graphics board". You will always need one as else it's quite difficult to even power up a computer these days.

(My uncle is completely blind, but in order for him to use his computer with braille board, he still has to buy a good videocard and monitor... go figure. ;-))

tullio
tullio
Joined: 22 Jan 05
Posts: 2,081
Credit: 46,859,328
RAC: 11,816

I said I admire those graphic

I said I admire those graphic cards, no matter what name you give to them. My last window shopping object is the Radeon HD4770 with its 40 nm technology and its moderate price. Maybe someday I shall buy one, but I am waiting for OpenCL. Cheers.
Tullio
Edit: Arecibo Binary Pulsar Search has graphics. I can see them.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.