Looking to get started with GPU processing

floyd
floyd
Joined: 12 Sep 11
Posts: 133
Credit: 186610495
RAC: 0

RE: RE: Your first major

Quote:
Quote:

Your first major decision point is what GPUs, after the screwdrivers. This will determine mobo psu and even case size.

Ok, looking at the top hosts Linux users, I see a lot of 7870/7950/7970/R9 280X, and that's comforting

The 280x's seem like an ok price point for new: [url] http://www.newegg.com/Product/Product.aspx?Item=N82E16814150678[/url]

So, if there's no driver issues with that particular model, it seems good to me.

Any other issues I'm missing? If not, then what mobo/psu/case would we be thinking for two 280x's?

I'm running a similar system so I can share some experiences. Actually it's a dual R9 280 but it's listed in the 7870/7950/7970/R9 280X class.

First, a R9 280 is long. A 280X is even longer. The one you mentioned seems to be 295mm long. That certainly won't fit in every case. You may need to be able to remove the hard disk mounting. Even if that's not necessary the GPUs will probably interfere with the HDD. You may be forced to move the HDD off the air stream. That's the main reason why I went for 280 instead of 280X.

Later I was glad I made that choice because of the other main issue: heat. HEAT. Two 280 sum up to the TDP of 400W. For two 280X that's 500W. Plus all the other little thingies in the case. You don't imagine how hot that gets. My case is an older model with one 120mm intake fan and one 120mm exhaust fan. I added a slot fan because nothing better would fit and cooling is still insufficient. At least the PSU is at the top, pulling air from the case. Many modern cases have the PSU at the bottom, pulling air from outside. I'd certainly be lost with such one. In short, you'll want as much cooling as you can get. Which immediately leads to two other issues, noise and heat in the room. There's probably not much you can do about this, just learn to live with it.

I don't think there's much to mind with respect to the motherboard, except make sure it can run two GPUs at full speed. Most boards with multiple PCIe slots share PCIe lanes between slots and can only run a single GPU at full speed. You'll have to look in the higher price range.

PSU: You'll need one with four PCIe connectors. Powerful PSUs usually have those but there may be exceptions. GPUs often come with adaptors but I'd prefer not to use those.

AMD GPUs need good CPU support. I'm running my dual 280 with a FX-6300 and the full six cores are needed for best performance. With Intel you may need less but I don't have any experience with that.

tbret
tbret
Joined: 12 Mar 05
Posts: 2115
Credit: 4812854158
RAC: 98380

RE: So, if there's no

Quote:

So, if there's no driver issues with that particular model, it seems good to me.

Any other issues I'm missing? If not, then what mobo/psu/case would we be thinking for two 280x's?

I'm not dissing an R9 280X. I wish I had several.

Look at this photo, though:

GTX 970 Blower-Fan

I don't know anything about that card. It's probably a good one, but I am *only* asking you to look at the "cooling" solution.

In general these "blower-style" fans are preferable in a multi-GPU system. Here's the paradoxical thing about that: The blower-style fans usually aren't really able to keep things quite as cool as a multi-fan solution. If that's true, then why would they be preferable?

Well... it comes down to where they deposit the heat. The blower-style blow through the card putting *most* of the heat out of the back of the machine instead of dumping it into the case.

Now look at this: 970 w/bigger exhaust

All things being equal (and I'm sure they aren't), you would expect this card to be able to blow as much heat at a lower velocity out of this wider slot.

Okay, so why not just recommend the card with the widest slot with the biggest fan? That's because all things are not equal. Notice that the first card has some sort of fancy fresh air intake ducts while the second does not. I have no idea if that works, but it would seem to make sense.

Then, there is this take on the theme:

10% More Cooling!

ASUS is trying to solve the slot width problem with a little more height and a fancy-shmancy fan arrangement.

Why do you care?

Because SLI or Crossfire cables (which you do not need for crunching) are short, motherboard manufacturers put their fast slots next to one-another. You may have seven PCIe slots, but if you want to run two cards in the fastest slots on the board they have to be close to each other.

This next comment is only to be taken in general and is meant only as a thing for you to take into consideration.

Some of these cards that dump heat into the case with lots and lots of fans are too tall to be mounted right next to each other. The fans are almost touching the backplate of the card immediately below it. Obviously they can't suck much air that way. ...and if you put a case fan blowing against the side of the card (toward the motherboard) you can interfere with the hot air that is trying to come out of the side of the GPU and make things worse!

That leaves you trying to blow cool air "up the rump" of the card to provide the fans with some cool air (so it isn't recirculating heat from the card below it). That means your case needs fans INSIDE the case, accelerating air right up the backside of the cards.

With a blower-style, you might need a little of that as well.

A common, common, common misunderstanding of this situation is that ANY fan in any configuration will pull enough air to create a vacuum-cleaner-like suction and move air from outside of the case. They just don't have a Hoover-like isolated fan and they will not pull air except from right around them.

So why not just say something like, "Be sure to get a blower-style fan."

Personally, I would say that and be done with it to be on the safe-side, but I know that isn't the only way to view the world. The chief problem with the blower-style fan cards is that they do tend to be a little more irritatingly noisy. If you have to crank the fan up they can sound like you are running a hair-dryer in the room. I have some that do and some that don't and I really can't tell you what the difference is except that the older, hotter cards require more RPM because they suck more power and have to do something with more heat.

Just to be clear: The FAN isn't that much louder. The velocity of the AIR moving through the exhaust is LOUD and hot.

So now you have to get into this whole "RAC/watt" thing because even if your electricity were free it is these watts that cause the heat.

You can't make that your main criterion because doing that means you'll buy a whimpy GPU. When comparing two cards with similar RAC you will probably want to choose the one that consumes less electricity simply to save yourself the heat.

There is one more solution:

Hybrid Cooling

That lets you move the heat from the card and deposit it wherever you care to mount the fan and radiator. This is CLEARLY the best way to go short of a super-duper fully liquid cooled hyper-custom "oh yeah, mamma" $$$$$ behemoth.

In general, are you getting the impression that heat dissipation is a big deal? It is.

Oh, you asked if a 1kW PSU was "enough." Pick your cards and we can tell you.

tbret
tbret
Joined: 12 Mar 05
Posts: 2115
Credit: 4812854158
RAC: 98380

RE: AMD GPUs need good

Quote:

AMD GPUs need good CPU support. I'm running my dual 280 with a FX-6300 and the full six cores are needed for best performance. With Intel you may need less but I don't have any experience with that.

My non-scientific observation based on feeding R9 270X cards in four different machines is that...

I can run two Parkes GPU work units and two CPU work units at the same time without slowing-down my GPU work on my old i5 machines. I can't do *any* work on an old Athlon II four-core without interfering, and the Athlon 64 dual core is the same.

I *can* run simultaneous GPU/CPU work (just one or two CPU work units) on old Phenom II six core CPUs while I am running 4 simultaneous GPU work units (on two cards, two work units per card).

I cannot run any CPU work units on an eight core AMD FX CPU which is supporting 4 GPUs running two work units each.

On my FX-6300 six "core" I can run one CPU work unit and two GPU work units. I might, might be able to push that to two CPU work units, but I haven't tried. If I had two cards in that machine I would do as you are doing and not run any.

I know this isn't scientifically correct for every possible situation, but my rule of thumb is to leave one core free per GPU work unit on everything except the FX AMD CPUs. They seem to need me to leave them mostly free.

The Intel CPUs seem to do better against the current AMD FX offerings with their funky "core" configurations sharing a FPU.

Now, having said all that, we have to take into consideration that the programming is different with the NVIDIA work (in CUDA) and the AMD GPU work (in OpenCL).

Would the AMD CPU be able to do better supporting the AMD GPUs? It doesn't look that way.

Would my two old i5s be better off supporting NVIDIA cards? I really don't know.

But I can conclude that the i5 is MUCH better at supporting R9 270X cards than an old Athlon II *and* can crunch some CPU work at the same time and *still* be superior.

An i5 also costs a lot more than an Athlon II.

I just wouldn't recommend an FX 41xx CPU to anyone who might want to support two GPUs running multiple work units simultaneously. The AMD FX CPUs are so cheap right now that it wouldn't take much more money to go to eight core instead of four or six.

With my extremely limited experience with Intel CPUs, they seem to be far, far better than the current crop of FX CPUs in absolute terms. In financial terms, the AMDs seem to be the better "deal." The boards are cheaper, the RAM is cheaper, and the CPU is cheaper; you would expect to be giving-up something and I think you do.

...and now I leave the thread for others' comments and experiences. I have just about exhausted everything I know from experience. Almost anything else I say would be partially driven by fan-boy-type speculation and I don't want to get involved with that.

John Reed
John Reed
Joined: 23 Oct 10
Posts: 25
Credit: 11079168
RAC: 0
Zalster
Zalster
Joined: 26 Nov 13
Posts: 3117
Credit: 4050672230
RAC: 0

RE: Is this a good

Only going to comment on this one since I have the same case. It seems like it is a little light on the RAM 8GB? Minimum I would expect would be 16, easy enough to fix, just get another 8.

He doesn't mention the PSU at all. I would want to know the manufacture and the rating, example EVGA 750W 80% Gold.

That will tell you if you will need to upgrade the PSU in the future.

Let's see what the others say about this one and the other 2

AgentB
AgentB
Joined: 17 Mar 12
Posts: 915
Credit: 513211304
RAC: 0

RE: With my extremely

Quote:

With my extremely limited experience with Intel CPUs, they seem to be far, far better than the current crop of FX CPUs in absolute terms. In financial terms, the AMDs seem to be the better "deal." The boards are cheaper, the RAM is cheaper, and the CPU is cheaper; you would expect to be giving-up something and I think you do.

Do have any feel for their longevity - how long have they been running without a problem?

Quote:

...and now I leave the thread for others' comments and experiences. I have just about exhausted everything I know from experience. Almost anything else I say would be partially driven by fan-boy-type speculation and I don't want to get involved with that.

We could speculate about whether fans should run clockwise or anticlockwise...

tbret
tbret
Joined: 12 Mar 05
Posts: 2115
Credit: 4812854158
RAC: 98380

RE: Do have any feel for

Quote:

Do have any feel for their longevity - how long have they been running without a problem?

I think you are referring to the AMD CPUs, I hope.

In my entire entirety I have had two CPUs die and one of them had a glass of tea poured on top of it through the top case fan. That would have been my youngest who put the tea exactly where I told her not-to because, as I told her, accidents happen.

The motherboard is still living and crunching today, but the FX-8120 bit the dust. I thought that was pretty amazing, too.

The other CPU that died fit in a Socket 7 motherboard. Don't remember if it was a Cyrix or an AMD K6 II+.

I'm getting to that place in life (whether busy or age) where years run-by so fast and melt together that I really cannot tell you how old any of my current crop of CPUs are, but some are quite old (the Phenom II 1090T was a brand new thing when I got mine, they were released in the Spring of 2010, I had to look it up). It has been running Einstein and SETI 24/7 for most of its life.

I'm sure some of the Athlon IIs are older. That Athlon 64 is from... wow...uh... 2008-ish. It hasn't crunched or supported crunching all of its life, but it has run 24/7, often doing nothing.

Gary Roberts
Gary Roberts
Moderator
Joined: 9 Feb 05
Posts: 5842
Credit: 109395580043
RAC: 35791725

RE: Is this a good

Quote:
Is this a good deal?
....


You take a number of risks trying to buy complete systems.
You're paying for stuff you don't need.
The things you do need may not have the performance you desire.
You risk critical items not being in proper condition for the high stress life they are about to have.
You don't know what previous (ab)use the machine has suffered.

You are best to confine purchases of used stuff to individual items that exactly match what you need and where there is a good reason for why the owner wants to sell it. GPUs are in this category because there are always a good supply of gamers who fairly quickly decide that what they have is no longer the 'best' and are just itching to upgrade to the latest and greatest.

Earlier, I asked you where you were going to house whatever machine(s) you end up acquiring. I think this should be your first decision. If it's going to be 'on display' and part of your everyday family life, you are going to want it to look nice and tidy and perhaps be a topic of conversation when friends/relatives visit. You also wont want it belching a lot of hot air with lots of fan noise. On the other hand, if you have a separate room away from normal living areas where you can close the door and shut out the noise and unsightliness, you have a completely different ball game and the ability to improve performance at a lower total cost. So you need to make that decision.

A number of responders have commented on the importance of heat dissipation. The problem can be eased considerably if you don't have to have everything in a fully closed-in case. If you decide that you could operate with the case permanently open, heat problems are much easier to solve.

Your total draw from the wall socket ends up as heat in the room. You will need to decide how you are going to deal with this heat. It could be household heating in winter but what are you going to do in summer? Using aircon is an expensive way to deal with it. If you can house your stuff in a room with natural ventilation, you may have a much reduced problem. I know this from personal experience.

It will be very easy for you to get carried away with the excitement of the possibility of having a 'top 20' cruncher. Everything may turn out just as you might hope or you may make some expensive mistakes along the way. If you are very keen to end up with that sort of output, I would advise you do so in two stages. Initially, a modest but fully contained first stage where you try to keep costs under control so that if things go wrong you wont have lost so much. I would run and fine tune that first stage build for a couple of months and experiment with it to give yourself that thing you most lack - experience.

While you are playing with that build (and hopefully enjoying the experience) you will be much better able to know what you really need to do for your top 20 build, if that's the way you ultimately decide to go. Your first stage build can be quite complete in itself and will not be wasted. It will just be supplementing your total output and may even be just as good or even better in the credit output per total cost of ownership stakes.

I've picked out some hosts from the top computers list for you to think about. Some of these might be suitable for a 'first stage' build. The first one however (currently ranked #6) might be suitable for the second stage. Please note that when looking at the top hosts list, there are several things to keep in mind. You may not know the other projects the host may be contributing to. You won't know if the host is crunching 24/7 and has reached its plateau credit-wise. You may not know if the host is crunching the optimum number of concurrent tasks for best performance. You may not know the exact mix of GPUs because only the most powerful type is listed if there are two or more different models from the same manufacturer.

With these points in mind here are some hosts worthy of inspection. I've actually tried to steer clear of hosts belonging to people contributing to this thread because (if you ask them) I'm sure they'll be willing to give you an honest evaluation of their machines. I've also kept away from hosts with high numbers of GPUs. These are way too complicated for newcomers. I think dual GPUs would be perfect for you.

1. #6 - ID=7181095 with dual 980Tis. I believe Jeroen posted about this machine some time ago. Current RAC of 440K.
2. #21 - ID=10192182, listed with dual Tahiti series, probably 7970s, with RAC of 275K. Jeroen used to have a system with dual 7970s and from memory I think he was getting about 320K or more at the peak.
3. #29 - ID=11709032, listed with dual 780Tis and RAC=242K. If there are gamers upgrading to 980TIs you might get a bargain 2nd hand.
4. #61 - ID 3641777 with a single Tahiti series - probably 7970. Current RAC is 175K and the CPU is a lowly Pentium dual core. Compare this one with No. 2 above. A machine with a single GPU usually gets more than half the score of a machine with two such GPUs, as in this case, all other things being equal (which they are not). Just goes to show that you can cherry pick samples to show whatever you want :-).

Experience tells us that people often start out with much enthusiasm. It takes quite a bit of dedication to build and maintain a high performance machine. People's circumstances change, so unless you are absolutely sure you're in this for the long haul, I'd suggest not going over the top with your first machine. Many people find they have better things to do with their time once the novelty wears off.

So, is this machine going to be 'on show' or can you hide it away in the basement somewhere? :-).

Cheers,
Gary.

mikey
mikey
Joined: 22 Jan 05
Posts: 11889
Credit: 1828125638
RAC: 206202

RE: We could speculate

Quote:

We could speculate about whether fans should run clockwise or anticlockwise...

I don't care which way they run, I just set all mine to exhaust out of the case, except those in the front of the case or on the side panel above the cpu, those I set to blow into the case. Most of my cases do not have a front case fan, nor side panels, so mine are all set to suck air out of the case.

John Reed
John Reed
Joined: 23 Oct 10
Posts: 25
Credit: 11079168
RAC: 0

Excellent advice as usual,

Excellent advice as usual, thanks Gary.

I'm in this for the long haul. I used to build my own computers long ago and I'm getting a taste for it again. I stopped because my gamer days ended circa 2000 and complete systems became too cheap to bother with custom builds for what I needed. I am thoroughly enjoying every bit of this planning process and I'm not in a hurry.

It would be a mistake for me to try to get in the top hosts right now. There's too many factors, as you pointed out, around machine placement, heat, noise, etc. And without something in hand to give me context, it'll be difficult to get right the first time. I'm trying to balance out some great earlier advice not to get stuck in perpetual incremental builds and end up with multiple mediocre machines. That's a good tip too. Since I am in this for the long haul, I wouldn't actually mind ending up with two machines. I work in IT, strong interest in these things, these pet projects have always served me well, and I can always find something else to do with it if I have to. I like the idea of an intermediate build.

Yea, I am uncomfortable buying not only a whole system used, but even a used GPU. That, along with not feeling the need to outperform everyone else, I'm even pondering a modest, but all new, build.

Being inexperienced right now, it's added difficulty to plan a build when obtaining used GPUs is uncertain by nature of the uncertain supply(waiting for availability, losing auctions without getting into bidding wars, etc). A single, but new, GPU for around $200 I think is acceptable(so long as it's not so new as not to be supported on Linux). It gives me comfort from low risk and the certainty allows me to plan the rest of the components. A single GPU would also greatly simplify all of the other factors and definitely make for a pleasant build experience. Perhaps if I avoid the temptation to buy components with the intent to expand that same machine in the future, I can benefit from experience while avoiding the incremental build mistake. Build a nice, shiny new, but moderate machine with the right sized components for that GPU, and leave it alone.

I posted those used machines for research. Your reply matches my gut feeling. I get such valuable feedback with every question I ask though so I put it out there anyways.

Quote:
Earlier, I asked you where you were going to house whatever machine(s) you end up acquiring. I think this should be your first decision. If it's going to be 'on display' and part of your everyday family life, you are going to want it to look nice and tidy and perhaps be a topic of conversation when friends/relatives visit. You also wont want it belching a lot of hot air with lots of fan noise. On the other hand, if you have a separate room away from normal living areas where you can close the door and shut out the noise and unsightliness, you have a completely different ball game and the ability to improve performance at a lower total cost. So you need to make that decision.

Both, with some limitations. My family is actually very tolerant of white noise and I'm sure would enjoy a space heater. But I know how loud fans can get and I'm sure there is a limit to that tolerance. I have a garage too where I can run caseless. Summer however, it can hit 90 degrees in there if I recall correctly. I'm guessing I'll be crunching nights only then.

Quote:

I've picked out some hosts from the top computers list for you to think about.


Thanks for the build samples. Again, I don't need to be so high from the start. Even churning out 175K RAC with a lowly dual core would make a fine experience right now. Thinking I'd bump up the CPU of that build, and scale down the GPU to get a new one.

Any further insights still appreciated!

EDIT: The GTX 960 and the some flavors of the R9 are in the $200 price range. Any feedback on those?

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.