should I add a 3rd GPU?

merle van osdol
merle van osdol
Joined: 1 Mar 05
Posts: 513
Credit: 60,724,446
RAC: 0
Topic 197877

I have an Asrock z87 extreme 4 MoBo.
An intel i7 4790K with HT NOT used.
Two amd r9 cards; a 280x and a 270x.

I run 2 wu's on each card.

It has been suggested to me that adding a third gpu with my system would probably not be worth it because the PCIe bus is probably near capacity for Einstein work.

Any comments would be appreciated. Thanks in advance.

merle

What is freedom of expression? Without the freedom to offend, it ceases to exist.

— Salman Rushdie

ExtraTerrestrial Apes
ExtraTerrestria...
Joined: 10 Nov 04
Posts: 770
Credit: 575,326,913
RAC: 185,102

should I add a 3rd GPU?

Your mainboard supports 2 cards in 8x / 8x mode, whereas 3 would run as 8x / 4x / 4x. PCIe 3 4x would hinder a high-end GPU at Einstein, although I can't give you precise numbers.

If your "urge to upgrade" is not urgent, I'd wait for the next AMD chips and maybe replace the 270X with such one. The next big boy condenamed "Fiji" is rumored to have massive memory bandwidth, which would be good for Einstein.

Any particular reason you're not using HT? It's especially good for Einstein@CPU and for threads simply feeding GPUs.

You could probably also increase your contribution by using the iGPU, although it might reduce the throughput of the regular GPUs a bit.

And you'll probably get higher throughput from your 280X by running more than 2 tasks on it.

Edit: I know you didn't ask for the last points.. but combined they may gain you as much throughput as another small GPU.

MrS

Scanning for our furry friends since Jan 2002

merle van osdol
merle van osdol
Joined: 1 Mar 05
Posts: 513
Credit: 60,724,446
RAC: 0

RE: Your mainboard supports

Quote:

Your mainboard supports 2 cards in 8x / 8x mode, whereas 3 would run as 8x / 4x / 4x. PCIe 3 4x would hinder a high-end GPU at Einstein, although I can't give you precise numbers.

If your "urge to upgrade" is not urgent, I'd wait for the next AMD chips and maybe replace the 270X with such one. The next big boy condenamed "Fiji" is rumored to have massive memory bandwidth, which would be good for Einstein.

Any particular reason you're not using HT? It's especially good for Einstein@CPU and for threads simply feeding GPUs.

You could probably also increase your contribution by using the iGPU, although it might reduce the throughput of the regular GPUs a bit.

And you'll probably get higher throughput from your 280X by running more than 2 tasks on it.

Edit: I know you didn't ask for the last points.. but combined they may gain you as much throughput as another small GPU.

MrS

Thanks ETA,
I am going to try a few things you have suggested. I don't use the igpu because I use this computer for a few general tasks and I want a screen that is responsive. I don't run any cpu tasks since the cpu (without HT) is busy (about 67% of the time) with feeding the gpu's. So I'll try using HT and running more than 2 tasks per gpu.

The new amd cards will probably cost a fortune. I doubt if I want to spend that much. My boinc time is also limited by the weather. I only do boinc now from Nov. thru Mar. I live in the south and the air conditioning cost is too much for me.

merle

What is freedom of expression? Without the freedom to offend, it ceases to exist.

— Salman Rushdie

archae86
archae86
Joined: 6 Dec 05
Posts: 3,157
Credit: 7,208,824,931
RAC: 946,291

I lack experience on all the

I lack experience on all the major components of your configuration, but will offer this comment to you and others from my own experience and observation:

It is hard to predict the productivity increment arising from adding a particular GPU to a particular system running a particular application.

I engaged in a card shuffle recently of which one consequence was that I had a GTX 750 available to add to my oldest system--which had a Westmere CPU on a motherboard giving only PCIe 2.0 capability. The existing GPU on that system was a GTX 660.

From many comments posted here, I expected not only that I'd get far less productivity added than a GTX 750 can give when running without other GPUs, but more specifically that it would add less to this system's productivity than had a GTX 750 Ti added to a more modern Sandy Bridge system giving PCIe 3.0 (where the add was to a GTX 660 of the same model). I was particularly influenced by comments here that the Einstein Perseus application is especially external communication dependent in general, and strongly influenced by PCIe 3.0 vs. earlier particularly.

So imagine my surprise when I got a bigger increment from adding the 750 to the Westmere than from adding the (higher clock rate, more operational units) SC 750 Ti to the Sandy Bridge! Granted, it was a pleasant surprise, but I had completely not expected this result.

While I think it likely adding the third card will give you a lot less than it would provide placed in new system, I'll not guess by how much. Maybe you should only try it if you would be happy using the card in a new-build system if it proves to add little or nothing as a third card.

merle van osdol
merle van osdol
Joined: 1 Mar 05
Posts: 513
Credit: 60,724,446
RAC: 0

RE: I lack experience on

Quote:

I lack experience on all the major components of your configuration, but will offer this comment to you and others from my own experience and observation:

It is hard to predict the productivity increment arising from adding a particular GPU to a particular system running a particular application.

I engaged in a card shuffle recently of which one consequence was that I had a GTX 750 available to add to my oldest system--which had a Westmere CPU on a motherboard giving only PCIe 2.0 capability. The existing GPU on that system was a GTX 660.

From many comments posted here, I expected not only that I'd get far less productivity added than a GTX 750 can give when running without other GPUs, but more specifically that it would add less to this system's productivity than had a GTX 750 Ti added to a more modern Sandy Bridge system giving PCIe 3.0 (where the add was to a GTX 660 of the same model). I was particularly influenced by comments here that the Einstein Perseus application is especially external communication dependent in general, and strongly influenced by PCIe 3.0 vs. earlier particularly.

So imagine my surprise when I got a bigger increment from adding the 750 to the Westmere than from adding the (higher clock rate, more operational units) SC 750 Ti to the Sandy Bridge! Granted, it was a pleasant surprise, but I had completely not expected this result.

While I think it likely adding the third card will give you a lot less than it would provide placed in new system, I'll not guess by how much. Maybe you should only try it if you would be happy using the card in a new-build system if it proves to add little or nothing as a third card.

Thanks archae86,
I'll keep this in mind. I too have a second machine with only PCIe 2.0. It has 2 older gpu's on it. I could add a third card to it but I would have to use a riser and hang it somewhere. It seems we often need to just use trial and error to get a final answer.

merle

What is freedom of expression? Without the freedom to offend, it ceases to exist.

— Salman Rushdie

ExtraTerrestrial Apes
ExtraTerrestria...
Joined: 10 Nov 04
Posts: 770
Credit: 575,326,913
RAC: 185,102

Archae, it's the first time I

Archae, it's the first time I read the story of this upgrade of your completely. It's a bit off-topic, but: Sandy Bridge supports "only" PCIe 2, Ivy can do PCIe 3 in the same boards. By adding the 2nd GPU you probably made the board run both GPUs as 8x PCIe 2. Some boards also offer only PCIe 2 4x in the other or the 3rd slot, which is mechanically 16x.

I don't know which slots your Westmere mainboard provides, but these "enthusiast" platforms generally offered at least 2 x 16x PCIe 2. They're not connected directly to the CPU but through the chipset, but at least there's a reasonably fast QPI link between chipset and CPU. So all-in-all I think your findings may actually support the importance of the PCIe link for Einstein :)

merle wrote:
The new amd cards will probably cost a fortune. I doubt if I want to spend that much. My boinc time is also limited by the weather. I only do boinc now from Nov. thru Mar. I live in the south and the air conditioning cost is too much for me.


In this case I think the answer is easy: don't add a 3rd GPU! The costs of running BOINC should never be something to worry about, otherwise you're probably investing too much for this hobby.

MrS

Scanning for our furry friends since Jan 2002

archae86
archae86
Joined: 6 Dec 05
Posts: 3,157
Credit: 7,208,824,931
RAC: 946,291

RE: Sandy Bridge supports

Quote:
Sandy Bridge supports "only" PCIe 2, Ivy can do PCIe 3 in the same boards. ...
MrS


Thank you MrS. I'll stop telling that story as I have been telling it, pronto.

I did look at the motherboard claims before making my statements, but neglected to consider the effect of the actual CPU plugged in.

Just maybe, I might look into the possibilities of dropping an upgrade CPU into that socket, someday.

ExtraTerrestrial Apes
ExtraTerrestria...
Joined: 10 Nov 04
Posts: 770
Credit: 575,326,913
RAC: 185,102

You're welcome :) Out of

You're welcome :)

Out of curiosity: how are the actual lanes set up in your case? GPU-Z can read this out easily.

MrS

Scanning for our furry friends since Jan 2002

archae86
archae86
Joined: 6 Dec 05
Posts: 3,157
Credit: 7,208,824,931
RAC: 946,291

RE: how are the actual

Quote:
how are the actual lanes set up in your case? GPU-Z can read this out easily.


Again I am in your debt--I looked to see what CPU-Z could tell me, which was not what I wanted to know, but GPU-Z does.

Westmere CPU with Asrock X58 Extreme 3 motherboard
both the GTX 660 and the GTX 750 are getting x16 PCIe 2.0

Sandy Bridge
both the GTX 660 and the GTX 750 Ti are getting x8 PCIe 2.0

Haswell
the (only card) GTX 970 is getting x16 PCIe 3.0.

This speaks to another comparison, which was that the Haswell got just about the same performance out of the base model GTX 750 (run alone) as the Sandy Bridge did out of the GTX 750 Ti run alone.

While in this last case x8 vs. x16 was not an issue--I assume both got x16--the PCIe 3.0 on the Haswell probably had the GTX 750 running nearer to true potential--so maybe my belief based on that comparison that folks here may be wasting money on Ti and overclocked variants of the GTX 750 is overstated. I still suspect they may be getting less advantage on the current Perseus application than they imagine.

ExtraTerrestrial Apes
ExtraTerrestria...
Joined: 10 Nov 04
Posts: 770
Credit: 575,326,913
RAC: 185,102

RE: so maybe my belief

Quote:
so maybe my belief based on that comparison that folks here may be wasting money on Ti and overclocked variants of the GTX 750 is overstated. I still suspect they may be getting less advantage on the current Perseus application than they imagine.


You're right, adding more processing power yields diminishing returns if one is limited or bottlenecked elsewhere. GM107 has relatively little memory bandwidth compared to its powerful execution cores, similar to GM204. I can well imagine that GTX750Ti does not achieve 5/4 of the performance of GTX750, in the same way as GTX980 does not achieve 16/13 of the performance of GTX970.

Based on your results we can actually suspect that the performance difference between GTX750 and 750Ti was exactly offset by the performance difference due to the 16x PCIe 3 vs. PCIe 2. I'll leave it up to you if this warrents any further meaurements ;)

MrS

Scanning for our furry friends since Jan 2002

merle van osdol
merle van osdol
Joined: 1 Mar 05
Posts: 513
Credit: 60,724,446
RAC: 0

RE: In this case I think

Quote:


In this case I think the answer is easy: don't add a 3rd GPU! The costs of running BOINC should never be something to worry about, otherwise you're probably investing too much for this hobby.

MrS

MrS,
My financial situation is none of your business. Thank you very much for the advice though.

merle

What is freedom of expression? Without the freedom to offend, it ceases to exist.

— Salman Rushdie

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.