end of XP, Maxwell and such

archae86
archae86
Joined: 6 Dec 05
Posts: 3157
Credit: 7212814931
RAC: 974023
Topic 197315

With Microsoft not yet showing signs of blinking on the end of XP support, I have reason to build a new machine to replace my last XP host a few months into 2014. This is a general-use machine, mostly used by my wife for Internet browsing, light MS Office work, and much Solitaire, with a secondary mission of producing Einstein results.

The "computer table" in which the machine lives somewhat limits ventilation, and I am near my preferred household power limit, so I have so far been thinking of using a 2-core late-series Intel CPU (i3-3225, last time I looked), with, if I were building it today, an Nvidia GTX650 card, as to which Gary Roberts has provided good evidence of a reasonable trade of appreciable output with good economy in both purchase cost and power.

As always, the dangling offer of future products gives pause. In this case, I've seen a single marketing slide for the forthcoming Maxwell-series Nvidia parts which implies a spectacular jump in processing power per watt for CUDA applications, while the couple of other similar slides in circulation lacking the CUDA notation show much more modest improvement over the current generation.

Does anyone here know anything about Maxwell behavior for Einstein applications? Or even a good reason to suspect?

I, personally, won't be interested for this build in their higher-end offerings, so this would be an interest in the sub-100 watt, sub $150 end of the range.

If it seems likely that a Maxwell in that range really will be wonderful, possibly I'll just build the machine without a graphics card for use through the summer, then hope to add a suitable Maxwell card in the fall when my household (and room) power budget for computing goes up as cooling requirements drop.

Gary Roberts
Gary Roberts
Moderator
Joined: 9 Feb 05
Posts: 5872
Credit: 117254645279
RAC: 36204539

end of XP, Maxwell and such

Quote:
... so I have so far been thinking of using a 2-core late-series Intel CPU (i3-3225, last time I looked), with, if I were building it today, an Nvidia GTX650 card, as to which Gary Roberts has provided good evidence of a reasonable trade of appreciable output with good economy in both purchase cost and power.


Whilst I'm still very happy with the GTX650 (the ones I have are still producing very well - RACs > 30K) I'm now playing with AMD HD7850s which seem to be even better. One of my local computer stores is having a special on 2GB MSI cards running at a core clock of 900MHz. The price is what got my attention - $AU149 which translates to around $US135 at current exchange rates.

The processor I chose is the i3-3240 which is a Ivy Bridge HT dual core running at 3.4GHz and costing $AU130. The motherboard is a basic H61 chipset Asrock costing $AU40. I've built this machine in the last couple of days and I'm very happy with the performance.

It was running BRP5 4x on the GPU and 2 FGRP2 tasks on the CPUs. With 4 HT cores, 2 virtual cores are reserved automatically for GPU support. A batch of 4 GPU tasks were taking around 5.25hrs. I've now set the available cores to 75% and this freeing up of a further virtual core has dropped the GPU time to a tad under 5hrs. The machine should end up with a RAC of close to 65K.

The most pleasing aspect is the power consumption. Years ago, when I was running a whole bunch of Tualatin PIIIs, the machines I bought at auction came with 175watt Delta PSUs which could do 100W on the 12V rail. When I upgraded from PIIIs to Q8400 quads, I replaced these PSUs with 300W versions (OEM Seasonic) which I acquired for about $13 each. On my latest GPU builds I've found I can go back and reuse the old Delta units, but use two in the one case. They are the small form factor SFX style and one powers the motherboard whilst the other powers the GPU. At the wall, under full load, the machine draws just over 150 watts. I haven't tried to measure each PSU separately but my gut feeling is that they each provide roughly half. Even after running for days at full load, the PSUs are barely warm to the touch. The mobo PSU is slightly warmer so I guess it's supplying slightly more than half.

Others will have to comment about Maxwell. However, if there are any good run-out specials on HD7850s in your area, it might be worth considering. My machine is running Linux with a full KDE desktop and even under full crunching load, it feels very snappy and responsive. I looked at the GPU stats earlier this morning. The temperature was 59C and the utilization was 96%. It's almost the middle of summer here in tropical Brisbane and most of my machines do labour at this time of year (none are air-conditioned) - but, apparently, not this one. I'm surprised at how cool this machine seems to be running.

Cheers,
Gary.

archae86
archae86
Joined: 6 Dec 05
Posts: 3157
Credit: 7212814931
RAC: 974023

RE: The processor I chose

Quote:
The processor I chose is the i3-3240 which is a Ivy Bridge HT dual core running at 3.4GHz


So your processor choice is a close cousin to my current candidate i3-3225 in that both are Ivy Bridge dual-core HT processors of 55W "TDP". The 3225 has HD4000 internal graphics, which I currently understand to be of modest (far under capable graphics cards, but more than nothing) use for one or more Einstein work type, while the 3240 had the HD2500 internal graphics, which I understand to be plenty to drive the monitor for non-gaming work, but not currently useful for any Einstein work.
[pre]
i3-3240 i3-3225
Ivy Bridge Ivy Bridge
dual-core HT dual-core HT
55W TDP 55W TDP
3.4 GHz 3.3 GHz
HD2500 graphics HD4000 graphics
$US121 $US140[/pre]
Practical question: do you drive the system monitor from the CPU chip graphics, or from the AMD graphics card? I have a Sandy Bridge i5-2500K host, which also has Intel on-chip graphics (HD3000) not suitable for Einstein, and on that host I drive the monitor from the CPU graphics, hoping for slightly less hesitation in user response, and slightly better Einstein performance, than if I plugged the monitor into the GTX660 I have on that system.

Gary Roberts
Gary Roberts
Moderator
Joined: 9 Feb 05
Posts: 5872
Credit: 117254645279
RAC: 36204539

RE: ... Practical question:

Quote:
... Practical question: do you drive the system monitor from the CPU chip graphics, or from the AMD graphics card?


I'm driving the monitor (widescreen at high res) from the 7850 which is why I was pleasantly surprised about how snappy it felt. When I finish playing, the machine will normally run headless - unless I end up liking it so much that it becomes my daily driver :-).

I've had enough issues getting my preferred Linux distro + AMD GPU to run at 'proper' performance without trying to complicate things further by adding an Intel GPU to the mix. Maybe when I decide to try out Haswell (or whatever comes after that) I might get a bit more adventurous :-). My thinking has been that AMD GPUs (which in themselves have really good performance) suffer from needing a lot more CPU support than nVidia. I've had this sneaky feeling that using the internal GPU would likely cause too large a drop in external GPU performance. I'm basically waiting for someone else to do the experiments :-). Horror stories like that of Arvid don't tend to make me want to jump in the deep end :-).

Cheers,
Gary.

tbret
tbret
Joined: 12 Mar 05
Posts: 2115
Credit: 4861254633
RAC: 36453

The problem with waiting for

The problem with waiting for Maxwell is that we don't know when Maxwell will ship and they usually start with the high-end models when they do start. You might not get a reasonably priced Maxwell for many months after the top-of-the-line cards start to ship.

The problem with paying more and buying Maxwell early is that we still don't have applications that take full advantage of Kepler. How long after Maxwell is in the field will it be before we've got a fully optimized application?

It may be years before Maxwell owners get to experience applications that will deliver on the promise of "more work using less electricity."

I'm still running some old GTX 460s and they aren't being embarrassed by the 660Tis I bought because the "new architecture" held promise. Running four-version-old CUDA applications isn't delivering on that promise.

If I were buying today, in that price range, I also think I'd have to conclude that a HD 7850 looks like a good deal. I compared cards in that price range (and even above) and decided to buy a couple. Then I realized I needed more video cards like my bathroom needs a wildebeest and decided against buying anything.

Gary Roberts
Gary Roberts
Moderator
Joined: 9 Feb 05
Posts: 5872
Credit: 117254645279
RAC: 36204539

RE: .... Then I realized I

Quote:
.... Then I realized I needed more video cards like my bathroom needs a wildebeest and decided against buying anything.


Gotta luv that mental image of a stampeding wildebeest in your bathroom ... :-).

Perhaps you could do a 'dry run' at Pamplona first and then you should be able to handle several ... :-).

Cheers,
Gary.

archae86
archae86
Joined: 6 Dec 05
Posts: 3157
Credit: 7212814931
RAC: 974023

tbret wrote: How long after

tbret wrote:
How long after Maxwell is in the field will it be before we've got a fully optimized application?


"Fully optimized" probably never. But your post made me ponder the meaning of the publicity graph showing far better improvement in CUDA flops/watt than their generic graphs. Most likely it means that they hope that new feature content in the Maxwell, if fully exploited, would give additional benefit far above the basic improvement coming from modestly improved design and later semiconductor process.

As a former long-term Intel employee, I am well aware that field experience often gives a far lower average improvement from functionality tweaks than the early published claims (and I am also aware of cases in which both marketing and engineering management published more optimistic views while suppressing more pessimistic views--including my own).

So, subject to revision, here is the plan: I start serious component selection in the new year, buy parts for a no add-in graphics card system by March, get it fully commissioned and in service by XP drop day, and review the moderate price, moderate power graphics landscape about September 1. By then the actual performance of early Maxwell higher-end products should be public, including actual Einstein experience, and the visibility of the lower product entries should be much better than now. Or it will be clear that the whole line has suffered major delay.

Meanwhile I like Gary's CPU choice better than mine if it is going to turn out that the HD4000 graphics are not useful to me for Einstein, so I'll need to watch the evidence on that subject much more carefully. For $20, and the loss of 0.1 GHz, they don't have to be wonderful to pay their way, but they need to beat useless, and decent incremental Einstein output for the extra watts. My guess is that I'll go Gary's way.

I'd prefer to opt out of the wildebeest in the bathroom, even if not stampeding.

Gary Roberts
Gary Roberts
Moderator
Joined: 9 Feb 05
Posts: 5872
Credit: 117254645279
RAC: 36204539

RE: Meanwhile I like Gary's

Quote:
Meanwhile I like Gary's CPU choice better than mine if it is going to turn out that the HD4000 graphics are not useful to me for Einstein, so I'll need to watch the evidence on that subject much more carefully.


I've not really been looking at Haswell previously because of higher prices for apparently not much gain in performance. In particular, mobos have been quite a bit more expensive last time I looked. Admittedly, this was a while ago.

So, I spent a bit of time last night having a browse and found that prices have come down a bit from what I remember. I could get a basic H81 board and an i3-4130 (3.4GHz and HD4400 graphics) for about $20 more than the IB version previously mentioned. To go to the i3-4340 (3.6Ghz HD4600 graphics) would be a further $40 on top. From what I remember of posted comments about HD4x00 graphics, you need a free CPU core at least to support a single GPU task and that it's possible to to run two tasks in parallel but you don't gain from doing so. However, if you don't have an external GPU, the Intel GPU tasks run well enough to add something like 7-10K to your RAC - so quite worthwhile.

I'm going to build at least one more 7850 GPU system so I might spend the extra $20 just to see how much (or how little) extra performance can be squeezed out of the 7850 when running under Haswell.

Cheers,
Gary.

archae86
archae86
Joined: 6 Dec 05
Posts: 3157
Credit: 7212814931
RAC: 974023

RE: I'm going to build at

Quote:
I'm going to build at least one more 7850 GPU system so I might spend the extra $20 just to see how much (or how little) extra performance can be squeezed out of the 7850 when running under Haswell.


You may find a power consumption benefit as well. It took a long time, but the "power matters" message struck home at Intel quite a while ago, and after long propagation delays, is clearly arriving in customer hands with both process and design attention.

My impression is that posted TDP is not very tightly correlated with actual power consumption of a given SKU.

archae86
archae86
Joined: 6 Dec 05
Posts: 3157
Credit: 7212814931
RAC: 974023

Gary Roberts wrote:I've not

Gary Roberts wrote:
I've not really been looking at Haswell previously because of higher prices for apparently not much gain in performance.


I missed even seeing the Haswell options in my previous looking. At this minute I'm inclined toward the Haswell i3-4130 on an Asrock Z87 Extreme3 on an "if I were ordering right now without further study or change in the market basis".

As I'm a bit of a power nut, and I view the long-term Einstein performance as mostly coming from add-on graphics, I actually liked that a review I found called that Asrock board lowest in power, but slightly slow in CPU performance compared to some others.

I'm out of the overclocking game, so the Haswell criticism that design and packaging choices make it less overclockable don't bother me.

tbret
tbret
Joined: 12 Mar 05
Posts: 2115
Credit: 4861254633
RAC: 36453

RE: I'd prefer to opt out

Quote:

I'd prefer to opt out of the wildebeest in the bathroom, even if not stampeding.

I'm afraid to go in there. I listen at the door and there's no sound.

I think he may be up to something gnu.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.