Times (Elapsed / CPU) for BRP5/6/6-Beta on various CPU/GPU combos - DISCUSSION Thread

Mass
Mass
Joined: 14 Dec 11
Posts: 1
Credit: 46862494
RAC: 80934

I hope this is isn't

I hope this is isn't off-topic.

http://einsteinathome.org/host/4622518/tasks

I have completed 17 v1.5 work units atm.
3 have come back as validate error
2 validation inconclusive
Still waiting for the rest to be compared.

Is this something I need to checking to on me end with regards to hardware or just keep crunching some more? GPU has some overclock on it but has never been an issue before.

Here are some result run times.

AMD
Nvidia gtx 580
Core 810
Mem 4100
GPU utilization seems to hold steady 79-80%
Memory controller load 54%
1 device
1 work unit at a time

17 w/u
BRP6-Beta 1.5 run time avg 4,180.6 cpu time avg 709.82

Stef
Stef
Joined: 8 Mar 05
Posts: 206
Credit: 110568193
RAC: 0

Here is my comparison with

Here is my comparison with V1.39

mountkidd
mountkidd
Joined: 14 Jun 12
Posts: 175
Credit: 10890481001
RAC: 5517182

Hi

Hi Gary,

Quote:
Depending on what tools you have, your skill level with those tools and how much time you have available, there are bound to be many different ways to deal with the problem. I'm an absolute dummy when it comes to statistical presentation and how to get the nice graphs that show the problem at a glance - a picture really is worth 1000's of words - so I'll just lay out exactly what I've done to get my own results. I suggest that if you, the general reader - not Peter - think you're a dummy like me, this is a fairly painless way to contribute your information in a useful way.


There is a much less painful way to deal with the data collection & statistics... I will outline the method that I use in a Windows environment.

Microsoft Excel 2010 & newer (2007 w/an MS addin) has a 'Data from Web' facility that allows one to import web data into a spreadsheet. Navigate to the E@H Tasks/Valid page of interest and copy the 'http://data address...' from the address bar. Click on a spreadsheet cell (eg, A10) then click on the Data tab and select 'from Web'. Paste the data address into the address bar of the New Web Query Dialog and click Go. The data will be displayed in the dialog along with several small arrows w/yellow background. Click on the arrow nearest the data of interest, then click on Import. An Import Data Dialog will appear for verification of cell address (where you do want to put the data). Click Ok and poof! the data gets loaded into your spreadsheet. At this point you can select/edit/filter/import-more/do-whatever with the data and build graphs etc. Zero typing of data values & 20 results are loaded at a time!

For those w/o Windows (Gary...) there is still hope! I checked a Kubuntu host w/Libre Office Calc and a similar data import facility exists there as well, although the steps to use it are slightly different. There may even be something similar for PClos...

Gord

archae86
archae86
Joined: 6 Dec 05
Posts: 3145
Credit: 7022934931
RAC: 1835474

RE: Is this something I

Quote:
Is this something I need to checking to on me end with regards to hardware or just keep crunching some more? GPU has some overclock on it but has never been an issue before.


That much validation trouble on just a few units says something is wrong.

The simplest hypothesis that would fit the available facts is that the new application is a bit more challenging for your GPU than the previous work you have not had an issue with, and that your degree of overclock is too much for this work in your current setup. You might do yourself a favor and generate useful information for the rest of us adjusting by the overclock. I actually suggest, instead of inching down, that you go down substantially--like to zero overclock--and if that is clearly better on the first twenty units (should see zero problems--consider accepting one at most) that you start back up--perhaps going half the distance back to your old operating point at each step. If you find a new, slower, point that works about as well as before, please report here what change was required.

Good luck, and thanks for reporting this.

I, myself, have seen one validation error on what is now well over a hundred beta units, where I would have expected zero before. As it happens the GPU in question is not nominally overclocked, but that does not mean it might find itself now near the edge--so if I see appreciably more trouble I'll follow my own advice and back down some.

Jeroen
Jeroen
Joined: 25 Nov 05
Posts: 379
Credit: 740030628
RAC: 556

I have seen a nice boost in

I have seen a nice boost in performance with the latest 1.50 Beta application for my NVIDIA card.

[pre]
CPU: Intel Core i7 920 @ 4.0 GHz
Threads: 4 - HT Disabled
PCIe slot x16 Version 2.0
1st GPU: EVGA GTX 780 Ti
RAM: 3 x 2GB
Concurrency: 1 @ 0.2 CPUs + 1.00 GPUs
CPU Tasks: None
Free CPU cores: 3
OS: Windows XP x64

Application Elapsed CPU time Sample Size
(Parkes PMPS XT) v1.39 6331 1986 25
(Parkes PMPS XT) v1.50 2717 579 37
[/pre]

The power draw of the card is peaking at 103% which is significantly higher than I have seen with any of the previous BRP GPU applications.

Jeroen

Gary Roberts
Gary Roberts
Moderator
Joined: 9 Feb 05
Posts: 5842
Credit: 109376876219
RAC: 35985569

RE: There is a much less

Quote:
There is a much less painful way to deal with the data collection & statistics... I will outline the method that I use in a Windows environment.


The ease of dealing with data collection and the level of pain you will suffer is likely to vary widely from person to person. My formal education was in the sixties and early seventies and PCs (and Microsoft) didn't exist then. I grew up with mainframes and later departmental workstations/servers in a University environment and grew to really dig the Unix way of doing things - eg Vi for editing and Troff for document processing. By the time Microsoft had started to dominate, I was in the fortunate position of being able to delegate word processing type tasks to underlings. I know what the screen images look like but I've never really 'used' any Microsoft Office component.

I wrote a lot of documentation of all types but I always preferred to use Unix/Troff if I had to do it myself. I had good staff who could do things that needed MS Office so I just kept using Unix tools for technical documentation I needed, right up to the point I retired. In the last few years of my working life, my business had Windows and FreeBSD machines - you can probably guess who was using which :-). Just after retirement, I switched to Linux and have been extremely happy ever since. I've never had a good reason to use any of the free office suites that come with Linux. I know where they are in the repos but I've never installed one.

These days I have access to a bunch of iMacs, courtesy of my daughters business. The one I've 'comandeered' recently had a hard disk failure so I worked out how to open it up and replace the hard drive. I set up a live USB external drive with the latest OS X Yosemite and have used it to do upgrades on other machines in the office and a fresh install on the one with the new hard drive. Sure beats 5GB downloads on each machine. I have now this morning installed LibreOffice on it and have actually found the 'Import from the Web' function you talked about. So I'm sitting here looking at a spreadsheet with several imported pages of Einstein results, headings and all, all nicely lined up into rows and columns but without much of a clue as to what to do next :-). I vaguely remember that you can process numbers in colums by creating formulae or using functions and so by clicking on a thing called "function wizard" and doing a bit of judicious reading of the help combined with a generous dollop of trial and error (mainly error), I've now actually managed to get reasonable looking values for MIN(), MAX(), AVERAGE(), STDEV(), and VAR() calculated and stored in the cells immediately below the two particular columns we are interested in. Of course, I am assuming that these actually are the right functions and they are smart enough to ignore non numeric stuff like headings, etc. and double dashes in the columns for 'in progress' results, but at least nothing seemed to complain :-). As Dr Mike would say in his own inimitable style - WHOOPITY GOLF CLAP!!!!

So a big thanks go to Gord for shaming me into dipping my toe into the libreoffice pond. The first host only took about 6 hours to process and finally come up with the numbers for 107 data points. I'll add this new host to my list shortly. Now that I have some idea about what I'm doing, maybe the next one will have a chance of actually beating my manual method, particularly as the sample sizes are going to grow progressively day by day and the manual method will become a bit tedious :-).

EDIT: Host 05 now posted in Results thread. I decided the way I was presenting the mean values away from the corresponding standard deviation and variance values was sub-optimal and confusing. So I restructured the template (just for the final host) and I think it looks better now. I'm sorry if this causes any inconvenience for anyone. If there are no cries of anguish, I'll probably go back and fix the first 4 hosts as well.

Cheers,
Gary.

Bikeman (Heinz-Bernd Eggenstein)
Bikeman (Heinz-...
Moderator
Joined: 28 Aug 06
Posts: 3522
Credit: 686005872
RAC: 601570

RE: I have seen a nice

Quote:

I have seen a nice boost in performance with the latest 1.50 Beta application for my NVIDIA card.

[...]
Jeroen

Interesting, this confirms my suspicion that NVIDIA cards will benefit much more from the recent optimization than AMD cards.

People who use both NVIDIA and AMD cards and do so on E@H and other GPU-projects might be able to give a rough indication whether BRP6-Beta has leveled the playing field between NVIDIA and AMD, in the following sense:

For every project P_i, you will see that a given NVIDIA card performs at a factor of k_i compared to some given AMD card of yours. How does this factor compare for your other projects with the factor for E@H?

Cheers
HB

AgentB
AgentB
Joined: 17 Mar 12
Posts: 915
Credit: 513211304
RAC: 0

I have just started to run a

I have just started to run a few of the Beta tasks and once i get rid of a few instabilities, i will post some details after a few days.

I am changing a few things at the moment so my figures will be variable but some immediate comments.

I am running two gtx-460s with a i3 on linux.

The CPU resident memory size of the beta tasks are much larger 119 vs 72MB. The GPU memory size seems the same.

The tasks are completing in about 75% of the time

The reported CPU time is massively different (less) although I have never put any value in that metric.

AgentB
AgentB
Joined: 17 Mar 12
Posts: 915
Credit: 513211304
RAC: 0

RE: The CPU resident

Quote:

The CPU resident memory size of the beta tasks are much larger 119 vs 72MB. The GPU memory size seems the same.

Just to be more precise, both grow in size as the run progresses, but the beta more so.

The second GPU (PCIe1 slot x8) card is very close to the PCIe2 slot x16 performance. (about 10% slower)

This is a very significant improvement (and temperature increase) it is looking like run times dropping by 50% or more, and the PCI Bandwidth utilization is reported at 1%!

Wonderful.

Gary Roberts
Gary Roberts
Moderator
Joined: 9 Feb 05
Posts: 5842
Credit: 109376876219
RAC: 35985569

I've added an improved

I've added an improved template with placeholder strings containing suggestions of what information to provide, in the opening post of the RESULTS thread. There are a set of step by step instructions on how to use the template. If you care to browse the opening post, you should get a good idea of what I've done. Please feel free to offer criticisms/suggestions for improvement if you see something you don't like.

I've also added HOST 06 to the bottom of the list.

Cheers,
Gary.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.