Pascal again available, Turing may be coming soon

mmonnin
mmonnin
Joined: 29 May 16
Posts: 291
Credit: 3391016540
RAC: 2834040

There's quite a few GDDR

There's quite a few GDDR chips around the missing GPU chip, maybe 1GB each. I'd guess it's just a test sample card where they can push high current through those VRMs. Typically most of the VRMs are on the right side and I'm not sure where the air is even going pointed straight at the board. 

archae86
archae86
Joined: 6 Dec 05
Posts: 3157
Credit: 7219994931
RAC: 957543

I agree that the posted photo

I agree that the posted photo looks to me more like an engineering test bed setup than a production board early test article.  Mind you, my personal experience with production board development is pushing toward four decades ago, but I have been there and done that.

 

archae86
archae86
Joined: 6 Dec 05
Posts: 3157
Credit: 7219994931
RAC: 957543

Another fresh leak, published

Another fresh leak, published at the same web site as the one that prompted me to start this thread.  This one claims to be based on a leaked vendor e-mail that contains alleged current expected release dates and sequence from four 11xx cards, presumably Turing.

https://wccftech.com/nvidia-geforce-11-series-gtx-1180-gtx-1170-gtx-1160-release-date-leak/

Sometimes they start low, sometimes high.  This one starts near the top, with an 1180 purported to release on August 30.

I'm not betting my money nor holding my breath--just sharing rumors.

 

 

archae86
archae86
Joined: 6 Dec 05
Posts: 3157
Credit: 7219994931
RAC: 957543

OK, so after leak after leak,

OK, so after leak after leak, and conflicting rumors as to product ID's, family name, and introduction dates, today I got an e-mail from Newegg with specific listings for GTX 2080Ti and GTX 2080 cards from multiple vendors.  So I guess that makes it official!.

1. Family name does seem to be Turing, as long-rumored.

2. The gamer cards seem to be using ID's like 20n0.

3. unlike the bottom-up introduction of the Maxwell cards (ah, the 750 of fond memory), this time they seem to introducing higher-end cards first.

4. In the highest cards there is a lot of emphasis on better hardware support for ray-tracing, with the specific intent of allowing the real-time generation of more compelling graphics for games.  Obviously, one could in fact compute ray-tracing with the previous cards (or any CPU, for that matter), but far _slower_.  This will allow them to claim amazing performance improvement for those cases.

I'd be surprised if the Einstein applications, built on the software base they use, get anything beneficial out of the ray-tracing support hardware at all, especially at first, or even ever.  I'd also be surprised if the highest members of the family really give enough extra performance over (future introduction) mid-range members to pay off here.

I've not read enough on performance improvement claims nor on power efficiency claims to have a well-founded opinion on whether the mid-range cards might make a compelling case for themselves here (as did the Pascal generation  1060, 1070... in my opinion).  But I think that is the way to bet.

Actually I'd love it if one of you got a 2080 and a 2080 Ti as soon as possible, ran Einstein under well controlled conditions allowing comparison for speed and power consumption with Pascal cards, and posted the results here.  But the prices are quite high, and with my stated doubts, I'll probably wait for my first trial until I see mid-range cards available.

Anonymous

Nvidia 2080 TI at Bestbuy: 

Nvidia 2080 TI at Bestbuy:  $1199.00.  Out of my league.

archae86
archae86
Joined: 6 Dec 05
Posts: 3157
Credit: 7219994931
RAC: 957543

It seems the first two cards

It seems the first two cards out will be designated RTX 2080 ti and RTX 2080, with princely prices.

Multiple sources assert that 2070 and 2060 cards will come along within a month or two later.  

Some ambiguity as to whether the 2060 will carry the RTX prefix (as to which I assume the R is a reference to the Ray-Tracing enhancement).  Possibly it might be better for Einstein use if it does not, as perhaps the price would not carry a premium for a capability I think we are unlikely to find beneficial.

Base price for non Founders Edition 2070 is asserted to be $499, so one hopes the 2060 will be considerably less.

On the alleged release sequence, I currently imagine I might buy a 2060 promptly on availability.  I have a box currently running a single 1050 which would surely be considerably upgraded by that.  As it is a single graphics card box, I could get a pretty direct comparison.  My other two boxes run two cards each, which would make comparisons rather muddier.

DanNeely
DanNeely
Joined: 4 Sep 05
Posts: 1364
Credit: 3562358667
RAC: 0

I'll probably be buying a

I'll probably be buying a 2080 or possibly a 2080Ti before the end of the year.  I game and my 1080 doesn't really have the power needed at 4k.

 

That said I also suspect the new ray tracing and turing cores will do nothing for E@H performance.  The latter at least sound like a specialized form general purpose compute though, so hopefully some other project will come along that can utilize them without also wanting the rest of the GPU hardware as well.

cecht
cecht
Joined: 7 Mar 18
Posts: 1533
Credit: 2898692235
RAC: 2136649

Here are some comparative

Here are some comparative specs on the new GeForce cards from the Tech PowerUp GPU database

RTX model Cores TMUs ROPs clock, MHz GDDR6, MB Bus, bits FP32, GFLOPs FP64,  GFLOPs TDP, W base $
2060 1536 128 32 1410 8192 128 4,792 149.8 120 n/a
2070 2304 144 64 1410 8192 256 7,465 233.3 175 499
2080 2944 184 64 1515 8192 256 10,068 314.6 215 699
2080 Ti 4352 272 88 1350 11,264 352 13,448 420.2 250 999

Ideas are not fixed, nor should they be; we live in model-dependent reality.

archae86
archae86
Joined: 6 Dec 05
Posts: 3157
Credit: 7219994931
RAC: 957543

Today was embargo release day

Today was embargo release day for initial reviews of the 2080 and 2080Ti.  They do better on existing games than the extreme pessimists forecast.  They are awfully expensive.

One element of my personal situation is that the CPU support characteristics of the Windows OpenCL implementation are considerably less friendly to multi-card systems than was the previous CUDA code.  Of my three Einstein-contributing systems, the two dominant ones are current dual-card Pascal generation systems.

If Turing is a major generational improvement in Einstein productivity per unit power consumption (as were Maxwell and Pascal generations before it), then I'm thinking about possibly making the following substitutions



 System Was       Becomes Comments
 Stoll7 1050+1060 2080    May increase power, more Einstein
 Stoll8 1050      2070    More power, much more Einstein
 Stoll9 1060+1070 2080Ti  Crazy expensive. Only warranted with data
 

This would all be far better informed by actual Einstein results in service.  I may supply that need, both for myself and others, by going ahead with a 2080 FE purchase quite soon.  I really don't know what to expect, either for Einstein performance or power consumption.

 

Keith Myers
Keith Myers
Joined: 11 Feb 11
Posts: 4963
Credit: 18709277611
RAC: 6328777

Read the AnandTech benchmarks

Read the AnandTech benchmarks and pay attention to the Compute portion of the benchmarks.  The Folding@Home, the N-body Physics and the OpenCL Geekbench4 results are up and are pertinent to distributed computing tasks.  Geekbench4 results are 200% better than Pascal.

Actual finished tasks at each project will determine the real benefit of Turing.

 

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.