The Nvidia Pascal family cards (such as GTX 1060, 1070, and 1080) were a substantial step forward in power efficiency on arrival just about two years ago. No so many months ago, the more capable ones had been bought up by blockchain miners, so supply was limited and prices from those offering delivery were double list or more. (This was not Bitcoin, as the efficiency crown for that mining task had moved on to more customized silicon).
I just happened to look at an availability tool a week or three ago, and noticed that the Pascals were rather broadly back in stock. I'm not sure whether the particular mining opportunities that had made them scarce are now serviced by other silicon more efficient for the task, or whether the widely publicized declines in various cryptocurrency prices lowered estimated returns on new purchases too much to justify them. Some stock market gossip has suggested that Nvidia card suppliers may be facing a substantial inventory overhang from the sudden drop in crypto related purchases.
I've just seen a credible-seeming rumor, strongly suggesting that Nvidia will launch a Turing series of GPUs (probably with product names such as 1170, 1180) in early August, with potential reviewers having received invitations to marketing activities at Gamecon.
I have zero opinion on whether the change from Pascal to Turing will be good or bad for Einstein use.
The true badge of success would be for Gary Roberts to do a major fleet upgrade with Turing cards. I don't expect that, and Nvidia probably would have underpriced the cards if that happened.
Copyright © 2024 Einstein@Home. All rights reserved.
There was the announcement of
)
There was the announcement of custom ASIC silicon for Ethereum and the like crypto currencies a month ago. I think that might have had something to do with the drop-off in card demand along with the depressed prices of coins.
Curious about your rumor. I have seen just the opposite, no new cards until next year until they can work off the excessive inventory of Pascal chips.
Keith Myers wrote:There was
)
I've seen that one also, on an investor's rumor mill forum, and think it reflects poor understanding of the actual way businesses do semiconductor product cycles.
The more concrete reason to doubt near-term release is that the Nvidia CEO himself not long ago threw cold water on the idea of an imminent announcement, but the language he used was generic enough to accommodate a wide range of actual truth. A month ago, when asked about a Turing announcement in a public forum, he replied "It will be a long time from now. I'll invite you, and there will be lunch. "
The specific article I alluded to is posted here:
https://wccftech.com/nvidia-launching-next-generation-graphics-cards-at-gamescom-2018/
Form your own opinion--better yet, comment here on what you consider the strengths and weakness of that report.
And yes, I am aware that there has been more than one previous specific rumor of a Turing release which proved false.
The ramps of GDDR6 memory
)
The ramps of GDDR6 memory says there will be new cards out soon. I'm hoping it will bring a bit of efficiency improvement to crunching. Even a 10% performance boost at 10% less power would be worth getting hid of my maxwell cards. NV has basically made a x70 card about equal to the the previous gen x80 card.
But you fail to realize that
)
But you fail to realize that GDDR6 is NOT specifically targeted at gpu cards. It has multiple use cases.
Going Beyond GPUs With GDDR6
There are lots of other uses for GDDR6 and I can see that the new push for AI in automotive platforms is where the current production is going right now. And GDDR6 is just started to ramp up, nowhere near normal production run capacity.
Are you referencing this
)
Are you referencing this quote in the article?
“one of [the] necessary memory solutions” for a diverse set of use cases, including artificial intelligence, virtual reality, self-driving cars and 4K displays."
Because those are applications using graphics cards and those are not not typical graphics cards are not using as much as a graphics card. G in GDDR is for graphics.
GDDR production requires specific timing to meeting the demands of graphics cards. Other types of applications can wait on production. All 3 major memory manufactures wouldn't be ramping production at the same time for other small uses of GDDR6.
What I meant was that it was
)
What I meant was that it was not targeted specifically at Consumer graphics cards. Lots of specialized graphics products needed for enterprise, supercomputer, AI and automotive automated driving systems.
As far as GDDR is concerned
)
As far as GDDR is concerned there is no difference whether it goes into consumer or professional products. The new NV chip is triggering the production ramp of a new GDDR memory. This will be the main reason for it's release. The article and thread is about NV's next gen product and that is the driver for GDDR6. Nothing else. Something else may end up using it but if there was no new GPU generation there would be no GDDR6 ramp.
You may have a point. The
)
You may have a point. The usual YT tech channels have been mostly saying to not expect any new GPUs this year.
An interesting financial analysis of Nvidia over at Seeking Alpha produced this snippet.
Along with an image of the engineering sample of a supposed GTX 1180.
Why would they need 4 fans on
)
Why would they need 4 fans on that card? Second, when has nvidia ever used 3-8 pin power connectors on a commercial GPU? The only one in recent memory was the KFA2 Geforce 1080Ti. Even the GTX 690 only used 2-8 pins.
Well notice the actual GPU
)
Well notice the actual GPU silicon is missing. The fans appear to be solely used for cooling the VRM's. I too wonder about the 3 8-pin power connectors?