General Lee
Member
Consumer Volta is unlikely to launch before SK Hynix gets GDDR6 going, and that's not before next year.
Consumer Volta is unlikely to launch before SK Hynix gets GDDR6 going, and that's not before next year.
There's no reason why Volta can't launch with GDDR5X, especially the upper mid range one. GDDR5X's highest specced speed is well within that of GDDR6, it's entirely possible that Micron will be able to hit it this year.
I also can't remember a single time when NV waited for some memory tech to launch their new GPUs. They have updated previously launched GPUs with new memory a number of times though.
Take this with a grain of salt, but people on Reddit are pointing out that there was likely multiple tears per frame in the awful Vega Prey demo. This could be a crossfire issue, to do with the unfinished cards.
But it could also be that the framerate was much higher than the projector could output. The question here then, is why didn't they just use a single card? Or show an FPS counter?
Anyway: The wait for Vega continues.
Edit: sources
https://www.reddit.com/r/Amd/comments/6edd3h/amd_rx_vega_computex_pray_demo_fps/
https://www.reddit.com/r/Amd/comments/6ecdfr/slug/di9irs6
I think Reddit was more concerned about needing to crossfire Vega's when Prey zips through 4K in one 1080Ti.Take this with a grain of salt, but people on Reddit are pointing out that there was likely multiple tears per frame in the awful Vega Prey demo. This could be a crossfire issue, to do with the unfinished cards.
But it could also be that the framerate was much higher than the projector could output. The question here then, is why didn't they just use a single card? Or show an FPS counter?
Anyway: The wait for Vega continues.
Not just Reddit's concern. A 980ti gets around 45fps I think. So if that Vega card is between the 980ti and 1080ti, then they should really only need 1. It sent all the wrong signals.I think Reddit was more concerned about needing to crossfire Vega's when Prey zips through 4K in one 1080Ti.
This timetable you suggest for Volta sounds and awful lot of fanboyism without anything backing it up. Sure Nvidia can use older memory in cheaper products, but that's hardly what people are waiting for.
No one's saying Nvidia is "waiting" on memory. They're not releasing anything to replace 1080 Ti so soon after its release.
Suuuure. I've just told you what's backing it up, in addition to the whole lot of other factors. What sounds like fanboyisnm is your apparent inability to read what I've said.
Autumn is not "soon" after 1080Ti release.
If you call that backing up your statements you're delusional. And in terms of GPU generations these days, spring to autumn is a blink of an eye.
This timetable you suggest for Volta sounds and awful lot of fanboyism without anything backing it up. Sure Nvidia can use older memory in cheaper products, but that's hardly what people are waiting for.
No one's saying Nvidia is "waiting" on memory. They're not releasing anything to replace 1080 Ti so soon after its release.
Stage 1.
Of course, I'm delusional and not the guy who expect consumer Volta later than general GV100 availability in 4Q17.
Spring to autumn is 6-9 months (considering that 1080Ti launched in March). 6-9 months is not a "blink of an eye". GP104 launch to autumn is 15-18 months and this is actually in the higher range of a typical NV product update schedule (lower range is about 12 months). If you still can't see it then no reason to continue this discussion.
Of course, I'm delusional and not the guy who expect consumer Volta later than general GV100 availability in 4Q17.
Spring to autumn is 6-9 months (considering that 1080Ti launched in March). 6-9 months is not a "blink of an eye". GP104 launch to autumn is 15-18 months and this is actually in the higher range of a typical NV product update schedule (lower range is about 12 months). If you still can't see it then no reason to continue this discussion.
I think Reddit was more concerned about needing to crossfire Vega's when Prey zips through 4K in one 1080Ti.
Looking forward to Ryzen + Vega APUs no matter what.
It was a risk nVidia didn't have to take.
AMD, on the other hand, being underfunded and an underdog had to gamble.
It seems it didn't quite work out.
My main concern is not that it doesn't take on 1080Ti, but that they are still not able to actually bring product to the market (faster than 1070/1080 is the biggest part of high end market anyhow)
Also, Raja speaking about that there is only a handful of GPU manufacturers because it is so hard to write drivers smells... like:
Lisa: dafuq Raja, why am I not impressed by how our HBM2 500mm2 chip performs?
Raja: Oh, that's... drivers! It rocks, but we don't have good enough drivers (points hands to dudes responsible for software)
GeForce FX 5800.There's no reason why Volta can't launch with GDDR5X, especially the upper mid range one. GDDR5X's highest specced speed is well within that of GDDR6, it's entirely possible that Micron will be able to hit it this year.
I also can't remember a single time when NV waited for some memory tech to launch their new GPUs. They have updated previously launched GPUs with new memory a number of times though.
What you lack is any proof of your suggested timetable for the launch of consumer Volta. Sorry if I don't just take things in blind faith based on no factual information. I've seen it time and time again fanboys have these ridiculous notions of when next gen "should arrive", yet have nothing but vague assumptions based on when previous gens arrived. There's solid information out there from Nvidia's own GTC that GDDR6 is aimed at early 2018, and that's as close to a reliable hint you'll get at this point. Could Nvidia launch some low end Volta before that? Perhaps, but who cares, people talking Volta are expecting a 1080/Ti successor.
The question here then, is why didn't they just use a single card? Or show an FPS counter?
Ask Intel how hard it is. Seriousl.
What you lack is any proof of your suggested timetable for the launch of consumer Volta.
No one is expecting a Ti succesor this year. Nvidia always releases their x80 & x70 cards first with a Ti months later. I could easily see the 1070 & 1080 successors use GDDR5x.
GV104 is not TI.This timetable you suggest for Volta sounds and awful lot of fanboyism without anything backing it up. Sure Nvidia can use older memory in cheaper products, but that's hardly what people are waiting for.
No one's saying Nvidia is "waiting" on memory. They're not releasing anything to replace 1080 Ti so soon after its release.
Look at their compilers, as well as a bunch of other software they write.I can't imagine any other explanation, besides "performance is lacking".
You mean that company that is 5 times bigger than AMD + nVidia together?
Are you sure they'd say "we suck at writing software"?
This. We're talking about a Volta-based cutdown where the "1180" would outstrip the 1080Ti by a few percentage points, and an "1170" that would be positioned in the neighborhood of the 1080. Wouldn't be the first time Nvidia has used that approach on a similar timetable. The "1180Ti" would launch ~spring 2018, with the aforementioned GDDR6.
So that wasn't a great covfefe huh? So I should stay green?
What am I supposed to see? Fantastic compiler?Look at their compilers, as well as a bunch of other software they write.
The key here is that it requires DIFFERENT expertise/experience and many iterations to progress (as well as making silicon for those). In my opinion, of course, I'm not claiming these are facts.Graphic drivers requires more effort than all their other drivers combined. They don't invest in those.
Okay.What you lack is any proof of your suggested timetable for the launch of consumer Volta.
Sure:Was there ever only a 12 month wait between x80 to x80 launch? It's usually longer no?
GDDR2 wasn't really a new memory type, it was basically just an overclocked DDR1 - which is partially why FX5800 failed in power efficiency as this OCed DRAM was dissipating insane amounts of heat. For all what went wrong with the FX series I don't think that DRAM was the reason for its late introduction.GeForce FX 5800.
Not sure why 60 is something to boast about.
What am I supposed to see? Fantastic compiler?
The key here is that it requires DIFFERENT expertise/experience. As well as making silicon for those. In my opinion, of course, I'm not claiming these are facts.
Size of it is vastly irrelevant in the context of monster company like Intel.
GV104 is not TI.
GDDR5x allow 384GB/s for 256bits bus... that only if the manufacturer didn't have a 13/14Gbps GDDR5x.
Maybe if those cards were the 4gb variant and coming under $400? That would be a big deal if that were the case.
Rumors from a while back point to there being a somewhat budget vega card close to 1070 in terms of performance. So if those were the lower end versions that only have 4gb of memory running together in crossfire doing 4k 60fps or well above then that is impressive as it's a little over 1080ti price for 2 cards running together. But all this is conjecture at this point without solid evidence to go by.
Maybe if those cards were the 4gb variant and coming under $400? That would be a big deal if that were the case.
Rumors from a while back point to there being a somewhat budget vega card close to 1070 in terms of performance. So if those were the lower end versions that only have 4gb of memory running together in crossfire doing 4k 60fps or well above then that is impressive as it's a little over 1080ti price for 2 cards running together. But all this is conjecture at this point without solid evidence to go by.
Okay.
Sure:
- 480->580 in 8 months
- 680->780 in 14 months
But then 12 months from 1080 launch is right now so the proper question is - what's the biggest typical period between NV gen update? If we take the longest periods we're looking at:
- 980->1080 in 20 months (not their decision but a result of TSMC's failure with 20nm)
- 780->980 in 16 months
The second one is important because there are many similarities between Kepler to Maxwell and the upcoming Pascal to Volta switches.
So where does that put Volta? Obviously we've already passed the minimal 12m mark so it will be more. May'16+16 months is Sep'17. May'16+20 months is Jan'18.
Those who expect x80 Volta next year assume (for some reason) that Volta will be as late as Pascal was relative to Maxwell (or even more late) - but Pascal was late because of the production process failure at TSMC, and nobody expect Volta to wait for 7nm (which is likely 1H19 for GPU production). Thus why I really don't get people who expect Volta next year. It will most likely launch this Autumn, starting with GV104 cards which will replace 1080 and 1080Ti, as usual.
Not sure why 60 is something to boast about.
Wait what? Wouldn't you need like 12gb of gddr5x at 11-12gbps to get near 400 bandwidth? And isnt a 12gb config restricted to a 352 bus?
I don't think we will see a 16gb 1170/1180. At those price points aren't they kind of stuck with 8gb and 256 bit bus?
Hard to disagree with dr_rus on timings, if 11xx does indeed use GDDR5x.
It seems like too soon only because underfunded underdog AMD is struggling to roll out high end cards (and even that is likely because they bet on hbm2)
I'm pretty sure if those were lesser Vegas they'd mention it. (Raja would in any case)
Also note the "not optimized for Prey" driver (heh).
Interesting timeline comparison. I wonder where the Titan fits in all this.
I'm considering upgrading from my 980Ti to the Volta Titan, it would be such a massive jump
Not sure why 60 is something to boast about.
This whole demonstration was bad. They kinda missed the moment when their hype attempts for Vega started to look weak and actually damage Vega's image. Hype can take you only so far without actual products or at least some solid benchmark results.
I also like how he just plainly admits that they are using heavily optimized by them s/w to demonstrate Vega's capabilities. They should just stop saying anything until the cards will hit the market at this point.
You only needs 12GB RAM for 384bits... 256bits allow 8GB RAM.Wait what? Wouldn't you need like 12gb of gddr5x at 11-12gbps to get near 400 bandwidth? And isnt a 12gb config restricted to a 352 bus?
I don't think we will see a 16gb 1170/1180. At those price points aren't they kind of stuck with 8gb and 256 bit bus?
[snip]
Not sure why 60 is something to boast about.
[snip]
Reads to me like he's saying crossfire Vega is above a 1080 Ti... which doesn't mean anything.Confirmed in a tweet that he meant above 60. Second part: "comfortably above *any* single GPU" is more interesting.
So is he implying above 1080ti then?
It's so complex that it's a contributing reason why Intel gave up on creating a high-end GPU with a revolutionary architecture (read up on Larrabee if you are unfamiliar with it).Yeah, complex, so what. Is it complex to a point of "Intel's GPUs are underwhelming, because Intel doesn't know how to write drivers"? (Raja's point) I doubt it, to be honest.
What's going to be interesting to see is how the Frontier Edition will perform in games compared to the eventual RX Vega. There's no doubt going to be some game benchmarks for FE, but it might end up looking like a disappointment at first if the drivers aren't ready and the gaming version should have higher clocks.
The Prey demo certainly didn't make a whole lot of sense, other than showcasing how AMD will finally have a premium option for both CPU and GPU. I highly doubt ThreadRipper will be useful for gaming, even with CrossFire. Maybe if they had showcased working quad-CF...
Subjecting oneself to quad CF to play games sounds like a form of torture. Yes, I do want to burn myself to death in an oven fueled by my own wasted money and shattered dreams of actual CF or SLI support. How did you know?!
I think Reddit was more concerned about needing to crossfire Vega's when Prey zips through 4K in one 1080Ti.
Confirmed in a tweet that he meant above 60. Second part: "comfortably above *any* single GPU" is more interesting.
So is he implying above 1080ti then?
Well, except Prey runs better on nvidia cards.
Agreed on ThreadRipper.
Can't make any sense of neither this demo nor Raja's comment...