• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

nVidia RTX 2080 Ti/2080/2070/2060 Offical Announcement - Gamescom Conference Live Today 6PM CEST/12PM EDT/9AM PDT:

ruvikx

Banned
What an absolute minuscule change that brings nothing to gameplay while having to sacrifice 1440p/4k and 60fps.

"Nvidia does what AMDon't!"

Aka Sega's Blast Processing... 2018 edition. Seriously though haven't they learned anything? Their hairworks tech also had lots of "ooh-ahh" demos... until people realized the performance hit wasn't worth it. It's similar here, especially because it's just one element (shadows), i.e. there's a whole load of other stuff which needs to be improved across the entire screen for it to be a truly impressive 'generational leap'.
 

Three

Member
Wake me up when we have a single card solution for 4k 144 stable fps.

By the time that card comes out we will have 8k 240hz monitors. There is no point of chasing an ever moving goalpost. Graphics cards are never going to hit the bleeding edge of other tech advancements. I am underwhelmed by this graphics card and raytracing tech so far though like you are.
 

Redneckerz

Those long posts don't cover that red neck boy
I am actually legit amazed a lot of folks can't comprehend how impressive this stuff is. Note: It is supposed to not be noticeable, and the only way it does in Battlefield is because its exaggerated, whereas in Metro you only notice it because there was a comparison.

True reflections can have implications for new gameplay though, especially for puzzles in adventures where you can have those crazy mirror effects you see at exhibitions (Hall of Mirrors, and such).
 

thelastword

Banned
33c0455e21f1c7b4ce63968784ec2738d2ff67d110d54f8b52d9877ed5c77d09.png
Pretty much the conference.....

when 500 bucks is entry level hahaha

these nvidia charts are always nonsense. cant wait on the golden samples benchmarks they gonna ship out towards those review outlets for totally free ofcourse and early.

It's Nvidia after all, shady as hell company.
Money post.

In their chart they compare a GTX1080 against a RTX2080, the 1080 has 320Gb/s whilst the 2080 has 448Gb/s similar to a 1080ti. So what do they do, they compare at the rez which requires more bandwidth (4k), so they could make GTX1080 look shitty. Was 1080 ever a 4k card? no it wasn't....Yet, I have no bones about 4k tests, because this is where we're heading into with traditional rendering.......but they didn't pitch traditional rendering at 4k 60fps did they?. They pitched raytracing for 2hrs, not at 4k rez, but at 1080 and sub 60fps....

So, I've mainly been asking, why not show the settings for the games tested on the graph. Is it ultra, is it high, does it have AA? All I saw was 4k rez wiith HDR, but why HDR? because everybody knows pascal blows with HDR ON against the competition or even against NV's newest cards. So at 4k rez, GTX 1080's limited bandwidth will take the tumble, with HDR on, even more so, because everybody is playing modern games at 4k with HDR on their 1080's ;) Two areas where the 1080 can suffer heavy perf loss...

I'm just confused by Nvidias strategy here, but apparently they can do whatever they want now: There is no competition in the high end sector after all.
I've heard a youtuber say recently that NV is going to be the best card, it's a forgone conclusion, so they could innovate and nothing could shake them. People said the same thing about CPU's, yet AMD was much further behind intel in the CPU space, even moreso, than they are from NV in the GPU space.

Vega was not fully realized in-terms of enginnering due to a manpowershift, but that won't be the case with NAVI as they are getting good R&D dollars for NAVI collaborating with Sony on that project. They now have Vega at 7nm and giving Vega the love it needed on first light. So they can very well launch a 20.9TF Vega this year, open up their own RT solution and offer folk a 4k 60fps solution with HDR, CB, FP16 and even expand on Vega features.

At this point, AMD is not necessarily in a bad spot with their GPU's, mining helped them make some bank with Vega, their Rx580's outperform the 1060 on many games, has more ram and is now priced even cheaper than a 6GB 1060. Their 4Gb 580's destroy 3GB 1060's and the 570 runs all over the 1050ti. The Vega 56 is a very good card at it's current price in the Vega lineup, and if you can get a sapphire based 56 or even 64, they are the best AIB vega cards for performance, bar none, all the others are jokes in comparison.

Yet I think, AMD now has a chance to topple NV in perf and rez and even have an RT solution whilst at it. NV's best card was 11.3TF and now their best next gen card is at 13.4TF, that's not a huge boost. Clearly their focus was on RT cores and tensor cores, but everything looks a bit underwhelming with low perf and low rez there (raytracing push).......A re-engineered Vega will eliminate all issues they had on first release; unfinished engineering, high wattage, low clocks......Clearly, they will sort that out with Vega 7nm, and even provide a better cooling solution out the gate...and there's no way NV will beat them on the rez and fps front with turing if they do.....Keeping in mind that the archaic DX11 is going the way of the dodo and everybody is on the vulkan train now, which is where AMD arch excels.....Intel and AMD are fully commited to vulkan, even NV has no choice and are on the vulkan train too....

I give everything a chance, so we will see how it pans out, but I think AMD is in a good position to make a move and gain lots of marketshare here.........They don't have to follow NV, they can follow their own vision and right now it's looking gelled with the high rez, high fps transition. Not forgetting games development will be mostly led on AMD hardware due to consoles in the coming years. It's also indicative of AMD's plan to push their hardware to developers even on the PC side to ease the transition (perhaps they knew about NV's focus on turing).....They have recently shipped many Vega and Ryzen kits to PC developers just the same, to get these devs more familiarized and tuned to AMD hardware and arch with the new api's taking shape and gaining momentum. PS5 development will only make this easier and extend this. I think they realized they had a chance here for more than a minute, and are putting everything in place......They are already at 7nm, whilst the other teams struggle to get their nodes on or try to do things which are not necessarily ready for primetime. It will be an interesting couple of months, that's for sure....
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
Pretty much the conference.....

Money post.

In their chart they compare a GTX1080 against a RTX2080, the 1080 has 320Gb/s whilst the 2080 has 448Gb/s similar to a 1080ti. So what do they do, they compare at the rez which requires more bandwidth (4k), so they could make GTX1080 look shitty. Was 1080 ever a 4k card? no it wasn't....Yet, I have no bones about 4k tests, because this is where we're heading into with traditional rendering.......but they didn't pitch traditional rendering at 4k 60fps did they?. They pitched raytracing for 2hrs, not at 4k rez, but at 1080 and sub 60fps....

So, I've mainly been asking, why not show the settings for the games tested on the graph. Is it ultra, is it high, does it have AA? All I saw was 4k rez wiith HDR, but why HDR? because everybody knows pascal blows with HDR ON against the competition or even against NV's newest cards. So at 4k rez, GTX 1080's limited bandwidth will take the tumble, with HDR on, even more so, because everybody is playing modern games at 4k with HDR on their 1080's ;) Two areas where the 1080 can suffer heavy perf loss...

I've heard a youtuber say recently that NV is going to be the best card, it's a forgone conclusion, so they could innovate and nothing could shake them. People said the same thing about CPU's, yet AMD was much further behind intel in the CPU space, even moreso, than they are from NV in the GPU space.

Vega was not fully realized in-terms of enginnering due to a manpowershift, but that won't be the case with NAVI as they are getting good R&D dollars for NAVI collaborating with Sony on that project. They now have Vega at 7nm and giving Vega the love it needed on first light. So they can very well launch a 20.9TF Vega this year, open up their own RT solution and offer folk a 4k 60fps solution with HDR, CB, FP16 and even expand on Vega features.

At this point, AMD is not necessarily in a bad spot with their GPU's, mining helped them make some bank with Vega, their Rx580's outperform the 1060 on many games, has more ram and is now priced even cheaper than a 6GB 1060. Their 4Gb 580's destroy 3GB 1060's and the 570 runs all over the 1050ti. The Vega 56 is a very good card at it's current price in the Vega lineup, and if you can get a sapphire based 56 or even 64, they are the best AIB vega cards for performance, bar none, all the others are jokes in comparison.

Yet I think, AMD now has a chance to topple NV in perf and rez and even have an RT solution whilst at it. NV's best card was 11.3TF and now their best next gen card is at 13.4TF, that's not a huge boost. Clearly their focus was on RT cores and tensor cores, but everything looks a bit underwhelming with low perf and low rez there (raytracing push).......A re-engineered Vega will eliminate all issues they had on first release; unfinished engineering, high wattage, low clocks......Clearly, they will sort that out with Vega 7nm, and even provide a better cooling solution out the gate...and there's no way NV will beat them on the rez and fps front with turing if they do.....Keeping in mind that the archaic DX11 is going the way of the dodo and everybody is on the vulkan train now, which is where AMD arch excels.....Intel and AMD are fully commited to vulkan, even NV has no choice and are on the vulkan train too....

I give everything a chance, so we will see how it pans out, but I think AMD is in a good position to make a move and gain lots of marketshare here.........They don't have to follow NV, they can follow their own vision and right now it's looking gelled with the high rez, high fps transition. Not forgetting games development will be mostly led on AMD hardware due to consoles in the coming years. It's also indicative of AMD's plan to push their hardware to developers even on the PC side to ease the transition (perhaps they knew about NV's focus on turing).....They have recently shipped many Vega and Ryzen kits to PC developers just the same, to get these devs more familiarized and tuned to AMD hardware and arch with the new api's taking shape and gaining momentum. PS5 development will only make this easier and extend this. I think they realized they had a chance here for more than a minute, and are putting everything in place......They are already at 7nm, whilst the other teams struggle to get their nodes on or try to do things which are not necessarily ready for primetime. It will be an interesting couple of months, that's for sure....
Nobody is rooting for AMD more than me, but these statements are exaggerated. I can't comment on 1050 performance, because that is card I'd never own, but the 580 and Vega GPUs performance against the nvidia equivalent are not winning in most games. They are, at best, trading blows.

Let's say our dream came true and AMD dropped a 7nm Vega this year that was able to get 4K 60 fps and outperform a 1080 Ti I'd consider it if the price was right.

I do need to see how the RTX 20 series performs against the GTX series in non-raytracing games as that is what I truly care about. Until we see actual benchmarks, my expectation is that in non-raytracing games performance will maybe be 20% better than the 10 series.
 

thelastword

Banned
Nobody is rooting for AMD more than me, but these statements are exaggerated. I can't comment on 1050 performance, because that is card I'd never own, but the 580 and Vega GPUs performance against the nvidia equivalent are not winning in most games. They are, at best, trading blows.

Let's say our dream came true and AMD dropped a 7nm Vega this year that was able to get 4K 60 fps and outperform a 1080 Ti I'd consider it if the price was right.

I do need to see how the RTX 20 series performs against the GTX series in non-raytracing games as that is what I truly care about. Until we see actual benchmarks, my expectation is that in non-raytracing games performance will maybe be 20% better than the 10 series.
I said 580 ouperforms the 1060 on many games and that it does, I don't see the exaggeration in that. If you use DX12 and Vulkan titles, even more. The extra 2GB on RX 580 is not an exaggeration either. The cheaper pricing is not an exxageration either, if you care to check. Perhaps you should re-read what I said...
 
Last edited:

magnumpy

Member
sheesh what's the deal with gamers being afraid of new technology. seems kind of anathematic.

i say bring on the new technology. it might be scary at first but that's just the way the cookie crumbles. I personally do not want to see technology frozen in place.
 

thelastword

Banned
TBF raytracing is not new technology and Nvidia's approach is still hybrid. RTX is not raytracing implicitly either, it could just be image enhancment too. The argument is simple. If this was raytracing in the vein of "it just works" at 4k 60fps, I'd be stoked for these cards, even at these prices, but thats just not the case. I'm not stoked for some small sampled demo bite-sizes of games with RT at "1080p 30-50fps". I don't think the effect warrants the hit in frames and rez at the asking price.......You're paying a markup for a tech which may not even be prioritized by devs going forward......When people start going "this is too low rez and too much of a hit on frames"...you will understand...
 

magnumpy

Member
TBF raytracing is not new technology and Nvidia's approach is still hybrid. RTX is not raytracing implicitly either, it could just be image enhancment too. The argument is simple. If this was raytracing in the vein of "it just works" at 4k 60fps, I'd be stoked for these cards, even at these prices, but thats just not the case. I'm not stoked for some small sampled demo bite-sizes of games with RT at "1080p 30-50fps". I don't think the effect warrants the hit in frames and rez at the asking price.......You're paying a markup for a tech which may not even be prioritized by devs going forward......When people start going "this is too low rez and too much of a hit on frames"...you will understand...

what is unthinkable today becomes no big deal tomorrow. it's forward thinking yes, untenable by current standards yes, but that's just the way technology works. in 10 years RT will be no big deal.

or, we can just have graphics stay the same forever with no advancement ever again? seems kind of dull and boring...
 

octiny

Banned
sheesh what's the deal with gamers being afraid of new technology. seems kind of anathematic.

When it raises the price by a substantial amount, when the majority of PC gamers buying these high-end cards (an already small minority), wouldn't be playing @ 1080P 60 FPS. So glad these cards are getting torched on enthusiast forums such as overclock.net, my 2nd home.





Click for better comparison.

Looks like the resolution dropped to 720P with slightly lower details/geometry/AF on the hardware once RTX is turned on, extremely noticeable in the background & ground even on off-screen capture. This is Mechwarrior 5, I believe.

Yikes.

Personally I'll be buying one for benchmarking purposes, but it won't be because of half-ass RT. The sad part is we're be paying Titan pricing for a Ti tier card ($1150 currenty being the cheapest 2080 Ti), which makes one wonder how much they're going to charge for the full phat TU104 Titan card (300+ more cuda cores+more memory).


Edit:

Pre-orders are clearly down overall, rightfully so, specifically the GTX 2080 card (launch cards still available even on Nvidia's website). Tomshardware just put out an obvious shrill piece even blasting his own editors for saying "wait".

Just Buy It: Why Nvidia RTX GPU's Are Worth The Money

https://www.tomshardware.com/news/nvidia-rtx-gpus-worth-the-money,37689.html

GamerNexus response video to Toms article:

 
Last edited:

thelastword

Banned
what is unthinkable today becomes no big deal tomorrow. it's forward thinking yes, untenable by current standards yes, but that's just the way technology works. in 10 years RT will be no big deal.

or, we can just have graphics stay the same forever with no advancement ever again? seems kind of dull and boring...
The thing is, there will be other RT solutions from the competiton and it won't be at such a high markup. I think the BF demo was really unrealistic in how you approach and play that game. No one is going to look into a soldier's eyes whilst playing BF...Yet, for a very long time now we've been asking for better framerates, higher resolutions.

I personally have been asking for better Ai and Physics.....So raytracing won't solve the main issues or deficiencies with our games atm, if anything, this should be approached on balance, (RT,Ai, Physics, LOD) not spending all your money and tech on RT....in the hope that everyone falls into line at a high price.......I watched the BF demo and all I saw was really low rez fire, coupled that with low frames and rez and let's be honest, raytracing won't look as good in the mix without the rest of the rendering getting similar uplifts like the lighting and reflections did... And you must understand, the more they try to upgrade the other rendering parts with these cards, the more they would have to sacrifice RT rendering to maintain perf and rez as a whole.....
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
Anybody pre-ordering these GPUs right now are goddamn idiots.

Despite a few unverified benchmarks, we know literally nothing about their performance. Literally nothing.

Ultimately for these cards to be worthwhile upgrades, they must provide excellent non-raytracing performance that exceeds the 10 series by 20-30%. If they don't do that, people are far better off just saving their money and buying the 10-series if they don't have them already.

Now, I do appreciate that nvidia is making raytracing R&D a priority and it will be great once it fully matures, but I don't believe the 20 series will provide that.
 
Anybody pre-ordering these GPUs right now are goddamn idiots.

Despite a few unverified benchmarks, we know literally nothing about their performance. Literally nothing.

Depending on where you live there's 0 risk in pre-ordering. We know NDA is ending on 14th while the cards will be released on 20th so there's few days to cancel any preorder. And even if you do get 1 then in EU you have 14 days to return them just loosing few euros on sending them back.

Also while 2070/2080 are more risky since we don't know how they match-up against 1080ti the situation with 2080ti is much simpler - if you already have 1080ti and you need more power there's only 2 things that can give it 2080ti or Titan V
 

magnumpy

Member
The thing is, there will be other RT solutions from the competiton and it won't be at such a high markup. I think the BF demo was really unrealistic in how you approach and play that game. No one is going to look into a soldier's eyes whilst playing BF...Yet, for a very long time now we've been asking for better framerates, higher resolutions.

I personally have been asking for better Ai and Physics.....So raytracing won't solve the main issues or deficiencies with our games atm, if anything, this should be approached on balance, (RT,Ai, Physics, LOD) not spending all your money and tech on RT....in the hope that everyone falls into line at a high price.......I watched the BF demo and all I saw was really low rez fire, coupled that with low frames and rez and let's be honest, raytracing won't look as good in the mix without the rest of the rendering getting similar uplifts like the lighting and reflections did... And you must understand, the more they try to upgrade the other rendering parts with these cards, the more they would have to sacrifice RT rendering to maintain perf and rez as a whole.....

yes, I'm sure there will be other options eventually. I would guestimate that nvidia has a year of exclusivity for this feature. but it could be less, I shouldn't count out amd here. they've got a lot of smart people working there too.

and this is really a ridiculous markup. $1,000 for a video card!? that's just too much. but the cost will come down as time goes by, as it's done before. I'm personally not going to bother with any video card upgrades until christmas time, maybe later. and not for $1,000 :)
 

Redneckerz

Those long posts don't cover that red neck boy
The thing is, there will be other RT solutions from the competiton and it won't be at such a high markup. I think the BF demo was really unrealistic in how you approach and play that game. No one is going to look into a soldier's eyes whilst playing BF...Yet, for a very long time now we've been asking for better framerates, higher resolutions.

I personally have been asking for better Ai and Physics.....So raytracing won't solve the main issues or deficiencies with our games atm, if anything, this should be approached on balance, (RT,Ai, Physics, LOD) not spending all your money and tech on RT....in the hope that everyone falls into line at a high price.......I watched the BF demo and all I saw was really low rez fire, coupled that with low frames and rez and let's be honest, raytracing won't look as good in the mix without the rest of the rendering getting similar uplifts like the lighting and reflections did... And you must understand, the more they try to upgrade the other rendering parts with these cards, the more they would have to sacrifice RT rendering to maintain perf and rez as a whole.....
  1. It was exaggerated to denote the difference. (Raytracing done right is unnoticeable, see the Metro RTX demo and imagine not having seen the comparison shot)
  2. These are Alpha drivers done in a very small amount of time. Ofcourse its going to look off on a game that clearly never was built around the rendering principle of raytracing or taking as such in account
Again, it pains me that people are severely downplaying this on either price or ''It does not look that great/resolution is dropped''. Its raytracing, its not faking anything there. The fact that we even have little aspects of the whole Raytracing model now in an accelerated fashion that developers (and Nvidia) are considering it and putting it in games is impressive indeed.

But its also equally impressive that this kind of raytracing has existed in the demoscene years before, without dedicated hardware. But ofcourse, that is custom to-the-metal code, procedurally generated for a hilariously tiny memory foot print, and as such, is a completely different angle as compared to the RTX line.
 

thelastword

Banned
yes, I'm sure there will be other options eventually. I would guestimate that nvidia has a year of exclusivity for this feature. but it could be less, I shouldn't count out amd here. they've got a lot of smart people working there too.

and this is really a ridiculous markup. $1,000 for a video card!? that's just too much. but the cost will come down as time goes by, as it's done before. I'm personally not going to bother with any video card upgrades until christmas time, maybe later. and not for $1,000 :)
I don't think NV will be that long in the market with raytracing, AMD already has a solution and it seems NAVI has it's own focus besides RT, but they still have an RT solution which is open which many can adopt and expand on, no aspect of their RT tech would be propietary with GPU OPEN...



You realize, this is something they've been working on for a while now. If you check, this video was published since June 2016. So whatever they do with RT won't be a reaction to NV or quickly coined up......They will go after their own vision...and from what I'm hearing about Navi, they have their own focus....

https://gpuopen.com/gaming-product/radeon-rays/

https://pro.radeon.com/en/software/radeon-rays/

I would imagine from 2016 when this was published, they have made some serious strides with their RT tech. It's also great to know that they have recently declared working really hard trying to get 7nm working, they put all their resources into it and things are looking quite positive that they have moved some of their 7nm products to launch in 2018 from a prior-projected 2019.....I think, they have been doing some good work in the background tbh, following their vision to launch something great....or many great products for that matter.....

https://appuals.com/amd-7nm-process-cpu-gpu/

https://fudzilla.com/news/processors/44151-papermaster-says-7nm-is-a-tough-lfit

https://www.crn.com/news/components-peripherals/amd-cto-we-went-all-in-on-7nm-cpus
 
Last edited:
The problem is that AMD will make tech, tell us how great they are since they made it open , pat themself on the back and forget that tech exists.

Meanwhile Nvidia developer relations team already managed to put various form of Touring features in around 21 titles.

High end gpus are tiny fraction of the market so majority of devs won't bother spending resources on it - you have to give them incentive.
 

nemiroff

Gold Member
Anybody pre-ordering these GPUs right now are goddamn idiots.

I disagree. There's no risk. I haven't paid anything, and I have a 30 day no questions asked return policy. Last time a lot of people had to wait a while for their 1080s, I want to put the power of my 2080 Ti to immediate use in VR in DCS, and I just can't wait..
 
Last edited:
Pretty much the conference.....

Money post.

In their chart they compare a GTX1080 against a RTX2080, the 1080 has 320Gb/s whilst the 2080 has 448Gb/s similar to a 1080ti. So what do they do, they compare at the rez which requires more bandwidth (4k), so they could make GTX1080 look shitty. Was 1080 ever a 4k card? no it wasn't....Yet, I have no bones about 4k tests, because this is where we're heading into with traditional rendering.......but they didn't pitch traditional rendering at 4k 60fps did they?. They pitched raytracing for 2hrs, not at 4k rez, but at 1080 and sub 60fps....

So, I've mainly been asking, why not show the settings for the games tested on the graph. Is it ultra, is it high, does it have AA? All I saw was 4k rez wiith HDR, but why HDR? because everybody knows pascal blows with HDR ON against the competition or even against NV's newest cards. So at 4k rez, GTX 1080's limited bandwidth will take the tumble, with HDR on, even more so, because everybody is playing modern games at 4k with HDR on their 1080's ;) Two areas where the 1080 can suffer heavy perf loss...

I've heard a youtuber say recently that NV is going to be the best card, it's a forgone conclusion, so they could innovate and nothing could shake them. People said the same thing about CPU's, yet AMD was much further behind intel in the CPU space, even moreso, than they are from NV in the GPU space.

Vega was not fully realized in-terms of enginnering due to a manpowershift, but that won't be the case with NAVI as they are getting good R&D dollars for NAVI collaborating with Sony on that project. They now have Vega at 7nm and giving Vega the love it needed on first light. So they can very well launch a 20.9TF Vega this year, open up their own RT solution and offer folk a 4k 60fps solution with HDR, CB, FP16 and even expand on Vega features.

At this point, AMD is not necessarily in a bad spot with their GPU's, mining helped them make some bank with Vega, their Rx580's outperform the 1060 on many games, has more ram and is now priced even cheaper than a 6GB 1060. Their 4Gb 580's destroy 3GB 1060's and the 570 runs all over the 1050ti. The Vega 56 is a very good card at it's current price in the Vega lineup, and if you can get a sapphire based 56 or even 64, they are the best AIB vega cards for performance, bar none, all the others are jokes in comparison.

Yet I think, AMD now has a chance to topple NV in perf and rez and even have an RT solution whilst at it. NV's best card was 11.3TF and now their best next gen card is at 13.4TF, that's not a huge boost. Clearly their focus was on RT cores and tensor cores, but everything looks a bit underwhelming with low perf and low rez there (raytracing push).......A re-engineered Vega will eliminate all issues they had on first release; unfinished engineering, high wattage, low clocks......Clearly, they will sort that out with Vega 7nm, and even provide a better cooling solution out the gate...and there's no way NV will beat them on the rez and fps front with turing if they do.....Keeping in mind that the archaic DX11 is going the way of the dodo and everybody is on the vulkan train now, which is where AMD arch excels.....Intel and AMD are fully commited to vulkan, even NV has no choice and are on the vulkan train too....

I give everything a chance, so we will see how it pans out, but I think AMD is in a good position to make a move and gain lots of marketshare here.........They don't have to follow NV, they can follow their own vision and right now it's looking gelled with the high rez, high fps transition. Not forgetting games development will be mostly led on AMD hardware due to consoles in the coming years. It's also indicative of AMD's plan to push their hardware to developers even on the PC side to ease the transition (perhaps they knew about NV's focus on turing).....They have recently shipped many Vega and Ryzen kits to PC developers just the same, to get these devs more familiarized and tuned to AMD hardware and arch with the new api's taking shape and gaining momentum. PS5 development will only make this easier and extend this. I think they realized they had a chance here for more than a minute, and are putting everything in place......They are already at 7nm, whilst the other teams struggle to get their nodes on or try to do things which are not necessarily ready for primetime. It will be an interesting couple of months, that's for sure....

I can't say much else aside from the fact that I appreciate this post. I didn't bother to watch the conference because the cynic in me had already a swami reading that this would be the case. Extortionate prices and bullshit to make their older cards look worse than they actually are against their newest offerings.

I found reading about the conference on here and various other tech sites to be much more enlightening and a better time spent than watching the actual thing.

I'll grab a 2080TI when the time is right, or after I see Navi. Cause if Navi is priced right and looks good I'm going Red for the first time in more than 12 years.

The Brits said it best: "Fuck right off, mate!"
 

octiny

Banned
I disagree. There's no risk. I haven't payed anything, and I have 30 day no questions asked return policy. Last time a lot of people had to wait a while for their 1080s, I want to put the power of my 2080 Ti to use in VR in DCS, and I just can't wait..

+1.

While I stand by my previous post about these cards putting them in a negative light in regards to RT & pricing. There is absolutely nothing wrong with pre-ordering from somewhere, where they don't charge until they ship it (like I did). Especially for the Ti version as supply is low due to yields on the nearly 4400 cuda core chip, those suckers will be sold out for awhile. Benchmarks will officially surface the 14th or with possible leaks before where I & others can decide. So long as it can push 30%+ higher than a Titan XP, which I have no doubt it can (from the Cuda core increase alone, though may not be the case for the rest of the RTX series) than I can finally downsize my Titan XP SLI rig into a single-card Dan A4.

Calling people "god damn idiots" for making a smart play is ignorant & face palm worthy.
 

llien

Member
The problem is that AMD will make tech, tell us how great they are since they made it open , pat themself on the back and forget that tech exists.
Mantle => Vulkan
FreeSync => in 100+ monitors out there, HDMI and DVI specs

So, uh, nope.
 

ruvikx

Banned
Dice are confused by the reaction to the Battlefield RTX demonstrations..

DICE: Battlefield V Demo was Running at “Rock Solid” 60fps in 1080p using NVIDIA’s RTX Ray Tracing Tech

https://wccftech.com/dice-battlefield-v-60fps-1080p-rtx/

I don't even need to click on the article to say 1440p & 4K are the base resolutions people want (especially after many PC gamers have invested in screens for that purpose), ergo a straight 60 fps 1080p no longer cuts the mustard. I'm not saying ray tracing isn't good (it will be), but it's just somewhat counterintuitive in this market to sell a downgraded resolution as an exciting upgrade.
 

manuvlad

Neo Member
I found this on the ChaosGroup Blog (The company that makes V-ray). Maybe for games, it is not that important. But for me, that plan to use these cards for 3d modeling/rendering work, it's great!:

NVLink was introduced in 2016 and V-Ray GPU was the first renderer to support it officially in V-Ray 3.6 and newer versions. Until now, the technology has only been available on professional Quadro and Tesla cards, but with the release of the RTX series, NVLink is also available on gaming GPUs - specifically on the GeForce RTX 2080 and GeForce RTX 2080 Ti. Connecting two cards with NVLink requires a special NVLink connector, which is sold separately.

https://www.chaosgroup.com/blog/wha...ware-mean-for-ray-tracing-gpu-rendering-v-ray
 

Sargon

Member
What are the chances of a new GTX line coming out next with improved performance over the 1080 series, but without the RayTracing features? Or is RTX the future direction for all the Nvidia gaming cards?
 

octiny

Banned
What are the chances of a new GTX line coming out next with improved performance over the 1080 series, but without the RayTracing features? Or is RTX the future direction for all the Nvidia gaming cards?

0.1% for 2070 & up.

Rumors have it 2060 & below will be the GTX line, since those cards simply won't be powerful enough to make minimal use of RT. At that point, a 1080 Ti would more than likely be the better deal.
 

Dargor

Member
Man, these things are way too expensive, is the 2080Ti really 1k dollars? Thats the pricing for them Titans man, how much will the new Titan cost, 2k? Yeesh...
 
Man, these things are way too expensive, is the 2080Ti really 1k dollars? Thats the pricing for them Titans man, how much will the new Titan cost, 2k? Yeesh...

$1200 actually. And the Ti could be the new Titan or maybe Titans will remain at the 3k price point like the Titan V.
 
Last edited:
Mantle => Vulkan
FreeSync => in 100+ monitors out there, HDMI and DVI specs

So, uh, nope.

Thanks to it being open standard, people with new ryzens have also managed to get nvidia cards (eg tx1060) working with freesync monitors. There's nothing special about G-sync, other than nvidia locking it down and charging a premium on the technology. Nvidia is the villain in this story. AMD called it FREE sync for a reason.
 
Last edited:

Dargor

Member
$1200 actually. And the Ti could be the new Titan or maybe Titans will remain at the 3k price point like the Titan V.

Hope they work these prices on their next gen, I really feel like their prices are getting out of hand. I own a 1080Ti, so I know whats paying alot for the performance, and I know these aren't the bang for your buck type of deal, but I really think they are going overboard.
 

octiny

Banned
$1200 actually. And the Ti could be the new Titan or maybe Titans will remain at the 3k price point like the Titan V.

Unfortunately it's not a Titan replacement. The full die has 4608 CC's, 576 TC's, 72 RT cores, 36 GU's, 288 TU's & 92 ROPS vs 4352, 544, 68, 34, 272, and 88.

Knowing Nvidia, they will definitely launch Titan Turing aimed at "gaming" 4-6 months down the line, if only to take try & take some spotlight from AMD's 7nm launch. Only reason they always wait is because full die get dedicated to the top end Quadro card, with the rejects going to the Ti card. Once yields improve, they launch it. I could see official price drops on the Ti to $999 & under (for all Ti cards), and a $50-$100 cut on 80/70 cards while the Titan comes in @ $1500.
 
Last edited:

llien

Member
Soo, can someone tell me, when do we expect 7nm chips from nVidia? Because for AMD it's Q1, or Q2 2019, I was told. (consumer cards, for prof market they already have that weirdo dual chip 7nm Vega with passive cooling (meant for cooled racks)).
 

dirthead

Banned
Thanks to it being open standard, people with new ryzens have also managed to get nvidia cards (eg tx1060) working with freesync monitors. There's nothing special about G-sync, other than nvidia locking it down and charging a premium on the technology. Nvidia is the villain in this story. AMD called it FREE sync for a reason.

To be fair, if it wasn't for Nvidia, variable refresh won't even be a thing today. They're the ones who made it an issue and brought it into the limelight.

AMD just capitalized on all the attention that Nvidia threw at it (with an inferior solution).

Nvidia might be evil assholes, but you can't deny that they've been delivering in terms of actual performance/features.
 

ISee

Member
when do we expect 7nm chips from nVidia?

Probably the next series of GPUs, so in 1.5-2 years.
About AMD 7nm: As far as I understand they are currently heavily concentrating on bringing 7nm to next gen consoles. I wouldn't expect new 7nm consumer cards from them before PS5 and Xbox Next tbh.
 

llien

Member
I wouldn't expect new 7nm consumer cards from them before PS5 and Xbox Next tbh.
Consumer 7nm from AMD looks inevitable in 2019, their roadmap states:

Vega 7nm sampling in 2018 (we saw it already) => 7nm Navi in 2019 (I'd say Q2 is quite likely, Q3 almost guaranteed) => next gen chip on 7nm(+) in 2020

AMD's 7nm CPUs & GPUs To Be Fabbed by TSMC, on Track for 2018 - 2019

So, if "next nvidia" is at least 1.5 years away, AMD has chance to have very competitive mid range chips, perhaps rivaling 1080 at a fraction of price.
 
Last edited:

Mecha Meow

Member
I'll be using that 15% off ebay coupon later today to snag a regular 1080 for $350. I'm only doing that so my friend can get my 1070 faster to place his old shitty R9 270.

I still get a decent upgrade bump to hold me over until AMD's Navi 7nm cards come out and a new backup gpu.
 
Last edited:

octiny

Banned
Consumer 7nm from AMD looks inevitable in 2019, their roadmap states:

Vega 7nm sampling in 2018 (we saw it already) => 7nm Navi in 2019 (I'd say Q2 is quite likely, Q3 almost guaranteed) => next gen chip on 7nm(+) in 2020

AMD's 7nm CPUs & GPUs To Be Fabbed by TSMC, on Track for 2018 - 2019]

So, if "next nvidia" is at least 1.5 years away, AMD has chance to have very competitive mid range chips, perhaps rivaling 1080 at a fraction of price.

+1

Fully expect 7nm AMD consumer cards sometime next year, and Nvidia 7nm quickly following suit by October. Nvidia is not going to wait around like they did with the Turing release (no competition), while AMD steals the 7nm thunder. Looking forward to some healthy competition. Consumers win.
 
Last edited:

CuNi

Member
Have been planning on getting a 2080ti with a speculated price of around 850€... Actual release price is 1250€...yeah, screw you nvidia on this one. I'll be waiting another gen then, or go red if they manage to close the cap.
That price just isn't worth it and I hope they'll make a loss with this greedy strategy. To all defenders, yes RT is a great technology and I applaud them for the R&D on it.. But honestly, they should have put it into workstation GPUs or made some RT only cards for rendering like they did with physX on the early days and use full die for GTX upgrade. The RT features are barely noticeable and for me both worth the used die space.
 
Mantle => Vulkan
FreeSync => in 100+ monitors out there, HDMI and DVI specs

So, uh, nope.

No one cared about Mantle to the point where AMD had to pad the numbers with things like 3 unannounced Stardock games (hmm I wonder what happened to them) until it morphed into something that worked on Nvidia cards.

Freesync got adopted only because Nvidia created market for it and shown there's demand for more than 60 Hz displays at all. The entire high end gaming monitor market we have now is because Nvidia made 3D technology


And I wonder why you ignored other examples

TressFX or TressFX 2.0 sure got widely supported by developers right
Or the super successfull AMD TrueAudio

Consumer 7nm from AMD looks inevitable in 2019, their roadmap states:

Vega 7nm sampling in 2018 (we saw it already) => 7nm Navi in 2019 (I'd say Q2 is quite likely, Q3 almost guaranteed) => next gen chip on 7nm(+) in 2020

AMD's 7nm CPUs & GPUs To Be Fabbed by TSMC, on Track for 2018 - 2019

So, if "next nvidia" is at least 1.5 years away, AMD has chance to have very competitive mid range chips, perhaps rivaling 1080 at a fraction of price.

I guess some people haven't learned from Vega release yet. This is new process that haven't seen any bigger dies created yet together with hopefully new architecture and not another GCN rehash. It will be miraculous if there aren't any delays. Which means they will start with smaller chip which will be their RX480 equivalent for next generation

GlobalFoundry just announced their capitulation on developing 7nm parts - this means only source of chips made in this process is going to be TSMC - which means whatever production capacity is available will be need to taken away from the likes of Apple, Qualcomm or Sony (assuming PS5 launches in 2019). And that's just the tip of the iceberg - there's going to be internal competition inside AMD itself for whatever 7nm waffers are available - they have to serve professional market with die shrink Vegas, then there's next generation Ryzen CPUs , then another high priority market of Apus where AMD has to reclaim part of laptop market.

Optimistic scenario will be paper launch in Q3 and extremely limited availability in Q4.

So Nvidia will have almost whole year for itself and then when AMD finally enters higher volumes of production they will start gearing for their own launch on by that time more mature 7nm process which likely means they will be able to afford bigger dies giving them again performance crown.

Last but not least AMD is generation behind since Vega falls close to Maxwell in a lot of metrics - whatever small die GPU AMD releases is unlikely to close the gap completely and in lower end of the market Nvidia won't be releasing tensor cores in 2060 cards which means unlike their big 2070+ brothers there won't be space wasted for Ray Tracing tech so those gpus might be still competitive against Navi on everything except die size.
 
Top Bottom