• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

nVidia RTX 2080 Ti/2080/2070/2060 Offical Announcement - Gamescom Conference Live Today 6PM CEST/12PM EDT/9AM PDT:

Solomeena

Banned
New AMD has similar ray tracing? Or do they have their own ray tracing?

No one knows at this point. Considering how far behind AMD is with Nvidia when it comes to releasing hardware, it is hard to say. I sure AMD is working on it but we don't know for sure if it is coming soon.
 

longdi

Banned
New AMD has similar ray tracing? Or do they have their own ray tracing?

Considering that the rtx2080ti is the almost full quadro gpu, with tensors, rt and tips cores, that is a beast. 700mm2 in size.

AMD doesn't seem to have engineered anything close to this, as far as leaks go. Doesn't bode well for next gen console gaming.
 
Id rather some game changing AI tech. Even if it were to take up another PCI-E slot.

Eyeball reflections reflecting reflections isnt all that impressive going by the tech demos.
 

llien

Member
I did, hence the second part of my post (which you ignored). :)
I didn't, it just contradicts my understanding of "clearly". :)
I'd call it "they formally noted it was 'in ray tracing'".

I'm pretty sure majority of those who bought FE 2070 for the price of AIB 1080Ti, misunderstood it.
 

Airbus Jr

Banned
This is what you get from no competition

Nvidia free to do anything they like

Hopefully we got a responses from AMD and Intel to this otherwise it's getting more and more ridiculous
 

LostDonkey

Member
Been speaking within my circle of gaming friends and not one person is interested at that price. We all have 10 series cards and plan on waiting it out.
 

Dontero

Banned
As someone said if it was in PS3 era it would be amazing change. But at this point with physical based rendering you can already get pretty close to really great image. Obviously with raytracing it is shitload easier and more correct but people are already making games so it is not that huge change for them.

IMHO the real change for next gen will be physics. Dedicated part of silicon for physics only could really fucking change a lot our games while Raytracing is just nice FX effect.

AMD doesn't seem to have engineered anything close to this, as far as leaks go. Doesn't bode well for next gen console gaming.

I don't think AMD cares what Nvidia does at this point. They basically hold 90% of gaming market in their hands and whatever custom silicon they will came up with has possibility of really fucking hurting Nvidia in gaming sphere.

Nvidia really made a fucking huge mistake when they lost both major consoles to AMD.

Imagine for a second that AMD gets their own PsyhX and they dedicate part of silicon for it. Since they control both consoles it means 90% of developers in the world will make their games completely use it. Which in turn means that almost every PC game later will look like shit on Nvidia GPUs becausae they can't run it and some will not even be playable on Nvidia because developers choose to use physics as one of main designs behind game or various gameplay elements rely on it.

That is why i said Nvidia made a huge fucking mistake because company that has both best selling consoles will decide what games will use or not. Any custom silicon that manufacturer puts into consoles will be instantly used as a norm among game creators. If Nvidia would hold both consoles then we could have raytracing additions of RTX as standard for next gen.

We just don't know what AMD is cooking yet. IMHO it will be physics.
 
Last edited:

Darak

Member
Unless there is a huge surprise on the horizon, the next console generation won't have anything to do with real-time raytracing, which means AAA games will still target the current pipeline for quite a while. Sure, we'll have a RTX switch with better shadows and another gimmicks in selected titles, but the new technology really asks for games and game engines specifically made for it, and that's not going to happen anytime soon.

So, what's the value proposition here? Despite a long presentation and a massive marketing campaign, zero information was given about real-world performance in current titles compared to the previous generation, which essentially guarantees the comparison is not favorable to nVidia. With 4K displays starting to become commonplace, the thing customers really need right now is a faster, more affordable GPU which can bring the 4K/60 fps to the common gamer. This announcement went the opposite way: expensive cards with dubious performance improvements and a completely new, alien technology which doesn't really translate to anything impressive on screen (judging by the presentation). This is not what anybody wanted and, despite nVidia saying they know better, this is not what anybody needs.

In addition, the huge focus on general computing tasks on this presentation makes me think the new cards are going to be stellar for cryptomining. It's likely that coins that are currently somewhat resistant to GPU mining are going to fit much better those new Turing cores. If this confirms to be true, we'll face yet another speculation bullshit craze, and the prices can go even higher.
 

Kenpachii

Member
Unless there is a huge surprise on the horizon, the next console generation won't have anything to do with real-time raytracing, which means AAA games will still target the current pipeline for quite a while. Sure, we'll have a RTX switch with better shadows and another gimmicks in selected titles, but the new technology really asks for games and game engines specifically made for it, and that's not going to happen anytime soon.

So, what's the value proposition here? Despite a long presentation and a massive marketing campaign, zero information was given about real-world performance in current titles compared to the previous generation, which essentially guarantees the comparison is not favorable to nVidia. With 4K displays starting to become commonplace, the thing customers really need right now is a faster, more affordable GPU which can bring the 4K/60 fps to the common gamer. This announcement went the opposite way: expensive cards with dubious performance improvements and a completely new, alien technology which doesn't really translate to anything impressive on screen (judging by the presentation). This is not what anybody wanted and, despite nVidia saying they know better, this is not what anybody needs.

In addition, the huge focus on general computing tasks on this presentation makes me think the new cards are going to be stellar for cryptomining. It's likely that coins that are currently somewhat resistant to GPU mining are going to fit much better those new Turing cores. If this confirms to be true, we'll face yet another speculation bullshit craze, and the prices can go even higher.

Next gen consoles will already have a huge problem with getting that 4k and 60 fps, raytracing is far from something that they care about.
 

DonJimbo

Member
U should upgrade every half year actually.
Thats horrible
PC gamers aren't required to upgrade their GPUs every year. :goog_what::goog_what::goog_what:
If they want to max out they have to
Why would you buy a new GPU every year? I had my last one for 4 years and many people keep them for a lot longer than that.
Really ?
My problem is that whenever i get a new gpu it gets obselete the next day i have no luck
 
Last edited:

Kamina

Golden Boy
Can someone who is versatile in GPU spects tell me how a GTX 1080Ti holds up against a 2080(not Ti)?
 

Silver Wattle

Gold Member
Great looking cards, dunno why people use this as a chance to dump on AMD, they are working on bringing 7nm chips out asap, Nvidia is going to make us wait for 7nm.
 

McHuj

Member
I ended up pre-ordering the 2080, only since I don't have to pay now and will wait until benchmarks to really decide. It's a place holder until then.

However, it does look like the 2080Ti are out of stock, but plenty of 2080's available for pre-order. I don't remember that with Pascal. I thought everything was sold out immediately.
 

JohnnyFootball

GerAlt-Right. Ciriously.
Can someone who is versatile in GPU spects tell me how a GTX 1080Ti holds up against a 2080(not Ti)?
There are no benchmarks, so nobody knows. Given that nvidia did not do any comparisons benchmarks I would be rather surprised if it is worth an upgrade to anyone who has a 1080 Ti.

I have a 1070 and I am taking a wait and see approach.
 
Last edited:

scydrex

Member
If i build a new gaming PC will be AMD 100%. Will not give Nvidia what they ask for this new GPU or the upcoming 2060 for $350 or $399.
 
Last edited:
I’m fairly impressed, but I’ll just stick with my 1080 until I see how these really perform and plenty of games are out that incorporate RT. It’ll be interesting to see what results are really like in real people’s hands.
 

nkarafo

Member
Good that im playing on consoles dont like to upgrade my gpu every year
You are right, it's good you are playing on consoles because you seem to not know how PCs work.


Interesting and how is the cpu side
If i have a quad processor
Pretty good. I have an i5-4670 since 2014 and it still allows me to play most games at 60fps, unlike those shitty console CPUs you are stuck with for years. Actually, even an i3 from that era would be superior to what consoles have.
 
Last edited:
I used to be so hyped when a new set of graphics cards came out. I still remember when I got my GTX 560 Ti. Awesome card.

Nowadays it makes absolutely no sense to get any of these cards for the average gamer since games are being held back by consoles. I can't see any real reason this is a good way to spend the game instead of getting a console.

Wake me up when we have a single card solution for 4k 144 stable fps.
 
Last edited:

GoldenEye98

posts news as their odd job
TFlops base/boost clocks

1070 - 5.78Tflops/6.46Tflops
1080 - 8.22Tflops/8.87Tflops
1080Ti - 10.6Tflops/11.34Tflops

2070 - 6.49Tflops/7.64Tflops
2080 - 8.92Tflops/10Tflops
2080Ti - 12.23Tflops/13.44Tflops
 
Last edited:

Dontero

Banned
Can someone who is versatile in GPU spects tell me how a GTX 1080Ti holds up against a 2080(not Ti)?

We don't know. Both Tensor cores and RC cores are separate part of silicon that will not be used in traditional rendering which means they will be unused when you play normal games or older ones.
So the real question is how many games will actually implement this in meaningful way.

I don't think Nvidia simply is in position to standardize it. Only AMD can now standardize custom silicon now.
 

LordOfChaos

Member
Considering that the rtx2080ti is the almost full quadro gpu, with tensors, rt and tips cores, that is a beast. 700mm2 in size.

AMD doesn't seem to have engineered anything close to this, as far as leaks go. Doesn't bode well for next gen console gaming.

It doesn't change how the next gen consoles bode at all. A new larger part that wasn't going in the consoles at any reasonable console cost doesn't change the APU that's likely for it.

Navi was always going to be a mid range part, we have strong indications of the PS5 being at least Navi-akin, a new colossal part above it doesn't change anything for it. Likely something like enhanced Vega 64 graphics before this news, likely something like enhanced Vega 64 graphics after this news.


I do however hope the next consoles have some ray tracing acceleration hardware, as that'll be taking off in the PC space in the meantime.
 
Last edited:

pawel86ck

Banned
Infiltrator tech demo 4K result is very impressive, 78fps on 2080ti, and 38 fps on 1080ti. I wonder how they did it exactly because CUDA cores amount alone doesnt suggest that much difference in performance. Maybe 2080ti was rendering much lower resolution, while tensor cores were somehow used to render the remaining resolution?
 
Last edited:

dirthead

Banned
Here's the reality of the situation:

* Every card in this line up other than the 2080ti is going to be slower and more expensive than the 1080ti you can already buy today, and the 2080ti is only going to be marginally faster; so right out the gate, this entire generation is basically worthless
* None of these cards is going to be able to drive 4k @ 120hz without dropping frames
* It gets better! We have confirmation that this card isn't going to support HDMI 2.1, which would have been the only useful feature it could have had compared to the 1080 lineup, so it'll be completely worthless for 2019 TVs with HDMI 2.1
* Nvidia's too greedy and evil to actually support HDMI 2.1 VRR; they'll cling to G-Sync module sales until they're absolutely forced out of it
* There's basically no point in being an early adopter on new GPU technologies since all games are designed around consoles anyway. Raytracing demos are a fucking joke until there's an actual console generation of hardware out there with it and games are designed from the ground up with it in mind. Video games aren't designed for PC hardware anymore. They're designed around consoles and released on the PC as a second thought.
* The real version of these cards is coming out next spring with the 7nm refresh, which will give them enough headroom to actually improve performance outside of bullshit bogus raytracing benchmarks that were invented to make themselves look good
* To top everything off, Nvidia's current AI ray tracing technique has an end result that looks like shit; because it's essentially just guessing, there are all kinds of visual anomalies and defects that make it arguably look just as bad as the errors you get with traditional rasterization anyway.

This entire generation of GPUs is shit. Here's hoping that Navi isn't a complete piece of crap. This is a real opportunity for AMD. They could be the only game in town that actually supports HDMI 2.1 VRR on 2019 OLED TVs. Even if their performance isn't as good as Nvidia's, that's a HUGE advantage. Nvidia's blowing it.
 

dirthead

Banned
It doesn't change how the next gen consoles bode at all. A new larger part that wasn't going in the consoles at any reasonable console cost doesn't change the APU that's likely for it.

Navi was always going to be a mid range part, we have strong indications of the PS5 being at least Navi-akin, a new colossal part above it doesn't change anything for it. Likely something like enhanced Vega 64 graphics before this news, likely something like enhanced Vega 64 graphics after this news.


I do however hope the next consoles have some ray tracing acceleration hardware, as that'll be taking off in the PC space in the meantime.

If consoles don't have hardware ray tracing and there isn't an architecture agnostic API for it, it might as well be PhysX. Games are designed around consoles. Not PCs.
 

dirthead

Banned
Nvidia's benchmarks are the computer hardware equivalent of a fat girl's online dating profile.

I'm 600% faster at some phony raytracing metric that I made up.
I weigh 100 pounds on the moon.

iur
 

LordOfChaos

Member
If consoles don't have hardware ray tracing and there isn't an architecture agnostic API for it, it might as well be PhysX. Games are designed around consoles. Not PCs.

I think at worst it'll be like Gameworks in adoption. Nvidia today has a heck of a lot more clout than PhysX era.
 

dirthead

Banned
I think at worst it'll be like Gameworks in adoption. Nvidia today has a heck of a lot more clout than PhysX era.

Ironically, despite their PC market dominance, I think they actually have less clout than they've ever had, overall, since all the consoles other than the Switch, which isn't even really a console but a portable, are on AMD's architecture. Like I said, games are made for consoles now. Not PCs. So people are writing games for AMD, basically. The only reason we saw these demos is because Nvidia paid off a few developers to slap them together to make this launch not seem like the complete scam that it is.
 
Considering that the rtx2080ti is the almost full quadro gpu, with tensors, rt and tips cores, that is a beast. 700mm2 in size.

AMD doesn't seem to have engineered anything close to this, as far as leaks go. Doesn't bode well for next gen console gaming.
The disavowed rumored MCM gpus would allow them to compete.

What I'm more excited about is the deep learning super sampling.

As for amd, if amd had AI acceleration, they would be fine.

I would hypothesize AI could even be trained on a per game basis to high quality fake a higher quality ray traced graphics implementation in developers servers. That is play the game with the equivalent of a high end card on the developers servers and use the results to train a NN to fake said lighting.

AI acceleration > RT acceleration
 
Last edited:
Strange time to release 12FFnm lithography gpus when 7nm is already in production.

Ray tracing looks good in most examples. Just too early and too much of an early adopters tax for my taste.
 
Wasn't all that impressed with the demos tbh.. You would think they would do some sort of HDR tech since support is lacking on PC. I don't really see many caring about running sub 60 fps at 1080P with a $1200+ GPU. Can only imagine how shitty it would be on a 2070/2060 which will be way more popular.

Would much rather see GPUs assisting with AI since it's so damn terrible in games now. Not sure what nVidia was think with RT at this point..

The reference design is awful as well.
 
Ray Tracing is fantastic and I love the way Nvidia keeps pushing new and innovative technology.

However I don't see the appeal in these cards right now. The price is way too high. They absolutely need to destroy Pascal performance in non RT games to be attractive at that price. That does not seem to be the case. Maybe they should have kept this tech in the oven for a bit longer. Especially when we have a situation where RT games have trouble reaching 60fps at 1080p. That is concerning.
 

LordOfChaos

Member
Ironically, despite their PC market dominance, I think they actually have less clout than they've ever had, overall, since all the consoles other than the Switch, which isn't even really a console but a portable, are on AMD's architecture. Like I said, games are made for consoles now. Not PCs. So people are writing games for AMD, basically. The only reason we saw these demos is because Nvidia paid off a few developers to slap them together to make this launch not seem like the complete scam that it is.



And that status quo existed when they started pushing Gameworks, so I'm expecting about the same adaptation. RTX enhancements in a number of AAA games.

This is if their extensions are so incompatible with AMDs the market it split, best case being coming AMD hardware supports the same DX12/Vulkan extensions well enough and the consoles get it too.
 
Last edited:

DonJimbo

Member
You are right, it's good you are playing on consoles because you seem to not know how PCs work.



Pretty good. I have an i5-4670 since 2014 and it still allows me to play most games at 60fps, unlike those shitty console CPUs you are stuck with for years. Actually, even an i3 from that era would be superior to what consoles have.
Dont be harsh buddy i asked a question because im getting interested in investing in a new pc
 
Ray Tracing is fantastic and I love the way Nvidia keeps pushing new and innovative technology.

However I don't see the appeal in these cards right now. The price is way too high. They absolutely need to destroy Pascal performance in non RT games to be attractive at that price. That does not seem to be the case. Maybe they should have kept this tech in the oven for a bit longer. Especially when we have a situation where RT games have trouble reaching 60fps at 1080p. That is concerning.

The deep learning supersampling allowed them to run 4k infiltrator demo at over 78fps vs 30ish framerate in a 1080ti, with superior image quality.

Any game that supports that AI supersampling will see over 2X improvement in performance.
As an example, Huang showed off the Infiltrator demo, based on the UE4 engine, running at 78fps at 4K, instead of 38fps on a GeForce 1080 Ti. The main reason for the increase is due to the Tensor cores filling in the resolution through neural network training. -hexus.net
 
Last edited:

nkarafo

Member
Dont be harsh buddy i asked a question because im getting interested in investing in a new pc
I was mostly referring to your initial statement about having to "upgrade the GPU every year" which to me looked like the usual trolling from console users. Sorry if i misinterpreted it.
 

scydrex

Member
If i were building a PC right now i would buy a 1080ti or a used 1080 instead if this high price line up... Nvidia is greedy as always.
 

BenBuja

Neo Member
The deep learning supersampling allowed them to run 4k infiltrator demo at over 78fps vs 30ish framerate in a 1080ti, with superior image quality.

Any game that supports that AI supersampling will see over 2X improvement in performance.
If this really works well then this would be a huge selling point for me since I mainly game on a 4K TV. I'll still wait for a price drop though. No way I'm going to pay €1200+ for a single GPU.
 

bee

Member
if it's too expensive then it's not aimed at you? can't say i'm super happy with the price but will buy a 2080 ti closer to xmas

being able to use elements of ray tracing in games 2018 is pretty amazing to me, i see people are now trying to spread fud by claiming the games actually run worse than they do or downplaying the effect itself, same shit with 4k etc years ago when consoles couldn't do it 4k was lame, now every other game is "where's mah 4k patch " (e.g shenmue hd) when amd catches up to these cards in 5 years or so it'll suddenly be very well received
 
Top Bottom