• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Report: AMD expects “Big Navi” to compete with 3070, not 3080

VFXVeteran

Banned
And why not?

Because the form of reflection in that demo is just 1 ray cast along the reflection vector. That's why they are perfect mirrors. It's very cheap to compute and test for intersection.

illum9d.gif


Whereas in path tracing, you fire many rays along that reflection vector in a cone shape. Giving you blurred reflections.

Specular-Reflection-Vector-Diagram.png
 
Last edited:

Durask

Member
Majority of PC gamers do not buy top end cards. Isn't 1060 the most common card on Steam these days?

If they can make a "3060 level" card and sell it at a competitive price, that's all they are going to need.
 

Ascend

Member
Because the form of reflection in that demo is just 1 ray cast along the reflection vector. That's why they are perfect mirrors. It's very cheap to compute and test for intersection.

illum9d.gif


Whereas in path tracing, you fire many rays along that reflection vector in a cone shape. Giving you blurred reflections.

Specular-Reflection-Vector-Diagram.png
There are multiple blurred reflections in the demo.
 

thelastword

Banned
Are there any leaks when the new cards will be revealed? I mean they surely can't wait till October. Even if they launch end of October it would be nice to know if waiting would be wise. :/
I would believe it's sometime this month. Maybe around Nvidia's launch or even later. Rumors are that the cards are to launch in October with a possibility that it may slide into early November. Yet I think the plan was always October, since AMD said they would launch the cards before consoles........

It's being teased already, so you will hear something soon....


AMD-Radeon-RX-6000-Big-Navi-RDNA-2-GPU-Teaser_Official_1.jpg


https://wccftech.com/amd-radeon-rx-6000-big-navi-graphics-card-teased-in-fortnite/
 

nochance

Banned
I would believe it's sometime this month. Maybe around Nvidia's launch or even later. Rumors are that the cards are to launch in October with a possibility that it may slide into early November. Yet I think the plan was always October, since AMD said they would launch the cards before consoles........

It's being teased already, so you will hear something soon....


AMD-Radeon-RX-6000-Big-Navi-RDNA-2-GPU-Teaser_Official_1.jpg


https://wccftech.com/amd-radeon-rx-6000-big-navi-graphics-card-teased-in-fortnite/
That would be such a stupid decision on AMD's part, especially if they do have a response to 3080. People were waiting for a reasonable upgrade path since 1080ti came out, 3080 is going to accommodate this demand and because of the hype and being better value it will probably get a lot of people that would normally by a 3070 level card.
 

llien

Member
That would be such a stupid decision on AMD's part, especially if they do have a response to 3080.
Reactions like this puzzle me.
Exactly what would happen 1 month after GPU release? Everyone who wanted to upgrade would manage to upgrade and it would be too late for 6000 series?

i'm suprised he confuse allocated ram and real usage
Regardless, hit that 2080 is taking at 4k is quite notable.
 
Last edited:

Armorian

Banned
I would believe it's sometime this month. Maybe around Nvidia's launch or even later. Rumors are that the cards are to launch in October with a possibility that it may slide into early November. Yet I think the plan was always October, since AMD said they would launch the cards before consoles........

It's being teased already, so you will hear something soon....


AMD-Radeon-RX-6000-Big-Navi-RDNA-2-GPU-Teaser_Official_1.jpg


https://wccftech.com/amd-radeon-rx-6000-big-navi-graphics-card-teased-in-fortnite/

phn0bwnobq7y.jpg


How that turned out...



I seriously think people are gonna get raped in the ass by the 3070 is they buy it for 4k or downsample.

My 8gig in my Vega is on the edge now and that's HBM2.


Avengers with ultra textures eats ~7.5GB of VRAM in 2560x1080, and ~14GB of RAM (first game that uses 16GB :messenger_grinning_smiling:)

Yeah, 8GB won't be enough for anything higher than 1440 native.

Regardless, hit that 2080 is taking at 4k is quite notable.

Difference is probably more about memory BW and ROPs than amount of VRAM. Doom is caching alot of stuff.
 
Last edited:

martino

Member
Reactions like this puzzle me.
Exactly what would happen 1 month after GPU release? Everyone who wanted to upgrade would manage to upgrade and it would be too late for 6000 series?


Regardless, hit that 2080 is taking at 4k is quite notable.
if real usage is not over 8gb the hit is not because of ram though
 

Papacheeks

Banned
phn0bwnobq7y.jpg


How that turned out...



Avengers with ultra textures eats ~7.5GB of VRAM in 2560x1080, and ~14GB of RAM (first game that uses 16GB :messenger_grinning_smiling:)

Yeah, 8GB won't be enough for anything higher than 1440 native.



Difference is probably more about memory BW and ROPs than amount of VRAM. Doom is caching alot of stuff.

This shows why they are trying to leverage DLSS so much. Feels like the chiplet design for them is way off. Chiplet GPU design will mitigate the issues we are seeing with VRAM usage.
 

Rikkori

Member
i'm suprised he confuse allocated ram and real usage

He didn't confuse anything. There's real limits to vram and sometimes you hit them. It doesn't mean the game is unplayable but it does mean at the very least lower averages, or usually - stutter (from streaming and swapping). 8 GB for Doom Eternal at 4K is NOT enough for great experience. Same story for other games. Too many people plug their ears because they don't want to hear it, lest it makes the new cards look gimped (as they are).

ui9hLnC.png
 

martino

Member
He didn't confuse anything. There's real limits to vram and sometimes you hit them. It doesn't mean the game is unplayable but it does mean at the very least lower averages, or usually - stutter (from streaming and swapping). 8 GB for Doom Eternal at 4K is NOT enough for great experience. Same story for other games. Too many people plug their ears because they don't want to hear it, lest it makes the new cards look gimped (as they are).

ui9hLnC.png
yeah this graph is bad
 


I seriously think people are gonna get raped in the ass by the 3070 is they buy it for 4k or downsample.

My 8gig in my Vega is on the edge now and that's HBM2.

From what I understand, if you’re VRAM limited your performance drops off a cliff.

Looking at that, that’s an acceptable drop in performance for the 2080. It doesn’t look like it’s VRAM limited, but it may be limited in other ways.
 

Rikkori

Member
From what I understand, if you’re VRAM limited your performance drops off a cliff.

Looking at that, that’s an acceptable drop in performance for the 2080. It doesn’t look like it’s VRAM limited, but it may be limited in other ways.

It's not a choice of either okay or dead, there's steps in between. Depending on other factors you see different effects. It should be obvious that missing 0.5-1 GB isn't the same as needing another 5 GB.
 
That would be such a stupid decision on AMD's part, especially if they do have a response to 3080. People were waiting for a reasonable upgrade path since 1080ti came out, 3080 is going to accommodate this demand and because of the hype and being better value it will probably get a lot of people that would normally by a 3070 level card.

Well yes and no. Because Nvidia are gonna do well to release early, but yields on Samsung's botched 8nm are absolutely shite, so not many 3080s are going to be on store shelves by October/Nov anyway, so AMD releasing their 3080 competitor in Oct is fine,as long as they have stock.
 
It's not a choice of either okay or dead, there's steps in between. Depending on other factors you see different effects. It should be obvious that missing 0.5-1 GB isn't the same as needing another 5 GB.
How is that obvious?

I thought that the frame would need to be drawn twice if you’re trying to use 9 GB or 13 GB on an 8GB card.

Whether it’s 9 or 13 GB it still need 2 passes to draw the frame.
 
Last edited:

Rikkori

Member
How is that obvious?

I thought that the frame would need to be drawn twice if you’re trying to use 9 GB or 13 GB on an 8GB card.

Whether it’s 9 or 13 GB it still need 2 passes to draw the frame.

There's a hierarchy to what's loaded, and what's used, then there's the hierarchy to where it comes from, and to everything along the chain, and then to engine-level decisions whether lower-res assets get swapped in when your vram is choking (like what happened with FF Remake on PS4 in spring), and so on. We have lots of simplistic discussions on this forum but there's A LOT going on behind the scenes and these things have been well thought out. It would make no sense in 2020 for game performance to completely plunge when running in a small vram choke, I mean just at the rendering level, having culling and all that, it just would be a very primitive way to make games. So it may have been true decade+ ago but is not true any longer.

Nonetheless, not having enough vram can still be detrimental to the experience, it's just not a complete catastrophe.
 
There's a hierarchy to what's loaded, and what's used, then there's the hierarchy to where it comes from, and to everything along the chain, and then to engine-level decisions whether lower-res assets get swapped in when your vram is choking (like what happened with FF Remake on PS4 in spring), and so on. We have lots of simplistic discussions on this forum but there's A LOT going on behind the scenes and these things have been well thought out. It would make no sense in 2020 for game performance to completely plunge when running in a small vram choke, I mean just at the rendering level, having culling and all that, it just would be a very primitive way to make games. So it may have been true decade+ ago but is not true any longer.

Nonetheless, not having enough vram can still be detrimental to the experience, it's just not a complete catastrophe.

1080ti with 11GB of VRAM gets something like 10-15 more FPS than the 8GB VRAM 2080 on Doom Eternal 4K Nightmare Graphics settings.
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
These rumors never pan out. Big Navi barely beating a 2 year hardware? I don’t believe it.

With raja gone, I expect this to be competing with the 3080.
Yeah, AMD is much better off without him. He seems to be bad luck since Intel has had major issues since he joined the company.
 

JohnnyFootball

GerAlt-Right. Ciriously.
In regards to the 3070 8GB vs. 1080 Ti/2080 Ti 11GB. A lot of you are assuming that just because there is more memory that it will bottleneck the 3070, but are failing to acknowledge how big of an advantage this could provide.
geforce-rtx-30-series-rtx-io-announcing-rtx-io-scaled-e1599045046160-2060x1130.jpg


It is quite possible that could more than make up the difference when it comes to 8GB vs 11GB deficit and may even end up being non-existent for the 3080 10GB.
 
Last edited:
There's a hierarchy to what's loaded, and what's used, then there's the hierarchy to where it comes from, and to everything along the chain, and then to engine-level decisions whether lower-res assets get swapped in when your vram is choking (like what happened with FF Remake on PS4 in spring), and so on. We have lots of simplistic discussions on this forum but there's A LOT going on behind the scenes and these things have been well thought out. It would make no sense in 2020 for game performance to completely plunge when running in a small vram choke, I mean just at the rendering level, having culling and all that, it just would be a very primitive way to make games. So it may have been true decade+ ago but is not true any longer.
You seem to be having a difficult time explaining something that "should be obvious" tbh.
 

Rikkori

Member
In regards to the 3070 8GB vs. 1080 Ti/2080 Ti 11GB. A lot of you are assuming that just because there is more memory that it will bottleneck the 3070, but are failing to acknowledge how big of an advantage this could provide.
geforce-rtx-30-series-rtx-io-announcing-rtx-io-scaled-e1599045046160-2060x1130.jpg


It is quite possible that could more than make up the difference when it comes to 8GB vs 11GB deficit and may even end up being non-existent for the 3080 10GB.

That helps you to get assets into vram, but it does nothing for vram itself. If you need to hold 1.5L of water but you have a 1L bottle, it doesn't matter how much faster one tap is than the other, you still need a bigger bottle.
It's the same mistake people make with compression technology - it helps with bandwidth but more bandwidth by itself isn't enough to totally compensate for lack of space.

Not to mention, we'll see the next GPU architecture by the time that gets more widely adopted, and by that point 20 GB & 16GB models will be common.

You seem to be having a difficult time explaining something that "should be obvious" tbh.

There I go being too optimistic about people's ignorance again. :)
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
That helps you to get assets into vram, but it does nothing for vram itself. If you need to hold 1.5L of water but you have a 1L bottle, it doesn't matter how much faster one tap is than the other, you still need a bigger bottle.
It's the same mistake people make with compression technology - it helps with bandwidth but more bandwidth by itself isn't enough to totally compensate for lack of space.

Yes, but if you can fill and empty the one liter and refill it a quicker rate than filling the 1.5L bottle, that difference may be irrelevant.Not to mention that Ampere could take advantage of PCI-Express 4.0 in a way that Turing couldn't.

I am not saying you're not correct, but I am open to seeing with my own eyes rather than outright dismissing.
 

Rikkori

Member
Yes, but if you can fill and empty the one liter and refill it a quicker rate than filling the 1.5L bottle, that difference may be irrelevant.Not to mention that Ampere could take advantage of PCI-Express 4.0 in a way that Turing couldn't.

I am not saying you're not correct, but I am open to seeing with my own eyes rather than outright dismissing.

That's fine, we'll find out soon enough anyway. Crysis Remastered coming out a day later with 8K textures, maybe that will be a worthy test. :)
 

Ascend

Member
Low res hiearchial geometry. Not blurred reflections dude. I know what I'm looking at and what is capable on these machines and it's not that.
No need to get frustrated. I'm simply trying to understand. Is there any way you can describe how you tell the difference?
 

VFXVeteran

Banned
No need to get frustrated. I'm simply trying to understand. Is there any way you can describe how you tell the difference?

Well, blurred reflections is extremely expensive and that's just not going to be in a game that's even semi-complicated. It's really hard to describe it tbh. There is so many things that go into reflections.

One telltale sign is to just look at the reflection and see if it's very sharp near where the surface meets another surface and then blurs as it goes further out into the distance.

Here is a real path-traced reflection where the one on the left is mirror and the one on the right is blurred in the vertical direction (never seen in a videogame).

6eed343b189eb36465c547debd24cf6c.jpg
 
If people use their brain they will realise than the 3080 is in fact the 3070. So that claim makes total sense.

Again use your brain.

Why don't you explain your thoughts instead of telling everyone they are stupid if they don't automatically believe you?
 

CrustyBritches

Gold Member
Turing was TU104 for 2080S/2080/2070S/Some weird 2060. TU102 for 2080ti and Titan. For Ampere, 3080 is GA102 along with 3090. 3070 is GA104. In this dynamic, 3080 is cutdown from the Titan card(3090) like 2080ti. By that measure, wouldn't the 3070 be the new xx80?
 
Last edited:

pullcounter

Member
thanks to all ITT for the youtube links, specifically for the RT videos. Very interesting that RT seems to be coming along rather quickly for both amd and Nv

once I saw that Cryengine RT demo, I knew it was only a matter of time before we were going to see real time RT on consoles/amd gpus.

For those who aren't familiar with what I'm talking about:



Now this actually isnt doing much beyond RT reflections, and there is a bit of IQ reduction going on inside the reflections (geometry and resolution being scaled down), but it still looks quite good and I'd imagine this will be what most PS5/XSX raytracing will look like and imo that's still quite amazing
 

VFXVeteran

Banned
Now this actually isnt doing much beyond RT reflections, and there is a bit of IQ reduction going on inside the reflections (geometry and resolution being scaled down), but it still looks quite good and I'd imagine this will be what most PS5/XSX raytracing will look like and imo that's still quite amazing

That's going to completely depend on what they decide to render within the reflections. For example, I don't like that Spiderman trailer when he's in jumping with energy around his arms and you can't see this energy nor can you see the weave from his suit in the reflection.

milesmorales-venom-ps5-legal.original.jpg


I can already surmise that they will use proxy geometry and shaders when returning a color back to the object reflecting. It might be missing so much details that it will look like SSR except be visible off camera.
 
Last edited:

llien

Member
i never said there wasn't a drop
The context of this discussion is "is 8GB sufficient". We already have examples demonstrating that it is not.
Who cares about what names to call it, what would it change if it is "real" or "not real" RAM usage that is causing this issue?


but are failing to acknowledge how big of an advantage this could provide.
This is more of a "maybe this will make it suck less", which is quite naive a take, given that even RAM to GPU RAM transfer hurts 8GB 2080.

To me, personally, it is nvidia's time bomb to force early Ampere buyers below 3090 to upgrade shortly.
 

rofif

Can’t Git Gud
Even if big navy is a tiny big faster tan 3070 and similar price, I would still get 3070.
It's not a good feeling to be locked out of rtx, dlss and all other features and compatibility even if You don't often use it
 

llien

Member
a tiny big faster
I would still get

Chuckle.

ll other features

Don't forget DF uber exclusive objectivity karma, it improves satisfaction by up to 97%, leather jacket skill by 6% (stacks up to 18%)


On a serious note, Biggest NAVI (which is about 505mm2 or 485mm vs 3090's 627mm2) is said to beat 3070 so decisively that 3070Ti as a counter to it is not viable. Also likely to come with 16GB.
 
Last edited:
Chuckle.



Don't forget DF uber exclusive objectivity karma, it improves satisfaction by up to 97%, leather jacket skill by 6% (stacks up to 18%)


On a serious note, Biggest NAVI (which is about 505mm2 or 485mm vs 3090's 627mm2) is said to beat 3070 so decisively that 3070Ti as a counter to it is not viable. Also likely to come with 16GB.

3070 Ti already been spotted with 16GB:


But they've taken the page down. It was in a Lenovo pre-built, so ready to launch soon. Not confirmed whether GA102 or GA104.

This is like a slap in the face of all those 10GB 3080 early adopter suckers.
 

Stooky

Member
In regards to the 3070 8GB vs. 1080 Ti/2080 Ti 11GB. A lot of you are assuming that just because there is more memory that it will bottleneck the 3070, but are failing to acknowledge how big of an advantage this could provide.
geforce-rtx-30-series-rtx-io-announcing-rtx-io-scaled-e1599045046160-2060x1130.jpg


It is quite possible that could more than make up the difference when it comes to 8GB vs 11GB deficit and may even end up being non-existent for the 3080 10GB.
Did they mention anything about a minimum storage spec to take advantage of Direct Storage and RTX I/O?
 

JohnnyFootball

GerAlt-Right. Ciriously.
Did they mention anything about a minimum storage spec to take advantage of Direct Storage and RTX I/O?
No. There is still a lot of unknowns. But I think it’s premature to assume that memory size is the only factor. Even Steve from GamersNexus mentioned that there were other factors.
 
Top Bottom