• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry — RTX 4070 Super vs PS5: How Much Faster/Better Are Today's Mid-Range GPUs?

DavidGzz

Member
Sony and MS can afford to sell consoles at a small to no profit because of the fact that consoles require paid online. Something no one seems to take into account is that these consoles end up costing you twice as much if it's your only gaming device. Then you have the fact that buying games on PC from sales to being able to utilize key sellers makes the platforms much closer in price over the course of a generation.

Besides a PC does so much more and we have mods. It's not even close overall. Add in that a Steam Deck let's you take your gaming on the go and it gets even more one sided.

Edit: Dogma 2 at 60fps too. *runs off*
 
Last edited:
WTF with the first two? Did you not know the state of Sony ports on PC, specially the worst port of the last 10 years, TLOU?

For the other two I don't think so as well since those games can also favor AMD cards, maybe you saw a comparison with Nvidia cards? I need a receipt.
You just asked for any games cause you were confident there were none and I made sure to name more than 1 to verify it wasn’t an outlier but fine I’ll name one more Elden ring
 

Bojji

Member
Easy.

4060 - low end
4070 - mid-range
4080 - high end
4090 - ultra high end

I think xx60 cards are the definition of mid range.

We don't have xx50 or xx40 cards this gen (yet) but they are the low end. Too bad nvdia fucked up 4060 this gen, it's worse in vram intensive games than 3060 12GB.
 

Kenpachii

Member
Mid range gets decided by price not performance.

~150-200 is low end
~300 is mid range
~400+ a high end
~500+ is enthousiast.


Mid range is a 4060 as its targeted for 300 euro's.
4070 super = 700-750 euro gpu, nothing midrange about it.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
Mid range gets decided by price not performance.

~150-200 is low end
~300 is mid range
~400+ a high end
~500+ is enthousiast.


Mid range is a 4060 as its targeted for 300 euro's.
4070 super = 700-750 euro gpu, nothing midrange about it.
There’s a lot more than just $100 separating tiers. This isn’t 2010.
 
Of course 4060 is the low end and 4070 is the mid range, PC's entry price is higher than console because they simply offer much more, if NVIDIA could charge us 9€ per month to play online for example the graphic cards would be much cheaper, but that's not how PC market works.

PC's shouldn't even be close to the price/performance of a console like PS5, that's just not the case right now because of an anomaly, since the consoles are more expensive in 2024 than they were in 2020, but that shouldn't last long i guess, but that's the price you pay to play online for free, cheaper games or to be able to play the "exclusive" games from Xbox, Switch, PS5 and PC on a single platform.
 

Kenpachii

Member
There’s a lot more than just $100 separating tiers. This isn’t 2010.

People need a reality check on what people spend on PC for gpu's. No average gamer on pc is dropping 750 euro's for a gpu alone and steam survey showcases this easily. People here have a complete out of touch view on what reality of what people really spend.

Prices can balloon like nvidia wants u to believe but the reality is, nobody buys any of those cards on PC.

Let alone think a 4070 super is mid range absolute idiotic.
 
Last edited:

Bojji

Member
Mid range gets decided by price not performance.

~150-200 is low end
~300 is mid range
~400+ a high end
~500+ is enthousiast.


Mid range is a 4060.

I agree about 4060 being mid range but prices are completely delusional.

Everybody thought that 3080 700$ MSRP was ok so you can say that 700$ is ok for a high end card, 500-600 is mid-migh, 300-400 is mid range and everything below is low end. Everything above 700$ is enthusiast.

But of course nvidia screwed everyone this gen by offering super cut down gpus for ridiculous prices, 3070 was ~55% (cores) of 3090ti and offered ~58% of its performance for 500$. 4070 is just 36% of 4090 cores and offers 50% of its performance for 600$ (MSRP).
 

DragonNCM

Member
I'm mild shocked.....600$ GPU beats 450$ console
patrick-stewart-mild-shock.gif
 

Gaiff

SBI’s Resident Gaslighter
People need a reality check on what people spend on PC for gpu's. No average gamer on pc is dropping 750 euro's for a gpu alone and steam survey showcases this easily. People here have a complete out of touch view on what reality of what people really spend.

Prices can balloon like nvidia wants u to believe but the reality is, nobody buys any of those cards on PC.

Let alone think a 4070 super is mid range absolute idiotic.
You need a really check if you still think a paltry $100 separates GPU tiers.

In 2012, the 670 was $400 and the 680 $500. Do you seriously think $100 is still $100 12 years later? In addition, the higher you go up the product stack, the larger the price gaps.

It depends how you define mid-range too. Do you mean only current generation GPUs as in Lovelace and RDNA3? Or do you mean all somewhat modern GPUs that can play mainstream games? Because then this takes us back all the way to Pascal and the GTX 10 cards.

For Ada, the 4070 is mid-range because it’s not only based on the mid-range die but also in the middle of the product stack.

AD102 - 4090
AD103 - 4080
AD104 - 4070
AD106 - 4060

So it’s a mid-range Lovelace card but, in the entire market of GPUs that still sell, it’s a high-end product since you can still buy 3050s and 3060s.
 
Last edited:

Bojji

Member
I'm mild shocked.....600$ GPU beats 450$ console
patrick-stewart-mild-shock.gif

Not only beats, it's 2x as powerful.

If someone has PC with something like 5600/11400 that person can buy PS5 for 500$ or get 4070S to get machine more powerful than PS5 Pro for 600$ (even less by selling his old card).
 

Kenpachii

Member
You need a really check if you still think a paltry $100 separates GPU tiers.

In 2012, the 670 was $400 and the 680 $500. Do you seriously think $100 is still $100 12 years later? In addition, the higher you go up the product stack, the larger the price gaps.

It depends how you define mid-range too. Do you mean only current generation GPUs as in Lovelace and RDNA3? Or do you mean all somewhat modern GPUs that can play mainstream games? Because then this takes us back all the way to Pascal and the GTX 10 cards.

For Ada, the 4070 is mid-range because it’s not only based on the mid-range die but also in the middle of the product stack.

AD102 - 4090
AD103 - 4080
AD104 - 4070
AD106 - 4060

So it’s a mid-range Lovelace card but, in the entire market of GPUs that still sell, it’s a high-end product since you can still buy 3050s and 3060s.

nvidia can make 10 dies and 30 cards out of them and all price them above a million, that doesn't make of those of those cards mid range.

Mid range is based on pricing not naming or hardware specifications.

Nvidia's naming has been a joke for a long time now. Calling a 750 euro card mid range is beyond laughable.
 

DragonNCM

Member
Sony and MS can afford to sell consoles at a small to no profit because of the fact that consoles require paid online. Something no one seems to take into account is that these consoles end up costing you twice as much if it's your only gaming device. Then you have the fact that buying games on PC from sales to being able to utilize key sellers makes the platforms much closer in price over the course of a generation.

Besides a PC does so much more and we have mods. It's not even close overall. Add in that a Steam Deck let's you take your gaming on the go and it gets even more one sided.

Edit: Dogma 2 at 60fps too. *runs off*
Console sales on games are as good as PC.
 

Gaiff

SBI’s Resident Gaslighter
nvidia can make 10 dies and 30 cards out of them and all price them above a million, that doesn't make of those of those cards mid range.
Of course, it does.
Mid range is based on pricing not naming or hardware specifications.
And who decides the pricing? You?
Nvidia's naming has been a joke for a long time now. Calling a 750 euro card mid range is beyond laughable.
Which is irrelevant. The products available dictate what are the different tiers, not some arbitrary prices.

Once again, looking at the market as a whole, ie, only current products that still sell, the 4070 is a high-end product. Looking strictly at Lovelace, it isn't, because only two GPUs are weaker while the majority are stronger. If you throw in RDNA3 as well, then it firmly sits in the mid-range. These aren't some numbers I just pulled out of thin air. These are objective tiers and how we quantitatively compare the different cards available. The 4070 is a high-end card, not a high-end Lovelace GPU.
 
Last edited:

DragonNCM

Member
Not only beats, it's 2x as powerful.

If someone has PC with something like 5600/11400 that person can buy PS5 for 500$ or get 4070S to get machine more powerful than PS5 Pro for 600$ (even less by selling his old card).
How old is RTX4070 ? 1 and a halve years ?
And PS5 ? 3 years ?
You do the math.....you are getting 500$ console who can decently play games for next 7 to 10 years or 2k PC who can be top of the line this year & fall short next few years.
Digital Foundry making useless comparations comparing oranges to apples. Best value in gaming are consoles, PC gaming cost you a lot of money to be in top tier (top tier PC setup can cost you more then 3.5k with top tear monitor, mouse & keyboard.
 

Gaiff

SBI’s Resident Gaslighter
How old is RTX4070 ? 1 and a halve years ?
And PS5 ? 3 years ?
You do the math.....you are getting 500$ console who can decently play games for next 7 to 10 years or 2k PC who can be top of the line this year & fall short next few years.
Digital Foundry making useless comparations comparing oranges to apples. Best value in gaming are consoles, PC gaming cost you a lot of money to be in top tier (top tier PC setup can cost you more then 3.5k with top tear monitor, mouse & keyboard.
What do you mean comparing apples to oranges? This was a GPU benchmark. They reviewed the 4070S against other cards and against the most popular premium console. No one was trying to claim that it's a better value than the PS5, nor was Bojji Bojji alluding to that.

What's with console gamers taking so much offense to some comparisons? It just shows us how this GPU stacks up against other GPUs and consoles.

Also, no. A PC centered around an RTX 4070S wouldn't be top-of-the-line this year.
 
Last edited:

Bojji

Member
How old is RTX4070 ? 1 and a halve years ?
And PS5 ? 3 years ?
You do the math.....you are getting 500$ console who can decently play games for next 7 to 10 years or 2k PC who can be top of the line this year & fall short next few years.
Digital Foundry making useless comparations comparing oranges to apples. Best value in gaming are consoles, PC gaming cost you a lot of money to be in top tier (top tier PC setup can cost you more then 3.5k with top tear monitor, mouse & keyboard.

2080ti that is older than PS5 (2018) can still play games and usually at higher settings than PS5. It's not about year of release but actual power and features.

I would argue that PS5 can't "decently play games" right now, that's what 30 fps games tell us, not to mention what will happen in 7 years lol, PS6 will release in 2027.

I told you that if someone has pc build in 2020 it only costs him 600$ to get better experience than potential PS5 pro and you talk to me about 3.5k pc including monitor etc. Hahaha. You need tv for console as well.
 
Last edited:
They are but come on. the 780 sold for $650 11 years ago. You can't seriously tell us the RTX 4080 should be cheaper. Relatively speaking, the GTX 780 was almost 2/3 more expensive than the PS4 which would make the RTX 4080 $812 today. $1000 is too high and the insane $1200 launch price needs not to be mentioned but $600? You could effectively build a rig 2.5x the performance of the PS5 for 2x the price. This would make high-end PC gaming a better value than console gaming which has never happened.
The 4080 is not 2.5x the ps5 in rasterization (it’s about 2.15x)
 

Gaiff

SBI’s Resident Gaslighter
The 4080 is not 2.5x the ps5 in rasterization (it’s about 2.15x)
Yes, it is. This video alone has the 4070S 2x the performance of the PS5. The 4080 isn’t just 7.5% faster than the 4070S.

That's using the 6700 as a baseline. The 4080 is 2.53x ahead.

6yNs0Un.png


Besides that, this is irrelevant because even if it were just 2.15x faster, it would still make a better value than the PS5 which hasn’t happened in a long time. High-end PC gaming is not more cost effective than console gaming.
 
Last edited:
Yes, it is. This video alone has the 4070S 2x the performance of the PS5. The 4080 isn’t just 7.5% faster than the 4070S.

That's using the 6700 as a baseline. The 4080 is 2.53x ahead.

6yNs0Un.png


Besides that, this is irrelevant because even if it were just 2.15x faster, it would still make a better value than the PS5 which hasn’t happened in a long time. High-end PC gaming is not more cost effective than console gaming.
You mean the digital foundry vid where they used a cpu nearly 3x the power of the ps5? And tested some ganes with rt no one doubts the 4080 is like 4x the power of the ps5 in rt but in raster absolutely not a 2.5x difference certainly not when using an equal cpu no worries though I assume the 2.5 was a typo
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
You mean the digital foundry vid where they used a cpu nearly 3x the power of the ps5? And tested some ganes with rt no one doubts the 4080 is like 4x the power of the ps5 in rt but in raster absolutely not a 2.5x difference certainly not when using an equal cpu no worries though I assume the 2.5 was a typo
The chart here says 2.53x faster than the 6700. Where do you get 2.15x from?
 
Last edited:

Freeman76

Member
This would be a more productive video. What does it take to MATCH the console.
Eventually the PS5 PRO is going to be released and I would like to know if my 4070 can keep up.

Taking a 4070 super and dialing back the settings to match a console is just kind of, I don't know, dumb?
I'm not going to play that way. I'm not dialing back my 4070.
Dialing back the settings is something you do if your hardware is getting on in years.
Theres no way a ps5 pro is gonna outdo a 4070
 

Gaiff

SBI’s Resident Gaslighter
Theres no way a ps5 pro is gonna outdo a 4070
Why not? The 4070 isn’t very impressive. NVIDIA intentionally gutted it with a tiny bus that results in lower performance than the 3080. The 3080 will be 4 years old when the Pro drops. It can certainly compete with or even beat a weaker card in rasterization. RT is a different story.

I think they’ll be about even for the record but it’s possible for the Pro to pull ahead by about 5% or so at higher resolutions.
 
Last edited:

winjer

Gold Member
Theres no way a ps5 pro is gonna outdo a 4070

In rasterization, it's not that difficult. Then 4070 is slightly slower than a 7800XT.
The question is about RT. If the PS5 Pro uses the RT units from RDNA3, than it's going to lag behind the 4070.
But RDNA4 is supposed to use newer RT units. So it might be able to narrow the gap.
The PS5 pro is rumored to use a mix tech from RNDA3 and RDNA4.
But at this point, we are all just speculating.
 

SABRE220

Member
Theres no way a ps5 pro is gonna outdo a 4070
The 4070 is not some monster card...its inferior to a 3080. Nvidia has basically butchered the 4xxx series die to the point where the 4070 is what a 4050ti/4060 would have been in earlier generations. If you think a 200mm die 3080 equivalent is a ridiculous benchmark by the time we are about to get the 5xxx series then we might as well not even get the pro gen consoles because that jump is underwhelming in itself.

Even the ps4 pro which was a conservative upgrade was a 2.25x compute jump and the series x was a massive bump.
 
Last edited:
The 4070 is not some monster card...its inferior to a 3080. Nvidia has basically butchered the 4xxx series die to the point where the 4070 is what a 4050ti/4060 would have been in earlier generations. If you think a 200mm die 3080 equivalent is a ridiculous benchmark by the time we are about to get the 5xxx series then we might as well not even get the pro gen consoles because that jump is underwhelming in itself.

Even the ps4 pro which was a conservative upgrade was a 2.25x compute jump and the series x was a massive bump.

As a 3080 owner i'd gladly trade it for a 4070, if rasterization mattered, people would mostly buy AMD cards on PC, but the truth is one of the reasons to go to PC is to escape from AMD's cards and technology.

A 4070 with DLSS 3 and FG destroys a 3080, and is an over 2x upgrade from a PS5 considering the 4070s from the DF's video is like 2,5x to 3x the PS5's performance.
 

Freeman76

Member
The 4070 is not some monster card...its inferior to a 3080. Nvidia has basically butchered the 4xxx series die to the point where the 4070 is what a 4050ti/4060 would have been in earlier generations. If you think a 200mm die 3080 equivalent is a ridiculous benchmark by the time we are about to get the 5xxx series then we might as well not even get the pro gen consoles because that jump is underwhelming in itself.

Even the ps4 pro which was a conservative upgrade was a 2.25x compute jump and the series x was a massive bump.
Where did I say the 4070 was a monster of a card? Its good enough to play anything current at 4k around 90-120fps, if you think the ps5 'pro' is gonna hit that kind of performance u will probably be dissapointed
 
Last edited:

winjer

Gold Member
Sony desperatly needs to develop their own DLSS tech.

Rumors say that the reason FSR2 doesn't use AI is because of Sony, since they decided not to have support for DP4A on the PS5.
We now have confirmation that AMD is adding AI to FSR3. And chances are that the PS5 Pro will have at least WMMA support.
 
Last edited:

Bojji

Member
Rumors say that the reason FSR2 doesn't use AI is because of Sony, since they decided not to have support for DP4A on the PS5.
We now have confirmation that AMD is adding AI to FSR3. And chances are that the PS5 Pro will have at least WMMA support.

When they developed it DLSS was still in infant/shit stages so it's not surprising they didn't consider anything about it.

Xbox potentially supports dp4a but so far nothing to show for it. Xess on rdna2 shows that performance penalty can be much bigger than FSR2.
 

winjer

Gold Member
When they developed it DLSS was still in infant/shit stages so it's not surprising they didn't consider anything about it.

That is true. DLSS1 was a pile of crap.
And DLSS2 was only released in early 2020, so by that time the PS5 was already in production and the specs locked in.

Xbox potentially supports dp4a but so far nothing to show for it. Xess on rdna2 shows that performance penalty can be much bigger than FSR2.

XeSS is optimized for Intel GPUs, even the DPa4 path. If AMD was to optimize a similar AI upscaler, it would probably run a bit faster on RDNA2.
 

Zathalus

Member
Where did I say the 4070 was a monster of a card? Its good enough to play anything current at 4k around 90-120fps, if you think the ps5 'pro' is gonna hit that kind of performance u will probably be dissapointed
4070 or better in rasterization seems quite believable. We know the PS5 Pro is using a 60 CU GPU based on RDNA4. A 60 CU card using RDNA3 is almost 10% faster then a 4070.

I can't speculate on RT performance as we have no idea how well RDNA4 will do with that.
 
The chart here says 2.53x faster than the 6700. Where do you get 2.15x from?
The chart did similar testing as digital foundry which is with a 13900k and it also tested some games with rt not just rasterization performance only card that actually is your figure is the 4090
 
Last edited:

iHaunter

Member
A $599 RTX 4070 super is superior than a
2020 APU??

ItS60NI.gif


Unfortunately, the graphics card doesn't work without RAM, CPU, motherboard, power supply, etc.

Pretty pointless video. Oh yes, that's right, otherwise you wouldn't have been able to bash against the PS5. Why PS5 and no comparison to the xbox series x!?
And that's JUST the GPU... No Motherboard, case, hdd, ram, coolers, powersupply, windows license.

Going to be spending $2,000 to support that GPU.
 

Gaiff

SBI’s Resident Gaslighter
The chart did similar testing as digital foundry which is with a 13900k and it also tested some games with rt not just rasterization performance only card that actually is your figure is the 4090
The 6700 was also paired with a super fast CPU, not a 3600 so that changes absolutely nothing. Once again, where did you pull 2.15x from? What is your source?
 
The 6700 was also paired with a super fast CPU, not a 3600 so that changes absolutely nothing. Once again, where did you pull 2.15x from? What is your source?
When using a 5600x instead of a 13900k that’s where and testing games without ray tracing like cod or rift apart in non rt mode. I’d bring up last of us but that pc port is so bad the 4080 is barely 80% faster there
 
Why not? The 4070 isn’t very impressive. NVIDIA intentionally gutted it with a tiny bus that results in lower performance than the 3080. The 3080 will be 4 years old when the Pro drops. It can certainly compete with or even beat a weaker card in rasterization. RT is a different story.

I think they’ll be about even for the record but it’s possible for the Pro to pull ahead by about 5% or so at higher resolutions.
These are very conservative estimates your estimates make sense if the pro is only 499 but considering it’s gonna be 599-699 I think you should be expecting better I’m expecting 4070 ti (maybe super) on average raster performance and 4060 ti rt performance (maybe in very fringe and lucky cases like last of us we will just about get 4080 performance in raster)
 

Gaiff

SBI’s Resident Gaslighter
When using a 5600x instead of a 13900k that’s where and testing games without ray tracing like cod or rift apart in non rt mode. I’d bring up last of us but that pc port is so bad the 4080 is barely 80% faster there
That's straight up bullshit. They used a 5800X for the 6700 XT AND the RTX 4090.

gJGZiIP.png


5gwRvek.png


And I ask again, where did you pull 2.15x from. Ae you gonna tell us or keep lying?
 
That's straight up bullshit. They used a 5800X for the 6700 XT AND the RTX 4090.

gJGZiIP.png


5gwRvek.png


And I ask again, where did you pull 2.15x from. Ae you gonna tell us or keep lying?
benchmarks I should add the 5800x is still like 80% better than the ps5 cpu but at least it’s no more of that near 3x boost
 

Gaiff

SBI’s Resident Gaslighter
benchmarks I should add the 5800x is still like 80% better than the ps5 cpu but at least it’s no more of that near 3x boost
Irrelevant in this case. A 5800X will never bottleneck a paltry 6700. If anything, the 5800X could hold back the 4080 or 4090, underselling the actual performance disparity.

Once again, can i get your source for 2.15x because I've asked several times and you still haven't provided it.
 
Irrelevant in this case. A 5800X will never bottleneck a paltry 6700. If anything, the 5800X could hold back the 4080 or 4090, underselling the actual performance disparity.

Once again, can i get your source for 2.15x because I've asked several times and you still haven't provided it.
Of course it wouldn’t hold back the 5800x like how the ps5 cpu would likely hold back its gpu that’s the point I was making with the 80% comment really should be no more than a 5600x when comparing against a ps5. And I saw the 2.15x in rasterized benchmarks online
 
Top Bottom