How much did AMD pay them to not include DLSS?
It will. Depends on your res and texture settings, you can be smart about things ya know!Ah, 8GB VRAM already obsolete then? Some people thought it would last for a couple of years into the next gen. Such naive people.
Try this more recent video for PS5.
It looks better than the PC one.
Yep, definitely better.
Start of the sequence 1st i-frame
End of the sequence last i-frame
At 1min12 to 1min13s i counted 58 frames between 2!! clear frames (I- Frames?), those 58 between both "i-frames" where blurry like after getting heavily drunk.
At one point, the whole screen was white. Motion blur and anime fans gonna like this
on top of my head? 3080-3090-6800xt-6900 if we talk about TF.What GPU is 3x more powerful than the PS5?
on top of my head? 3080-3090-6800xt-6900 if we talk about TF.
ps5 is 10.something vs 30-36-i don't know amd gpus TF but they are on par with nvidia gpus so...
Wait, maybe you were talking about vram?
They said that the game utilize amd-optimized rtx, i don't remember the name of the tech.In a few weeks.
Godfall runs better at max settings on RTX 3080 with 10GB of VRAM than on the RX 6800XT because we done goofed and put Raytracing in this, raytracing whose cost goes up with resolution.
It is? Do you have a source for this?
So you can exceed your VRAM limit and it's fine and nothing happens? The 3070's 8GB should be enough too then right?
I didn't say PC highest settings. I said better textures.
like i said, if we talk only about teraflop, 30 and 36 are triple the number of tf inside a ps5.None of those cards are 3x more powerful than the PS5. For argument's sake let's take the 5700XT. You think a 3080 is 3x more powerful than a 5700XT?
Yep, definitely better.
Start of the sequence 1st i-frame
End of the sequence last i-frame
At 1min12 to 1min13s i counted 58 frames between 2!! clear frames (I- Frames?), those 58 between both "i-frames" where blurry like after getting heavily drunk.
At one point, the whole screen was white. Motion blur and anime fans gonna like this
like i said, if we talk only about teraflop, 30 and 36 are triple the number of tf inside a ps5.
If you talk about vram, then no, only the 3090 is barely 2 times better.
So you buy 3070 to play 4K, Ultra?Ah, 8GB VRAM already obsolete then? Some people thought it would last for a couple of years into the next gen. Such naive people.
The same amount they must pay 90% of all other developers that don't include DLSS.
All these developers and publishers are getting rich.
How much did AMD pay them to not include DLSS?
fair point.If you just look at teraflops like you are, then the 3090 is over 2x more powerful than the 2080ti. Except it isn't.
90% of games don't need DLSS. Maybe the game from a developer pounding their chest that it uses 12GB of VRAM does though.
Craig...please dont embarrass yourself. I dont think i can defend you.What GPU is 3x more powerful than the PS5?
The reality is there's not a single game out there that wouldn't benefit from DLSS since it improves image quality with no performance cost.
What is also a reality is the fact that it requires some work on the developers end along with co-operation with Nvidia to get it working and that's not something everyone wants to do.
Unless Nvidia can find a way to get it implemented across the board at the driver level we are going to continue to see just a handful of games utilising the tech.
Why would they? Have you not seen re2 and other games that can "allocate" more than your GPU's vram size? Check out the video a couple posts up, about the difference between allocation and utilization. They are not the same.I'm sure 3080 owners will be fine. They can just lower the texture quality.
Craig...please dont embarrass yourself. I dont think i can defend you.
I'm sure 3080 owners will be fine. They can just lower the texture quality.
Stop talking sense ffs you're ruining the thread.If VRAM utilization is not higher than 10gb, of which 3080 has higher bandwidth memory than that of the new consoles anyways, it wouldn't matter. Games do not usually come close to eating up even 60% of the VRAM allocation. So its a complete non-issue.
'Twas just a poke. Notice the there. I'm aware.Why would they? Have you not seen re2 and other games that can "allocate" more than your GPU's vram size? Check out the video a couple posts up, about the difference between allocation and utilization. They are not the same.
but what of series x which only has 10GB fast speed ramPS5 should have enough VRAM for this.
The newer cards have less gaming performance per teraflop.on top of my head? 3080-3090-6800xt-6900 if we talk about TF.
ps5 is 10.something vs 30-36-i don't know amd gpus TF but they are on par with nvidia gpus so...
Wait, maybe you were talking about vram?
Stop talking sense ffs you're ruining the thread.
It's a 500$ card in 2020 so yes.So you buy 3070 to play 4K, Ultra?
Are PC gamers going to shit up every thread on console games that will appear on PCs? We get it. The PC will be able to handle better graphics, even if the framerate is shitty. I'm not jealous. I just don't care to invest in the improvements. And I have disposable cash. Can we separate console and PC game threads going forward? Because this is kind of annoying.
this is a thread about pc requirements .....Are PC gamers going to shit up every thread on console games that will appear on PCs? We get it. The PC will be able to handle better graphics, even if the framerate is shitty. I'm not jealous. I just don't care to invest in the improvements. And I have disposable cash. Can we separate console and PC game threads going forward? Because this is kind of annoying.
You literally came into a PC thread to bitch about people coming into a console thread. And even so, unless the game is specifically mentioned to be running on a console, 9 times out of 10 you are almost always looking at PC footage of multiplat games.Are PC gamers going to shit up every thread on console games that will appear on PCs? We get it. The PC will be able to handle better graphics, even if the framerate is shitty. I'm not jealous. I just don't care to invest in the improvements. And I have disposable cash. Can we separate console and PC game threads going forward? Because this is kind of annoying.
You're an insecure fellow aren't you.It will be funny to see the PS5 version have better textures than the 3080.
Looked about the same to me.
Are PC gamers going to shit up every thread on console games that will appear on PCs? We get it. The PC will be able to handle better graphics, even if the framerate is shitty. I'm not jealous. I just don't care to invest in the improvements. And I have disposable cash. Can we separate console and PC game threads going forward? Because this is kind of annoying.
Are console gamers going to shit up every thread on pc version of games that are on both? This is getting annoying.Are PC gamers going to shit up every thread on console games that will appear on PCs? We get it. The PC will be able to handle better graphics, even if the framerate is shitty. I'm not jealous. I just don't care to invest in the improvements. And I have disposable cash. Can we separate console and PC game threads going forward? Because this is kind of annoying.
The reality is there's not a single game out there that wouldn't benefit from DLSS since it improves image quality with no performance cost.
It's a mid-range card, aimed for 1440p. If you buy a mid-range card to game at 4K you need your head checked in the first place.It's a 500$ card in 2020 so yes.
Keep in mind that VRAM limitations don't get better at low frame rates. Either it's 30 or 60fps, the moment VRAM needs to swap, you will get massive stutters. That makes the 3070 inferior to next-gen consoles at 4K, despite being more powerful. It means, it's low amount of VRAM is it's bottleneck.
It's a mid-range card, aimed for 1440p. If you buy a mid-range card to game at 4K you need your head checked in the first place.
Any game that uses raytracing needs to have DLSS as an option. If they don't, then they come across as foolish or were paid not to include it.
Who cares what you think about prices. The reality is that mid range now is $400-500. RX 5700 Xt was a mid range card starting at $449. Looks like AMD is also brainwashing?I love how $500 is midrange now. Nvidia certainly brainwashed people pretty well. Midrange should never be more than $300, like it used to.
No, its not midrange. That would be the 2060 or the upcoming 3060. And no, thats no low range, that would be anything from the 1650 card or lower.It's a mid-range card, aimed for 1440p. If you buy a mid-range card to game at 4K you need your head checked in the first place.