StereoVsn
Member
Damn, imagine.
Well, I guess I will take the next 20 years to get through my backlog finally...I mean library, that's it, library.
Damn, imagine.
A lot people have braindead. Buying more expensive and slower RTX 3050 instead RX 6600/XTLot of people would be interested in a mid top GPU
If you spend that kind of money you probably want to play CP2077 with ray tracing on.Yup, if you’re going to spend $1000 plus why would you settle for significantly worse performance in the most demanding + graphically impressive games out there?
If you’re in that market then chances are you want to play something like Cyberpunk 2077 or Flight Simulator completely maxed out. You’re not going to compromise on that just so you can get like, 3% better max framerate in Counterstrike or something.
Good.
Focus on the mid ranges, and lower end. That’s the majority of buyers. That’s where the competition is.
Leave the ultra high end all to Nvidia. It’s a segment AMD or Intel will never win.
I have also switched around a lot and I'm currently very pleased with my 6700XT. For the price, AMD cards perform very well but their best traits are just in different areas from Nvidia. Right now Nvidia does much better in lighting and reflections, AMD does much better in general rendering and textures.When’s the last time you owned an AMD GPU?
I bounce around between AMD and Nvidia, and I just don’t see this inferiority. My previous 5700XT was a fantastic card for the money, and my 7900XT flat out beats the more expensive 4070ti very often in rasterization.
People who don’t buy AMD or owned one many generations ago while bring up drivers, but AMD’s drivers have been great. Their Adrenaline application is way better than Control Panel.
AMD’s problem is they’re not selling them at enough if a lower price to take marketshare and mindshare away from Nvidia. Their cards have been great.
Sony already gets semi custom SoC from AMD shopping from their roadmap and adding items they care about to it just for consoles or PC too (that depends on AMD) and/or redesigning some units (like the FPU in Zen2 to lower power consumption, cut size), as well as doing custom silicon work with AMD or other partners (see I/O unit in the main SoC, the Tempest3D Audio unit, and the custom SSD controller and NAND layout).Should AMD only make entry level GPUs I wonder what will happen to PlayStation.Will they build their own GPUs or go to Nvidia or Intel or take the entry level GPUs and Fusion lots of entry level GPUs to one big one.Pretty interesting.
They literally are. Nvidia is sold out of AI hardware for the next 18 months.
Still a shame. Pathetic and weak if you ask me. Now I'm curious to find out what would be considered mid-range? 8800? 8700? 8800 XT? Oh yeah, the fun times are about to begin.Intel will also focus on mid range really
There's little to no return for focusing on high end. It's for epeen wars.
"It's a free real estate"Curious to see how this will impact Nvidia’s strategy with Blackwell (50 Series) especially when it comes to pricing.
In one sense it kind of is. A lot of games are coming out, a lot of them are great games even.
On the other hand, performance of a ton of AAA titles is shit and GPU price/performance ratio is plain terrible.
But every other hardware component is a lot more reasonable these days. Even AM 5 Motherboards.
This generation has barely started with 'next generation' graphics and game engines. PC's are going to need far more performance to brute force its inefficiencies in design verses on consoles. And when more UE5 games come out you're going to need a beefy system to run them well. See Revelant 2 for example and that doesn't even use Nanite - or mesh shaders.I am pretty confident people buy way more GPU than they need.
120 FPS for singleplayer? Why?
Raytracing on in most games? Why? Only a few make a difference.
Ultra settings? You don't actually notice the difference in 99% of cases.
xx80 tier GPUs for 1080p/1440p gaming? Seriously.
Not turning on FSR/DLSS when available? Craziness.
Granted there have been shit ports that rely on brute force, which is a bad trend. I think the problem with PC gaming is FOMO, but you can have a great experience for $300-400 unless you absolutely need 4k, and even there I would not go higher than a 6950xt. Raytracing is largely a FOMO gimmick until it becomes widely adopted and mainstream, where it actually makes a visible difference and not puddle reflections (cyberpunk and Dying light 2 are notable instances where it is done well). I think we are at least 2 generations from where raytracing will actually be a big deal.
Is Remnant 2 the typical case for UE5, seems like it was an unoptimized mess on a new engine. There are several engines that look much better and require far less.This generation has barely started with 'next generation' graphics and game engines. PC's are going to need far more performance to brute force its inefficiencies in design verses on consoles. And when more UE5 games come out you're going to need a beefy system to run them well. See Revelant 2 for example and that doesn't even use Nanite - or mesh shaders.
What drugs did you take before writing this?full APU based discrete GPUs via PCIe
The original CELL Simulator devkits by fixstars were that very solution. My hunch is that for AMD to continue winning government contracts, arrays of powerful, low power SoCs interconnecting with their infinity fabric will be their solution, so having a retail product helps with scale and cost savings.What drugs did you take before writing this?
People really have to stop buying into shady rumours so easily.
How about literally everything that isn't on Playstation. so Nintendo games, older Sega games, arcade games (both old and modern, if you know where to look), Xbox exclusives, etc.And which games are available on PC but not playable on a PS5?
This had CU count in upper 200's. So we will this in RDNA5RDNA 4's top end
This had CU count in upper 200's. So we will this in RDNA5
Only interesting question how many N43 have CU, if N41 was going up 200CUGood timing, Keplar has weighed in.
I don't see it going beyond 60 CU honestly, although they are targeting the low to mid-range with RDNA 4 so who knows.Only interesting question how many N43 have CU, if N41 was going up 200CU
Probably not that many more. They will likely still be monolithic for these mid range rdna4 cards.Only interesting question how many N43 have CU, if N41 was going up 200CU
You'll have to have sawhorses to hold it up hanging out of the case.monolithic $2000+ 5090 confirmed
Yeah. it's looked like 9SE/270CU for N41beyond 60 CU honestly
N43 is a Monolithic Cheap one, so probably 40/32 CU.Only interesting question how many N43 have CU, if N41 was going up 200CU