Im still playing the waiting game to see what next gen consoles will have, before i buy a new card.
I wanted to buy a rtx 2060 but i dont know how future proof it will be with only 6gb vram. By the way how does this card perform with ray tracing on Metro Exodus?
What mobo do you have?
Change CPU to 3770 (k) or 2600k - 4 more threads will give you good upgrade in many games and if you can OC it's even better
Buy Nvidia GPU, 1660 should work great, AMD drivers have high CPU overhead in DX11 and your 290 is killing this i5 you have.
This should work great untill NG consoles arrive, after that who knows what CPU power will be needed to maintain 60fps in ports of PS5/X4 games.
I have not, any other games like this I could give it a try on? Also I'm not at stock, I'm at 4.2 Ghz.
Dont remember my memory but a generic socket 1155 No overclock posibilities
Mainly i was wondering if was cpu limited. I will probably make a new build after amd refreshes their cpus
Yeah the 2060 is very much the "it can do raytracing" but "you probably shouldn't" card.RTX 2060 wouldn't be future proof even if it had 16GB of VRAM if you want ray tracing. You can max out Metro Exodus with it and get 1080p/60fps with ray tracing, but there isn't much headroom for future games.
For ray tracing I'd say either wait for the next gen or pick up 2080 Ti.
See this seems like too much of a generic assumption. Can you then explain why AMD cards crush Nvidia GPU's in Far Cry 5 then which is DirectX 11? I mean the Vega 64 performs at the level of a Titan X and near a 1080 Ti.Ubisoft open world games are heaviest on CPU and besides Devision are DX11 only. I'm running mine at 4.4GHz with 1070 (oc'd to 2100MHz on core).
RTX 2060 wouldn't be future proof even if it had 16GB of VRAM if you want ray tracing. You can max out Metro Exodus with it and get 1080p/60fps with ray tracing, but there isn't much headroom for future games.
For ray tracing I'd say either wait for the next gen or pick up 2080 Ti.
Probably not, the 2060 is already at the level in some games where sacrifices need to be made even at 1080p. You'd very much be buying into the mid-range for already current titles and engine technology.Ray tracing would be just a bonus, most games dont support it anyway. I was thinking more that if it will run future games reasonably well.
See this seems like too much of a generic assumption. Can you then explain why AMD cards crush Nvidia GPU's in Far Cry 5 then which is DirectX 11?
Probably not, the 2060 is already at the level in some games where sacrifices need to be made even at 1080p. You'd very much be buying into the mid-range for already current titles and engine technology.
Probably not, the 2060 is already at the level in some games where sacrifices need to be made even at 1080p. You'd very much be buying into the mid-range for already current titles and engine technology.
See this is kind disingenuous though because you're trying to assess this with stock values from an 8 year old CPU which basically no one who plays games on PC games does. Even my 2600k at 4.2Ghz isn't going to put me into some kind of a wall because it's well beyond the bottleneck threshold.Tests are usually done on best CPU's on the market and we are talking 8 years old CPU here This is FC5 on 2600k:
And this on oc'd 5960x
AMD cards are fine with fast CPU's but with anything below 6 series from Intel (IDK about broadwell) Nvidia cards will have better fps.
Ray tracing would be just a bonus, most games dont support it anyway. I was thinking more that if it will run future games reasonably well.
If you don't have big amounts of disposable income is better to buy a $300 card you can upgrade every 3 years instead of buying a $600+ card every six yearsRay tracing would be just a bonus, most games dont support it anyway. I was thinking more that if it will run future games reasonably well.
They can't step up their game when people go for nVidia even when AMD has the better product. Latest example... RX 570 vs GTX 1050 Ti. RX570 is at least $20 cheaper, at least 20% faster and comes with two games. Yet Steam is littered with the 1050 Ti and no RX570 to be seen ( almost 10% for the 1050 Ti, and below half a percent for the RX 570)...If AMD stepped up their game, yeah sure.
Ah yes. Another one of these oh so great technologies, simply because nVidia is doing it. There are many technologies now to reduce tearing. Another one is nice, but it really isn't that big of a deal.i don't see amd pushing along technologies like s-sync
They can't step up their game when people go for nVidia even when AMD has the better product. Latest example... RX 570 vs GTX 1050 Ti. RX570 is at least $20 cheaper, at least 20% faster and comes with two games. Yet Steam is littered with the 1050 Ti and no RX570 to be seen ( almost 10% for the 1050 Ti, and below half a percent for the RX 570)...
How can one expect AMD to step up their game when no one is giving them income?
Ah yes. Another one of these oh so great technologies, simply because nVidia is doing it. There are many technologies now to reduce tearing. Another one is nice, but it really isn't that big of a deal.
I guess Radeon Chill wasn't an innovation?
I'm not really impressed by the GTX 1660. You really are better off going with AMD cards, especially if you consider Vulkan and DX12 performance.
Ah yes. Another one of these oh so great technologies, simply because nVidia is doing it.
If you don't have big amounts of disposable income is better to buy a $300 card you can upgrade every 3 years instead of buying a $600+ card every six years
Follow this and you won't have trouble future proofing games
Why would anyone buy G-sync monitors now with FreeSync around and nVidia supporting it? It's actually really hard to buy a not potato tier monitor that does not have at least a small FreeSync range, and it's definitely a superior solution to s-sync and other software solutions like enhanced sync or fast sync.the thing is, s-sync isn't just another one, it's basically one step removed from gsync (as a software solution)
it's a total game changer, especially for people who cannot afford gsync monitors or are still using perfectly capable 1080p 60Hz monitors
I am actually genuinely curious if one can achieve the same thing with Radeon Chill. Don't think so though.It's an RTSS (?) thing where you banish the tearline to a less noticable part of the screen, preferably in the displays non visible redundancy. It's a bit of a ballache to get set up and varies between display. I struggled with it on my 4k OLED but that's because of the GPU usage I guess.
They can't step up their game when people go for nVidia even when AMD has the better product. Latest example... RX 570 vs GTX 1050 Ti. RX570 is at least $20 cheaper, at least 20% faster and comes with two games. Yet Steam is littered with the 1050 Ti and no RX570 to be seen ( almost 10% for the 1050 Ti, and below half a percent for the RX 570)...
How can one expect AMD to step up their game when no one is giving them income?
Ah yes. Another one of these oh so great technologies, simply because nVidia is doing it. There are many technologies now to reduce tearing. Another one is nice, but it really isn't that big of a deal.
I guess Radeon Chill wasn't an innovation?
I'm not really impressed by the GTX 1660. You really are better off going with AMD cards, especially if you consider Vulkan and DX12 performance.
Yeah if amd pulls it together its going to help everybody by bringing much needed competition, and it also helps that their midrange cards almost assuredly will have 8gb+.This is a good advice thank you, especially since i am looking for a card to play at 1440p/60 fps at most, im gonna wait if Navi can deliver this at a better price than current NVidia cards
Yeah if amd pulls it together its going to help everybody by bringing much needed competition, and it also helps that their midrange cards almost assuredly will have 8gb+.
Do you have a card to hold you by in the meantime?
As far as I'm aware they're not not. It's an RTSS (?) thing where you banish the tearline to a less noticable part of the screen, preferably in the displays non visible redundancy. It's a bit of a ballache to get set up and varies between display. I struggled with it on my 4k OLED but that's because of the GPU usage I guess.
The RX 570 has been under $150 since September of last year. That's over 7 months now. There is no mass adoption like we see with nVidia cards, even though the card doesn't really have any true competition within its price range. How long did it take for everyone to have the GTX 970?It is important to note that the RX570 was most likely not cheaper than the 1050Ti at launch. The RX cards have suffered from fluctuating prices that are way higher than MSRP due to crypto, especially in 2017 and Q1-3 of 2018. The game bundles thing was also added to most of the RX cards subsequently. So yes, The RX570 would be a better deal than the 1050 Ti under the right conditions.
If you look across the various price points, AMD loses flat out. The 1660 and 1660 Ti has pretty much destroyed AMD's RX 5xx in performance. Vega 56 needs a MAAAAASSIVE price cut to compete against the RTX cards, Vega 64 is almost dead in the water at the premium-high end of things now.
The RX 570 has been under $150 since September of last year. That's over 7 months now. There is no mass adoption like we see with nVidia cards, even though the card doesn't really have any true competition within its price range. How long did it take for everyone to have the GTX 970?
AMD's main problem is mind share. At this point I really think they are beyond saving. They can bring a card performing like the RTX 2080 at $400 and people will find an excuse not to get it. Enthusiasts will get that card, but the masses will not. It has been clear for many years that the goal post shifts every time. When AMD has the performance, power consumption matters. When AMD has the better power consumption, performance matters more. When AMD has better power consumption and better performance, drivers matter. When AMD has the better drivers and software, that's no longer that important, but price is. And so on and so on.
Gamers have not rewarded AMD in any way for their products. Even after the whole 3.5GB of the GTX 970 came to light, people still kept buying it instead of the R9 390, which had 8GB and didn't really underperform. But ah yes, power consumption.
The one that made me cringe the most recently was that many people were bashing FreeSync for years, because it's oh so inferior to G-Sync. But when nVidia announced that they will support FreeSync monitors too, suddenly nVidia gets a bunch of praise for doing so. How people don't see the double standard is beyond me.
And reviews don't help either, especially recently. It's all tested at default settings. What was the last time someone tested a new AMD card comparing power efficiency mode vs normal mode, which is a simple toggle in their software? When was the last time someone tested a new card with the Chill feature enabled? Basically, every review out there tests the card based on its worst possible performance/watt ratio. They are happy to test DLSS and the likes on nVidia though. The review system has basically been a way to support online E-peen battles, where people can brag about the card that they got, because it's benchmarked to be faster than another card. Whoopdeedoo. What good is that if you don't take all features into account?
People keep saying that AMD needs to step up their game and bring in competition in the gaming space. They are right, but, how can they do that with no income from the graphics division? Sure. Their CPUs are doing well now. But they have debts to pay off, and whatever profits they get from CPU, only a marginal part can go into graphics. And the most important question is... Why would they invest it into gaming graphics? Miners bought more cards in a year than gamers have in a very long time. Not to mention, when they do release respectable cards, like the RX480, people flocked to the cheaper nVidia cards instead. People prefer to get a GTX 1060 3GB, which is technically not even a GTX 1060, than getting a 4GB RX470/570/480/580. That, to me, is beyond comprehension.
When nVidia brings out overpriced cards, they are justified in doing so because features, or performance.
When AMD brings out equivalently performing & priced cards (Hello Radeon VII), they are overpriced because they failed to help lower nVidia prices. And oh yes, don't forget power consumption. We will forget though, if AMD ever becomes more efficient, just like that neat little fact is still swept under the rug when it comes to Ryzen vs Intel CPUs. Because it's all about performance.... Preferably, single threaded performance....
Yes. People want good AMD cards so they can buy cheaper nVidia. And obviously that's not going to work. The current mentality is not going to change the gaming space. Blaming AMD is easy. But maybe, just maybe, it's not only AMD that is to blame.
I never said AMD has no fault. They definitely do, and their marketing is indeed very weak. But when they have little to no fault, they are still punished. nVidia isn't. Not really. RTX is the first time in a long time, where they actually felt something in their pockets.Maybe, just maybe, you can blame then for not having such a good marketing as Nvidia. I think the agressive game Nvidia played all along has helped them to establish themselves as the most "powerful" player, at least with the average consumer.
That is all true. But if the reverse was true, people would wait for prices to drop to get nVidia. They wouldn't flock to AMD instead. So that's a contributing factor. Not exactly the real issue.Yes, the RX570 and most of AMD offerings have their prices stabilized since late last year. My point is, it certainly didn't help their sales that the prices were way off MSRP upon launch. The entire RX 500 series as well as the RX 480 was basically impossible to buy on their various launch, and anyone who needed a mid range graphic card back then would have had no choice but to turn to the 1060.
That is exactly my point with my wall of text. But it's not because of a lack of trying. Or do you think it is?@marquimvfs brings a very relevant point to this issue as well: AMD's presence in the GPU scene is barely noticeable by the casual and average gamer.
Why?I tried suggesting Vega 56 to a friend who had issues with his 1060 and trust me, attempting to introduce AMD alone got me done with the suggestion.
I never said AMD has no fault. They definitely do, and their marketing is indeed very weak. But when they have little to no fault, they are still punished. nVidia isn't. Not really. RTX is the first time in a long time, where they actually felt something in their pockets.
That is all true. But if the reverse was true, people would wait for prices to drop to get nVidia. They wouldn't flock to AMD instead. So that's a contributing factor. Not exactly the real issue.
That is exactly my point with my wall of text. But it's not because of a lack of trying. Or do you think it is?
Why?
Indeed... And now that Intel is going to enter the GPU space... Maybe that will bring in some competition in the long term while AMD decides to exit the gaming space. It will depend on how well Navi and Arcturus do.But moving forward it is going to be interesting to see how much AMD is willing to gamble on the GPU market. I do really hope that they are going to keep battling Nvidia and provide better products, but I won't be surprised at all if they decide to "abandon ship" so to speak and focus on Ryzen. Like you mentioned, it is tough to expect results when you don't see your efforts being rewarded.
Yeah... And that's a hard situation to turn around. AMD has been increasing the amount of games that they sponsor. But as of now, the effect seems to be (near) zero. But obviously as long as nVidia is making money they will not change what they are doing. The funny thing is, that both Sony and Microsoft repetitively choose AMD over nVidia for their consoles. Apple chose AMD too recently for many of their products, and lastly, Google also chose AMD for Stadia.Because said friend of mine had really only ever seen the green logo flash across most games he's played and heard the term "GTX" mentioned every now and then. Mention AMD and most of them would stop to think for a second what AMD actually plies their trade in. It's crazy because AMD is HUUUUUUGE, but doesn't seem to elicit a response from most casuals.
It makes more sense to go for the 2080 over the 2070ti. The gap isn't really that big between the 2 cards and you are better off handing over the little extra bit for the extra performance. Granted, better off just skipping a gen or waiting for NAVI realistically. I have a 2080 and ray tracing hurts my performance so much that I do not bother. Also the 2080 barely holds up in most games at 1440p(the resolution I game at). Maybe I tied too much of my expectations to the price point over the actual performance. Granted I got a great deal on my rig so no actual complaints. I really cannot see justification for a 2060... not for the price.
Another example: I have been playing The Division 2(AMD sign thrown in my face every boot up). Even the vega 64 blows my GPU out the water. It really feels like AMD has potential to be so much more(I know that this game was developed more with AMD in mind). The image quality is actually better in many games(even ones that perform less). Then there is DLSS which is a blurry nightmare and useless. A recent patch was released for BF5 that improved the DLSS. I tried it out. The sharpness is *better* up close but still too blurry when shooting a target in the distance. DLSS is still useless.