• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Crimson ReLive Driver Leaked Slides

Helznicht

Member
These charts are all kinds of confusing. So is a RX480 8gb a worthy upgrade from a GTX970 from a performance perspective?
 
This looks great! Glad to see new recording software for AMD GPUs!

These charts are all kinds of confusing. So is a RX480 8gb a worthy upgrade from a GTX970 from a performance perspective?

No not really, unless you consider 5-15% performance gains a worthy upgrade. In most cases it's within 10% of a GTX 970, and some more under DX12 games.

The GTX 970 also has ridiculous overclocking headroom that can make cards on-par with a stock GTX 980 or a little above that, making it recoup the performance difference between the GTX 970 and the RX 480.

Here's a review from Digital Foundry: Nvidia GeForce GTX 1060 review

The ram boost is great but it's probably better to get a stronger card like a GTX 1070 or an equivalent when AMD releases higher-end GPUs.
 

sirap

Member
This might be a good place to ask. I got a 1060 a couple of months ago but I'm considering getting a 480 since that supports freesync and my monitor does too. Would it be worth it?

I was in your position a few months ago. Freesync is pretty awesome especially on my 43" 4K monitor. Drivers have matured to the point where both cards have pretty similar performance so if you can get a 1:1 swap just do it.

Still, you might want to wait till the end of the year and see what Vega is (assuming they do announce it this month)
 

horkrux

Member
Bar chart magic!

Radeon-Software-Crimson-ReLive-NDA-Only-Confidential-v4-page-040-copy-1140x641.jpg

Learning from the best /s

 

NeoRaider

Member
I guess my 280X is all but abandoned now. Sucks.

Yep. I have the same card and have this feeling. F*cking great.

Maybe they will do something about it later?? But i doubt it. Our cards are considered "old" now and when you think better. 280 and 280x really are old because they are rebranded 7970.

I remember my 280x not supporting VSR when AMD released it, but some less powerful GPUs did, and i felt bad about it. But later AMD added VSR support for 280x, 1440p only but it's better than nothing i guess.
 

jett

D-Member
Yeah I guess there are driver updates, although I'm suspicious of them to be honest. With my older card I swear it kept getting slower with each update. :p I don't think I've updated my drivers since Doom.

And I'm not too pissed, my 280X has served me well for almost three years, but I admit I didn't do my homework and didn't know it was a rebranded 7970 from 2011 when I bought it.
 

NeOak

Member
Yeah I guess there are driver updates, although I'm suspicious of them to be honest. With my older card I swear it kept getting slower with each update. :p I don't think I've updated my drivers since Doom.

And I'm not too pissed, my 280X has served me well for almost three years, but I admit I didn't do my homework and didn't know it was a rebranded 7970 from 2011 when I bought it.

AMD doesn't gimp the performance of older cards.
 

Easy_D

never left the stone age
I guess my 280X is all but abandoned now. Sucks.

Only for the Wattman stuff, apparently, not a huge deal unless you're into GPU oc'ing

Maybe they will do something about it later?? But i doubt it. Our cards are considered "old" now and when you think better. 280 and 280x really are old because they are rebranded 7970.

I remember my 280x not supporting VSR when AMD released it, but some less powerful GPUs did, and i felt bad about it. But later AMD added VSR support for 280x, 1440p only but it's better than nothing i guess.

Not like you want to push the 280X beyond 1440P in most cases anyway.
 

thelastword

Banned
They only gimp them when they're new.

It also helps that AMD used the same architecture for several generations and did a lot of re-badging.
Good strategy, at least they can improve things later on. There's always incentive to buy AMD cards with that strategy. TBH, I just think they have a vested interest in improving their drivers and software, it was their worse attribute, it was never their hardware. So fps gains with better drivers is pretty much guaranteed.
 

jett

D-Member
Only for the Wattman stuff, apparently, not a huge deal unless you're into GPU oc'ing



Not like you want to push the 280X beyond 1440P in most cases anyway.

It actually performs quite nicely at 1440p and 1800p in older games using GeDoSaTo. I even tried 4K on Valkyria Chronicles but I couldn't hold 60fps there.
 

Zemm

Member
This might be a good place to ask. I got a 1060 a couple of months ago but I'm considering getting a 480 since that supports freesync and my monitor does too. Would it be worth it?

Dude yes (assuming you can swap them). That's the combo I have and it's amazing. Screen tearing is my #1 pet peeve but I also hate the input lag vsynch brings, so this combo is literally perfect. Titanfall on max settings and 100+ fps with no tearing and no input lag is incredibly smooth. Dunno how I'm going to go back to some console games honestly (and some crap pc ports).
 

Kayant

Member
AMD bringing that heat! Really though their answer to shadowplay sounds great.

Radeon Technologies Group has been been doing great things since it was formed.
 

Easy_D

never left the stone age
It actually performs quite nicely at 1440p and 1800p in older games using GeDoSaTo. I even tried 4K on Valkyria Chronicles but I couldn't hold 60fps there.

Oh really? Well shame on AMD for restricting it for no reason. I knew 1440p was no issue for 360 era ports, but 1800p as well huh?
 

NeOak

Member
Oh really? Well shame on AMD for restricting it for no reason. I knew 1440p was no issue for 360 era ports, but 1800p as well huh?

I'm sure the lack of hardware for it in the GCN 1.0 chip had nothing to do with it. Yeah.
 

Easy_D

never left the stone age
I'm sure the lack of hardware for it in the GCN 1.0 chip had nothing to do with it. Yeah.

If you can downsample with GeDoSaTo there's literally zero reason AMD couldn't have let you customize the downsampling resolutions instead of restricting it to 1440p. Not talking about Wattman here.
 

tuxfool

Banned
If you can downsample with GeDoSaTo there's literally zero reason AMD couldn't have let you customize the downsampling resolutions instead of restricting it to 1440p. Not talking about Wattman here.

The method AMD uses to implement downsampling is different. AMD does it in the hardware scaler, as such it is pretty limited in comparison to gedosato.
 

Locuza

Member
Oh, I see, thank for the clarification
Your point still holds up.
Nothing forced AMD to solve VSR through the Display Engine, they could have done it through a sofware solution like GeDoSaTo or DSR from Nvidia.
They still should do it, the Display Engine solution is too restrictive, doesn't support a wide range of products, doesn't support a wide range of aspect ratios, doesn't support different or advanced downsampling algorithms.

Also Nvidia could move their butt too, started with a bad Gauss Filter and still using it.
So much potential left aside.
 

Easy_D

never left the stone age
Yeah, no Wattman for 280X was the entire point of my post actually, since I own one myself. Fineprint for Chill only specifies GCN and GCN Polaris so it should come to 1.0's as well.

If you look at the Wattman one the fineprint specifies the specific cards that get it, implies that Chill is for everyone, which is cool :)
 

abracadaver

Member
Chill sounds really good.

I don't care about the power consumption but it reduces frametimes as well?

radeon-software-crimsfnufz.jpg



Only in selected games though and not in DX12/Vulkan yet

radeon-software-crimsc2u55.jpg


Is that CS:GO or CS 1.6?
 

riflen

Member
Chill sounds really good.

I don't care about the power consumption but it reduces frametimes as well?
..

I don't see how it can reduce the time taken for the GPU to render a frame, which is what we mean when we talk about frame times, unless they somehow reduce the scene complexity. Here they seem to be talking about time to display the frame. To me it looks like they dynamically alter the DirectX render queue length. This would also explain why support is on a per game basis, as messing with the queue length may not work well with all games.

Having a deeper queue keeps the GPU busier. Perhaps busier than it needs to be, if you're trying to keep power and heat down to a minimum. Another problem with a deep queue is latency. Say you're on a 60 Hz display with Vsync enabled and a 3 frame queue. You only need a new frame every 16.7ms. Let's say you start producing new frames every 12.5ms (80fps). The extra frames that don't need to be displayed yet are written to a queue and shown at the next 16.7ms interval. With a 3 frame queue, you're effectively seeing the past on your display, as far as the game engine is concerned. Game input from your controller is being processed every 12.5ms, but you're seeing the results of those inputs 3 frames later on, at 16.7ms intervals. This is what results in floaty, laggy control response sometimes.

So, if you're not moving your character, or looking at the floor, perhaps they reduce the queue to 1 frame instead of 3. This would reduce your GPU usage a little (it's told not to queue up frames), making these savings they're talking about.
The compromise would be that as the frames are delivered "on demand", so to speak, you're vulnerable to an unexpected drop in performance. If you can't meet the 16.7ms interval, you've not got a frame ready in your buffer to keep things smooth and so the last frame might be repeated. This will look like stutter/hitching/judder. Smaller queue will reduce latency of your controls, which is what AMD could be showing here. Just my guesswork.

TL;DR - Chill could be the driver dynamically controlling the length of the DirectX render queue.

EDIT: Apparently there was no need for speculation, as AMD acquired this software recently: http://www.hialgo.com/TechnologyCHILL.html

So, it's a dynamic frame rate limiter.
 

Easy_D

never left the stone age
Chill prevents underclocking? So I could theoretically play Knights of the Old Republic at something that isn't 30 FPS now? Because no matter what I've tried I haven't had any luck getting my GPU to run at high load speeds in that game, it's all "Nah, gonna idle while you play this". It's annoying having to "OC" my GPU every time I want to play it.

Edit

If you can't meet the 16.7ms interval, you've not got a frame ready in your buffer to keep things smooth and so the last frame might be repeated. This will look like stutter/hitching/judder
Ew. That makes zero sense with the stuff on the website "CHILL is for you -- if you experience performance drops while playing.". Repeat frame stutters are nasty.
 

RootCause

Member
I was in your position a few months ago. Freesync is pretty awesome especially on my 43" 4K monitor. Drivers have matured to the point where both cards have pretty similar performance so if you can get a 1:1 swap just do it.

Still, you might want to wait till the end of the year and see what Vega is (assuming they do announce it this month)
Can I get a link to that monitor?
 

chaosblade

Unconfirmed Member
This sounds pretty great. A Shadowplay alternative and borderless freesync were two of my most wanted features. Just need an integer scaling option now.
 

riflen

Member
Ew. That makes zero sense with the stuff on the website "CHILL is for you -- if you experience performance drops while playing.". Repeat frame stutters are nasty.

That's because it's marketing garbage. The performance drops they're talking about are specifically if your GPU is throttling due to heat.
 
Top Bottom