• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Poor AMD performance with DOOM.

tuxfool

Banned
So the game still runs like shit on 280X, at least a fellow on reddit with a 7870Ghz edition (which is pretty much the same card) has dips to 30 FPS on Medium settings, shadows on low. At one point he even dipped down to 24 :lol.

You mean the 7970Ghz.
 

Instro

Member
So the game still runs like shit on 280X, at least a fellow on reddit with a 7870Ghz edition (which is pretty much the same card) has dips to 30 FPS on Medium settings, shadows on low. At one point he even dipped down to 24 :lol.

God dammit, AMD fix your shit. There's no fucking reason a 280X should dip to 30 FPS constantly on Medium preset when the 960 can pull 30 minimum on the highest possible settings. It just ain't right.

I believe the 280x is the equivalent of the 7970ghz.
 
As a little overview I really like this twitter feed:
id8hsc4.jpg

Tiago Sousa is an ex-Crytek guy, their former "Lead Graphics R&D Engineer" apparently. He's Id's new engine man, filling the big shoes of John Carmack. So far he's done a bang-up job with retrofitting id Tech 5 into the id Tech 6 seen in DOOM, it runs great with excellent frametimes and smooth framerate, has very nice graphical quality, and handles both indoor and outdoor areas very well. I've now played through most of Doom and I'm optimistic that Id can continue their engine quality with this man at the helm.
 

garbagejuicer

Neo Member
290 @ 1130/1325 / FX9590 Here.

Just did a real quick and dirty test in SP Foundry level. Went to a couple spots on the map, and wrote down the FPS. I realize this tells nowhere near the whole story, but figured someone else with a similar setup might find the results interesting.

1080p Ultra Preset, 8x TSSAA

16.5.2
Location A: 60
Location B: 113

16.5.2.1
Location A: 92
Location B: 131
 

Shaneus

Member
Today's.



They tried a rewrite sometime around 2003-2004. I'm not sure exactly what they got out of it, but they lost at least a year in terms of opengl support in the process.



Yes.
Cool, thanks for letting me know! I tend to be a bit behind in some of these GPU development shenanigans.
 

Renekton

Member
Tiago Sousa is an ex-Crytek guy, their former "Lead Graphics R&D Engineer" apparently. He's Id's new engine man, filling the big shoes of John Carmack. So far he's done a bang-up job with retrofitting id Tech 5 into the id Tech 6 seen in DOOM, it runs great with excellent frametimes and smooth framerate, has very nice graphical quality, and handles both indoor and outdoor areas very well. I've now played through most of Doom and I'm optimistic that Id can continue their engine quality with this man at the helm.
So far he added PBR, dynamic lighting and uncapped FPS? I always thought Id Tech 5 was allergic to those.
 

Bronion

Member
R9 290
i5-4460 @ 3.2 GHz

Still dropping to a rock-solid 30fps in the first room even on Low preset with V-sync on.

Could it be my CPU? It's not that bad, is it?
 

DoctorDonkey

Neo Member
280x here, with 16GB DDR4 and a 6700k. I did a test before I updated to the latest beta drivers. The room you start the game at is the most taxing I've seen, so I stood in the corner and stared at the table.

Before: 28-33 fps
After: 67-72 fps
 
AMD is over a year past the OpenGL 4.5 spec being released as Durante has pointed out. AMD's customers are seeing poor OpenGL support for id's latest game. AMD customer's saw poor OpenGL support for id's previous game, and the one before that. Luckily Vulkan may break the cycle, because it's pretty clear AMD won't at this point.

It's amazing that in some people's world that Nvidia actually making an effort to give good OpenGL support means that AMD gets to pass the buck for their poor support to an unrelated developer.

I don't think AMD is really interested in devoting a lot of time and resources into OpenGL anymore. I think from their point of view, as long as it is stable and gets passable performance on most applications they will be quite happy with that outcome. They really don't seem to bend over backwards for the API unless there are huge releases like Doom that they have to deal with. It is a shame that their OpenGL performance has always been a little lacklustre, I use Linux on a regular basis and this is the main reason why I always go with Nvidia.

I would imagine that they are going to be putting more of their focus on Vulkan going forward.
 

Henrar

Member
It's amazing how far people will go to blame a company using a standard on politics.

What is the argument really, that AMD's OpenGL driver doesn't suck? Well, I've worked with various OpenGL implementations but don't take my word for it -- you don't need to.
Or is it that it's NVidia's fault or id's fault that AMD's OpenGL driver sucks?
Can you explain how beta, that reported newer version of OpenGL (4.5 vs 4.3 in final) ran better on AMDs hardware and the final release just sucks in performance? I'm curious.
 
Can you explain how beta, that reported newer version of OpenGL (4.5 vs 4.3 in final) ran better on AMDs hardware and the final release just sucks in performance? I'm curious.

Durante doesn't need to explain it, the devs already have. The Beta was on medium settings for all users and it's apparently the higher settings which make use of effects and such that are problematic for AMD hardware with OpenGL. See the tweets posted earlier ITT.
 
Can you explain how beta, that reported newer version of OpenGL (4.5 vs 4.3 in final) ran better on AMDs hardware and the final release just sucks in performance? I'm curious.

The AMD driver dev on twitter stated in particular that the higher shadows were a major reason why AMD cards were crapping themselves at higher settings. The issues were very clearly in AMD's implementation of OpenGL and AMD was able to quickly fix at least some of the major performance issues post-launch of DooM. The real question you need to ask is why AMD didn't fix it sooner, as they obviously would have had the opportunity to get a look at the game pre-launch and see how the game performed beforehand. But AMD being AMD probably didn't give enough of a shit until all the bad PR hit about Ultra settings shitting the bed on Radeons.

I know everyone's quick to blame Nvidia anytime an AMD card performs poorly in a game and sometimes that blame is justifiable. Sometimes you just gotta deal with AMD just not being good at writing drivers, their DX11 drivers were always subpar in any game that was CPU limited and their OGL drivers have always, always been a shitshow.
 

LCGeek

formerly sane
The AMD driver dev on twitter stated in particular that the higher shadows were a major reason why AMD cards were crapping themselves at higher settings. The issues were very clearly in AMD's implementation of OpenGL and AMD was able to quickly fix at least some of the major performance issues post-launch of DooM. The real question you need to ask is why AMD didn't fix it sooner, as they obviously would have had the opportunity to get a look at the game pre-launch and see how the game performed beforehand. But AMD being AMD probably didn't give enough of a shit until all the bad PR hit about Ultra settings shitting the bed on Radeons.

I know everyone's quick to blame Nvidia anytime an AMD card performs poorly in a game and sometimes that blame is justifiable. Sometimes you just gotta deal with AMD just not being good at writing drivers, their DX11 drivers were always subpar in any game that was CPU limited and their OGL drivers have always, always been a shitshow.

Been using amd/ati products since rage. No they weren't always shit back in the 9700 pro day the cards were fine for plenty of OGL heavy games. What's sad about AMD/ATI on drivers is they chose to become shit and ignore games/benchmarks.
 
Been using amd/ati products since rage. No they weren't always shit back in the 9700 pro day the cards were fine for plenty of OGL heavy games. What's sad about AMD/ATI on drivers is they chose to become shit and ignore games/benchmarks.

I remember vividly back when Doom 3 came out and internet forums were having the exact same arguments that we're having right now. Back then, Radeons were pretty much stomping GeForce cards in performance...except in Doom 3. Looking back at the benchmarks now and it even mimicks the performance curve we see now, Radeons perform well in medium settings then dropped off the cliff on the higher settings. I was a Radeon owner myself in those times and I remember spending A LOT of time tweaking Doom 3 to the limits to try to maximize the FPS in that game.
 

Easy_D

never left the stone age
I remember vividly back when Doom 3 came out and internet forums were having the exact same arguments that we're having right now. Back then, Radeons were pretty much stomping GeForce cards in performance...except in Doom 3. Looking back at the benchmarks now and it even mimicks the performance curve we see now, Radeons perform well in medium settings then dropped off the cliff on the higher settings. I was a Radeon owner myself in those times and I remember spending A LOT of time tweaking Doom 3 to the limits to try to maximize the FPS in that game.

I barely bothered with Doom 3 until I got an 8800 GT. The one I had before that was a Radeon 9600 and that card was, well, not Doom 3 friendly :lol
 
So far he added PBR, dynamic lighting and uncapped FPS? I always thought Id Tech 5 was allergic to those.

Well I get the impression he's already rewritten large parts of the engine. Some of the stuff we saw in CryEngine has found it's way into id Tech 6, that's probably not coincidental.
 

Henrar

Member
Durante doesn't need to explain it, the devs already have. The Beta was on medium settings for all users and it's apparently the higher settings which make use of effects and such that are problematic for AMD hardware with OpenGL. See the tweets posted earlier ITT.

I'm interested in his opinion as he's experienced with rendering and graphics APIs. Thanks for the answers, both to you and Beautiful Ninja.
 
I remember vividly back when Doom 3 came out and internet forums were having the exact same arguments that we're having right now. Back then, Radeons were pretty much stomping GeForce cards in performance...except in Doom 3. Looking back at the benchmarks now and it even mimicks the performance curve we see now, Radeons perform well in medium settings then dropped off the cliff on the higher settings. I was a Radeon owner myself in those times and I remember spending A LOT of time tweaking Doom 3 to the limits to try to maximize the FPS in that game.

Yeah I had a radeon 9800 pro and doom 3 was the first game that made my gpu feel hopelessly outdated. It just wasn't good enough for doom 3 on high settings and nowhere near good enough for ultra.


I never bothered to check how it performed on nvidia back in the day though, I just know it ran like SHIT on my gpu.


Meanwhile games like bf1942, half life 2 (on high settings, ultra got about 30-45 fps) , NFS most wanted (an early xbox 360 game) etc all ran brilliantly.
The 9800 pro was a total beast in its time and for at least 1.5-2 years after release.
 

Novum

Neo Member
Well I get the impression he's already rewritten large parts of the engine. Some of the stuff we saw in CryEngine has found it's way into id Tech 6, that's probably not coincidental.
The credits list *seven* additional senior engine/rendering engineers and an additional lead programmer besides Tiago. That's not how any of this works. Nobody in this industry writes anything by himself anymore.
 

drotahorror

Member
I seem to have around the same fps on low or high with a 7950 3gb, 8gb ram, i5 2500 . around 30-55fps, mainly in the middle at all time. Goes down when a monster pops up, sat around 35-45 in the first big room on the first level.

definitely time to upgrade, this isn't playable to me. I can't wait til is though, seemed fun.
 
The credits list *seven* additional senior engine/rendering engineers and an additional lead programmer besides Tiago. That's not how any of this works. Nobody in this industry writes anything by himself anymore.

More nonsense from Unknown Soldier as usual.

Has the Crimson hotfix resolved most of the issues for 290/X/390/X owners? Just saw a comparison bench on another site.
 

Adry9

Member
I seem to have around the same fps on low or high with a 7950 3gb, 8gb ram, i5 2500 . around 30-55fps, mainly in the middle at all time. Goes down when a monster pops up, sat around 35-45 in the first big room on the first level.

definitely time to upgrade, this isn't playable to me. I can't wait til is though, seemed fun.
That's for posting your experience, I'll hold off from buying the game until it's fixed.
 

Easy_D

never left the stone age
280x here, with 16GB DDR4 and a 6700k. I did a test before I updated to the latest beta drivers. The room you start the game at is the most taxing I've seen, so I stood in the corner and stared at the table.

Before: 28-33 fps
After: 67-72 fps

That's, what. That's like a 120% performance increase, hot diggity damn. What preset are you using?
 

drotahorror

Member
That's for posting your experience, I'll hold off from buying the game until it's fixed.

Alright I jumped the gun a bit. I got the hotfix driver which I didn't think would do anything but I guess it did. I have everything on medium except reflections and particles on low, and in the first area I get about 50-60fps in the big lab area, playable for sure (vsync on) and in the next level I'm getting 60 no dips at all so far.

*just turned vsync off and getting 70ish fps in new area and it feels fine to play at (sometimes certain games over my refresh rate feel really janky so I have to use vsync). Doom feels good w/o vsync on though.

7950 - latest hotfix driver
8gb ram
i5 2500
ssd
 

Adry9

Member
Alright I jumped the gun a bit. I got the hotfix driver which I didn't think would do anything but I guess it did. I have everything on medium except reflections and particles on low, and in the first area I get about 50-60fps in the big lab area, playable for sure (vsync on) and in the next level I'm getting 60 no dips at all so far.

*just turned vsync off and getting 70ish fps in new area and it feels fine to play at (sometimes certain games over my refresh rate feel really janky so I have to use vsync). Doom feels good w/o vsync on though.

7950 - latest hotfix driver
8gb ram
i5 2500
ssd

Ok that's better. Medium/Low settings though, what are consoles running on approximately?
 

DSix

Banned
So they only fixed the 390 issues and wont acknowledge that even on medium/low the performances of a 270x are nowhere near the equivalent gtx 960.

The 390 and high settings are not the only things to have huge issues.
 

DieH@rd

Banned
So they only fixed the 390 issues and wont acknowledge that even on medium/low the performances of a 270x are nowhere near the equivalent gtx 960.

The 390 and high settings are not the only things to have huge issues.

There are other reports of great increase of performance with latest beta driver. Post #412
 

-edm

Neo Member
Playing on an r9 290 and 4690k oc to 4.5 i'm getting nothing less then 75+ fps with everything on ultra at 1080p with newest drivers. Game ran great before as well but it's even better now.
 

DSix

Banned
There are other reports of great increase of performance with latest beta driver. Post #412

Alright, I installed the driver, and it's still just as bad. I still can't hit a reliable 60fps on 1080p no matter how low my settings. This is unacceptable, a few weeks ago I was having a grand time running DS3 at a rock solid 60 framerate, and DS3 looks 10 times better than whatever I'm looking at with Low Doom.

It's a fucking mess.
 

DieH@rd

Banned
Well, you have 270X which is near lowest spec. It's no wonder you cannot hit 60fps on any visual setting.

And there is no point comparing this with other games. Every game is different.
 

DSix

Banned
Well, you have 270X which is near lowest spec. It's no wonder you cannot hit 60fps on any visual setting.

And there is no point comparing this with other games. Every game is different.

Are you joking? It's an equivalent to a gtx 960, and there's no reason that I'm at half the framerate on a game that doesn't look that good to begin with.
 

DieH@rd

Banned
I am not joking. 7870 is a min spec for Radeons for this game, and 270X is a bit better than that.

Specs were out for a while now, and you could have expected that 60fps would be out of reach. 960 is much more powerful than gforce minspec for this particular game.
 
AMD had issues with iD games far back as Rage. And as another poster points out even Doom 3 ran poorly on AND. Rage was was absolutely fucked on AMD cards at launch. After that I abandoned ship for Nvidia. I just had too poor a user experience with AMD to keep buying their products.
 

DSix

Banned
I am not joking. 7870 is a min spec for Radeons for this game, and 270X is a bit better than that.

Specs were out for a while now, and you could have expected that 60fps would be out of reach. 960 is much more powerful than gforce minspec for this particular game.

The 270x is more comparable to a 7950 as it's at a higher clock than a 7870. I have no issue playing everything at 60fps on medium, Doom is the only recent game to be unplayable. Stop trying to deflect that shit.
 

derFeef

Member
AMD had issues with iD games far back as Rage. And as another poster points out even Doom 3 ran poorly on AND. Rage was was absolutely fucked on AMD cards at launch. After that I abandoned ship for Nvidia. I just had too poor a user experience with AMD to keep buying their products.

Well I see a pattern there, lol.
While it's true idtech is just bad with AMD (or OpenGL in general), there is no denying that iD could have worked a bit more closely with AMD. Wolfenstein to this day is still not very good on my 390, it's ridiculous.
 
Top Bottom