• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why is Starcraft 2 making my GPU run hot as hell?

Mr Swine

Banned
I know that Geforce GTX580 with the Nvidia reference coolers are not good (I have a zotac card that s down clocked). But why the hell is Starcraft 2/Hots up at 82-90c when Vsync is not activated? When Vsync is activated then its at 70-75c. I thought the game wasn't that taxing on the GPU?
 

Cade

Member
If you don't have v-sync activated, is it because you're displaying a lot of unnecessary frames (framerate not capped)?
 

Jedi2016

Member
With Vsync off, the game will run at whatever fps it's capable of achieving, regardless of whether you can see it or not. I've had ten year old games run upward of 300-400fps, absolutely cooking my card. There's no reason at all not to have vsync on in situations like that. The only time you ever want it off is when the game is running lower than you want, and you need to boost performance.

Even so, 70 is really damned hot for this game. My 680 barely heats up at all at full 1080p60, maybe up to around 50-55, except during cutscenes that push the graphics harder.
 

Haunted

Member
I remember that bug during the beta. Could've sworn they fixed that in a patch, though.

Paradoxically, the less was going on on the screen, the hotter your card would get. It's definitely recommended to enable vsync.
 

RoyalFool

Banned
Enable vsync, in menus it has no limiter and so you'll end up getting 200+fps and all those buffers flipping between memory constantly heats things up.
 

Ushae

Banned
Yeah I think you may have uncapped framerates, so try enabling vsync. Perhaps look into getting better coolers too.
 

RPGCrazied

Member
The only problem I have is during the Utter Darkness mission, when the protoss mothership comes into play and 100's of zerg, my FPS tanks. I can't seem to get that mission to run any better.
 

Mr Swine

Banned
With Vsync off, the game will run at whatever fps it's capable of achieving, regardless of whether you can see it or not. I've had ten year old games run upward of 300-400fps, absolutely cooking my card. There's no reason at all not to have vsync on in situations like that. The only time you ever want it off is when the game is running lower than you want, and you need to boost performance.

Even so, 70 is really damned hot for this game. My 680 barely heats up at all at full 1080p60, maybe up to around 50-55, except during cutscenes that push the graphics harder.

Wow I never knew that could happen when playing older games without Vsync :/ but your card is manufactured at 28nm while mine is at 32 (I think) and yours is newer to boot :p

Enable vsync, in menus it has no limiter and so you'll end up getting 200+fps and all those buffers flipping between memory constantly heats things up.

Ok, good to know. Thanks!
 

Durante

Member
With Vsync off, the game will run at whatever fps it's capable of achieving, regardless of whether you can see it or not. I've had ten year old games run upward of 300-400fps, absolutely cooking my card. There's no reason at all not to have vsync on in situations like that. The only time you ever want it off is when the game is running lower than you want, and you need to boost performance.
Actually, having the framerate uncapped has one minor advantage: It improves the responsiveness of the game. That's why in Starcraft 2 -- which already features very responsive controls -- the mouse cursor actually feels more responsive than in Windows. Take that games with shitty slow-as-molasses mouse handling (I'm looking at you Bethesda)!
 
Actually, having the framerate uncapped has one minor advantage: It improves the responsiveness of the game. That's why in Starcraft 2 -- which already features very responsive controls -- the mouse cursor actually feels more responsive than in Windows. Take that games with shitty slow-as-molasses mouse handling (I'm looking at you Bethesda)!

Really? Don't pretty much all modern games that have an on-screen pointer use a hardware cursor though?
 
Top Bottom