• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

E-cores and HT are bad for gaming

Panajev2001a

GAF's Pleasant Genius
More like apple m line is proof that the x86 architecture is a dinosaur in it's last legs that needs to be phased out as soon as possible in favor of RISC processors. Don't even bother because with almost 4 chip makers now developing arm processor for domestic motherboards and with a Window version in the works too, this Intel approach is about to become history.
You, you have not watched Apple’s A17 and M3 Pro announcements or Qualcomm/Nuvia’s latest one either right ;)?

Heterogeneous big.Little designs are here to stay 😂… :/…
 
The industry is terrible and slow at adapting to new tech. News at eleven.
Crysis ran like shit on pretty much all average PCs because it was single threaded af.
Many games suffer from stuttering because syncing parallelization seems not to work well.
E-cores are probably perfect for some minor loads that otherwise would need to be scheduled through the proper cores, making the demanding tasks wait. Make eg eplorer.exe, spotify, discord and your antivirus programm run on it. But so far games hardly are multithreaded properly for 8 cores since consoles set that number and I think use at least one core for system stuff anyway and if done really dumb the game possible directs a bigger task to the e-core and that is of course not really fast. E-cores and energy saving clock speeds are not necessarily made for gaming in mind, but for energy labels, but if used clever they can add performance like anything with any compute power.
 

Puscifer

Member
I'm team 3D Cache honestly, it's such a improvement for games like Battlefield, WoW, FF14 and some games with huge simulation and player counts.

5800X3D and 7800X3D for the win
5800X3D owners stay winning, what a way to end a socket that it still competes against more expensive units
 
  • Like
Reactions: RNG

GymWolf

Member
So in simple terms, what do i have to do to have better perf with my 13600k during 4k gaming?
 
Last edited:

SF Kosmo

Al Jazeera Special Reporter
More an issue of lazy console porting and /or not optimizing for new hardware. Games with good threading like Cyberpunk or Starfield benefit a great deal from the extra cores. It's not an inherently flawed design for gaming, it's just a matter of games not really being written an optimized in ways that maximize modern CPUs. And the implication of that is also that most games aren't very CPU limited on these chops in the first place.
 
Last edited:
The "E" cores is the reason i'm switching to AMD for the first time after 20 years.
For the first time in over 20 years I went AMD again. As I only browse, stream video content and play games I went with a 7800X3D. The boot times can be annoying if you restart your PC a lot but I personally don't and you can turn some settings on in the UEFI to shorten the boot time if you prefer to completely shutdown your PC all the time.
 
Is this again something we see pop-up at 1080p with a 4090 slapped in and as soon as you up the resolution all the differences disappear and nearly all CPUs normalize to the same performance?

I'm on AMD but i feel like CPU bound scenarios are mostly for wanking
I play at 60fps so whatever.
 

Tams

Member
Why make this design in the first place? Why not just make less regular cores? Instead of, say, 6+4 why not go for just 8?

Why add this level of complexity for the developers to solve?

To me this whole design exist only to make their CPUs look more impressive to the masses. Because this way they can make and sell CPUs with "more" cores. More is better, right? It's like the Pentium 4 CPUs and how they looked better on paper because the numbers were higher VS the Athlons. The only difference is that back then it was all about the GHz speed and now it's about the number of cores. This is classic Intel.

The idea is that the OS scheduler sorts it all out and that programmes don't need to know which cores to use, they just get 'told' which they will get.

The problem is Windows hasn't had this, so heterogeneous cores on Windows is still a bit dodgy.

That said, some chip designers are going against having physically different cores and just having all the cores be the same, just altering things like the clock speed. MediaTek are doing this with their Dimensity 9300, and I believe AMD are planning on doing it that way too.
 

IFireflyl

Gold Member
That video shows that on average (with these particular games) there's not much difference percentagewise between E-cores on and off (which actually is not a great advertisement for technology that can double or triple the number of available CPU cores), but there are the problem cases where you should turn off e-cores. And that's my point. I'm not playing 40 games at once, I'm playing one at the time. How do I know I'm not playing one of those games that have worse performance? If e-cores provide little or no additional performance, that's ok, but worse?

oWRlkFY.jpg


BTW Starfield is (or was, perhaps it's been patched) also a game that runs better when you turn E-cores and in particular HT off according to Digital Foundry. (video link)

PYIOJZB.jpg

Why did you bother responding to me? My point was to blame developers for being too lazy to work with existing and changing technology. You provided examples of developers who aren't doing that, and you're pretending that this is the fault of Intel by blaming the technology. That's stupid. The games that aren't using the technology have, for the most part, small decreases in performance. The numbers game in the video I posted shows that more games benefit by having E-Cores enabled. So E-Cores is not the problem. Developers are.
 

DaGwaphics

Member
I don't think the results in the graphic are even unreasonable, I'd say almost an expected result. HT can be great for gaming if it gives you access to threads you need but otherwise don't have (run this test on a 4 core). If a game can only use 6 threads, than disabling all but 6 threads is going to net you the best results (always, on any chip) because you are using less heat and energy now and the 6 active cores will boost higher as a result. Plus, you know that everything is getting assigned to a real core and not a virtual one.

I don't see how anyone figures this is an issue with the Intel design. Games just don't use as many threads as what the top end processors offer, complain to the developers about that.
 

SHA

Member
People need to get educated, what are you expecting from the silicon? Marketers ain't gonna do you a favor, we've been in this situation for a whole decade and nothings changed, Jensen been talking about relying on Technology, well, unfortunately, that's his resolve and it's not applicable to everyone, every chip, every brand, we're basically still dealing with the main issue with the silicon and nothings changed.
 
Last edited:

GymWolf

Member
Dont play shitty unoptimized Unreal Engine games. Man....it just hit me, Cyberpunk Orion is also being made on UE5...it took them 2 years to optimize an engine they were super familiar with..imagine what kind of an abomination this will turn to?
But robocop and remnant 2 (and honestly even fallen) run pretty well to great on my end.

Some bruteforcing going on but i'm never gonna return to mid config anymore, only top class from now on so these problems don't really touch me that much.
I had way worse problems with starfield that doesn't use unreal at all.

And ue5 is gonna improve with time.

If fucking teyon can make a great looking game that also run great, i'm sure more competent devs can do even better, at least on high end configs.
 
Last edited:

Solidus_T

Member
AMD is using a similar architecture for their next CPU line up supposedly. I just hope they don't scrap the X3D options that have been the best option for gamers.
 

kruis

Exposing the sinister cartel of retailers who allow companies to pay for advertising space.
Here are two more examples of games that run better when you turn e-cores/ht off.

dpPWR3B.jpg


I think both games have had patches that fixes the CPU scaling but I'm not 100% sure.
 
Last edited:

Nvzman

Member
Why make this design in the first place? Why not just make less regular cores? Instead of, say, 6+4 why not go for just 8?

Why add this level of complexity for the developers to solve?

To me this whole design exist only to make their CPUs look more impressive to the masses. Because this way they can make and sell CPUs with "more" cores. More is better, right? It's like the Pentium 4 CPUs and how they looked better on paper because the numbers were higher VS the Athlons. The only difference is that back then it was all about the GHz speed and now it's about the number of cores. This is classic Intel.
Except E cores do make MASSIVE performance differences in properly multithreaded applications, even outperforming 16 core AMD 7950x CPUs in those situations. The issue is not that Intel is bullshitting with the E-cores, its more that a lot of programs just simply aren't optimized super well for the architecture. Even some games like Cyberpunk actually perform best with E cores because the engine is properly programmed to utilize it and as a result its obvious they aren't just a gimmick.
 

Alexios

Cores, shaders and BIOS oh my!
Sh/Couldn't games/apps, choose to not utilize ecores/ht if they run better by only running on the main cores (other than base/boost clocks being different with them off, that's usually a small difference)? Or is there no way for them/the OS to identify the difference and not run on those threads?
 
Last edited:

nkarafo

Member
Except E cores do make MASSIVE performance differences in properly multithreaded applications, even outperforming 16 core AMD 7950x CPUs in those situations. The issue is not that Intel is bullshitting with the E-cores, its more that a lot of programs just simply aren't optimized super well for the architecture. Even some games like Cyberpunk actually perform best with E cores because the engine is properly programmed to utilize it and as a result its obvious they aren't just a gimmick.
I'm not suggesting that E cores are worse than NO cores. I'm not saying that a 6P core is better than a 6P-4E core CPU. I'm saying that (IMO) having 1 regular P core is better than having 2 x E cores instead. For instance, a 6P-4E CPU would be worse than a 8P CPU.

That "MASSIVE" performance difference you talk about would still be there if you had 2 x extra P cores instead of 4 x extra E cores. But it wouldn't need extra effort from the developers or the OS to reach it due to the extra complexity.

Now i'm not a tech expert and i don't know all that for sure. But to me it does look a lot like Intel did this so they can sell CPUs with more cores.
 
I mean, this was something people told you to do with the 12k intel series and onward upon release. It's dumb of course, but back when 12k series launched there were compatibility issues with old games otherwise. I remember having issues with AC Valhalla until I disabled e cores or something like that. I don't really notice any performance difference from it, which is the good news at least.
 
More like apple m line is proof that the x86 architecture is a dinosaur in it's last legs that needs to be phased out as soon as possible in favor of RISC processors. Don't even bother because with almost 4 chip makers now developing arm processor for domestic motherboards and with a Window version in the works too, this Intel approach is about to become history.
Nah x86 is fine the difference between apple and Intel/AMD is that Apple is making chips specially tailored for itself so they can pick whatever traits they value for their products as Apple does not sell it's chips to anyone. Intel and AMD sell their chips to customers, neither company has a big PC business where they make their own Mac/iphone like products. Instead companies like HP, Lenovo and Samsung as well as data center companies shop at Intel for the best CPUs for their needs.

That means Intel must make chips for the data center that excel at data center uses, they must make chips for enterprise use, for IoT, for consumer PCs, for gaming PCs and the chips have to excel at all these tasks and scale up and down on a level Apple chips cannot.
 
Most PC gamers expect that buying the latest and greatest CPUs will automatically translate to games running with higher frame rates and less dropped frames. Look at any CPU benchmark test and you'll see that the CPUs with the most cores and highest clock speed dominate the charts. Of course they do, so hip hip hooray for technological progress!

But they do really?

It's becoming clearer to me that Intel's most recent CPUs with additional efficiency cores are actually inefficient in many games. Maybe even most games but we don't really know how many games are affected because generally tech/games reviewers never do these kinds of performance tests. That means there's little information about this problem out there. It's a major issue IMO but completely unreported.

Alex Batagglia demonstrated in a number of recent videos that games engines like Unreal Engine 4/5 don't make good use of CPUs with many cores. In those cases a game's performance will not linearly scale with the increased number of cores but will get only marginal performance increases when turning on additional CPU cores. There are also games out there that perform better when you turn off features like hyperthreading and e-cores.

Takes Lords of the Fallen for an example of a games that has negative performance scaling. (video link). Here you can see an example of a game that runs better using just 6-cores of a 12900K CPU than in its default configuration with all cores, hyperthreading and e-cores turned on.

P8TdTJm.jpg


To combat this embarrassing situation Intel has recenlty introduced APO (Application Optimization Overview). This tool from Intel can make unoptimized games run better on Intel 14the gen CPUs. In the screenshot below you can see an example of what's possible. Rainbow Six Siege now runs at 724 fps instead of 610 fps. Impressive! But if you look at the chart more closely you'll see that you could wring more frames out of that game by turning HT off (633 fps) and by turning e-cores off (651 fps). (video link)

rU2qfSr.jpg


When Intel introduced their 12th gen CPUs there were people who feared performance degradations, but these fears were waved away. Intel's new Thread Director working together with Windows 11's improved task scheduler would monitor the system and move move background tasks to E-cores and make sure that performance would be optimized. Well, that obiously didn't come to pass.

Even worse is that APO, Intel's solution to these performance problems, currently supports only two games and will ru only on 14th gen Intel CPUs even though 12th and 13th gen Intel CPUs use almost the same tech. It's clearly more important for Intel to sell more 14th gen CPUs than support old customers who bought into the E-core fantasy.
The problem is the devs. Well optimized games like Cyberpunk take full advantage of e cores and have amazingly good CPU optimization. Then there's of the devs who don't put in the time to better optimize their games. Eventually things will get better as E cores are a very new concept but for now it is what it is.
 
It just feels like the more CPU designers try to do, the more problems are created in the process. At some point they'll add in a "game mode" switch you can toggle manually on/off that will cause even more problems but nobody will actually know why.
 
I just finished installing a 14700k on a Gigabyte motherboard and there's a giant E-cores Enabled/Disabled button in the BIOS. If there's a game that really needs them disabled, the option is easily available.

Otherwise I will fully enjoy the ridiculous Premiere and After Effects performance... And maybe keep an eye on the power bill but that's also easy to fix. I'm playing fucking Skyrim this winter so I might as well exclusively run it on two E-cores.

Btw the board was already updated with the June BIOS that supports the new CPUs :messenger_smiling_hearts:
 
Last edited:

alucard0712_rus

Gold Member
I just finished installing a 14700k on a Gigabyte motherboard and there's a giant E-cores Enabled/Disabled button in the BIOS. If there's a game that really needs them disabled, the option is easily available.

Otherwise I will fully enjoy the ridiculous Premiere and After Effects performance... And maybe keep an eye on the power bill but that's also easy to fix. I'm playing fucking Skyrim this winter so I might as well exclusively run it on two E-cores.

Btw the board was already updated with the June BIOS that supports the new CPUs :messenger_smiling_hearts:
Same here!
Bought 12600K years ago and had zero problems. Will go with Intel for now.
 

lyan

Member
Except E cores do make MASSIVE performance differences in properly multithreaded applications, even outperforming 16 core AMD 7950x CPUs in those situations. The issue is not that Intel is bullshitting with the E-cores, its more that a lot of programs just simply aren't optimized super well for the architecture. Even some games like Cyberpunk actually perform best with E cores because the engine is properly programmed to utilize it and as a result its obvious they aren't just a gimmick.
When you can get performance penalty by leaving a feature on, it is the hardware/os developer not doing their job.
 
Here are two more examples of games that run better when you turn e-cores/ht off.

dpPWR3B.jpg


I think both games have had patches that fixes the CPU scaling but I'm not 100% sure.

Yes, the problem is that the E-cores run slower than the P-cores so if more cores/threads are available then the game may end up using E-cores instead P-Cores hence the lower overall performance. I have an i5-13600KF which has 6 P-cores and 8 E-cores. I have noticed that a lot of Unreal Engine 4 games tend to ignore the hyperthreaded cores which remains at low usage and just use the main P-cores so that means 6 fast P-cores for games along with 8 slower E-cores. Given the age of UE4 then I am not sure that it is even optimised properly for newer Intel CPUs. Maybe someone else can clarify this?

I am not sure if this core allocation thing is a developer issue or an issue with how Windows 11 allocates CPU cores? Do developers have control over which CPUs are used for games or is that something that the operating system handles?
 
Last edited:

Nvzman

Member
When you can get performance penalty by leaving a feature on, it is the hardware/os developer not doing their job.
Older PC games have tendencies to run worse with multi-core CPUs, does that mean multi-core CPUs were a waste when they first emerged?
Or how re-sizable bar also can cause performance regressions in some titles, but others it massively boosts FPS?
This is a stupid argument, there is a lot of situations in the PC world where certain advanced features cause performance hiccups on software not designed for it, but software that does take it into account sees performance boosts.
 
I can't remember where I read it but I believe the next major update to Unreal Engine 5 (5.4) will finally see improvements to renderer parallelisation aka multi-threading. It's actually been kind of a joke just how bad UE is with CPU core scaling.
 
So in simple terms, what do i have to do to have better perf with my 13600k during 4k gaming?

Probably nothing.

I have a 10600k OC to 4.9GHz paired with a 4090 and at 4K most games don't see higher than 50% CPU utilization.

For example, Miles Morales was 60% utilization on high settings.

I have done numerous system tweaks in Windows to free up CPU resources, but you're totally fine.
 

winjer

Gold Member
Probably nothing.

I have a 10600k OC to 4.9GHz paired with a 4090 and at 4K most games don't see higher than 50% CPU utilization.

For example, Miles Morales was 60% utilization on high settings.

I have done numerous system tweaks in Windows to free up CPU resources, but you're totally fine.

That mean nothing.
Check your GPU usage. If it's under 100% and the game not frame rate limited, then you are CPU bound.
 

lyan

Member
Older PC games have tendencies to run worse with multi-core CPUs, does that mean multi-core CPUs were a waste when they first emerged?
Or how re-sizable bar also can cause performance regressions in some titles, but others it massively boosts FPS?
This is a stupid argument, there is a lot of situations in the PC world where certain advanced features cause performance hiccups on software not designed for it, but software that does take it into account sees performance boosts.
Where have I said it's a waste? Something like APO should have been a given, they pushed a product that was not yet ready for certain use cases, simple as that.
 

Kenpachii

Member
I'm not suggesting that E cores are worse than NO cores. I'm not saying that a 6P core is better than a 6P-4E core CPU. I'm saying that (IMO) having 1 regular P core is better than having 2 x E cores instead. For instance, a 6P-4E CPU would be worse than a 8P CPU.

That "MASSIVE" performance difference you talk about would still be there if you had 2 x extra P cores instead of 4 x extra E cores. But it wouldn't need extra effort from the developers or the OS to reach it due to the extra complexity.

Now i'm not a tech expert and i don't know all that for sure. But to me it does look a lot like Intel did this so they can sell CPUs with more cores.

Actually i disagree.

E cores get all the shit from the background of windows assigned towards it including operation system etc. U can straight up do whatever and the E cores will carry it.

The P cores without the E cores will run all of that otherwise on the same core. U will have to use outside tools to manually assign tasks towards cores to get optimal performance. This is specially a thing when u use multiple monitors and game + do other shit on the side.

Now do E cores replace P cores in performance, no not if u use them for games entirely. But lets be honest here, what game uses 100% of the cores anyway that are coming after 12 threads, barely any game.

I got a 8p core cpu with 4e cores and frankly it works like a charm.
 

winjer

Gold Member
Actually i disagree.

E cores get all the shit from the background of windows assigned towards it including operation system etc. U can straight up do whatever and the E cores will carry it.

The P cores without the E cores will run all of that otherwise on the same core. U will have to use outside tools to manually assign tasks towards cores to get optimal performance. This is specially a thing when u use multiple monitors and game + do other shit on the side.

Now do E cores replace P cores in performance, no not if u use them for games entirely. But lets be honest here, what game uses 100% of the cores anyway that are coming after 12 threads, barely any game.

I got a 8p core cpu with 4e cores and frankly it works like a charm.

Not really. E-cores don't have the same instruction set support as the P-cores.
That's why we are getting stupid crap like AVX10/128. For example.

 

nkarafo

Member
E cores get all the shit from the background of windows assigned towards it including operation system etc. U can straight up do whatever and the E cores will carry it.

The P cores without the E cores will run all of that otherwise on the same core. U will have to use outside tools to manually assign tasks towards cores to get optimal performance. This is specially a thing when u use multiple monitors and game + do other shit on the side.

You assume the P cores can't handle background tasks for some reason?

Instead of using 4 E cores for background tasks (using a 6P+4E core CPU), you could just use 2 P cores (in a 8P core CPU). In the end the game would still use 6 P cores exclusively and you would get a similar end result, without the risk of the E cores getting in the way of the games and without making the whole thing more complex than it needs to be. And if the 2 P cores assigned for background tasks would get in the way, it probably wouldn't drag the games as much as the e cores do.

I still see no reason to "split" a P core to 2 E cores and add all this extra complexity layer, except for making your CPU look like it has more cores in it.
 

kruis

Exposing the sinister cartel of retailers who allow companies to pay for advertising space.
Starfield's latest beta version finally has better CPU scaling.

This (see screenshot below) shouldn't happen anymore.

Ydi8Tdi.jpg


 
Last edited:
Top Bottom