• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Intel 10nm CPU's line-up leacked

Agent_4Seven

Member
May 6, 2012
2,898
836
720
A 4790k can run 99% of all games at max settings at 60fps at both 1080 and 1440, paired with a gtx1080.
GTX 1080 won't last long at max settings at 60fps at both 1080p and 1440p, GTX 1080Ti and everything above that will be bottlenecked by 4790K especialy in 1440p (not to mention 4K) and soon after in 1080p as well. I mean, of course, if you won't be playing really demanding and CPU bound games which'll require more than 4 cores and 8 threads you'll be fine, but even then sooner rather than later you'll encounter bottleneck.

Should be fine until next gen starts
Maybe even slightly longer. Mid 2021 at best.
 
Last edited:

Kenpachii

Member
Mar 23, 2018
1,658
1,168
435
I think that 1080 is going to last far longer then people think it is. If next gen consoles are going to focus on 4k and 60 fps that ~10 tflop AMD gpu isn't going to keep up with a 1080 that sits at 1080p resolutions.

Same here.
 

Barakov

Member
Sep 30, 2006
5,810
672
1,155
The 1080 reminds me a bit of the 8800GT. I remember the 8800GT punching above its' weight after newer stuff had hit the market.
 
Last edited:
  • Like
Reactions: ambientmystic

Tesseract

Crushed by Thanos
Dec 7, 2008
34,733
7,130
1,340
ya cannot say i'm thrilled about pushing frames above 60 when response times and pacing are so bad
 
Last edited:
Dec 14, 2008
32,567
457
1,055
The 1080 reminds me a bit of the 8800GT. I remember the 8800GT punching above its' weight after newer stuff had hit the market.
The 1080 Ti is probably the best modern video card a person could have bought if they wanted unexpected longevity. Even today it stands toe to toe with the RTX 2080, the 2070 Super, and the 5700XT. The last and greatest champion of Nvidia's rasterization era is still holding up many high-end gaming rigs to this day.
 

Tesseract

Crushed by Thanos
Dec 7, 2008
34,733
7,130
1,340
The 1080 Ti is probably the best modern video card a person could have bought if they wanted unexpected longevity. Even today it stands toe to toe with the RTX 2080, the 2070 Super, and the 5700XT. The last and greatest champion of Nvidia's rasterization era is still holding up many high-end gaming rigs to this day.
agree, it's gonna go down in history as the greatest goddamn monster of the early 21st
 

Fictive

Member
Nov 10, 2013
466
109
385
I’ll be fine with a 9900K for a while. Maybe, MAYBE, in two years time I’ll see what’s going on.
 

petran79

Member
Sep 17, 2012
9,348
946
720
GTX 1080 won't last long at max settings at 60fps at both 1080p and 1440p, GTX 1080Ti and everything above that will be bottlenecked by 4790K especialy in 1440p (not to mention 4K) and soon after in 1080p as well. I mean, of course, if you won't be playing really demanding and CPU bound games which'll require more than 4 cores and 8 threads you'll be fine, but even then sooner rather than later you'll encounter bottleneck.


Maybe even slightly longer. Mid 2021 at best.
when it comes to benchmarks at 1080, the gtx 1080 is equal or even outperforms higher tier cards.
The issue will be that games will jump to 4k and optimizations will focus on higher res.
This is why upgrading will only be worth if you want 4k or refresh rates higher than 60fps
 

Agent_4Seven

Member
May 6, 2012
2,898
836
720
the gtx 1080 is equal or even outperforms higher tier cards.
That's some nonsense right here. 1080 simply can't be better than 1080Ti or 2080Ti in any resolution and no matter which game is being tested.

The issue will be that games will jump to 4k and optimizations will focus on higher res.
They'll not only jump to 4K in terms of screen resolution, but also in terms of quality of textures etc. which will be in 4K depending on a game / platform and this will also impact performance.

This is why upgrading will only be worth if you want 4k or refresh rates higher than 60fps
The problem with older 4 core CPUs is that they will be obsolete in a matter of few more years and as soon as developers will start making games and utilize 6 and more cores and 12 and more threads. So, when it'll happen, it won't matter if you care about 4K 60 or not, your CPU will be simply not fast enough to give you 4K 60 experience cuz your GPU will be bottlenecked by it no matter how fast it is. 1080 is more like a 1440p GPU which strugles to maintain 60 FPS in modern demanding games at 1440p Very High / Ultra settings, unless you'll reduce overall image quality.
 
Last edited:

Skyr

Member
Sep 4, 2013
1,551
1,455
630
GTX 1080 won't last long at max settings at 60fps at both 1080p and 1440p, GTX 1080Ti and everything above that will be bottlenecked by 4790K especialy in 1440p (not to mention 4K) and soon after in 1080p as well. I mean, of course, if you won't be playing really demanding and CPU bound games which'll require more than 4 cores and 8 threads you'll be fine, but even then sooner rather than later you'll encounter bottleneck.


Maybe even slightly longer. Mid 2021 at best.
Isn’t that the other way around tho?
I was under the impression that the lower the res, the more likely you are to run into a cpu bottleneck. That’s why game benchmarks for cpus are done at 1080p or even 720p to determine the highest gap.
 
Jan 27, 2017
54
25
200
That number of pluses, dude....

Intel is caught in a perfect storm:

1) AMD with at least on par CPU architecture and chiplet concept
2) TSMC with superior node process
3) Spectre, Meltdown, Zombieland and co just keep coming

And to add insult to injury, TSMC is promisnig 5nm transition to start next year.
If you hold Intel stock, it's probalby a good idea to start selling it. AMD making superior products for a year or two would go unnoticed by many, but AMD knocking down Intel one year later will.
I’m in the market for a new CPU and GPU so I love the competition going on right now between the 2 companies. I might go with AMD, because x570s support PCIE 4, but from a CPU perspective AMD is just now matching Intel CPU performance. I expect Intel will have a small lead over AMD once again in the fall when these launch. Yes, TSMC is on a smaller node than Intel, but their transistor density can’t match Intels so they give up a lot of their node size advantage.

I hope the competition remains strong in the CPU and GPU markets, it’ll only benefit us as consumers.
 

llien

Gold Member
Feb 1, 2017
5,396
2,504
680
TSMC is on a smaller node than Intel, but their transistor density can’t match Intels so they give up a lot of their node size advantage.
That is simply not true.
On top of it, Intel has nothing on its roadmap any time soon to counter even 7nm, while TSMC is moving on to 5nm as early as next year.

I hope the competition remains strong in the CPU and GPU markets, it’ll only benefit us as consumers.
#metoo, with that being said, AMD must get to at least 30%+ market share in both CPU and GPU market, to be able to develop new products, without relying on geniuses, miracles, adventurous moves to stay afloat.

...from a CPU perspective AMD is just now matching Intel CPU performance...
Had it been AMD's $500 chip beating Intel's $329 (with cooler) by 1-2% at 1440p with 2080Ti ($1300 or more?) on board, while consuming nearly twice the electricity, that "victory" would have been described using different words. ;)
 
Last edited:

Agent_4Seven

Member
May 6, 2012
2,898
836
720
I was under the impression that the lower the res, the more likely you are to run into a cpu bottleneck
The lower the resolution (1080p for example), the more likely you'll be okay with 4 core 8 threads CPU for a longer period of time and if developers will actually optimize their games well for older CPUs. Take 4790K for example, if you'll be exlusively playing games in 1080p with 1080/Ti you'll be fine with at least 80 to 85% of games untill you hit the "more cores and more threads" wall. To put it simple - you can't brute force better frame rate with more powerful GPU if your CPU is not fast enough cuz it lacks the amount of cores and threads which game requires for a 1080p 60 (for example) experience, but you can play in 720p / 900p and get better frame rate or lower some GFX settings.

Now, the higher the resolution, the more you'll be GPU limited and so even if you've top of the line CPU it won't matter cuz now GPU is not fast enough and you need to buy a faster one, so it's always better to buy high-end CPU and for 5-6+ years upgrade only GPU.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Mar 31, 2011
4,119
25
705
i7 10700K will beat the i9 9900K?

10/20 i9 sounds like what the 9900K should have been, the current i7 9700K is a joke.

We need a release date.
 

petran79

Member
Sep 17, 2012
9,348
946
720
That's some nonsense right here. 1080 simply can't be better than 1080Ti or 2080Ti in any resolution and no matter which game is being tested.
If monitor is locked to 60hz and fps, a 1060 and a 2080 will not make a difference. I mean the difference between a 1080 and 2080 is more pronounced at higher res and higher settings. If you play at HD in low or medium settings, differences are lower. Also having so much vram for 1080p is overkill.
Eg compare this


HD low settings:
2080: 167.4 114.5
1080: 139.6 101

HD ultra:
2080: 114.9 98.2
1080: 87.1 64.8

At lower settings fps difference is much lower

While at 4k it almost becomes a double score. Unless someone wants to game at 4k 60fps or has a good gsync or Freesync monitor, a midrange gpu should suffice in performance and price ratio.
 

SonGoku

Member
Aug 16, 2018
3,500
3,309
550
While at 4k it almost becomes a double score. Unless someone wants to game at 4k 60fps or has a good gsync or Freesync monitor, a midrange gpu should suffice in performance and price ratio.
Until next gen...
I think the only midrange/old GPU with any hope to remain relevant for next gen is the 1080Ti
 

Agent_4Seven

Member
May 6, 2012
2,898
836
720
If monitor is locked to 60hz and fps, a 1060 and a 2080 will not make a difference.
It will make a difference in terms of 0.1% low FPS and the game will be more stable in terms of frame pacing too. 1060 (if we're talking about 6GB model) is basically 980 with 2GB more VRAM and is a 4 year old GPU.

I mean the difference between a 1080 and 2080 is more pronounced at higher res and higher settings. At lower settings fps difference is much lower
Who even game at low settings with 1080 or 1080Ti and in 1080p? :messenger_grinning_sweat:
 

Tesseract

Crushed by Thanos
Dec 7, 2008
34,733
7,130
1,340
It will make a difference in terms of 0.1% low FPS and the game will be more stable in terms of frame pacing too. 1060 (if we're talking about 6GB model) is basically 980 with 2GB more VRAM and is a 4 year old GPU.


Who even game at low settings with 1080 or 1080Ti and in 1080p? :messenger_grinning_sweat:
shrug, i play apex with someone who just bought a 2080 ti / 99ish and they play at low, ja ja competitive and all that

why you wanna stare at turd graphics for 700 hours
 
Last edited:

LordOfChaos

Member
Mar 31, 2014
9,078
1,121
710
"Well, 10nm really didn't work out for us, so fuck it, let's just go to 7nm"
I hope you know what you're doing, Intel bros

We'll see. I'm actually a little bit optimistic? Intel does alternating teams on fabs so the burning train of fuckups from one doesn't necessarily kill the other, and if anything they've drawn resources into the 7nm team. They're also going back to a 2x scaling goal, where 10nm was overly aggressive at 2.7x, had the plan worked it absolutely would have shat on TSMC 7nm but high risks/high rewards often mean you fall on your face too.