• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Raja Koduri leaves Intel

Senua

Member
l1QinAd.png
 

Spyxos

Gold Member
Does it mean that Intel's short dedicated Gpu journey is already over? It would be a pity, since 3 competitors are needed to keep the prices halfway moderate.
 

LordOfChaos

Member
I actually hope Intel doesn't give up on gaming dedicated GPUs and keeps trying for several more architectures. It's ironic because the company is Intel, but the duopoly is going to end up with $2000 GPUs with 12GB VRAM. The first one didn't light the world on fire but that's fine, just keep trying, make it a long term strategic goal.
 

supernova8

Banned
Arc itself doesn't even seem that bad at the right price. Sure the drivers are all over the place but the team seems to have made major improvements over the last 2 or 3 months.


Plus, while we cannot get the whole picture, it does appear that Intel has managed to sell a decent number of Arc GPUs even after the hopeless (delayed, confusing) marketing push, shit pricing at first, and shitty drivers at launch.

I get the impression that if they launched Arc GPUs about 10 years earlier (before AMD started actually competing in the CPU space), they would have still had the financial capacity to take the hit associated with selling Arc GPUs cheap (even at or below cost) to gain more traction and win mindshare. Instead we have a situation where Intel's GPUs are just making AMD look good.
 

SmokedMeat

Gamer™
I actually hope Intel doesn't give up on gaming dedicated GPUs and keeps trying for several more architectures. It's ironic because the company is Intel, but the duopoly is going to end up with $2000 GPUs with 12GB VRAM. The first one didn't light the world on fire but that's fine, just keep trying, make it a long term strategic goal.

It looks like the Arc GPUs have been getting a lot better with driver updates, and the price is excellent.

I’m more excited to see what Intel brings in the future, versus AMD and Nvidia.
 

StereoVsn

Member
Arc is supposedly pretty decent now at the lower end. AMD $200-250 offerings suck and Nvidia's don't exist. Not everyone has $400-$2000 to spend on GPU.

And Intel has been improving their driver performance.
 

SHA

Member
LMAO Raja ruins everything he touches, Lisa Su cut him loose but Intel was desperate. Turns out desperation doesn't result in good things.
Engineer ceos are actually the ones who ruins everything, no one should hire an engineer as a ceo.
 
Last edited:

Xyphie

Member
Arc is supposedly pretty decent now at the lower end. AMD $200-250 offerings suck and Nvidia's don't exist. Not everyone has $400-$2000 to spend on GPU.

And Intel has been improving their driver performance.

I think the A750 at $230 is very, very close to becoming an excellent value card. The hardware is clearly better than the AMD offerings in the price range (~6600 XT) the software just has to improve a little bit more.
 

winjer

Gold Member
Engineer ceos are actually the ones who ruins everything, no one should hire an engineer as a ceo.

In tech companies it's the opposite. For example, Intel for decades had engineers as CEOs.
Then they decided to hire non-engineers as CEOs. And when Brian Krzanich took over, that is when Intel's decline went into full gear, as he focused on immediate profits through stock manipulation, and reducing research investments to single-digit percentages of revenue.
The damage he did to Intel is still felt to this day, so no wonder Pat Geisinger pointed the finger at him.
 

SHA

Member
In tech companies it's the opposite. For example, Intel for decades had engineers as CEOs.
Then they decided to hire non-engineers as CEOs. And when Brian Krzanich took over, that is when Intel's decline went into full gear, as he focused on immediate profits through stock manipulation, and reducing research investments to single-digit percentages of revenue.
The damage he did to Intel is still felt to this day, so no wonder Pat Geisinger pointed the finger at him.
Business guys do things the dirty way , science guys have broader vision than both , engineers are at the moment, better at handling bad situations instantly.
 

winjer

Gold Member
Business guys do things the dirty way , science guys have broader vision than both , engineers are at the moment, better at handling bad situations instantly.

It was engineers that made Intel the biggest and most advanced chip maker in the world. For several decades.
After a decade without an engineers as CEO, Intel lost it 's lead.
 

ToTTenTranz

Banned
But on Intel he had a big budget and several ex-engineers from AMD and NVidia, and still he underperformed.

Up until last year there were only two companies in the whole world doing high-performance discrete GPUs. The amount of IP and expertise on hardware and software needed to get in that market is such that all the other companies that tried it were pushed away, and there were a lot of those 2 decades ago: 3dfx, Matrox, Imagination/PowerVR, VIA/S3, XGI, 3DLabs. And I'm only naming the ones that got to the point of launching products in the market, because there were a bunch who cancelled their products and closed down during development.
Pretty much all of them struggled with windows and linux drivers that supported millions of hardware combinations, as well as scaling up / widening GPU resources where interconnects and caches become more complex and important.

There's a bunch of companies making tiny embedded GPUs for mobile devices. There were only two making large discrete GPUs for PCs, and now there are three.

Intel's 1st gen Arc was never going to win big on their first try, and whomever thought otherwise had pretty unrealistic expectations.




Doesnt explain why he sucked at apple and intel

AFAIK Raja didn't suck at apple. He was pretty much responsible for apple's SoCs getting beefy iGPUs from PowerVR with adequate bandwidth which enabled the iPhone and iPad to get unparalleled UI fluidity and gaming performance until well after he left for AMD in 2013.
 

winjer

Gold Member
Up until last year there were only two companies in the whole world doing high-performance discrete GPUs. The amount of IP and expertise on hardware and software needed to get in that market is such that all the other companies that tried it were pushed away, and there were a lot of those 2 decades ago: 3dfx, Matrox, Imagination/PowerVR, VIA/S3, XGI, 3DLabs. And I'm only naming the ones that got to the point of launching products in the market, because there were a bunch who cancelled their products and closed down during development.
Pretty much all of them struggled with windows and linux drivers that supported millions of hardware combinations, as well as scaling up / widening GPU resources where interconnects and caches become more complex and important.

There's a bunch of companies making tiny embedded GPUs for mobile devices. There were only two making large discrete GPUs for PCs, and now there are three.

Intel's 1st gen Arc was never going to win big on their first try, and whomever thought otherwise had pretty unrealistic expectations.

Intel is not new to GPUs. They have been making tem since the 90s.
Even today, Intel is the company that ships the most GPUs, thanks to it's integrated parts. Intel is not new to producing GPUs and drivers.
They may not have been competing at the mid and low end, but they have the majority of the entry level
 

ToTTenTranz

Banned
Intel is not new to GPUs. They have been making tem since the 90s.
Even today, Intel is the company that ships the most GPUs, thanks to it's integrated parts. Intel is not new to producing GPUs and drivers.
They may not have been competing at the mid and low end, but they have the majority of the entry level

Intel isn't new to GPUs and neither is Imagination, Vivante, ARM or Broadcom. All of those make successful smaller integrated GPUs, but none of them make large / wide discrete GPUs that need to work across millions of combinations with different operating systems, CPU architectures, motherboards, RAM, drivers, quickly changing APIs and API features, etc.

When it comes to small integrated GPUs that only work in a SoC with one CPU architecture and memory configuration, even a tiny ASIC company that no one heard of before can make it work.


No one really knows if Raja really failed because no one except a handful of people knew his key performance indicators for 1st-gen Arc.
Does Arc underperform in terms of die-area / performance and power / performance when compared to the latest GPU architectures by AMD and Nvidia? Of course it does.
Was it realistic to expect otherwise, and have the 1st-gen Arc GPUs beat the accumulated know-how of two decades of thousands of engineers dedicated to high performance GPUs and their drivers, not to mention a vast network of developer relations that leads to specific performance optimizations on most games before launch? No, it was not.
 

Miyazaki’s Slave

Gold Member
I have a rig with an arc 770 in it. Fantastic performance 1140p max settings in most everything I play on it.

(PoE, Elden Ring, D4 Beta, RE 4 Remake) they all run great. Great card imo especially given the price you can snag them at.
 
Last edited:
Top Bottom