• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: AMD to lay off 20-30% of staff due to lower than expected profits

McHuj

Member
It means absolutely nothing for the next gen systems.

Nintendo, Sony, MS own the IP. At this point the CPU and GPU's are in the final stages of development so it's very unlikely this would impact those systems or stuff coming out in 2012/13.

If it affects anything, it will be standalone CPU's in 2013 and beyond. I think AMD would scale back from making CPU's and just focus on low cost/low power APU's and GPU's. This probably has a much bigger impact on PC gaming as there maybe only supplier of high performance CPU's, Intel.
 

scitek

Member
Isn't it exactly the opposite? If you want to take advantage of Nvidia cards with those SSSSADIOPSDAA and shit like that that AMD cards can't use you need to install extra tools which aren't exactly user friendly for your regular joe.


As far as AMD and regular users goes just install the drivers on the disc and you're good to go.

What he means by "worst drivers" is that AMD drivers often either break games that used to work just fine, or just don't perform like they should considering the hardware. For example, I bought Binary Domain and with my Radeon 6950, my framerate would drop to the 20s, and sometimes even single digits, when something exploded. My Nvidia card runs the game at a constant 60fps...at 3200x1800 resolution.
 
Again, AMD's biggest problem in the PC video card market place is Nvidia's pushing it's non-standard APIs on it's partner developers. Nvidia leverages it's money/influence on the game developers (either sponsoring, or giving them hardware to work with) and in turn developers use exclusive features or non-standard API which AMD can't use... Latest perfect example? Borderlands 2...

http://www.youtube.com/watch?v=2rokcD0nh4I

And you can't say AMD can't match that... here's a video from 3 years ago of OpenCL (an open standard) using the Havok physics engine (middle ware available to all software developers, and works on both sets of hardware)

http://www.youtube.com/watch?v=xfrM973spw0

(edit) derped that, didn't realize Intel bought Havok... Anyways, there are other implementations of physics that can be vendor neutral.
 
Sucks because AMD was making great strides with the introduction of their Athlon 64 line. Then Intel came out with the Core series then especially Core2 and really haven't looked back.
 
At this point, with contract work already knee-deep, why do you expect that anything at all will happen? Maybe this will affect other things, like their discrete video card marketing, but I don't expect that anything will happen to long-term money-makers, like their console part work.

Exactly right
 
Well the internet tends to be doom and gloom, so who am I to say these layoffs arent even a good idea, or at least in positions that wont hurt them? I dont know that.

Besides garbage CPU's, a strong ahem, brand preference for their competitor in GPU's, AMD's problem is probably pretty simple, PC market is declining, mobile market is growing. AMD has nothing in mobile.

I dont think Nvidia is immune to these things either. TI recently de-emphasized their mobile chips, because they noted manufacturers are increasingly going vertical integrated. this means basically Samsung and Apple who control the majority of the market, build all their own mobile chips. There isn't a ton of room for others, including presumably Nvidia's Tegra. Another pressure is Intel IGP's from the low end, there's already no use for a GPU under a $100, and that could get worse. They're being squeezed from the bottom, both AMD and Nvidia.

But so far at least Nvidia continues to do "ok" not exactly gangbusters and their share price is like $12, AMD not so much.
 

wsippel

Banned
AMD will never die. Intel can't afford to let it happen. In the worst case scenario, Intel would send them a truck full of money just to keep them afloat.
 

Kujo

Member
They've really dropped the ball to nvidia in the GPU market but how can they be doing so bad if their stuff is in all 3 future consoles? Do they not get paid until the systems are actually sold in public?
They get paid in Snickers bars
 
For all the talk of AMD's fusion and whatnot, how with ATI's GPU's they were going to have a big edge on everybody else, it doesn't seem to have lit the world on fire, obviously.

Sometimes I still resent the ATI acquisition.
 

Dilly

Banned
I'm just talking about standard use not any of that weird stuff that most normal people wouldn't even notice let alone bother trying to get working.

They have a reputation for requiring more knowledge of computers because of the problems they have and I was even told that by my friend (who still uses AMD today) when I was looking into building a PC.

I've read this so much about AMD and I personally never had a problem.

I owned nVidia until I bought my first ATi card which was an HD4xxx series and I just update my drivers regularly, that's all.
 

yon61

Member
That's pretty much it. The desktop market has always been AMD's bread and butter, now with that declining along with the disaster known to most as Bulldozer, what little share they had of the desktop CPU market has shrunk even more thanks to people migrating to Intel. It's really desperate for them now.
 

1-D_FTW

Member
If they truly are gutting the Canadian engineering (ATI), they truly are beyond worthless at this point. Such a shame their top level incompetence is dooming the only competition in the CPU and GPU market.
 
Don't AMD cards still have the best price/performance ratio?

Yup.

And nobody cares. Gimme that Nvidia hotness mayne.

People should get it right, AMD GPU's are excellent products.

It's up in the air how long they can continue that though, if they keep bleeding engineers. To me it's amazing they were able to this gen.
 

teh_pwn

"Saturated fat causes heart disease as much as Brawndo is what plants crave."
That would suck if AMD went under. Look at how Intel is virtually ignoring desktop performance the last 3 years. If there were competition we'd have 6-8 core CPUs without integrated graphics so it could cool well enough to run at 5 GHz at stock. But noooo, tablets.
 
What he means by "worst drivers" is that AMD drivers often either break games that used to work just fine, or just don't perform like they should considering the hardware. For example, I bought Binary Domain and with my Radeon 6950, my framerate would drop to the 20s, and sometimes even single digits, when something exploded. My Nvidia card runs the game at a constant 60fps...at 3200x1800 resolution.

Anecdotal evidences. As if no game ever had problems with Nvidia drivers.

For the record, I saw Binary Domain running at 60fps on a freaking Mobile Radeon 6850 card at 1080p.
 

Kujo

Member
Last time I bought an AMD CPU was 2005, but Intel has bested them since Core2, which is a shame as I find AMD's heatsinks easier to put on. I really hoped some day they would be able to compete with Intel in the $200+ market again, I feel CPU performance hasn't improved as much as it should have
 

Datschge

Member
AMD isn't dieing. They are finally adapting to the market. The market that Intel respectively nVidia controlled never really cared for AMD enough to make them a viable (as in constantly being able to compete at bleeding edge) competitor without it making steady losses, even now that the choice is barely there anymore. So they consequently pull out.

Doing custom designs on hire/for licensing is actually the area that AMD does well in, and that's where they show intention to expand. Here the competition is mainly ARM Limited, a company that is one forth the revenue and one fifth the manpower of AMD but most importantly actually steadily profitable.
 
As someone who has had AMD for the last three GPUs I've owned (3850, 5850, 6950) I can say they're all great cards that work fine with 99% of games and were wonderful value for money.

Not sure why anyone thinks owning an AMD card is any more hassle than an Nvidia one, to be honest.

Until I switched to my 2500k I had an Phenom X4 which was also a great CPU.
 

Derrick01

Banned
Anecdotal evidences. As if no game ever had problems with Nvidia drivers.

For the record, I saw Binary Domain running at 60fps on a freaking Mobile Radeon 6850 card at 1080p.

It's all anecdotal evidence. It's very rare that something is so bad that it breaks games for everyone.

But go back to last Fall when just about every major release had people in the OTs screaming about how it doesn't work right or work at all on their AMD cards.
 
Hard to believe what an epic fail bulldozer was.

Piledriver actually looks decent in some situations. I've been perusing early leaked benches this morning, but in some ways it's still pretty disappointing. It can hang with a i7-2600k anyway. One benchmark it even beat a i7-3770k, at like $100 less.

Single thread performance is up a disappointing 4% 8350:8150 . That's basically IPC improvement since the turbo clocks are the same as the 8150. Multi thread is up about 15%. That's probably mostly clock, since the base clock is up 400 mhz. Overall performance is about +10%, which is better than the +5% Intel managed with Ivy vs Sandy.

What worries me is they're pushing the clocks so high. There's really nowhere else to go up from 4 ghz.

If games actually start to be heavily threaded, BD/PD might actually be pretty decent.

But they'll market them as unlocked, great overclockers for enthusiasts, etc, and they might actually be a sight better than BD and not total pushovers. Fudzilla has a piece about them going to 5ghz on water etc etc.
 
It's all anecdotal evidence. It's very rare that something is so bad that it breaks games for everyone.

But go back to last Fall when just about every major release had people in the OTs screaming about how it doesn't work right or work at all on their AMD cards.

i see a lot more complaining on all forums about problems with nvidia cards.

could partly be a function of a lot more people owning nvidia cards, but still.
 

Iacobellis

Junior Member
I never had a problem with AMD or Nvidia chips, but I am sure AMD wouldn't be so deep in the red right now if Apple hadn't gone back to Nvidia for their MacBook line.
 

Shambles

Member
Industry rumblings but internet these days almost always ends up with some truth.
A very strange situation developing since AMD graphics powers Wii-U and next gen consoles. AMD CPU is rumored to power next gen consoles too!

AMD current South Island GPU offerings struggle to match Nvidia in efficiency at the high ends(HD79xx vs GTX680/670) and lack the marketing muscles of their lower/mid ends(HD78xx/77xx vs GTX660/650)


http://www.guru3d.com/news_story/am...er_results_released_and_they_are_alarmig.html




http://techreport.com/news/23725/multiple-sources-say-amd-plans-more-layoffs




http://semiaccurate.com/2012/10/12/amds-layoffs-target-engineering/

Avatar pretty much sums op the OP.
image.php


AMD is dying just like PC gaming is dying right? You really shouldn't come to conclusions about things you really have no understanding of. Laying off 10% of your work force during major restructuring is hardly unusual. If RIM is still hanging in there AMD sure as hell isn't going anywhere.

And AMD GPUs not being able to compete?? What planet are you on? The only place where nVidia CAN compete is the high end GPUs that make up very little of the bottom line. As soon as you go mid-range low end GPUs it's all AMD.

This thread is so misleading i'm inclined to belive that it's a shill post from nVidia. We know that all the major players actively do this.

Edit: Original thread title before a Mod fixed it: "AMD is dying, what will happen to Wii-U, Durango, Orbis?"
 
Unlikely but if AMD goes under I hope they spin off the GPU division. It would be sad to see the graphics market go the way of the CPU market.
 

ghst

thanks for the laugh
it's pretty telling that all discussion in here is about the quality of their GPUs, since their graphics department is the only one making money. it's the only field where they are relevant enough to even warrant discussion.

bring back ATI.
 

1-D_FTW

Member
it's pretty telling that all discussion in here in about the quality of their GPUs, since their graphics department is the only one making money. it's the only field where they are relevant enough to even warrant discussion.

bring back ATI.

Which makes it all the more sad that Charlie "On the AMD payroll" D-Bag is reporting they're laying their scalpel to the Canadian engineers that make up the ATI remains.
 

yon61

Member
Avatar pretty much sums op the OP.
image.php


AMD is dying just like PC gaming is dying right? You really shouldn't come to conclusions about things you really have no understanding of. Laying off 10% of your work force during major restructuring is hardly unusual. If RIM is still hanging in there AMD sure as hell isn't going anywhere.

And AMD GPUs not being able to compete?? What planet are you on? The only place where nVidia CAN compete is the high end GPUs that make up very little of the bottom line. As soon as you go mid-range low end GPUs it's all AMD.

This thread is so misleading i'm inclined to belive that it's a shill post from nVidia. We know that all the major players actively do this.

Edit: Original thread title before a Mod fixed it: "AMD is dying, what will happen to Wii-U, Durango, Orbis?"

Bad comparison. AMD is in a very unhealthy state, unlike PC gaming, surely you should know that. Intel is the chipmaker that is healthy, here.

Ten percent is the lowest estimate, but twenty to thirty percent is the number that is being thrown around more. Charlie from SemiAccurate has an article on this that you should read.

AMD is not RIM and RIM is not AMD.
 
With all three consoles rumored to being powered by AMD cards, won't AMD have some sort of advantage on the PC, since all three consoles are going to see games optimized for AMD GPUs?
 

1-D_FTW

Member
With all three consoles rumored to being powered by AMD cards, won't AMD have some sort of advantage on the PC, since all three consoles are going to see games optimized for AMD GPUs?

One would certainly think so. You'd think the PS4/Xbox 3 ports would all be optimized to run best on their architecture. But that's another year away. And with the current talent flush they're engaged in, who even knows what they're going to be looking like by then.
 
Next gen consoles aren't out so they wont be making any money yet. Also they are not making and selling the GPU's, just receiving a royalty so its not going to be as lucrative as selling millions of discrete GPU's.
 
The 670 just murdered them. They have nothing that can compete.

Not really they (and Nvidia) have been in that situation many times over the years and come back strong. The GPU cycle is too short for one fuckup to make a big difference.
 
Top Bottom