• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Intel Arc A770 officially announced for sale on October 12th at 329USD

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Is this Intel's top end card?

Very disappointing to match a mid-range card which was released more than a year ago.

This thing is DOA. Although I must say XeSS looks promising. Impressive uplift in Tomb Raider.
How is it DOA when they are targeting the most popular segment of the market?
That RTX 3060 - RTX3070 tier of card is what sells the most.
They are selling it for 329 dollars which is actually affordable for people. especially those who play at 1080p - 1440p.

The RTX 3060 is the most popular Ampere card, followed by the 3070, followed by the 3060Ti.
AMDs most popular card is the 6600XT....again in that xx60 - xx70 tier.

xx60s and 70s dominate the market, if you are trying to sell numbers and get your tech to be used (XeSS) you need to go where the market is.
Making a 3080 or 4070 class card would be pointless cuz you'd only sell 12 of them.
Cuz who you gonna convince off the jump to give you ~800 dollars for a GPU when theyve never seen your product do work.


Devs have no reason to support XeSS when no one is ever gonna actually use it.
With affordable cards in the segment people actually buy in, you atleast have a chance
finally, they're coming out
too bad it costs as much for the A770 as it does for a 3060ti, i figured theyd undercut nvidia on that one
You aint gonna find a brand new 3060ti for 329 dollars.

Hell the cards MSRP is 400 dollars and most of the ones available are closer to 500 than to 329 dollars......to make matters worse (or better for Intel) with Nvidia saying fuck yall broke cats, dont expect the 3060Ti replacement anytime soon.
Their lower tier AD chips are going into Laptops right now.
 

Ev1L AuRoN

Member
329$ for 3060Ti performance....
Intel is betting on AMD and NVidia not lowering price/performance for the next generation of GPUs.
This is really bad news for gamers. As the 3 companies might be trying to scalp consumers as much as they can.
But then again, why buy an Intel GPU, if we can get the same, but with better drivers and support from NVidia and even AMD.
Intel needs to launch products and have costumers in order to improve. From the interview I saw on DF, they say that they will be pricing their high-end model by the worst case results, meaning, the card is actually more powerful, but need the drivers to mature. Even if the product don't make a lot of sense now, I'm thrilled they entered the GPU market. Who knows, 3 years from now we can have meaningful options from all the big three.
 

PhoenixTank

Member
Still more advanced than Radeon. AMD should probably aim below this price.
Tom Hardy Bait GIF
 

Buggy Loop

Member
How is it DOA when they are targeting the most popular segment of the market?
That RTX 3060 - RTX3070 tier of card is what sells the most.
They are selling it for 329 dollars which is actually affordable for people. especially those who play at 1080p - 1440p.

The RTX 3060 is the most popular Ampere card, followed by the 3070, followed by the 3060Ti.
AMDs most popular card is the 6600XT....again in that xx60 - xx70 tier.

xx60s and 70s dominate the market, if you are trying to sell numbers and get your tech to be used (XeSS) you need to go where the market is.
Making a 3080 or 4070 class card would be pointless cuz you'd only sell 12 of them.
Cuz who you gonna convince off the jump to give you ~800 dollars for a GPU when theyve never seen your product do work.

But how could intel camp compare their epeens If there’s no high end big Ds in their camp?

Schwartz.gif
 

thuGG_pl

Member
Is this Intel's top end card?

Very disappointing to match a mid-range card which was released more than a year ago.

This thing is DOA. Although I must say XeSS looks promising. Impressive uplift in Tomb Raider.

They need to start with someting. And ususally it's better to start with baby steps.
I'm keeping my fingers crossed for Intel, because we need more competition.
 

Silver Wattle

Gold Member
This is good, very cheap for a card with so much memory, great for people that love modding.
I haven't checked, but will they be offering an 8GB version with the same performance for cheaper? A Sub $300 8GB card could sell a lot of cards.

This is assuming they can massively improve their drivers.
 

Crayon

Member
This is good, very cheap for a card with so much memory, great for people that love modding.
I haven't checked, but will they be offering an 8GB version with the same performance for cheaper? A Sub $300 8GB card could sell a lot of cards.

This is assuming they can massively improve their drivers.

Feels like everyone is alread side-eyeing 8gb cards. It will only look worse a year from now when the things are still for sale.
 

adamosmaki

Member
Thats DOA when an established 6600xt is similar performance with much better drivers(if A330 driver clusterf***k was anything to go about) .Good think intel admitted they dont expect to make any profits anytime soon but they said are committed in gpu race in the long run
 

DeadFire87

Neo Member
On paper its specs are like the low end of the high tier. Once drivers are worked out a bit I'm pretty sure the performance should match that at some point if they get it right. It should match up to the 6800 or RTX 3080 spec wise, but clearly its performance is slightly lower than the specs. Im thinking its likely driver related. Either way its a decent enough value for the average gamer.
 
I'm glad they're doing this but, from the perspective where Nvidia has a massive advantage on account of both the hardware and software features, and where AMD consistently has a much smaller market share despite putting out good products, I have to wonder where Intel could edge in. XeSS is significant as a feature, but it doesn't win against DLSS, and it isn't as widely adopted. As it is there's no selling point beyond enthusiasts wanting to try something novel. And this is a harsh market where even AMD has limited success despite their efforts.

They're going to need to invest heavily for a good few years ahead. Lots of people have raised doubts about Intel's willingness to do so. They'll certainly have difficulty justifying the expense to their shareholders. AMD didn't have so much difficulty because they purchased ATI, and Nvidia has been in the game for a long time, but Intel will struggle... I think they'll fail to justify the long-term strategy and end up dumping the GPU business after a generation or two.
 
Last edited:

Dr.D00p

Member
I'd like to try one just for the hell of it but I suspect they'll be nothing but paperweights on the second hand market in a year or two, unlike Nvidia & AMD cards which hold their value pretty well.
 
Shouldn't the entire A-line have launched already in summer? Is this the only card they launch?

Something around 3060 or a bit above is probably currently the area what people would want most, with a relative reasonable still high price, but something competing with 1650 would also be nice. Especially if it would have proper XeSS, at the price point of a GTX without DLSS.
 

GreatnessRD

Member
Shouldn't the entire A-line have launched already in summer? Is this the only card they launch?

Something around 3060 or a bit above is probably currently the area what people would want most, with a relative reasonable still high price, but something competing with 1650 would also be nice. Especially if it would have proper XeSS, at the price point of a GTX without DLSS.
They launched the A380 in early September for Desktop if I remember correctly. It was technically available in like March or something in China and their laptops, too. I was intrigued with the A770, but with their current drivers AND I believe Gamers Nexus said the performance takes a serious dip on AMD systems, I'd be crazy to pay $350 for that card. $250? Might've had a sale, Project Pat. Might've had a sale.
 

Mattyp

Gold Member
I just texted my son who corrected me that it is indeed a 1080ti.
Thanks for correcting me. I thought maybe something was off. Its my memory. 😏

I’m still running a 1080ti while waiting for the 4090s to drop.

20 series, just shit.
30 series, better but still can’t pull 4K144 properly
40 series, please be the jump I desire.

That said the 1080ti has held its own for so long now and will be worthy upgrade for my server box.
 

winjer

Gold Member
330 USD meaning 450 euros, right?

Considering the drop in the Euro value, it will be close to that. Depending on the country, as each UE member has different VAT percentages.
But it should be around 400-420 euros. For AIBs custom cards, it will probably go to 450, or higher.
 

01011001

Banned
this is a really decent card. IMO more attractive than AMD's cards in this range and also competitive against Nvidia.

so a decent showing for Intel's first try at a mid range card.


it now depends on RDNA3 if it will remain a well priced card.
in RT performance they are easily mopping the floor with RDNA2, let's see what AMD managed to do to improve that in RDNA3
 
Last edited:
They launched the A380 in early September for Desktop if I remember correctly. It was technically available in like March or something in China and their laptops, too. I was intrigued with the A770, but with their current drivers AND I believe Gamers Nexus said the performance takes a serious dip on AMD systems, I'd be crazy to pay $350 for that card. $250? Might've had a sale, Project Pat. Might've had a sale.
The 380 seems exactly like the card that should be best for casual mass market. Performance around RX6400 and while RT is barely showing good framerates, with that high end feature turned on it seems much much better than the 6400. And the 1650 offers no RT. So going for that niche might be even better than trying to go for a cheap 3060 competitor.
I can't find XeSS reviews for that card- the feature is not really out yet?- but I would assume instead of sort of acceptable RT using upscaling might suit the card's hw much better and might show its value best. While the A310 with the same 75W on paper but far less cores and everything seems just weird.
 

PaintTinJr

Member
I'm glad they're doing this but, from the perspective where Nvidia has a massive advantage on account of both the hardware and software features, and where AMD consistently has a much smaller market share despite putting out good products, I have to wonder where Intel could edge in. XeSS is significant as a feature, but it doesn't win against DLSS, and it isn't as widely adopted. As it is there's no selling point beyond enthusiasts wanting to try something novel. And this is a harsh market where even AMD has limited success despite their efforts.

They're going to need to invest heavily for a good few years ahead. Lots of people have raised doubts about Intel's willingness to do so. They'll certainly have difficulty justifying the expense to their shareholders. AMD didn't have so much difficulty because they purchased ATI, and Nvidia has been in the game for a long time, but Intel will struggle... I think they'll fail to justify the long-term strategy and end up dumping the GPU business after a generation or two.
I think with the current price of energy and the desire to be greener it is Nvidia who aren't getting the memo ATM and Intel's mid tier launch is about being in a product bracket (with AMD) that does away with needing more than the 75watts provided by the PCIe full sized bus for cards.

In effect, needing additional PCIe power connections (Molex for AGP originally) was always stealing tomorrow's performance for today's products, which I don't believe is going to be allowed for consumer GPUs for much longer or be cost effective for consumers with energy price trends.

My hunch is that between AMD and Intel they are going to create a joint standard for a GPU accelerated motherboard socket - like the old maths co-processor of 80287/80387/80487 days - and either do secondary memory slots for VRAM on motherboards or build new chipsets features to allow bespoke GDDR6X modules (for desktop) to satisfy both RAM and VRAM needs - but conceptually will just look like separate RAM and VRAM to programs.

If Intel and AMD standardise a new solution capped at 75watts (maybe 150watts) for a socketed GPU on the motherboard and start phasing out PCIe for desktop GPUs via their latest mobo chipsets, Nvidia's dominance will be weakened unless they change their strategy to compete for performance using fixed interconnects and a hard power cap that would come from such a socketed GPU solution.
 
AMD and Intel they are going to create a joint standard for a GPU accelerated motherboard socket
I doubt they could exclude nVidia doing something like that. Even if both are the smaller players in the gpu market, that sounds a lot like a monopoly move of the two 486 players. If nVidia on the other hand can also provide their GPUs for that socket, they would again be equally competitive like now.

I agree though that with ever increasing power, CPU too but less, some new standard for mainboards seems logical. But is needed more for the high end monsters than for 75W low end. Better APUs might also be something people would appreciate. Currently the GPU parts are a joke, barely a balance bewteen CPU power and minimal graphics capabilities, and the more powerful your CPU portion is the less sense does it make to again have only few cores of Vega RDNA whatever. Just make proper APU computers, like consoles already are.
 

PaintTinJr

Member
I doubt they could exclude nVidia doing something like that. Even if both are the smaller players in the gpu market, that sounds a lot like a monopoly move of the two 486 players. If nVidia on the other hand can also provide their GPUs for that socket, they would again be equally competitive like now.

I agree though that with ever increasing power, CPU too but less, some new standard for mainboards seems logical. But is needed more for the high end monsters than for 75W low end. Better APUs might also be something people would appreciate. Currently the GPU parts are a joke, barely a balance bewteen CPU power and minimal graphics capabilities, and the more powerful your CPU portion is the less sense does it make to again have only few cores of Vega RDNA whatever. Just make proper APU computers, like consoles already are.
It would be a standard, but unless Nvidia were going to do motherboard chipsets again - for niche motherboards - I'm not sure what part of them being equally in charge of the standard - which anyone could use - would be helpful.
AFAIK the PCI express standard is largely controlled by Intel - and a few others like IBM, HP - without Nvidia, and they aren't in anyway disadvantaged to use the standard, and make lots of money with their PCIe GPUs, so it shouldn't be a conflict in that sense IMO.

The problem for Nvidia being competitive in a socketed GPU world, is that their super/hyper scalar design philosophy to hold the performance crown will be at odds with having to work within such a standard at predictably restricted power(and standardised GPU to VRAM interconnects, and set chip size within reason, and new constraints for GPU cooling), say compared to the days of just going from bus powered, to 1x 6pin PCIe power, and then up to 2x 6pin connectors, and now 2x 8pin. Intel and AMD would likely have a clock advantage at restricted power, and Nvidia would probably offer more processor units, but at restricted clocks to lower power draw.

/edit
The APU angle is largely what Intel and AMD have already been doing with integrated graphics in desktop CPUs, just that they don't call them APUs because the integrated graphics are so far adrift from a dedicated GPU - say compared to the ones in the consoles - and ramping that up only really makes sense for laptops IMO, going with a dedicated GPU socket on motherboards would certainly make it easier to sell pre-built systems, and then upsell standard socketed GPUs that would definitely work, giving the enthusiast crowd less reasons to avoid pre-built OEM systems.
 
Last edited:
Good luck with those drivers, Intel! You'll soon discover why Nvidia and AMD have invested literal decades into driver development, requiring vast amounts of financial and software engineering resources, just to make it so PC gamers can pew pew at high framerates with fancy graphics. You're also about to discover how ruthless and brutal the discrete GPU market is, which is why there were only 2 players left in the first place. Razor thin margins, massive hardware and software development costs, TSMC who literally doesn't care about you because 70% of their capacity is already spoken for by Apple, unruly OEM partners, completely unscrupulous retailers, a brutal boom and bust product cycle which was made even more unstable by crypto mining, and a customer base that quite frankly is ungrateful, unappreciative, and borderline psychotic.

Have fun with your new market, Intel! 😂
 
Last edited:
not sure i'll buy one but that is one nice looking GPU.

i have an Nvidia founder's card which is a mostly all metal design (RTX 2080).

iu


i couldn't go back to buying those cards from the likes of asus, msi, etc. they are god damn awful looking and HUGE. here is the new RTX 4090.

L7z59Yf.jpg


AMD's cards look quite cool too.

iu


i like the simple basic design of the Intel card. looks good!

iu
 

PaintTinJr

Member
Good luck with those drivers, Intel! You'll soon discover why Nvidia and AMD have invested literal decades into driver development, requiring vast amounts of financial and software engineering resources, just to make it so PC gamers can pew pew at high framerates with fancy graphics. You're also about to discover how ruthless and brutal the discrete GPU market is, which is why there were only 2 players left in the first place. Razor thin margins, massive hardware and software development costs, TSMC who literally doesn't care about you because 70% of their capacity is already spoken for by Apple, unruly OEM partners, completely unscrupulous retailers, a brutal boom and bust product cycle which was made even more unstable by crypto mining, and a customer base that quite frankly is ungrateful, unappreciative, and borderline psychotic.

Have fun with your new market, Intel! 😂
I'm not saying you are wrong about the "razor thin margins", but I thought someone said in the EVGA (quitting the GPU market) thread that Nvidia makes 30% on hardware and a graphic of their margins versus their partner AIB card producers was most margins for Nvidia and small percentages for the partners.
 

RoboFu

One of the green rats
I have little faith this in Raja Koduri . I’d like to see him prove me wrong though.
 

Larogue

Member
We need to support their efforts to combat NVIDIA/AMD duopoly on the GPU market, their outrageous prices and shady tactics (eg. selling RTX 4070 as 4080 12GB for $900).
 
Last edited:
The problem for Nvidia being competitive in a socketed GPU world, is that their super/hyper scalar design philosophy to hold the performance crown will be at odds with having to work within such a standard at predictably restricted power
nVidia was/is competitive with larger nodes. Very much like intel was, until Ryzen, easily better with a worse node (although intel just numbers their process a bit differently) and less power draw since Core2Duo.
Artificially crippling any of their advantages by a limiting standard sounds like an insane idea, benefiting no one except those who would dictate this standard. I think only 15% or something on steam have 1650, so the 75W bracket. Requiring all customers to lower their desires, enforcing 75 or a bit more as the only standard, via a new sockel, seems utterly pointless, PCIe already easily provides this, if people want it. Many just don't. The other end though with now even burning cables is under construction.

APUs in laptops are great because lower powered CPUs and basic GPUs are balanced. Desktop APUs are not. Here both intel and AMD could easily hurt nVidia, entirely removing the whole, then unecessary PCIe thing and not only make one chip, but also cheaper mainboards would be fine for many people. Of course not ideal for separate upgrades, but for the average casual and mid range gamer imho better. Flexible parts are for high end more important. So the exact opposite of creating a 75W socket that sort of does the exact same as the already existing pcie slot, not pretending to be green or condescendingly tell the customer what he actually wants or whatever.
 

PaintTinJr

Member
nVidia was/is competitive with larger nodes. Very much like intel was, until Ryzen, easily better with a worse node (although intel just numbers their process a bit differently) and less power draw since Core2Duo.
Artificially crippling any of their advantages by a limiting standard sounds like an insane idea, benefiting no one except those who would dictate this standard. I think only 15% or something on steam have 1650, so the 75W bracket. Requiring all customers to lower their desires, enforcing 75 or a bit more as the only standard, via a new sockel, seems utterly pointless, PCIe already easily provides this, if people want it. Many just don't. The other end though with now even burning cables is under construction.

APUs in laptops are great because lower powered CPUs and basic GPUs are balanced. Desktop APUs are not. Here both intel and AMD could easily hurt nVidia, entirely removing the whole, then unecessary PCIe thing and not only make one chip, but also cheaper mainboards would be fine for many people. Of course not ideal for separate upgrades, but for the average casual and mid range gamer imho better. Flexible parts are for high end more important. So the exact opposite of creating a 75W socket that sort of does the exact same as the already existing pcie slot, not pretending to be green or condescendingly tell the customer what he actually wants or whatever.
I get what you are saying in regards of node labelling but the superscalar design style of Nvidia (in general) doesn't lend itself well to lower power in today's GPU usage, because with similarly higher clocks as AMD, the nvidia GPUs draw much more power, because they are effectively muscle car designs.

I agree that my 75watts was probably too conservative and was why I put the 150watt suggestion in brackets as I was deliberating if budget mobos could still be made and handle a socket for that, reliably. 150watts is probably a better upper limit in-line with consoles' APUs, although with an extra 150watt mobo draw for everything else on PC(CPU, memory, etc) reliability at the beginning might be iffy.

I still think the RTX xx70 (and above) gamer buyers really need to consider what is responsible at this point in time, and even if it wasn't 75watts or 150watts for the GPU socket, an overall TDP for both CPU + GPU sockets of 250watts on a new board design wouldn't seem unreasonable IMO, people maybe just need to wait for moore's law gains to get double the performance - at the same power draw - and consider their wants for gaming - obviously not production GPUs -are unreasonable in the current climate - pun intended.

The reason I disagree with you on the APU solution, is that we already have far better console silicon being optimised in games, and high-end PC gamers don't want APUs, so it doesn't solve the need of the PC gamer wanting to upgrade their GPU on a yearly basis. A new socket that reigns in the standard on power use to encourage efficient design (to win the performance crown) and makes VRAM upgradable in size and type, making GPUs usable for longer should be an overall gain IMO
 
A770 16GB, $349
A770 8GB, $329
A750 8GB, $289



Pretty sweet deals.

What's interesting the A770/A750 seems to have considerable ray tracing performance advantage over RTX 3060. And both the A series and 3060 have AI based upscaling using dedicated matrix units, unlike what AMD is doing. ATM I'd get A770 over low to mid end AMD parts.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not


That is actually a hell of a deal.
I hope they have sorted their drivers out.
Cant wait to see the reviews for this thing.

For a first attempt taking on Nvidias best seller is a good step.

Im sure some people who are first time builders and/or just need a decent gaming machine will be happy to get the cheaper A750 and A770s.
It gets the job done, and with some settings optimization you could probably have a good time at 1440p.

The Raytracing performance seems all over the place though.
Being able to power through Dying Light 2 and Metro Exodus that well would have been a good sign...but then BFV which only does reflections its getting pummeled?
Maybe the more taxing the effects the better it performs vs the 3060?
*Looks at Cyberpunk 2077....nope, just seems like the drivers pick and choose what they like and what they doesnt like.


Onwards to the B series Intel.
 
Yes looks really good.Ray tracing is amazing for this bargain card.16 GB is absolutely amazing for this low price.Nvidias top cards 3080 4080 have 16 or less.Would be really interesting to see how intel improves there graphics card and maybe we will see them in the next PlayStation 6 or 7.Or in the next Nintendo consoles/handhelds.Interesting times ahead
 
If Intel is willing to bundle a mid range cpu and their gpu, a lot of people with a small budget might be interested in building their first gaming pc
I guess some OEMs will offer such intel gaming PCs. Intel is, despite Ryzen existing, anyway the go to brand for most PCs and usually now nVidia is included if it is aimed at gamers and Ryzen + RX is the unusual combo. If you as a PC supplier already have a deal with intel for the CPU, adding those new GPUs should be relatively cheap and intel might offer some sweet deals if they really want to gain some market share and this isn't just a limited test run.
Also, if their APUs get more competitive, this will get interesting. Right now intel iGPUs are barely above office ready, basically just bare minimum display output but AMD also never made a proper console like APU with some adequate power budget for graphic processing. Maybe intel finally makes an APU that is more powerful and still balanced between both units.
 

FireFly

Member
ray tracing better than nvidia cards? am i reading the charts right?
It's apparently better than the 3060, but is also 47% bigger and with a 32% higher TDP, so it's not as impressive as it seems. But it looks like performance doesn't fall off as much as AMD in the most demanding titles.
 

Trogdor1123

Member
A770 16GB, $349
A770 8GB, $329
A750 8GB, $289



Pretty sweet deals.

What's interesting the A770/A750 seems to have considerable ray tracing performance advantage over RTX 3060. And both the A series and 3060 have AI based upscaling using dedicated matrix units, unlike what AMD is doing. ATM I'd get A770 over low to mid end AMD parts.

Kind of impressive. Waiting for real world numbers but very nice for a first try
 
Last edited:

Silver Wattle

Gold Member
Not looking as good as I have xpected, 16GB costs more and is limited edition.
8GB is over 300, the A750 is barely cheaper, should be 250.
They have priced it just high enough that you're better off sticking with the incumbents.
 

MiguelItUp

Member
According to that link, the Arc A770 stacks up to my Sapphire Nitro+ RX 5700 XT (which is still a beast in non-RT stuff) in the following ways:

Consumes 55% less power;
Has 2x the memory;
Is 8% faster;
Supports HW ray tracing (the 5700 XT doesn't);
Costs 27% less than what I paid for my 5700 XT in 2019 (before the crypto madness, before the chip shortage)

Considering my 5700 XT still easily gets 100+ fps at 1440p with high/ultra settings on a lot of modern games, I'd say that's a pretty good deal.
Agreed. I definitely think for a first go this is pretty solid. Curious to see where they go from here, especially since Nvidia has been... shitting the bed in a variety of ways lately.
 
Top Bottom