• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

ATI 58XX Preview - Media Stuff goes in here.

Mandoric

Banned
TrAcEr_x90 said:
I'm loving my new 5770. It was a big upgrade from my previous card that came with my HP desktop. Only problem is when i put it to 1920x1080, the tool bar is cut off at the edges, like its extending too far off screen. So i have to put it down one resolution.

HDMI out overscans; you can use a DVI-HDMI adapter or open up CCC, go to Graphics -> Desktops and Displays, right-click the panel in the lower bar and Configure..., Scaling Options, adjust the slider.
 

MightyKAC

Member
RayStorm said:
So about DisplayPort to DVI adapters... I remember reading that some kind of special, rare and expensive one is needed. Yet I see such adapters for less than 10€. Would these actually work and provide me with a cheap and easy 3 screen-solution?

Sadly no.

The cheap adapters simply won't help you as far as the 3 monitor Eyefinity setup goes. And trust me, this is hard experience talking (i.e. tried the cheap alternative, and was met with disappointment). I'm not sure of the technical details, but from what I understand the special adapters like this one provide a timing element as well as needing an open USB port to power it. So long story short, you're gonna need the expensive one for now...

Although, I've read somewhere that ATI was working with some manufacturer to produce a cheaper OEM converter. So if you're willing to wait a few months or so that may also be an option.
 

1-D_FTW

Member
Mandoric said:
HDMI out overscans; you can use a DVI-HDMI adapter or open up CCC, go to Graphics -> Desktops and Displays, right-click the panel in the lower bar and Configure..., Scaling Options, adjust the slider.

Does the scaler actually work? That's the issue I'm having with Nvidia at the moment. The scaler was completely broken on any driver past 182.5. You can resize to create a custom resolution, but you can't scale away overscan at 1080 or 720 like you used to be able to.
 
Mandoric said:
HDMI out overscans; you can use a DVI-HDMI adapter or open up CCC, go to Graphics -> Desktops and Displays, right-click the panel in the lower bar and Configure..., Scaling Options, adjust the slider.

Thanks !! ill try this when i get home. Is it better to use the converter?
 
MightyKAC said:
Sadly no.

The cheap adapters simply won't help you as far as the 3 monitor Eyefinity setup goes. And trust me, this is hard experience talking (i.e. tried the cheap alternative, and was met with disappointment). I'm not sure of the technical details, but from what I understand the special adapters like this one provide a timing element as well as needing an open USB port to power it. So long story short, you're gonna need the expensive one for now...

Although, I've read somewhere that ATI was working with some manufacturer to produce a cheaper OEM converter. So if you're willing to wait a few months or so that may also be an option.


All I want to do is have my desktop go across 3 screens? I have all 3 connected, but the third one wont turn on in the options. So im usre its the display port thing, im not doing it for gaming, so do i need to get the really good display port converter?
 
1-D_FTW said:
Does the scaler actually work? That's the issue I'm having with Nvidia at the moment. The scaler was completely broken on any driver past 182.5. You can resize to create a custom resolution, but you can't scale away overscan at 1080 or 720 like you used to be able to.

It works for me and my 5850. It also worked on my 4870. I have it going through HDMI to a Onkyo receiver to a Panasonic plasma @ 1920x1080, no overscan whatsoever after going into the options described and moving the slider all the way to the right.
 

Mandoric

Banned
TrAcEr_x90 said:
Thanks !! ill try this when i get home. Is it better to use the converter?

Nah, it's better to use HDMI because it supports audio out if you're into that. The driver presets just expect a TV on the other end of the connection and default to a mode that replicates what tube TVs did to video.
 

Archie

Second-rate Anihawk
So it seems like the 5850s will be plentiful around January or so? Maybe I will have a job then and can finally build my rig. ._.
 

1-D_FTW

Member
SuperEnemyCrab said:
It works for me and my 5850. It also worked on my 4870. I have it going through HDMI to a Onkyo receiver to a Panasonic plasma @ 1920x1080, no overscan whatsoever after going into the options described and moving the slider all the way to the right.

Cool. I think. My HDTV has about 3 percent natural overscan. So I'm talking about the video card scaling to eliminate cutoff. This is something my previous ATI card was broken on and why I became so happy to see Nvidia had a working solution. Seems like things have reversed themselves.
 

dionysus

Yaldog
http://finance.yahoo.com/news/AMD-Builds-on-Product-bw-2500130987.html?x=0&.v=1

Highlights.

2010:

Leo” – The next-generation enthusiast-class desktop PC platform with the industry’s first six-core desktop CPU, expected to deliver the ultimate performance for immersive gaming with support for DirectX® 11 graphics and ATI Eyefinity™ Technology.

“Danube” – The next AMD mainstream notebook platform featuring the first AMD mobile quad-core processors, “Danube” is expected to offer seven or more hours of battery life;

“Nile” – The 3rd Generation AMD ultrathin notebook platform, designed to offer seven or more hours of battery life;

2011:

“Bulldozer” and “Bobcat” – Two new x86 cores targeting different usage models. “Bulldozer” will be a completely new, high performance architecture for the mainstream server, desktop and notebook PC markets that employs a new approach to multithreaded compute performance for achieving advanced efficiency and throughput. “Bulldozer” is designed to give AMD an exceptional CPU option for linking with GPUs in highly scalable, single-chip Accelerated Processing Unit (APU) configurations. “Bobcat” will target the low power, ultrathin PC markets with an extremely small, highly flexible, core that also is designed to be easily scaled up and combined with other IP in APU configurations.

“Llano” – Targeted at mainstream notebooks and desktops, this APU will be the first in a family of next-generation designs that combine the power of the CPU and GPU onto a single piece of silicon and engineered to deliver impressive visual computing experiences, outstanding performance with low power and long battery life. It is expected to be the industry’s first APU processor ahead of the first “Bulldozer” and “Bobcat” based APUs;

“Zambezi” – An enthusiast desktop processor with up to eight cores, featuring the first “Bulldozer” core, scheduled for release in 2011.

Capture.jpg
 

FoxSpirit

Junior Member
Most TVs have natural overscan. But most TVs also have an option for 1:1 pixel mapping which will make the picture crisp as... crisps.
 

Lime

Member
Man, I'm glad I have a AM3 motherboard. Nothing like future-proofing your PC. I wonder how expensive the Thuban and Zambezi CPUs will be.
 
Anand said:
In 2011 we get Bulldozer and it comes in the form of the Zambezi CPU (AMD’s codenames are such fun). You’ll see four and eight core versions of Zambezi. Both will support DDR3 and both will work in Socket-AM3.

Ah shit that's stunning work AMD, really nice. Make's me wish I was on socket AM3!! Having excellent upgrade options through 2011 is very attractive, kind of makes up for the performance gap between the i5s and Phenom iis.

Picking up a cheap as chips Athlon ii X3 or something is an excellent buy for the here and now. More than enough CPU for a good couple of years, then easy and cheap upgrades to a next generation 8 core CPU down the road. Can't argue with that.
 
Archie said:
So it seems like the 5850s will be plentiful around January or so? Maybe I will have a job then and can finally build my rig. ._.

AMD execs have to be going catatonic that they can't get more of these 58XX cards to the market considering all the great reviews and Nividia still months away from real competition in the Dx11 arena.
 

MightyKAC

Member
TrAcEr_x90 said:
All I want to do is have my desktop go across 3 screens? I have all 3 connected, but the third one wont turn on in the options. So im usre its the display port thing, im not doing it for gaming, so do i need to get the really good display port converter?

Sorry bro. It's exactly what I had to do.
 

SapientWolf

Trucker Sexologist
KilledByBill said:
AMD execs have to be going catatonic that they can't get more of these 58XX cards to the market considering all the great reviews and Nividia still months away from real competition in the Dx11 arena.
I heard they're selling hundreds of thousands of the things.
 

Brofist

Member
Even in Japan the 58XX cards are moving quick. The PC shop I frequent was sold out of 5870s and down to 1 5850. I almost bought it on impulse but changed my mind. Still looking for a replacement for my 8800 GTS though.
 
D

Deleted member 17706

Unconfirmed Member
I can't wait to get me an octo-core processor! I might just skip quad all together.
 

Zyzyxxz

Member
Zefah said:
I can't wait to get me an octo-core processor! I might just skip quad all together.

When you get your octo-core, quad cores will become standard and you will just be playing the waiting game for applications to be multi-threaded for your octo.
 

Sciz

Member
Zyzyxxz said:
When you get your octo-core, quad cores will become standard and you will just be playing the waiting game for applications to be multi-threaded for your octo.
Eventually applications are going to have to start being coded for arbitrary numbers of cores and the problem will go away. Probably not quite yet, though.
 
Some information from Xbitlabs on the first "Fusion" chips:

XbitLabs said:
the first Fusion processor from AMD will feature 4 x86 cores that resemble those of Propus processor (AMD Athlon II X4) as well as 6 SIMD engines (with 80 stream processors per engine) that resemble those of Evergreen graphics chip (ATI Radeon HD 5800), PC3-12800 (DDR3 1600MHz) memory controller, possibly, with some tweaks to better serve x86 and graphics engines.

That's some mighty serious GPU grunt for an integrated GPU, knowing that every low end AMD CPU will be packing console crushing GPU performance is nice to know. This should really give PC gaming a shot in the arm, the low end is set to take an enormous leap with this. The leap from Intel integrated graphics to a 480 stream processor ATI DX11 can not be understated.

I'd say a ball park of half the 5750s performance should give a good idea of performance, obviously bandowdth is going to be seriously reduced, clocks as well, but it'll be packing a lot ore SPs than such a hypothetical GPU and the tight integration with the CPU can only help performance. So something like 60fps @ 1280x1024 in COD4 and UT3 for AMD's lowest end setup, that's pretty awesome if you ask me. Should really help speed up the adoption of GPGPU as well.

I could see a modified Llano being the basis for the Wii's successor. Hack away two of those CPU cores, use a 128 bit bus to attach 1GB of GDDR5 (doubling bandwidth over the stock config which should be a huge win for gaming performance), fab at 28nm and you've got a pretty awesome super low cost, single chip console ready to go. It should be a nice ~2x upgrade over a PS3/360 graphically which should make it a little beast at 720p yet R & D and production costs will be very low, and launching at $250 in 2012 whilst still profiting on hardware should absolutely be possible. It'll have more than enough grunt for a very comprehensive full software BC mode as well, heck getting 90%+ of the Wii library to run at 720p should absolutely be possible on such a system.
 

NotWii

Banned
I just got a 5770 this week
Feels great knowing it only eats up 18W when I'm doing stuff in Windows instead of 105W
And only eats up 188W in game instead of 205W
 
Behold her 1 billion transistor beauty. Basically the bottom 2/3 is dedicated to the GPU, AMD aren't shitting around here, the low end is set for the biggest single leap in GPU performance in history. Still apcks a full Athlon ii alike (but with 1MB L2 cache per core) quad core CPU in there as well:

a26e301a.png



This is massive for PC gaming, make no mistake. Much bigger news than the launch of the 58xx series, we're talking about Crysis on a low end CPU with no dedicated graphics card, and its only 18 months away, shit just got real. :D
 

Minsc

Gold Member
brain_stew said:
Behold her 1 billion transistor beauty. Basically the bottom 2/3 is dedicated to the GPU, AMD aren't shitting around here, the low end is set for the biggest single leap in GPU performance in history. Still apcks a full Athlon ii alike (but with 1MB L2 cache per core) quad core CPU in there as well:

[IM]http://img.photobucket.com/albums/v455/lixianglover/a26e301a.png[/IMG]


This is massive for PC gaming, make no mistake. Much bigger news than the launch of the 58xx series, we're talking about Crysis on a low end CPU with no dedicated graphics card, and its only 18 months away, shit just got real. :D

18 months is a pretty long time though, did you mean Phenom II? Not sure what an Athlon II is. 18 months puts us around Q2 2011 (without delays), hopefully it's still up to the task of being a ground breaking step forward for gaming. Larrabee has a new challenger though, which is good, because I'm not sure which one will make it out first.

What kind of performance is expected with it? I mean Crysis will run on mid-range laptops if you set the graphics to low. Is it going to top the 5800 series cards?
 

Acosta

Member
Good times indeed, having such a jump in the lowest end of PC will definitely push PC gaming. Probably I'm starting to think that in 2/3 years the scenario of PC gaming will be dramatically changed (for the best).

I feel good with my decision of waiting some months for the upgrade I wanted to do before summer. 5850 and i5 750 looks a killer combination and I'm in the hunt for one.

BTW brain-stew, I guess there is nothing like nhancer for ATI cards rights? I'm going to play in a 720p televisor and want the best IQ possible (that's why I want the extra push of a 5800 instead of going for the 5770).
 
Minsc said:
18 months is a pretty long time though, did you mean Phenom II? Not sure what an Athlon II is. 18 months puts us around Q2 2011 (without delays), hopefully it's still up to the task of being a ground breaking step forward for gaming. Larrabee has a new challenger though, which is good, because I'm not sure which one will make it out first.

What kind of performance is expected with it? I mean Crysis will run on mid-range laptops if you set the graphics to low. Is it going to top the 5800 series cards?

An Athlon ii is a Phenom ii without the L3 cache, this variant will ship with a boosted 1MB L2 cache per core, its a pretty capable quad core processor, more than enough for any average home user.

The packaged GPU is a 5xxx series GPU, a 480 stream processor variant to be exact but customised and packaged right on the CPU die. Of course this isn't going to challenge a 5870, this is AMD's low end platform and as such stands as a huge upgrade in a market where Intel integrated graphics flat out won't even run modern games at all nevermind at decent enough settings.

The point is that AMD's lowest end desktop and notebook chips will be shipping with enough grunt to possibly run Crysis at high settings/720p/30fps or somewhere close to that and that's a pretty big deal in my book when most current low end solutions would be lucky to to pass single digits at the game's lowest settings.

This isn't something that you or I will ever buy, but it is something that your aunty or cousin might, and whereas in the past their machine would probably be locked out of any games released in the past 3 years, if they buy a Fusion chip basically any PC game will run at playable standards. If every rig on the market is packing almost half a teraflop of computing power, it stands to reason that exploiting GPGPU becomes a whole lot more viable.

The point is that the low end is set for a huge upgrade when it comes to graphics, and when the low end rises by a significant amount, we all benefit. That a single chip with no dedicated GPU in site should run any multiplatform title better than the consoles do is pretty good news for PC gaming as far as I'm concerned, it really lowers the barriers to entry.


Acosta said:
Good times indeed, having such a jump in the lowest end of PC will definitely push PC gaming. Probably I'm starting to think that in 2/3 years the scenario of PC gaming will be dramatically changed (for the best).

I feel good with my decision of waiting some months for the upgrade I wanted to do before summer. 5850 and i5 750 looks a killer combination and I'm in the hunt for one.

BTW brain-stew, I guess there is nothing like nhancer for ATI cards rights? I'm going to play in a 720p televisor and want the best IQ possible (that's why I want the extra push of a 5800 instead of going for the 5770).

There's not but ATI's drivers themselves offer rotated grid supersampling on their new GPUs now. From what I gather they don't automatically alter the LOD bias, so you'll have to change that yourself but its done easily enough. PCgameshardware.com have a bunch of screenshots up showing it off and it looks stunning, pretty sure they explain how to set the LOD bias as well.
 

tenritsu

Banned
Minsc said:
18 months is a pretty long time though, did you mean Phenom II? Not sure what an Athlon II is. 18 months puts us around Q2 2011 (without delays), hopefully it's still up to the task of being a ground breaking step forward for gaming. Larrabee has a new challenger though, which is good, because I'm not sure which one will make it out first.

What kind of performance is expected with it? I mean Crysis will run on mid-range laptops if you set the graphics to low. Is it going to top the 5800 series cards?

I've got a ATI Mobility Radeon HD 4530 (LOOOOWW end discrete) which has 80 stream processors and runs crysis on medium at about 20-24 fps. I would expect this to do a LOT better. Maybe hit some high settings. At the very least you could expect it to do a lot better than the integrated HD 4200 or 3200. Based on pure speculation and no real research, of course.

This would be a gimongous leap for integrated graphics for sure. Geforce 6150 eat your fucking heart out! :lol
 

Acosta

Member
brain_stew said:
There's not but ATI's drivers themselves offer rotated grid supersampling on their new GPUs now. From what I gather they don't automatically alter the LOD bias, so you'll have to change that yourself but its done easily enough. PCgameshardware.com have a bunch of screenshots up showing it off and it looks stunning, pretty sure they explain how to set the LOD bias as well.

Thanks man, will try it. I think I have found the page:

http://www.pcgameshardware.com/aid,...irst-DirectX-11-graphics-card/Reviews/?page=6

In case anyone more want to check it.
 
As to Bulldozer, its a 4 core / 8 thread and 8 core / 16 thread design but performance should scale with extra threads much better than standard hyperthreading as the integar math parts of the chip are literally doubled, for many applications it will effectively be a 16 core design.

Each core has two integar "clusters" (essentially mini cores) with 4 pipelines each and their own L1 cache and two of these "clusters" share a sigle FPU "cluster" and an L2 cache to make up a single core (the L2 cache maybe shared between more than one core, but that's unclear from the diagram), all 8 of these cores will then share an 8-12MB L3 cache. Multithreaded performance should be excellent, and it needs to be as AMD were definitely lagging behind in this area.

There's still concerns about single threaded performance and floating point performance but with single threaded performance mattering less and less (and essentially hitting a dead end anyway) and much more floating point work expected to be offloaded to the GPU, it makes sense in terms of AMD's overall strategy, especially considering Bulldozer itself will have an integrated GPU core in future revisions. Why waste resources on CPU FPU when you've got a great big multithreaded floating point monster sitting on that die anyway?

A lot of its success definitely relies on the takeup of GPGPU being steep, and that leads into the Llana "APU" dedicating so much die space to its graphics component, I guess. AMD's rationale seems to be that since even the absolute low end is going to have a OpenCL and DirectCompute compliant mulit hundred gigaflop monster, developers won't be able to afford to ignore all that massive compute power for much longer.

Its really quite interesting how Intel, AMD and Nvidia are all approaching this from a different direction. Fermi looks to be a huge step forward for dedicated GPGPU hardware and yet, it could very well be AMD's fusion chips that really cause the revolution to happen as AMD are going to be providing developers with a userbase several dozen million in size in no time at all.



t7hers.jpg
 
tenritsu said:
I've got a ATI Mobility Radeon HD 4530 (LOOOOWW end discrete) which has 80 stream processors and runs crysis on medium at about 20-24 fps. I would expect this to do a LOT better. Maybe hit some high settings. At the very least you could expect it to do a lot better than the integrated HD 4200 or 3200. Based on pure speculation and no real research, of course.

This would be a gimongous leap for integrated graphics for sure. Geforce 6150 eat your fucking heart out! :lol

That's one aspect of this as well, AMD just wiped out a huge chunk of the low end discrete market, Nvidia can't be happy with that especially since AMD are putting their hurt on at the high end as well. These chips are going to be refreshed every year with at the very least and newer and faster integrated GPU so performance is going to continue to be at a very decent level as time goes by, something which was never the case with old integrated graphics solutions.
 

kinggroin

Banned
I hope to GOD this means good things for the pc gaming industry in general. More folks being able to play the latest and greatest with their $350 pc? :D
 
D

Deleted member 17706

Unconfirmed Member
That would be pretty amazing if they could get some relatively powerful integrated chips on the market. The main thing holding PC gaming back is that so many people have dogshit hardware and don't have a clue about how to install a video card.
 
Zefah said:
That would be pretty amazing if they could get some relatively powerful integrated chips on the market. The main thing holding PC gaming back is that so many people have dogshit hardware and don't have a clue about how to install a video card.

Well we already know its set to pack 480 stream processors, sure bandwidth will always be an issue with an integrated solution, but either way that's going to be no slouch.

I'd keep a keen eye on the RV8730 (likely to be called 5670/5650) to hit in the new year as this will probably be the chip its based on. Performance won't be upto the same standards because of the shared bandwidth, but it'll be nice to get a ballpark figure.

You can also start to think of this an extension of the Cell processor design, its essentially a hetrogenous design with 4 general purpos x86 cores (think PPE) and 6 SIMD engines (each packing 80 "SP"s) think of these as the SPEs, this design lies more in favour of having better general purpose performance but you've still got a CPU design that dedicates more die space to those SIMD cores than it does the traditional CPU part of the design. If GPGPU really takes off like AMD hopes it will that's essentially how this CPU design will work in a lot of tasks.
 
Has AMD said how they are gonna address memory and memory bandwidth with these hybrid chips? I guess DDR3 running at 2000MHz wouldn't be too too much of a bottleneck, but it's not great for something that's "high end" when it comes to desktop memory.
 

Durante

Member
I really don't see these integrated solutions reaching mid-end level performance any time soon -- regardless of whether they are integrated on the CPU or the northbridge. The bandwidth and power requirements are simply too high. The former could maybe be ameliorated by edram or additional memory channels on the chip, but that would defeat the low-cost goal and still not help with power consumption.

Using chips like these for GPGPU is a lot more interesting (and still keeping a discrete GPU around for graphics). Sadly most/all devs won't be able to target such a configuration.
 

Minsc

Gold Member
brain_stew said:
An Athlon ii is a Phenom ii without the L3 cache, this variant will ship with a boosted 1MB L2 cache per core, its a pretty capable quad core processor, more than enough for any average home user.

The packaged GPU is a 5xxx series GPU, a 480 stream processor variant to be exact but customised and packaged right on the CPU die. Of course this isn't going to challenge a 5870, this is AMD's low end platform and as such stands as a huge upgrade in a market where Intel integrated graphics flat out won't even run modern games at all nevermind at decent enough settings.

The point is that AMD's lowest end desktop and notebook chips will be shipping with enough grunt to possibly run Crysis at high settings/720p/30fps or somewhere close to that and that's a pretty big deal in my book when most current low end solutions would be lucky to to pass single digits at the game's lowest settings.

This isn't something that you or I will ever buy, but it is something that your aunty or cousin might, and whereas in the past their machine would probably be locked out of any games released in the past 3 years, if they buy a Fusion chip basically any PC game will run at playable standards. If every rig on the market is packing almost half a teraflop of computing power, it stands to reason that exploiting GPGPU becomes a whole lot more viable.

The point is that the low end is set for a huge upgrade when it comes to graphics, and when the low end rises by a significant amount, we all benefit. That a single chip with no dedicated GPU in site should run any multiplatform title better than the consoles do is pretty good news for PC gaming as far as I'm concerned, it really lowers the barriers to entry.

Thanks for clearing that all up, makes sense now. It's something I'm just going to ignore for the next ~12 months though, until it's much closer to releasing; looking 18 months ahead at any tech area (eBook readers, OSs, GPUs, HDTVs, etc, etc) always sounds the same - wonderful and amazing.
 
TouchMyBox said:
Has AMD said how they are gonna address memory and memory bandwidth with these hybrid chips? I guess DDR3 running at 2000MHz wouldn't be too too much of a bottleneck, but it's not great for something that's "high end" when it comes to desktop memory.

Its DDR3 (for now) no comments on the bus width, speculation over Beyond3D suggests a 192 bit bus is possible, but I've no idea. Faster memory will definitely make a big impact on graphics performance, no doubt, though DDR3 is scaling up in speed quite rapidly (and is already at price parity with DDR2 and it should be both cheaper and faster by 2011) and GDDR5 will be a viable alternative before long.
 
Durante said:
I really don't see these integrated solutions reaching mid-end level performance any time soon -- regardless of whether they are integrated on the CPU or the northbridge. The bandwidth and power requirements are simply too high. The former could maybe be ameliorated by edram or additional memory channels on the chip, but that would defeat the low-cost goal and still not help with power consumption.

Using chips like these for GPGPU is a lot more interesting (and still keeping a discrete GPU around for graphics). Sadly most/all devs won't be able to target such a configuration.

I don't think its about reaching mid level performance really. Its about improving the low end, and providing an order of magnitude larger userbase for GPGPU apps (whcich shouldn't be as adversey affected by the lack of bandwidth) so that AMD's advantage in graphics is able to put the hurt on Intel in their core market. It also gives AMD a pretty nice USP as well, their CPUs play WOW at full detail, Intel's require a seperate card to get it running in any sort of playable fashion, that's quite marketable in my book.

These aren't targeted at gamers like you or I, but they should allow a lot more people to play modern games that were previously completely shut out, and they should make GPGPU application development much more viable, two things which in the long term we both benefit from.

AMD are banking on the GPGPU takeup because Bulldozer isn't going to be able to match upto Sandy Bridge in floating point performance. That's why they're packing such a nice GPU on die and why they're claiming they'll be packing teraflop class hardware on die within a couple of years after launch. Its because they need the market that'll seriously benefit from GPGPU optimisations to be as large as possible and volume is done in the low end, so its the low end they need to target to make GPGPU support ubiquitous.
 

Durante

Member
Yeah, I'm just not sure yet if this really is that great a thing for PC gamers. Most "gamer's" PC games simply don't bother supporting integrated graphics these days. If those become even borderline viable with fusion-like approaches it will mean that the gap between the lowest and highest end configuration supported will actually widen further.
 
Top Bottom