• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

IGN Rumor: Xbox 3 GPU ~= AMD 6670, Wii U ~5x current gen, Xbox 3 ~6x, Dev Kits August

Status
Not open for further replies.
The one on the left is easily the best it has clarity and smoothness. Yet if I had to choose between the middle and the right I take it even with the blurring. You can fix the blurring with a better aa method and lod tweaking you can't fix a native res with no AA.

I pretty much agree with this although i have no idea about the technical aspects. For me it's just a case of the jagged lines standing out more and being more noticeable to the eye.

You guys do know the 720p high aa probably is already using more resources then 1080p no aa? Almost twice the resources so you can use a pretty fucking expensive filter to clean up the 1080p before you even reach the resources needed for the blurry 720p. And something better the SSAA doesn't exist.

Maybe im just crazy and confused and abuse simple arithmetic.
 

LCGeek

formerly sane
You guys do know the 720p high aa probably is already using more resources then 1080p no aa? Almost twice the resources so you can use a pretty fucking expensive filter to clean up the 1080p before you even reach the resources needed for the blurry 720p. And something better the SSAA doesn't exist.

Maybe im just crazy and confused and abuse simple arithmetic.

That had nothing to do with my point. I'm just saying sans resources which one is more pleasant to me. You can always make better aa methods whether they exist or not. I grew up on 3dfx and shitty tnt cards if devs or manufacturers want to improve it they can not that it does exist in the moment.

Also for what these systems are using AA isn't going to be that killer to system be it 720 or 1080 maxed it's not like people running eyefinity on 3 screens going for something a console consumer will never see in the next decade.

The reason two main factors that 720 looks blurry is because it's upscaled and is using aa which both can add blur or softness to an image. To me I would be interested in comparing non upscaled 720p with aa to the 1080p native with no aa.
 
fuck my eyes. I can not tell the difference between any of them.
1080 high AA has fewer sharp pixels than 1080 no AA at the top left part of the Brain. High AA is also more detailed than 720 AA at the bottom where the ridges under the brain are blurry. My eyes suck pretty bad, but I got better at noticing these details over time. It's the same for anyone else.
 

Proelite

Member
IGN's spec leak track record is pretty impeccable.
People who have seen the devkit basically confirmed what IGN said a few months later.

It was a good insightful post, but it sort of derailed here:
Stop repeating DigitalFoundry on this. The Zelda demo was a quick demo on faulty hardware. Nobody in their right mind would call a 320SPU part even twice as fast as an Xbox 360 GPU. "DX11 level shaders" are also barely different from what is found in the 360 (there's usually just much more of them). It's pretty well established by several people with access to the devkit that the development hardware in the Wii U was a 640 VLIW5 SPU part instead.

That suggests that the Wii U GPU has a fair chance of being more powerful than the Next Xbox GPU, but considering that that chip is going to be part of a SoC, and shrunken down compared to its original process, I think Microsoft is going to clock it at an insane-looking clock speed. An HD6670 like chip is probably attractive to them because of the small die size.

It's a cold hard inside source from a website who has been right on the specs of several consoles before, and recently too. It's naive to think Microsoft has any sort of obligation to do a larger increase in power.

There was nothing we saw on the Wii U that was an actual game. There couldn't have been, because the devkits were only just out, and were faulty and underperforming. Nobody could judge then, nobody can judge now. Later revisions of the Wii U devkit did have a more powerful GPU than the GPU mentioned for the Next Xbox here, but apparently that tells us nothing about the Xbox 360 multiplier we have to use either!

This is true.

What is also true is that a chip like the HD6670 is an incredibly bad starting point. It's a decent amount less powerful than the chip used in the Wii U devkits. It's not a starting point that doesn't make sense: as pointed out this is the same component that has been used in AMD Fusion chips, which is an analog of a new Xbox SoC.

I have no doubts that is going to improve, and maybe dramatically so. But from this starting point on, there is no way that the next Xbox will be in a league of its own compared to the Wii U.

Suplex.gif
 
Status
Not open for further replies.
Top Bottom