• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Visual Downgrade In Next-Gen Tech Demos Going From PC To Consoles?

Oh, you again. At least you didn't actually say anything to avoid making a fool of yourself for the 4th time.

We're talking about the same hardware. Same hardware means same hardware. Not similar card running on twice or 4 times the ram.


I like how you're all trying to pass out as more knowledgeable then someone that not only has been in the industry in this exact department for over 25 years, but actually pioneered many aspects of it. And mainly on the PC front of all places.

football_players_moving_the_goalpost_450.jpg
I thought the argument was that the consoles are somehow twice as efficient.
People post historical evidence of the opposite but it doesn't count because fuck you apparrently.
 
I like how many here are trying to keep a discussion going by quoting some twitter posts of a ''knowledgeable person'' and at the same time not knowing anything about the subject yourself, especially if those twitter posts may have been taking out of context completely.
 
I thought the argument was that the consoles are somehow twice as efficient.
People post historical evidence of the opposite but it doesn't count because fuck you apparrently.
It's like you were born for not reading posts and jump on one to post nonsense. You deserve my tag. That's what, the 5th time?

I didn't see any historical (whatever that means) evidence being posted, and neither did I move any goal posts. Goal posts being moved came from people bringing up cards that came out 7 years later and or higher specs.
 
Not necesseraly. Sony uses a heavily modified AMD GPU that differs from HD7000 desktop cards. Cerny's plan is to use the parts of the GPU that are underutilized during a single frame for GPU without a penalty for render performance.

To achieve that goal, AMD equipped the PS4 GPU with 8 Asynchronous Compute Engines and 64 parallel Compute Queues (HD7970 has 2 ACEs and 2 CQs). Programmers can fill the Compute Queues with compute kernels which will wait for dependences to trigger their execution. PS4 can have 64 kernels waiting at the same time. If the kernel is triggered then the ACE will create the wavefront for the CUs and the compute job will get done. Since PS4's GPU has eight ACEs, it can work on eight compute tasks at the same time. A HD7970 only can work on two compute tasks at the same time. GPUs that don't have these ACEs at all have to use the Compute Pipeline in the Command Processor to do GPGPU. This will create a huge hit for rendering since the Command Processor can only do either rendering or compute, a GPU with ACEs can do both at the same time.

So the plan is not to dedicate a fixed amount of GPU resource to GPGPU (like the 14+4 CU bullshit for example), but to fill under-utilized parts of the GPU with compute kernels.
Any numbers for ACEs and CQs for Xbox One?
 
I thought the argument was that the consoles are somehow twice as efficient.
People post historical evidence of the opposite but it doesn't count because fuck you apparrently.

Well, Capcom also said that you can get better performance due to the optimizations of the console architecture.

http://www.neogaf.com/forum/showthread.php?t=639093

Capcom said:
The peak performance of the PS4 is lower than that of an high end PC theoretically, but due to the ease of development and the streamlined architecture there are areas in which it can be superior. The same can be said about the Xbox One which has a similar architecture and potential.

The engine can use Tessellation and in certain areas Approximating Catmull-Clark Subdivision Surfaces, that is unfortunately too heavy to be used systematically for every element . It also uses Dynamic Level of Detail (DLOD) to avoid pop-in. Thanks to the PS4′s high memory capacity it’s possible to achieve both stable performance and avoiding pop-in with LOD.
 
Thats pretty strange, i've never considered 100-150$ as mid-range. I always though that mid-range is 200-300$.
What range something is isn't defined by price.

As a company if you have a range of product of differing quality or performance levels then it'll be based upon those levels.
 
Oh, you again. At least you didn't actually say anything to avoid making a fool of yourself for the 4th time.

We're talking about the same hardware. Same hardware means same hardware. Not similar card running on twice or 4 times the ram.


I like how you're all trying to pass out as more knowledgeable then someone that not only has been in the industry in this exact department for over 25 years, but actually pioneered many aspects of it. And mainly on the PC front of all places.

You cant get the same hardware on PC, its impossible.

Also that video proves that game doesnt run in less than 15fps and thats on worse GPU, which means 2x more claim is false. You cant spin it around anymore, sorry.

And at the end quotes from 4A engineer:
Oles Shishkovstov: No, you just cannot compare consoles to PC directly. Consoles could do at least 2x what a comparable PC can due to the fixed platform and low-level access to hardware.

Intel HD 4000 is about on par with current-gen consoles, so what's wrong with it being playable? Yes, memory bandwidth is the real issue, so don't expect it to run 1920x1200 at 30FPS, but something like 720p is playable.
So, he says that consoles can do 2x more than similar PC, yet their late gen and high-end game runs similarly on HD 4000, which they claim is in the same performance ballpark to consoles, as on current gen consoles.
It best example of theory vs practice situation we have.

===
What range something is isn't defined by price.

As a company if you have a range of product of differing quality or performance levels then it'll be based upon those levels.

I dont disagree with their nomenclature, just in my mind mid-range never was in 100-150$ range.
 
wow! looking at some references where ...ps4 gpu has something more than a 7970 or a titan...

I think people are setting up themselves for some future disappointment here.

accept what the coming consoles are, and be happy about that. my 2c's

Yup, this thread has been a joke with people pushing that these consoles are more capable than they really are.

It's been said for a while now from the more informed people here, just because these next gen consoles don't come close to gaming PCs, that doesn't mean it's not a good jump from last gen. It seems like some people take this as a negative against their platform of choice for some reason.

Nothing compares to Naughty Dog in my eyes. The Last of Us was just out of this world. Dat lighting...

I love TLoU but it's not out of this world. PC games (such as Crysis 3 and Metro Last Light) outclass TLoU in tech.

Mind linking to one of these "debunks"?

I'd love to see how Gears runs on a 512MB, 7800gt pc.


But you'd know best unlike Carmack ofc, even if we ignore the most obvious anti console agenda I've ever seen on GAF.

Carmack was speaking from experiencing on working with DX9 since IIRC RAGE wasn't a DX11 title. A lot has changed since the DX9 days.

With the next gen consoles now using x86 and DX11.1 hardware, PC versions turn out even better on average.
 
Actually he did provide an example, Intel's HD4000. It's a graphics chip with roughly the amount of power as a 360 GPU, so by all accounts it should run games half as good as a 360. In real-world conditions it actually outperforms the 360. This cannot be disputed, it's fact. There have been multiple benchmarks.

Carmack is a programming legend. A man of his abilities would certainly be able to squeeze every ounce of performance out of a given configuration if he was given enough time and resources and therefore he might achieve that 2x figure. Unfortunately, the overwhelming majority of developers a) do not have the skills of John Carmack b) have limited time and resources. This is why quoting Carmack is pointless.

You do have a point about the 512 MBs of RAM, but that was true in a time where Windows would occupy more than 60-70% of that capacity. Things are way, way different now.

Oh please just stop it. It is a completely different chip from even a different company and with a much more modern feature set - it supports even Shader Model 5/DirectX11. Not comparable at all. And a Radeon HD 7870 will look bad compared to the PS4, especially after a few years. And console APIs do have advantages to DirectX - they maybe don't give you always "direct access to the metal", but they are definitely closer to the metal than DirectX.
 
You cant get the same hardware on PC, its impossible.

Also that video proves that game doesnt run in less than 15fps and thats on worse GPU, which means 2x more claim is false. You can spin it around anymore, sorry.

That's right, you can't, but you can do a hell of alot better then what you tried to pass off.

And I want to know what CPU and most importantly, RAM was running that video. Although I'm pretty sure you already know the answer to that, which is why you want to make it about the GPU only.
 
Uhm, you guys do know Carmack was referring to specific rendering functions, right?

In terms of say, draw calls, hell yes, you can do twice as many on equivalent console hardware as you can on PC (thanks to CPU and bus latency).

But it's not like a rendering engine is 100% draw calls. In a real world scenario and after optimization on the PC things even out more or less. Overall, the difference in actual performance in actual games is more like 10-30% on roughly equivalent hardware (obviously we can't compare exactly equal hardware since PC architecture differs).

I'm not sure what console gamers are trying to prove?
 
Gemüsepizza;75545783 said:
Oh please just stop it. It is a completely different chip from even a different company and with a much more modern feature set - it supports even Shader Model 5/DirectX11. Not comparable at all. And a Radeon HD 7870 will look bad compared to the PS4, especially after a few years. And console APIs do have advantages to DirectX - they maybe don't give you always "direct access to the metal", but they are definitely closer to the metal than DirectX.

Why will the ps4's Pitcairn look better in a few years than it does now? We're not talking about better SPU utilization here. You realize "optimization" as used in the console sense really just mean "smoke and mirrors".
 
Why will the ps4's Pitcairn look better in a few years than it does now? We're not talking about better SPU utilization here. You realize "optimization" as used in the console sense really just mean "smoke and mirrors".

well yes. console optimisation was always a combination of getting to know New/seldom/unknown hardware combined with evolving engines.
this had changed now because New consoles have well known tech and are extremely similar to standard computers.
 
That's right, you can't, but you can do a hell of alot better then what you tried to pass off.

And I want to know what CPU and most importantly, RAM was running that video. Although I'm pretty sure you already know the answer to that, which is why you want to make it about the GPU only.

You made it about only GPU in the first place.
Btw CPU doesnt matter in this case, there wont be difference between quad core and dual in this resolution, settings and framerate.
RAM also doesnt really matter when You are comparing raw performance.

--
Maybe on 360, even then this seems pretty far fetched. I'm gonna disagree.


Also, "more cycles" seems about as disengenious as it gets. Do you mean get more out of a cycle or?
Metro:LL, both Crysis games and BF 4 utilize more tech in their games than any 1st title [with exception to GT 5/6 which are sitting at the same branch as ones mentioned earlier] . Why? Because they have better algorithms for tech features. You can even make summary of what every one of those games is doing and compare, its not really hard.
 
Why will the ps4's Pitcairn look better in a few years than it does now? We're not talking about better SPU utilization here. You realize "optimization" as used in the console sense really just mean "smoke and mirrors".

It's a custom design with a new version of libGCM. Of course you will see better results in a few years than now, when devs have more experience.

Uhm, you guys do know Carmack was referring to specific rendering functions, right?

In terms of say, draw calls, hell yes, you can do twice as many on equivalent console hardware as you can on PC (thanks to CPU and bus latency).

But it's not like a rendering engine is 100% draw calls. In a real world scenario and after optimization on the PC things even out more or less. Overall, the difference in actual performance in actual games is more like 10-30% on roughly equivalent hardware (obviously we can't compare exactly equal hardware since PC architecture differs).

I'm not sure what console gamers are trying to prove?

It's not the console gamers who think they have to prove something. It's a few PC gamers here who get overly defensive over their expensive machines and constantly want to belittle the hardware of nextgen consoles. Which I think is pretty good for 399€.
 
Gemüsepizza;75548423 said:
It's not the console gamers who think they have to prove something. It's a few PC gamers here who get overly defensive over their expensive machines and constantly want to belittle the hardware of nextgen consoles. Which I think is pretty good for 399€.

Is it belittling the console hardware, or splashing some reality into the conversation?

I've seen console gamers express the ridiculous notion that the next gen consoles, through the magic of optimization (none of them actually understand what that actually means) the consoles will perform like i7/GTX 680 machines.

That's not the case. That will never be the case.

I do agree that for $400 what you get in the PS4 in terms fo hardware is a great value. That doesn't invalidate the fact that a mid range gaming PC can do better. That's something you couldn't say a few months before the last gen of consoles came out. Seems significant to me.

As primarily a PC gamer, I would have preferred the consoles had a bit better GPU, and a MUCH more powerful CPU (right now we're looking at around 20% the performance per core of high end CPU).

Like it or not, with the exception of a few PC exclusives, and an additional handful of multi-plats with devs willing to put in the effort, the baseline for graphics for the next 6+ years will be the consoles.

If that baseline is higher, that means it's higher for the PC as well, which is great since PC hardware improves over time. If that baseline is lower, well PC gamers can brag about 4K displays and multi-monitor setups, but only a handful of games will take full advantage of all that hardware.

I prefer the former situation even if it gets all the console gamers going back to the same old, same old of proclaiming PC gaming dead (only to be proved utterly wrong later on).
 
Gemüsepizza;75545783 said:
Oh please just stop it. It is a completely different chip from even a different company and with a much more modern feature set - it supports even Shader Model 5/DirectX11. Not comparable at all.

Very well then, let's take an ancient GPU like the ATI Radeon X1950 Pro. It is rated for 248 Gflops, just 8 more than the Xbox 360 GPU. The X1950 Pro came out in October 2006, one year later than the Xbox. Surely an old card like that wouldn't be able to tackle modern games, right? Surely it would be at a disadvantage compared to the Xbox 360 GPU and its metallic optimizations. Let's watch:

Crysis 2

Far Cry 2

The Last Remnant

Modern Warfare 2

There are many more videos like those. So, what does that tell us?
 
Gemüsepizza;75548423 said:
It's a custom design with a new version of libGCM. Of course you will see better results in a few years than now, when devs have more experience.

I'm sure it's not as custom as Sony would like you to believe. The better results will be from improved pipelines and art mostly, not some magical secret sauce they discovered while optimizing for the GPU. These improvements could very well be carried over to the PC (or even Xbone) versions of these games.

Yes games will continue to look better, but that doesn't mean the hardware is more capable than what we see in today's PCs.

Gemüsepizza;75548423 said:
It's not the console gamers who think they have to prove something. It's a few PC gamers here who get overly defensive over their expensive machines and constantly want to belittle the hardware of nextgen consoles. Which I think is pretty good for 399€.

Actually I think it's a bunch of PS4 fans (how can you even be a fan of a system not released yet?) who are insistent that the system is more capable than it really is.

I'm excited for the system, plan to pick it up at launch, but I've never seen anything really wrong with what the PC gamers here have been saying about the next gen consoles. And I'm usually one of the first to point out bullshit if I see it.
 
What do people think of the idea that, because games were developed around the specs of current consoles, its actually helped boost PC gaming popularity because games looked and ran so well on them ?
 
You made it about only GPU in the first place.
Btw CPU doesnt matter in this case, there wont be difference between quad core and dual in this resolution, settings and framerate.
RAM also doesnt really matter when You are comparing raw performance.

--

Metro:LL, both Crysis games and BF 4 utilize more tech in their games than any 1st title [with exception to GT 5/6 which are sitting at the same branch as onles mentioned earlier] . Why? Because they have better algorithms for tech features. You can even make summary of what every one of those games is doing and compare, its not really hard.



Wait what? If you mean in the sense that maybe the likes of crytek and 4A were able to tone down resolutions and jam pack a laundry list of bullet point features onto aged hardware, then yeah.I personally didn't think any of those console ports impressive.
 
What do people think of the idea that, because games were developed around the specs of current consoles, its actually helped boost PC gaming popularity because games looked and ran so well on them ?

I think it's a solid theory. Consoles kept the PC barrier of entry so low that even $60 graphics card could provide better-than-console performance. It may actually happen again. Since the new consoles are relatively low-powered, lots of gamers might find it more economical to keep their current PC or upgrade just the graphics card and get great performance instead of buying a $400 console.
 
Console defense force out in full force here. May need to call in reinforcements seeing as they don't have a leg to stand on.

There will be, and always will be, a visual downgrade from PC to next-gen consoles. There's nothing wrong with that, it's a fact of life.
 
Wait what? If you mean in the sense that maybe the likes of crytek and 4A were able to tone down resolutions and jam pack a laundry list of bullet point features onto aged hardware, then yeah.I personally didn't think any of those console ports impressive.

It doesn't matter when your last post he was responding to was wrong to begin with.
 
Very well then, let's take an ancient GPU like the ATI Radeon X1950 Pro. It is rated for 248 Gflops, just 8 more than the Xbox 360 GPU. The X1950 Pro came out in October 2006, same year as the Xbox. Surely an old card like that wouldn't be able to tackle modern games, right? Surely it would be at a disadvantage compared to the Xbox 360 GPU and its metallic optimizations. Let's watch:

Crysis 2

Far Cry 2

The Last Remnant

Modern Warfare 2

There are many more videos like those. So, what does that tell us?
Interesting. Currently available gpus should be able to out perform consoles on day 1 then.
 
Very well then, let's take an ancient GPU like the ATI Radeon X1950 Pro. It is rated for 248 Gflops, just 8 more than the Xbox 360 GPU. The X1950 Pro came out in October 2006, same year as the Xbox. Surely an old card like that wouldn't be able to tackle modern games, right? Surely it would be at a disadvantage compared to the Xbox 360 GPU and its metallic optimizations. Let's watch:

Crysis 2

Far Cry 2

The Last Remnant

Modern Warfare 2

There are many more videos like those. So, what does that tell us?

Just looked at the first vid. That's 2GB RAM. And the CPU looks to be rated at 172GFLOPS. It's almost as if you can just buy a GPU and plug it directly into a monitor for gaming!
 
Very well then, let's take an ancient GPU like the ATI Radeon X1950 Pro. It is rated for 248 Gflops, just 8 more than the Xbox 360 GPU. The X1950 Pro came out in October 2006, one year later than the Xbox. Surely an old card like that wouldn't be able to tackle modern games, right? Surely it would be at a disadvantage compared to the Xbox 360 GPU and its metallic optimizations. Let's watch:

Crysis 2

Far Cry 2

The Last Remnant

Modern Warfare 2

There are many more videos like those. So, what does that tell us?
Thanks for gathering this list. It's nice to have some hard evidence when arguing that particular point.

Just looked at the first vid. That's 2GB RAM. And the CPU looks to be rated at 172GFLOPS. It's almost as if you can just buy a GPU and plug it directly into a monitor for gaming!
172 GFLOPS? Where are you getting that from? That's a ~30 GFLOP CPU.
 
So is this the part where you drive by and dip, or are you gonna tell me in what way was it wrong?

Oh hey start off by being passively aggressive, real nice. You're just like that other poster a while back making claims that I just dip out, classy. I'm sorry if I've explained some things countless times and haven't felt like repeating myself to people who don't care to discuss in the first place or maybe I haven't seen any replies for hours and didn't think it was warranted to bring up older conversations.

As for your previous post, I'm talking about 1st party games being less optimized on the 360 and how KKRT actually has a point, assuming I understood his point correctly. He said most rendering features were from 3rd party titles and that is true. Many of the effects that are featured in today's high end console titles are from games released by third party studios. I believe that's what he was talking about.
 
That is plain mathematical certainty.

Of course it is. I don't get why people are really arguing some facts. Console gamers (should) know they haven't the best hardware but this doesn't mean games aren't fun on consoles. Perhaps some are also decepted from what Cerny said and believe in technical wonders which would be ridiculous.
Buy a console, accept what you get and be happy with it.
 
Just looked at the first vid. That's 2GB RAM. And the CPU looks to be rated at 1.72GFLOPS. It's almost as if you can just buy a GPU and plug it directly into a monitor for gaming!

So now system RAM is important? kinda funny how people is selectible with some features. Probably if someone point out that his PC has 32GB RAM some people will dismiss it cause is system memory and not VRAM.
 
Oh hey start off by being passively aggressive, real nice. You're just like that other poster a while back making claims that I just dip out, classy. I'm sorry if I've explained some things countless times and haven't felt like repeating myself to people who don't care to discuss in the first place or maybe I haven't seen any replies for hours and didn't think it was warranted to bring up older conversations.

As for your previous post, I'm talking about 1st party games being less optimized on the 360 and how KKRT actually has a point, assuming I understood his point correctly. He said most rendering features were from 3rd party titles and that is true. Many of the effects that are featured in today's high end console titles are from games released by third party studios. I believe that's what he was talking about.
He said that third party titles "have the most optimized features". Thats not true at all.
 
Of course it is. I don't get why people are really arguing some facts. Console gamers (should) know they haven't the best hardware but this doesn't mean games aren't fun on consoles. Perhaps some are also decepted from what Cerny said and believe in technical wonders which would be ridiculous.
Buy a console, accept what you get and be happy with it.

Yeah these two point are pretty funny to me. You can tell, even before either console was revealed, that some here would have issue with their upcoming console of choice not matching gaming PCs of even today. The days when consoles have any real performance advantage over PCs are long gone, and people should just accept this.

I also wonder how many people here, especially those with Cerny avatars, knew about the person before the PS4 or are they just worshiping the man because of his involvement with Sony and the PS4? I have all the respect for him and consider him one of the industry greats, but this opinion was formed long before he revealed the PS4.

It all just screams typical forum soldier mentality IMO.

Edit:

He said that third party titles "have the most optimized features". Thats not true at all.

I think you're reading it wrong. Let's look at the original quote:

You cant compare 1st studios games to PC performance unfortunately, but actually the most optimized rendering features in current gen games were from 3rd.

What he's saying is most optimized rendering features in current gen games originated from 3rd party titles. Meaning the features found in optimized games (HDR, OBMB, SSAO, etc) have originated from 3rd party titles, not 1st party.
 
Just wanna say it was nice reading W!CKED's posts, dude sounds like he actually knows what he's talking about even though I'd need a translator to figure it out. Juniors get too much shit.
 
Of course it is. I don't get why people are really arguing some facts. Console gamers (should) know they haven't the best hardware but this doesn't mean games aren't fun on consoles. Perhaps some are also decepted from what Cerny said and believe in technical wonders which would be ridiculous.
Buy a console, accept what you get and be happy with it.

This is exactly right. I own many consoles, and accept them for what they are every time I boot up. That isn't denigrating them at all, just simple fact of the matter. I learn to accept their limitations and enjoy their games, simple as that.

Just wanna say it was nice reading W!CKED's posts, dude sounds like he actually knows what he's talking about even though I'd need a translator to figure it out. Juniors get too much shit.

Of course you'd say that.

He's just gotten a fact wrong (i.e. the 360 did not have a split memory pool, it was unified - a huge boon at the time of the console's release).
 
I've been lurking in this thread and feel the same way.

Of course you'd say that.

He's just gotten a fact wrong (i.e. the 360 did not have a split memory pool, it was unified - a huge boon at the time of the console's release). That error completely invalidates all of his previous posts that clearly showed a knowledge for graphics and tech above the other posters in the thread. Stop being a smelly fanboy, WeAreStarStuff.
 
The X1950 Pro in these videos has 512MB GDDR3 VRAM. ATi Xenos only has 256MB GDDR3 VRAM. The x1950 Pro in these videos has 100% more memory bandwidth, 50% more ROPs and almost 75% more pixel fillrate than Xbox360's Xenos!

I can't believe that so many users here actually fell for that bullshit. Seriously... this a really pathetic thread.

relax my friend. sounds like a ps4 vs xbone comparison :D

I bought 360 and ps3 day 1. and like 80% of my gaming money was spent on them for many years.
early next year, I bought an amd-something cpu and a something-8800 gpu. I dont remember how much memory and details, but it doesnt matter anyway.
games would play a little bit better there, than on either console. thats a fact!
I still gamed most of my time on consoles though, and had fun, until the gap became large with newer pc tech and consoles became stale. its normal.

this time, the starting gap between consoles and pc is a little bigger. thats a choice that console manufacturers made.
instead of imagining some magic trick that will somehow breed new amount of power after the 3rd year, it pays better for a gamer to be more realistic.
 
Of course you'd say that.

He's just gotten a fact wrong (i.e. the 360 did not have a split memory pool, it was unified - a huge boon at the time of the console's release). That error completely invalidates all of his previous posts that clearly showed a knowledge for graphics and tech above the other posters in the thread. Stop being a smelly fanboy, WeAreStarStuff.


Very funny!
 
I can't tell since Microsoft doesn't talk in detail about Xbox One hardware.



You're joking, right?

The rig in the first video uses an Intel Core2Duo with 3GHz and 2GB RAM. The rig in the second and third video uses a Core i7 with 4GHz and 4GB RAM. The rig in the fourth video uses an AMD dual core with 2.8GHz and 2GB RAM.

Xbox360 uses a lousy in-order PowerPC with 256MB system RAM.

The X1950 Pro in these videos has 512MB GDDR3 VRAM. ATi Xenos only has 256MB GDDR3 VRAM. The x1950 Pro in these videos has 100% more memory bandwidth, 50% more ROPs and almost 75% more pixel fillrate than Xbox360's Xenos!

I can't believe that so many users here actually fell for that bullshit. Seriously... this a really pathetic thread.

1. The Power PC chip used in the 360 was no slouch when it came out. 6 threads at 3.2Ghz was not too bad at all and is arguably comparable to the first gen core 2 duos. THe director of metro last light once compared specific functions on it being able to outperform a core i7 at 2.4 ghz or so.


2. Xenos does not have just 256mb RAM available to it... it could easily be above 256mb RAM.

3.22.4 vs. 44 in memory bandwidth is quite a good amount of diffrence. I will giv eyou that... but do not forget the advantages for Xenos in using is ED RAM to modify that for framebuffer purposes. Xenos punches above its stated bandwidth due to this.

3. The 1950pro has much less versatile hardware than the 360s Xenos... Mem Export and the unified shading tech allows the Xenos to use it capability much more than the more simply build 1950pro.

4. It has pretty hard to find any evidence of people even running lower end hardware than a 1950pro. The damn thing is ancient... and no one really would post cheaper derivative versions of it running modern software. I will look for some other shitty card in a video... that is more "similar" according to your demanding specifications.
 
Just wanna say it was nice reading W!CKED's posts, dude sounds like he actually knows what he's talking about even though I'd need a translator to figure it out. Juniors get too much shit.
From my lurking I see that he's one of the few people that actually argue with Durante. Durante has won all of their debates, of course (so far as I've seen), but an actual discussion usually results, and I read it all and feel smarter.
 
I can't tell since Microsoft doesn't talk in detail about Xbox One hardware.

You're joking, right?

The rig in the first video uses an Intel Core2Duo with 3GHz and 2GB RAM. The rig in the second and third video uses a Core i7 with 4GHz and 4GB RAM. The rig in the fourth video uses an AMD dual core with 2.8GHz and 2GB RAM.

Xbox360 uses a lousy in-order PowerPC with 256MB system RAM.

The X1950 Pro in these videos has 512MB GDDR3 VRAM. ATi Xenos only has 256MB GDDR3 VRAM. The x1950 Pro in these videos has 100% more memory bandwidth, 50% more ROPs and almost 75% more pixel fillrate than Xbox360's Xenos!

I can't believe that so many users here actually fell for that bullshit. Seriously... this a really pathetic thread.

he's not comparing Xbox 360 with a PC with the same specs, he's comparing Xbox 360 with a 2006 PC, which can have a x1950 Pro ($200 GPU at the time) and 2GB of RAM, how's that not a valid comparison?
 
Console defense force out in full force here. May need to call in reinforcements seeing as they don't have a leg to stand on.

There will be, and always will be, a visual downgrade from PC to next-gen consoles. There's nothing wrong with that, it's a fact of life.
There usually wasn't a large downgrade in visuals at next gen launch, other than IQ. I think the form factor is more limiting than cost at the moment. Modern high end GPUs are power hungry and the boxes just can't handle the heat.
 
There usually wasn't a large downgrade in visuals at next gen launch, other than IQ.

Yes, that's why PCs weren't really exploited. They got lazy ports. Of course they could've had better models or more stuff going on but devs (better: their bosses) didn't want to put too much effort in the ports.
We will see a graphics-bump because of the next-gen-consoles and gaming-PCs will easily run these games, too. More precisely, they were longing for it.
 
You're joking, right?

The rig in the first video uses an Intel Core2Duo with 3GHz and 2GB RAM. The rig in the second and third video uses a Core i7 with 4GHz and 4GB RAM. The rig in the fourth video uses an AMD dual core with 2.8GHz and 2GB RAM.

Xbox360 uses a lousy in-order PowerPC with 256MB system RAM.

The X1950 Pro in these videos has 512MB GDDR3 VRAM. ATi Xenos only has 256MB GDDR3 VRAM. The x1950 Pro in these videos has 100% more memory bandwidth, 50% more ROPs and almost 75% more pixel fillrate than Xbox360's Xenos!

I can't believe that so many users here actually fell for that bullshit. Seriously... this a really pathetic thread.

Isnt that a PC from 2006/2007?. In 5 years this gonna be funny "The Radeon 7850 in these videos has 2GB GDDR5 VRAM. PS4 Jaguar only......."

So its invalid if I want to compare a i7 4930k + 32GB RAM + Nvidia 6GB Titan or 780 with a PS4?
The never ending cycle.
 
Yeah but the differences between GPUs (100% more RAM, 100% more bandwidth, 50% more ROPs, etc) don't sound fair to me.

A Geforce Titan has 100% more RAM (based on KZSF), 65% more bandwidth and 50% more ROPs than PS4's GPU.

Titan is top of the line and costs $1000 today, x1950 Pro is a mid range card in 2006 and costs $200 at the time (full retail price) , it's not unfair.
 
Top Bottom