• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Reality of console visuals surpassing PC visual fidelity

when it comes to modern pc this isnt so much about hardware but software.

software that uses the hardware, both in terms of efficiency and in features.

Im pretty sure current high end pc is as powerful or more powerful than the true next gen consoles.

But once next gen console games role out they will blow current pc games out of the water.
 

Serra

Member
Pixels and resolution are not the only thing that matter, artstyle is more important. To me, some console games in 720p are visually more appealing than pc games. I have a 2560x1440 monitor running bf3, its awesome, but nothing that wows me.

So when you play that same 'console' game that has an awesome art style (you assume no PC games can have awesome art) it will look even more awesome on PC.
 

Salsa

Member
Doesnt really matter since they'll still run at 30fps

But no, its a pipedream. Next gen console reveals might be looking more impressive at launch, but that ends with the first port
 

Row

Banned
Impossible these days, the next consoles need to be priced too low to allow for something equivalent to a high end card like a 680m. PC tech just moves too fast, by the time sony and ms release their next console a whole new series of GPU's will be out.

Consoles will always offer better bang for your buck and convenience though, that is more important for most people I imagine.
 
Hopefully we can use those models without a huge performance penalty.

you can actually see how little extra performance such models take up if you load up the Free SDK. They have some model examples with character tesselation (an alien and a dragon). The performance difference is next to nothing.

0.jpg
 

mephixto

Banned
The point is that Xbox360 core model launched for 299USD back in 2005 i don't think that any pc at that price point back or even at double price, couldn't match the quality of the games back then, so i expect the same for the new titles released with the new Xbox.

Star Wars 1313 and Watch Dogs(both next gen games) were demoed on PCs, I don't think we can expect a difference at all.
 

dark10x

Digital Foundry pixel pusher
wanna bet? when did this ever happen?
Wait, what? This is pretty much going to be the first gen where console hardware won't outclass the PC at launch. It ALMOST happened last time as well but every other console launch featured software well beyond what you could get on a high-end PC.

When it took morw than a year to get back to 60fps? On a high end PC?
What are you even talking about?

I swear some of you only started paying attention to this stuff in the past few years.
 

SapientWolf

Trucker Sexologist
The consoles probably won't win in an A/B comparison with the best the PC has to offer, but they might be the price/performance kings for a year or two. A $400 box in 2013 that could max out Crysis 3 at 60fps would be impressive, whether it was a console or PC.
 

KKRT00

Member
you can actually see how little extra performance such models take up if you load up the Free SDK. They have some model examples with character tesselation (an alien and a dragon). The performance difference is next to nothing.

Yep, CE 3.5 has really good tessellation usage and PoM is cheap too. Both were used in Multiplayer Alpha and didnt stress PC that much.

once next gen games rolls out, most pc wont be able to keep 30fps either.

itll take a year or more before you can go back to your comfy 1080p/60fps.

Not going to happen, because there wont any tech boost like in last generations. Writing to the metal can give You only 20-25% more performance, but it wont be much compared to what higher tier/tech GPU models offer.

Will average spec requirements rise? Yes, but thats completely normal and excepted, because nowadays we have 70-80$ GPUs posted as recommended.
 
once next gen games rolls out, most pc wont be able to keep 30fps either..

Nope...I played Bioshock on my P4 3.0 Ghtz + 6800 GT at roughly 360 specs (details med-low around 30 FPS). That's hardware roughly a year prior to 360 release specs with a huge difference in architecture. Unless there's a complete change in hardware architecture, the difference isn't going to be that huge...
 
It's next gen as in Direct X11 only. Crytek got tired of waiting.

PS3/360 are downports.

Im pretty sure they have paid way to much attention to ps360 for it to truly be a true next gen title.

Like gears of war was or metal gear solid 2 was.

Next gen title has to lead on a next gen console co developed on pc/ngc.

ps360 cannot really be part of the picture.
 

Salsa

Member
Wait, what? This is pretty much going to be the first gen where console hardware won't outclass the PC at launch. It ALMOST happened last time as well but every other console launch featured software well beyond what you could get on a high-end PC.


What are you even talking about?

I swear some of you only started paying attention to this stuff in the past few years.

My memory might be fuzzy because its been almost 7 years now (oh god this gen) but if this is the reality moving forward ill eat my hat

Since when are you PC gaming? It happened this gen and it'll happen next.
I remember Tomb Raider: Legend and GRAW1 running like shit on my mid to high-end PC.

since a long time ago, and I dont think I remember a time where I had to wait morw than a year to recreate a console experience on a high end pc
 

Reiko

Banned
Nope...I played Bioshock on my P4 3.0 Ghtz + 6800 GT at roughly 360 specs (details med-low around 30 FPS). That's hardware roughly a year prior to 360 release specs with a huge difference in architecture. Unless there's a complete change in hardware architecture, the difference isn't going to be that huge...

Depends how demanding the next gen PC game is... Especially at the highest settings.
 

dark10x

Digital Foundry pixel pusher
Nope...I played Bioshock on my P4 3.0 Ghtz + 6800 GT at roughly 360 specs (details med-low around 30 FPS). That's hardware roughly a year prior to 360 release specs with a huge difference in architecture. Unless there's a complete change in hardware architecture, the difference isn't going to be that huge...
I did that as well and the performance was fucking awful.

I even used a P4-3.4 GHz CPU along with the 6800GT. I could barely reach 30 fps and it certainly wasn't stable. It looked like shit too.
 

mephixto

Banned
Im pretty sure they have paid way to much attention to ps360 for it to truly be a true next gen title.

Like gears of war was.

Gears of wars 1 was released on PC and looked a lot better than the 360 version. This gen was more about launch titles released only on consoles, if you put those titles on a 2005 PC's I bet they will look the same or slightly better.
 

King_Moc

Banned
once next gen games rolls out, most pc wont be able to keep 30fps either.

itll take a year or more before you can go back to your comfy 1080p/60fps.

You think consoles are gonna have the equivalent of a Geforce 680? How big were you thinking these consoles were going to be, because if that's their target they're gonna have a hell of a lot of heat to get rid of as it'll need a super fast processor to avoid bottlenecking.

And that's not even the top graphics card you can get.

Consoles used to launch and be as fast as PC's, yes, but the exotic cooling solutions required for high end PC's today kind of ends the potential for consoles to match them nowadays.
 

Serra

Member
The consoles probably won't win in an A/B comparison with the best the PC has to offer, but they might be the price/performance kings for a year or two. A $400 box in 2013 that could max out Crysis 3 at 60fps would be impressive, whether it was a console or PC.

Sure, consoles will always win the price/performance comparison because MS/Sony get those parts for so cheap from manufacturers, and still sell the consoles for a loss on the hardware.
 
Yes, b/c all PC games strive for hyper realism. Not only is art style subjective but games with great art style are found on PC as well (with the added benefit of resolution and anti aliasing enhancing beautiful art style on PC). Art style is basically the comfy couch excuse 2.0...

I never implied pc games don't have great artyle. :) Its just that when these discussions pop up, the pc argument is: bigger, better, more of everything.
 

King_Moc

Banned
Gears of wars 1 was released on PC and looked a lot better than the 360 version. This gen was more about launch titles released only on consoles, if you put those titles on a 2005 PC's I bet they will look the same or slightly better.

To be fair, in terms of detail and effects it was fairly identical. But my laptop (yes, laptop) could run it in full HD at launch.
 
Are Haswell/MAxwell expected to be the kind of jump that even if Durano/Orbis look fantastic & on level with today's PC games, that duo will be a major jump ahead? I only started to tune into pc gaming news a few months ago so I don't know much about them.
 
I did that as well and the performance was fucking awful.

I even used a P4-3.4 GHz CPU along with the 6800GT. I could barely reach 30 fps and it certainly wasn't stable. It looked like shit too.

as you know prior to 360. or before 2005.

Most new pc games ran sub 30fps. Because they actually pushed the hardware.

it wasn't until pc community moved on to 360 and set that as the lowest common denominator that we got the modern pristine image quality/high framerates that newbie pc aficionados seem to think has been the status quo 30 years.
 

Grief.exe

Member
everytime.

you're a fool.

I'll link back to this when I'm running next-gen console ports on my SLI-560 ti, a card that will be 2 generations old by the time the next generation consoles come out.

Are Haswell/MAxwell expected to be the kind of jump that even if Durano/Orbis look fantastic & on level with today's PC games, that duo will be a major jump ahead? I only started to tune into pc gaming news a few months ago so I don't know much about them.

Intel operates on a 'tick/tock' update system where the tock is a mild refresh and the tick is a signficant redesign and performance boost.

Sandy Bridge was the tick, and Ivy was the tock for this generation. The next generation of CPU will be the tick, providing a good performance increase.

It is rumored that they will change this in the next few years, but for now it is holding true.
 

mephixto

Banned
as you know prior to 360. or before 2005.

Most new pc games ran sub 30fps. Because they actually pushed the hardware.

it wasn't until pc community moved on to 360 and set that as the lowest common denominator that we got the modern pristine image quality/high framerates that newbie pc aficionados seem to think has been the status quo 30 years.

Ok I'm gonna let someone with better english than me destroy this argument.
 

Salsa

Member
it wasn't until pc community moved on to 360 and set that as the lowest common denominator that we got the modern pristine image quality/high framerates that newbie pc aficionados seem to think has been the status quo 30 years.

you cant make this argument of "newbie pc aficionados" for everyone. If youre talking to someone who comes from fps games for example theyve been plating above 60fps for a long ass time now
 

kinggroin

Banned
as you know prior to 360. or before 2005.

Most new pc games ran sub 30fps. Because they actually pushed the hardware
.

it wasn't until pc community moved on to 360 and set that as the lowest common denominator that we got the modern pristine image quality/high framerates that newbie pc aficionados seem to think has been the status quo 30 years.

Bullshit. The scaleability of PC games alone means you can't just throw out generalized statements like that. We're not talking about fixed hardware and settings.
 

Tain

Member
Is anyone else disappointed that it has come to this?

I honestly pine for the days when new console hardware could amaze me. I love that feeling of seeing something truly new and incredible for the first time. I haven't been truly wowed by a game in quite a while and I miss it.

Even better, I wish it were still possible to go to an arcade and see some multi-thousand dollar machine running a game made specifically for it blowing away everything on PCs and consoles.
 

patapuf

Member
as you know prior to 360. or before 2005.

Most new pc games ran sub 30fps. Because they actually pushed the hardware.

it wasn't until pc community moved on to 360 and set that as the lowest common denominator that we got the modern pristine image quality/high framerates that newbie pc aficionados seem to think has been the status quo 30 years.

We are talking about the aficionados aren't we? The OP states that price doesn't matter. I doubt a 1500$ rig won't be able to play next gen titles 1080p 60FPS
 
Now, there are occasional games where I use a locked 30 fps instead but it's not simple. Most people will tell you to lock it in the nVidia Inspector or use Bandicam or some other such tool. Those don't work. They DO deliver 30 fps according to FRAPS but also introduce severe microstutter into the image that ruins the consistency. With most games, however, a combination of using the "half refresh" option combined with MSI Afterburner OSD limited to 30 will produce good, stutter free results. This is the ONLY combination that has ever worked properly for me. Unfortunately, this isn't compatible with all games. BethSoft games tend to end up with longer loading or continue to stutter, for instance, and this happens with a few other titles as well. It's not a magic bullet but it's the closest I've seen. Fortunately, I only need to use this in cases where I want to push visuals all the way out while keeping performance consistent. Crysis 2 DX11, for instance, is great with this solution as I can use 1440p + the highest details settings with extra AA while holding 30 fps.

SLI is definitely better with the 600 series cards. That hardware frame metering seems to be working out pretty well so far on my 690; downsampling from 1440p and applying copious amounts of AA while still maintaining an average of 60 fps is truly a sight to behold. There is no way I can go back to regular 1080p on consoles, next gen or otherwise; downsampling restores details that would otherwise be lost even on a 1080p panel (reality captured on HD video is basically atomic res downsampled to 1080p, which is why it looks so good). Antialiased, jaggy/shimmer free images (coupled with shaders and lighting) is what pushes real time graphics to pre-rendered territory. The next few years as we move towards 4K is when the cycle begins anew as hardware tries to catch up to the new standard. I am of the opinion that more res the better; a huge amount of input from games comes visually, so the higher visual fidelity, the higher the potential immersion factor.

I don't miss the excitement of new hardware from consoles, that's been replaced by the excitement of constant new hardware in the PC space. It's just a lot more fun to spec out and build your own machine.
 

mephixto

Banned
No, it wont. There is not a single reason for it. There wont be anything revolutionary or really high end like CELL or Xenon + Xenos with edram in them.

Yep, both are basically PCs

SLI is definitely better with the 600 series cards. That hardware frame metering seems to be working out pretty well so far on my 690; downsampling from 1440p and applying copious amounts of AA while still maintaining an average of 60 fps is truly a sight to behold. There is no way I can go back to regular 1080p on consoles, next gen or otherwise; downsampling restores details that would otherwise be lost even on a 1080p panel (reality captured on HD video is basically atomic res downsampled to 1080p, which is why it looks so good). Antialiased, jaggy/shimmer free images (coupled with shaders and lighting) is what pushes real time graphics to pre-rendered territory. The next few years as we move towards 4K is when the cycle begins anew as hardware tries to catch up to the new standard. I am of the opinion that more res the better; a huge amount of input from games comes visually, so the higher visual fidelity, the higher the potential immersion factor.

I don't miss the excitement of new hardware from consoles, that's been replaced by the excitement of constant new hardware in the PC space. It's just a lot more fun to spec out and build your own machine.

I have a 120Hz monitor and I don't like getting lower than 100fps.
 

Grief.exe

Member
We are talking about the aficionados aren't we? The OP states that price doesn't matter. I doubt a 1500$ rig won't be able to play next gen titles 1080p 60FPS


EDIT: Damn. I let my rage overwhelm my ability to comprehend English. Ignore the rest of this post. I'll leave it here for lols. I bolded the important part of the quote.


You are woefully ignorant of relative performance and costs.

I can build a $800-$1000 rig now that will literally last through the next generation without need for an upgrade. That is most likely going to be over a decade.

Wait until early 2013 and for the same price you will have significantly better performance for the same price.

If you don't know what you are talking about, why post?
 

alphaNoid

Banned
The biggest factor to discuss is cost really. Do you want to spend $1200 on a fairly high end gaming PC or do you want to spend $399 for a console that can do almost what a $1200 PC can? I say almost because developers can write code straight to the metal on a console, since its a fixed platform. PC requires writing for the lowest common denominator and also a hypervisor of sorts.. so you are never going to be able to squeeze all the power of a PC whereas you definitely can with a console.

The average consumer doesn't want to manage drivers, editing INI files, sitting at a computer, or just random crashes etc.. The average consumer will always pick the cheaper and better optimized option.

The average consumer will buy a console because its arguably the better option. The enthusiast on the other hand would argue the other, with just cause. Its just, the enthusiast is the enthusiast and their platform is never going to be the best bang for the buck option. PC gamers can argue until they're blue in the face, but it will continue to fall on deaf ears. People don't care that your $1200 PC can edge out more IQ .. they really don't.

I'm a PC gamer as well.
 
Bullshit. The scaleability of PC games alone means you can't just throw out generalized statements like that. We're not talking about fixed hardware and settings.

yeah like crysis1?

You ran that at 60fps@1080p when it was released?

That is how pc gaming used to be. the scalablility used to mean you could run it at ok framerate on high end hardware.

Not like it is today, when you an average pc can run the latest multiplatform games 1080p@60fps. simply becase the game was developed with 360 in mind.

once the base moves on to next gen consoles. Pc is going back to sub 30fps second.
 
Well one thing is for sure PC is the better experience for a certain amount of individuals who can truly get to enjoy it, with a console everyone gets to experience the same graphics so there is no one really feeling left out unless you enter a pc thread and see what certain people invested in their gaming cpus to get those high fidelity graphics.
 

Grief.exe

Member
Well one thing is for sure PC is the better experience for a certain amount of individuals who can truly get to enjoy it, with a console everyone gets to experience the same graphics so there is no one really feeling left out unless you enter a pc thread and see what certain people invested in their gaming cpus to get those high fidelity graphics.

What is this? Care bear kindergarten hour?
 

kinggroin

Banned
yeah like crysis1?

You ran that at 60fps@1080p when it was released?

That is how pc gaming used to be. the scalablility used to mean you could run it at ok framerate on high end hardware.

Not like it is today, when you an average pc can run the latest multiplatform games 1080p@60fps. simply becase the game was developed with 360 in mind.

once the base moves on to next gen consoles. Pc is going back to sub 30fps second.

Crysis was released in 2005?
 
Top Bottom