• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Maybe this thread posting thing isn't all it's cracked up to be

None of the launch games are going to code to the metal. But I kind of agree. Knack doesnt look to amazing visually in my opinion so its disapointing its only 30fps.
That is my point exactly. It doesn't look so hot, yet the performance appears to be abysmal. I would expect a game with the visuals of Knack to actually run 60fps without breaking a sweat. Ditto Watch Dogs.
 
For the specs? i5 2500k OC, 8GB RAM, MSI TFIII 6950 OC.

Again, I'm simply assuming with the power of the 'closed box', it should be easier to utilise power, especially if you 'code to the metal' (whatever that means..)

Yes I know this thread has gotten a huge amount of backlash, but I'm just honestly surprised. Perhaps my expectations were too high. The PS3 has some visually fantastic games despite being on such dated hardware. The PS4 is a substantially stronger piece of hardware. But it doesn't appear to be that way.

So you seen all the PS4 titles? LOL
 
It is. Launch and cross generational games don't do it justice, never did to any system. Here's how games can look like:
The Order 1886
image_the_order_1886-22390-2752_0007.jpg





http://blog.us.playstation.com/2013/06/10/the-order-1886-brings-victorian-era-to-playstation-4/

"In-engine" cut-scene. And I think it's more the framerates than the graphics which are in question here.

I'm surprised that the games don't run more smoothly on both platforms already, given they're both using x86 tech etc. It's not like devs are having to learn an entirely new architecture and instruction set.

Perhaps the 8-core/low-clock speed Jaguar CPU is holding things back somewhat, where more needs to be offloaded on to the GPU?
 
That's mid end now?

And even with that you wouldn't be getting 60 at 1080p. More like 40-50.
Is the 6950 a low-end card? I thought it was a mid-range GPU.

Yes the performance was a little iffy on AC3, but it felt like 60FPS for the majority of the game. Again, it wasn't maxed out.
 
Are the specs of the PS4 overhyped?

How is it that Watch_Dogs looks and runs like garbage on the PS4, while AC3 runs at 1080p 60fps on my 2 year old 'mid range' PC?

Why does Driveclub look and run like crap?

Honestly, the only games which even look remotely good are Killzone and Infamous on the system. And the former has had FPS problems


Most obvious troll in the history of trolling!
 
On a side note, on paper the PS4 is not a beast. It only looks beastly because the Xbox One's specs are decidedly lower end. In all honesty, both of these consoles should have been more powerful. The 8GB GDDR5 is a spanner in the works, but it will probably take time to get the most out of, especially seeing as how games presently are accustomed to using much less ram. Add to that, getting all those Sony first party games that were Cell heavy to be more GPU heavy will take time.

I disagree. For $400 with a controller, mic, and a HDMI cable. To me that is a beast when you consider the price.
 
I think people are just disappointed most games are not 1080p/60. Though at least we are getting 1080p native with Ps4 and xb1. There was a time before the official reveals when we thought we would be stuck with 720p still.
 
Of course it isn't as strong as people think.

All of the coding to the metal comments will not stop it from being a mid range pc from 2010.
 
This short-lived era of cross-gen games is going to suck and I can't wait until we're past it. You could never convince me that the next-gen version isn't being held back in some way.
 
It's only fair as that's how the WiiU was judged.
Are you expecting the PS4 games to look and run worse than their 360 counterparts? the PS4 is factually the most powerful console and obviously way more powerful than the WiiU.
 
We were talking about the PS3 playing games at 1080p when it was released.

So no, we have absolutely -zero- evidence that a company would ever hype it's product too far.

A lot of PS3 early launch titles did run at 1080p/60fps actually. Ridge Racer and Wipout are some for example.
 
To be honest I'm simply surprised at the graphical downgrades we have seen with Watch_Dogs.

The game appears to look worse and worse everytime it is shown. Did you guys see the Jimmy Fallon video? I didn't expect Watch_Dogs to look like a masterpiece, but it almost looked like a current-gen game in that demo!

First off, the first videos you saw were made for you to drool over and were not on realistic target hardware.

Plus, and this is probably a big one, one year ago when you saw the first videos of Watch Dogs, you were probably not as accustomed to next gen graphics, so your mind tricks oyu now into thinking, that the visual quality of the game deteriorated over the last months.

In reality, probably you are just more used to next gen graphics, with all this stuff shown lately, so Watch Dogs just looks mediocre now compared to the real big guns we saw at E3 this year.

A direct comparison between the first shown video and recent footage would be interesting.
 
MotorStorm looked like crap too at E3 2006. It looked awesome at the EU launch. If they also turn arond DriveClub like that for the launch, they might be the weirdest studio ever.

I read the other day (someone's post on here, I think), that Evolution develop 'in layers', or whatever the term would be, i.e. as opposed to the vertical slice many devs use to show their games off. Would be interesting if true, and also will be interesting to see how Drive Club shapes up at launch as opposed to a fairly muted E3.
 
Are the specs of the PS4 overhyped?

How is it that Watch_Dogs looks and runs like garbage on the PS4, while AC3 runs at 1080p 60fps on my 2 year old 'mid range' PC?

Why does Driveclub look and run like crap?

Honestly, the only games which even look remotely good are Killzone and Infamous on the system. And the former has had FPS problems

I have to agree with OP on this one but his comparison is not very good. I have a feeling Xbox one while on paper is less powerful could perform better than PS4 in games. Pretty much almost every game Microsoft announced was 60fps while Sony's Studios are making 30fps games. Sure Forza does not have day night cycle or realistic clouds and has prebaked lights but it damn well looks godly in that 60fps. Microsoft had my curiosity but with the DRM change, they have my attention.
 
AC4 runs at 1080p 60fps on the PS4.

30 FPS according to Digital Foundry.

I assume that a Core i7 CPU by Intel is way faster than the AMD Jaguar APU, that was developed for notebooks, not desktop PCs.
A Nvidida graphics card for 200-300$ should be also faster than the upcoming consoles.
 
I think you make a certain concession when you decide to console the game, convenience over ultra high visuals and fps(all the time).

pc will be better unless they mess up the port

plus its to early to tell.

so yeah not as awesome as a PC
 
"In-engine" cut-scene.

According to the developers, the in-game graphics will look the same.

We also strived to create a seamless experience when it came to the game. The idea was to make sure that you never saw any visual discrepancies or breaks in continuity between gameplay and cinematic. Our game models and our cinematic models are one and the same, and everything is rendered real time in the engine as you play the game. The trailer we presented is a great example of that. What you saw is running in-engine, in-game with no gimmicks. These visuals are what you can expect of the final game when you play it.
 
"In-engine" cut-scene. And I think it's more the framerates than the graphics which are in question here.

I'm surprised that the games don't run more smoothly on both platforms already, given they're both using x86 tech etc. It's not like devs are having to learn an entirely new architecture and instruction set.

Perhaps the 8-core/low-clock speed Jaguar CPU is holding things back somewhat, where more needs to be offloaded on to the GPU?

Our game models and our cinematic models are one and the same, and everything is rendered real time in the engine as you play the game. The trailer we presented is a great example of that. What you saw is running in-engine, in-game with no gimmicks. These visuals are what you can expect of the final game when you play it.

.

Wipeout wasn't full 1920x1080 though, the resolution dynamically changed to maintain the frame rate.

And that is surely better than dropping frames. I wonder why more games didn't use it.
 
A lot of PS3 early launch titles did run at 1080p/60fps actually. Ridge Racer and Wipout are some for example.

A) Wipeout HD wasn't out at launch. I *think* it was summer 2008 when it was released, and B) do you have any other examples of early PS3 games than just those two?! While I don't really care about the number of pixels and visual quality as much as art style, I think Sony definitely stretched the truth in terms of the 1080p60fps dream in the run-up to the PS3 launch.
 
Will be kind of funny(and sad) when phones drive games at higher resolution in a couple of years.

Not really. Because the phone isn't driving the game at the same fidelity.

If you wanted you could go make a 4k120fps game on PS4/Xbone. It wouldn;t have the fidelity of a 1080p30 game though.
 
I have to agree with OP on this one but his comparison is not very good. I have a feeling Xbox one while on paper is less powerful could perform better than PS4 in games. Pretty much almost every game Microsoft announced was 60fps while Sony's Studios are making 30fps games.

To be honest I don't trust those games were running on the actual hardware no matter what MS said.

And all this seems to me like a turning point on the my PS4 hype, while it will perform better than the One it still not powerfull as we expected?
 
I read the other day (someone's post on here, I think), that Evolution develop 'in layers', or whatever the term would be, i.e. as opposed to the vertical slice many devs use to show their games off. Would be interesting if true, and also will be interesting to see how Drive Club shapes up at launch as opposed to a fairly muted E3.

Sounds fun. Anyone have that link? I'd like to see more of it.
 
To be honest I'm simply surprised at the graphical downgrades we have seen with Watch_Dogs.

The game appears to look worse and worse everytime it is shown. Did you guys see the Jimmy Fallon video? I didn't expect Watch_Dogs to look like a masterpiece, but it almost looked like a current-gen game in that demo!

Well to be fair that's an Ubisoft game, that's their MO
 
It will be 3 years old by the time the consoles are released, and it was never an enthusiast product. (That's not to say it isn't a good card, but it's far from "high-end" these days)

Does that mean my 6870 is low end now? Even though it can run most games on max, it close to, settings?


Think we need to introduce new tiers.
 
"In-engine" cut-scene. And I think it's more the framerates than the graphics which are in question here.

I'm surprised that the games don't run more smoothly on both platforms already, given they're both using x86 tech etc. It's not like devs are having to learn an entirely new architecture and instruction set.

Perhaps the 8-core/low-clock speed Jaguar CPU is holding things back somewhat, where more needs to be offloaded on to the GPU?

They have said there won't be any difference in cutscenes and gameplay.
 
Top Bottom