BlockheadBrown
Member
Some didn't think Halo would be that big of a deal either.It doesn't matter what the game is called, what matters is marketing presence and I don't see Sony ever getting near what Microsoft accomplishes with Halo.
Some didn't think Halo would be that big of a deal either.It doesn't matter what the game is called, what matters is marketing presence and I don't see Sony ever getting near what Microsoft accomplishes with Halo.
I think Sony definitely needs a FPS "Halo killer" type of game near the launch of PS4, but I don't think Killzone franchise has the reputation to be that. It already has a certain stigma of "yeah this is a solid series but isn't essential" whereas Halo is truly a sensation for gamers. If it were up to me I would assign Guerilla to a new FPS IP, give them plenty of resources, and hype the shit out of it upon release. I don't even like FPS games personally, but that's what I would do in Sony's shoes.
http://www.eurogamer.net/articles/digitalfoundry-vs-killzone-3?page=2
Don't know how 116ms compares to its peers for 30fps shooters. Still seems a lot to me, but KZ3 certainly felt much more responsive than its predecessor.
Give me MAG2 at launch and Sony will have my $500.
Socom deserves another shot.
Cool - you made a post showing i'm right. For the record - dude didn't reference shooters, specifically.http://www.eurogamer.net/articles/digitalfoundry-vs-killzone-3?page=2
Don't know how 116ms compares to its peers for 30fps shooters. Still seems a lot to me, but KZ3 certainly felt much more responsive than its predecessor.
It doesn't matter what the game is called, what matters is marketing presence and I don't see Sony ever getting near what Microsoft accomplishes with Halo.
Cool - you made a post showing i'm right. For the record - dude didn't reference shooters, specifically.
I also don't care what genre it is - playability is important.
But you're not right. 116ms is very good for a 30FPS game. At the time the game was made it was believed that 100ms is theoretical minimum for a 30FPS game, because only one or two games have accomplished it. It's pretty clear that they took the issue seriously, worked hard to improve latency, and managed to improve it a lot.Cool - you made a post showing i'm right. For the record - dude didn't reference shooters, specifically
116ms is the same as the vast majority of 30fps games.
They need to give Socom another chance, I personally think they will remake Socom 2, ps4 luanch title..
Which were also blasted for their video latency (peripheral lag), albeit, weren't Sony exclusives and therefore didn't get much heat for it.The whole thing was completely overblown. KZ2 didn't have much more controller lag than other games tested back then, like GTAIV etc.
Which were also blasted for their video latency (peripheral lag), albeit, weren't Sony exclusives and therefore didn't get much heat for it.
I'm just looking forwards to less video latency which means less feel of peripheral lag. Some TVs are nailing it with response (not printed "response" which should be called "pixel state change", but isn't, therefore misleading) for video latency at 8 and 16ms - here's hoping we can get PC-like experiences being the next frame refresh the only frame to wait for... unless you run deferred processes which compound rendering times (cough FROSTBITE 2 cough), then its just shit.
AMD would get into shitloads of trouble I presume.
116ms is the same as the vast majority of 30fps games.
But you're not right. 116ms is very good for a 30FPS game. At the time the game was made it was believed that 100ms is theoretical minimum for a 30FPS game, because only one or two games have accomplished it. It's pretty clear that they took the issue seriously, worked hard to improve latency, and managed to improve it a lot.
How's the latency on LED displays these days (I remember Samsung LCDs were among the weaker performing ones)? Our family bought a plasma on my request just so that I could sidestep this issue.
Not necessarily.
Remember how IBM piggybacked off CELL for the XENON processor in the X360?
Not necessarily.
Remember how IBM piggybacked off CELL for the XENON processor in the X360?
CELL was a 3 company investment(Sony/Toshiba/IBM - it wasn't just meant for games). IBM probably owned that part of the design.
Situation here is MS hired AMD to design for them, and MS will own the "rights" to it. From what little I understand.
Cell was meant for games . not just for general purpose computing . UC3 and GOW3 are possible because of cell since rsx is weaker compared to xenos
umm, no. Cell is a processor that's great for bandwidth but not latency. It's a powerful processor but games need speed not throughput. Sure, it can crunch a lot of numbers but what's the point if it's difficult to get your result in real-time compared to other solutions?Cell was meant for games . not just for general purpose computing . UC3 and GOW3 are possible because of cell since rsx is weaker compared to xenos
The thing is is that KZ MP has to do something new. If its like Halo or COD, then whats the point?
Speaking of launch titles and GT6, we all know polyphony likes to take their time with the series. Could it be possible for them to release GT6rologue when the PS4 launches after how many copies of GT5rologue sold?
116 @ 30 - the math does not work. For 60, sure, it does, for 30, no. Figure out if you don't get it. 99ms or 132ms for 30fps, not 116.
A lot more games using a forward render, no vsync and triple buffering hit 99ms - which is a majority.
Uncharted 1, for example nailed the 3@30 rule then jumped to 4@30 for Uncharted 2 once Vsync and a buffer was implemented.
Deferred rendering, AA, etc all add those precious MS to rendering a single frame.
Then we must consider framerate drops, frames carried over and displayed from the buffer to keep from tearing or dropping - holdovers - increase latency even further.
Since most games don't use deferred, vsync or double/triple buffering (as is seen by countless torn frames, framerate drops, etc in 90% of games) - the majority hit the mark.
The average videogame runs at 30FPS, and appears to have an average lag in the region of 133ms.
multiple measurements with Xbox 360 Need for Speed: Hot Pursuit confirm a five frame delay - so 83ms in total.
Planetside 2 pre installed on PS4 would be a damn good move
CELL was a 3 company investment(Sony/Toshiba/IBM - it wasn't just meant for games). IBM probably owned that part of the design.
Situation here is MS hired AMD to design for them, and MS will own the "rights" to it. From what little I understand.
MS will own the rights to one specific design. They will not own the rights to the philosophy behind components and processors that accomplish a certain function in general.
It's why I think the whole hype surrounding the special chips (especially those claiming it will turn a 1.2TF chip into a 3TF chip) is largely horseshit.
In the case of Xenos, unified shaders were a known quantity and AMD was able to advance their technology so it was included first in the Xbox 360.
If these magical chips were so great, we'd see them in an AMD or nVidia roadmap for chips coming out in the future. But we don't. So that tells me that these chips aren't magic, but just something that may tailor to Microsoft's goals with the project as a whole rather than something that's the next big step in graphics tech.
Haha, I like the "What?" when asked if he has an Orbis dev kit, followed by the smile. All but confirmation.
I'm actually not so sure. Not everyone likes shooters, and having one preintalled might put the "shooters console" stigma onto the PS4, something I think the 360 has suffered from.
MS will own the rights to one specific design. They will not own the rights to the philosophy behind components and processors that accomplish a certain function in general.
It's why I think the whole hype surrounding the special chips (especially those claiming it will turn a 1.2TF chip into a 3TF chip) is largely horseshit.
In the case of Xenos, unified shaders were a known quantity and AMD was able to advance their technology so it was included first in the Xbox 360.
If these magical chips were so great, we'd see them in an AMD or nVidia roadmap for chips coming out in the future. But we don't. So that tells me that these chips aren't magic, but just something that may tailor to Microsoft's goals with the project as a whole rather than something that's the next big step in graphics tech.
The thing is is that KZ MP has to do something new. If its like Halo or COD, then whats the point?
Prepare to be proven wrong.
And KZ2 did that.....
KZ3 tried to be a bit more COD which was disappointing.
I really miss KZ2 back in it's prime. Man the beta was soooo fuckin good
So explain why this isn't in any of the upcoming GPUs. If it were so great, you'd think AMD would want a competitive advantage against nVidia.
So explain why this isn't in any of the upcoming GPUs. If it were so great, you'd think AMD would want a competitive advantage against nVidia.
To be quite honest, the custom silicon is just that- CUSTOM. You have to understand, peak FLOPS are not the only determinant on performance. GAF has been fixated with that number since the beginning. I don't know about this bitter thing, but I do know it will have DSPs, large cache's, and ED RAM.
The machine will be 1.28 TF with 8 GB DDR3/4 alongside ED RAM, like we've mentioned previously over and over and over again.
It won't be long till someone leaks it all.
The thing is is that KZ MP has to do something new. If its like Halo or COD, then whats the point?
It's custom, worked on by very smart AMD engineers, that also have an interest in progressing GPU technology forward.
It's not like AMD is just going to develop some groundbreaking new tech for Microsoft exclusively, knowing full well of its own potential internally when they pitched the idea to Microsoft.
Like I said before, the processor arrangement in Durango is probably custom made for Microsoft's overall goals for the project, but it does nothing magical to increase performance efficiency per watt in a general sense for overall graphics output. It may be tailor made for certain processing functions that Microsoft wants to emphasize, but in the end Microsoft is limited to the same power & heat constraints that Sony is for a closed box.
To be quite honest, the custom silicon is just that- CUSTOM. You have to understand, peak FLOPS are not the only determinant on performance. GAF has been fixated with that number since the beginning. I don't know about this bitter thing, but I do know it will have DSPs, large cache's, and ED RAM.
The machine will be 1.28 TF with 8 GB DDR3/4 alongside ED RAM, like we've mentioned previously over and over and over again.
It won't be long till someone leaks it all.
PS4 is 1.84 TF, 4 GB 192 GB/S RAM, alongside 4 steamroller cores. These are your next-gen systems. Derive what you can and will.
To be quite honest, the custom silicon is just that- CUSTOM. You have to understand, peak FLOPS are not the only determinant on performance. GAF has been fixated with that number since the beginning. I don't know about this bitter thing, but I do know it will have DSPs, large cache's, and ED RAM.
The machine will be 1.28 TF with 8 GB DDR3/4 alongside ED RAM, like we've mentioned previously over and over and over again.
It won't be long till someone leaks it all.
PS4 is 1.84 TF, 4 GB 192 GB/S RAM, alongside 4 steamroller cores. These are your next-gen systems. Derive what you can and will.
you and canchruser seem the only posters that makes sense but i have a question. these are closed dedicated GAMING consoles. with 1.28 -1.84TF and the types of RAM they both house, what can we expect performance wise if we were to guesss/ compare it to one of todays pc games?
edit: let add sawyer to this as well
I thought PS4 was going with Jaguar cores also unless i remembering wrong .
So what's up with dev complaints? And devs being more happy with x3?