• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[Eurogamer\DF] Orbis Unmasked: what to expect from the next-gen PlayStation.

sTeLioSco

Banned
The math involved in actual realistic lighting is way beyond what is currently used in many of this generations games. Sure some tricks like ssao, hdr etc have been used but that is nothing compared to well executed ray tracing. There won't be ray tracing next gen, but the use of dynamic lighting will be something impressive compared to the often flat and life less lighting of the current gen. Not to mention simple things like shadow resolution increases.

thx
 
The math involved in actual realistic lighting is way beyond what is currently used in many of this generations games. Sure some tricks like ssao, hdr etc have been used but that is nothing compared to well executed ray tracing. There won't be ray tracing next gen, but the use of dynamic lighting will be something impressive compared to the often flat and life less lighting of the current gen. Not to mention simple things like shadow resolution increases.

Andy I know your getting bored to death of this but I am throwing another question your way. All these rumors about imminent console reveals. Any weight behind this stuff or are we waiting till E3 like we always do(except with the 360). I'm just trying to keep my mind as balanced as possible without resorting to drinking.
 
I'm a bit disappointed at both consoles but more so for orbis. I think 2014 would have been the best year to release a console. HMC, 3d stacking, full HSA, maxwell gpus which is supposed to be a big improvement and volcanic islands is its rival. A console with that technology would last a decade gladly.

I guess Sony went with gddr5 cause they didn't want to delay the system.

Another reason to be disappointed is unreal engine 4 is dropping their voxel lighting system which I believe was their strongest point for their engine. This can only be because the systems aren't powerful enough. I guess when epic said they need 2.5 teraflops they were not messing around, secret sauce be damned. I hope some breakthrough is introduced during spec analysis at gdc.
 

McHuj

Member
Not for VRAM on PCs. I have 156GB/s here.

One of the problems with comparing console to PC is that first you need to move the data (like a texture) from system memory to VRAM over a PCI-E bus.

Typical PC 4/8GB DDR3 => Main Mem bus 25.6 GB/sec => PCI-E 16GB/sec => VRAM 1-3 GB => 100-200 GB/sec to GPU.

There's a bunch of latency and bottlenecks to get through. If you're on a small memory GPU, textures and other data are probably being thrown out and reloaded all the time.

No one has seen what is actually capable from a system like PS4 that has a unified memory on a very fast bus to both CPU/GPU.
 
Unless they're looking to blow more cash... I doubt it.

But I could be wrong.
I doubt it too. From the rumours going around it seems like MS had the advantage and then Sony pulled a fast one and evened things up. At this point things should be set in stone.

Some people seem to think MS are just going to take the advantage based on their cash flow, but things are never that simple. If it was, they would be far more dominant in markets they've tried to enter.
 

Magpul

Banned
Judging by this Sony has a real potential to pull ahead in the area of first party games due to the potential of Orbis. Here's hoping to they invest heavily into it.
 
I dunno if I see myself buying Durango if MS doesn't step up their first party offerings. This gen i bought a 360 before a ps3, but in the long run I'm not sure i'm totally satisfied with my purchase. Halo and Gears are the only two exclusives to write about, and I'm not a huge fan of Halo >_>

and paying for online play is a barrier for entry for me, especially if there's no included wifi on the system again.
 

Reiko

Banned
I'm a bit disappointed at both consoles but more so for orbis. I think 2014 would have been the best year to release a console. HMC, 3d stacking, full HSA, maxwell gpus which is supposed to be a big improvement and volcanic islands is its rival. A console with that technology would last a decade gladly.

I guess Sony went with gddr5 cause they didn't want to delay the system.

Another reason to be disappointed is unreal engine 4 is dropping their voxel lighting system which I believe was their strongest point for their engine. This can only be because the systems aren't powerful enough. I guess when epic said they need 2.5 teraflops they were not messing around, secret sauce be damned. I hope some breakthrough is introduced during spec analysis at gdc.

Until final specs are revealed... Don't jump the gun.
 

tipoo

Banned
I have a Surface with 2gb of RAM and I have yet to see my usage go above 1gb with a whole slew of apps open and a bunch of browser tabs. 3gb of RAM is overkill for the next box to reserve. For game critical aspects of the OS, as we can see from the 360, not much is necessary for those services.


But these new consoles will have to last until about 2020 the way things are going, perhaps longer, maybe 2023-ish. Maybe it will keep getting OS updates and Microsoft knows it will take more RAM eventually, so they tell developers that they can only use the total amount minus 3 to be safe as it is. Because if they write games using more than that right now, OS updates may break it if they use that 3GB RAM.

Besides wasn't there also a rumor they whittled it down to 1.5GB and are targeting 1GB? Maybe the 3GB was just a early development precaution.


I'm a bit disappointed at both consoles but more so for orbis. I think 2014 would have been the best year to release a console. HMC, 3d stacking, full HSA, maxwell gpus which is supposed to be a big improvement and volcanic islands is its rival. A console with that technology would last a decade gladly.

And the year after that even greater hardware will be out. And the year after that, and that. There's always something much better on the horizon, they have to launch some time. I'd rather not keep waiting. Late 2013 with the hardware I've been hearing is good enough for me. Consoles can pull some impressive stuff with the limited hardware they have.
 

Karak

Member
Weird request.

Can someone help me out and either PM me or post in this thread and answer these questions? I need them when I talk to my source tonight. Something that was said is starting to really make me think along unique lines for future plans.

1. Do any of the streaming game companies utilize a combination of local hardware and distance hardware for their solutions? Terrain generation, differed lighting, other calculations and then merge them within the PC? Genki(SIC) or onlive? More importantly did they plan to or find problems with that mixing local hardware and distance hardware in streaming technology?

2. What speed did Genki or Onlive indicate as their "best performance" Internet speed?

3. What is the interface speed or even the delay between a computer getting Internet information and putting it on screen in either one of those 2, or likewise services? Its got to run around in the PC and shit before hitting the display. I am wondering if that is taken into account with the overall latency?

I am really sorry but I am going to have a long drive and spotty internet. But something in the MS powerpoint as well as a missing bit in the known Durango specs, and much more importantly something Corrien Yu kept talking about in interviews is making me go in a direction I didn't assume before. Even if its just explaining the weirdness.

Thanks whom-ever does this! And if none of you do...I still love ya.
 
I'm a bit disappointed at both consoles but more so for orbis. I think 2014 would have been the best year to release a console. HMC, 3d stacking, full HSA, maxwell gpus which is supposed to be a big improvement and volcanic islands is its rival. A console with that technology would last a decade gladly.

I guess Sony went with gddr5 cause they didn't want to delay the system.

Another reason to be disappointed is unreal engine 4 is dropping their voxel lighting system which I believe was their strongest point for their engine. This can only be because the systems aren't powerful enough. I guess when epic said they need 2.5 teraflops they were not messing around, secret sauce be damned. I hope some breakthrough is introduced during spec analysis at gdc.

It wont matter a great deal i don't think, core 2 cpu's and geforce 8800s had current gen completely smoked a year or so after they launched and the hardware has held up decently well i feel (performance perhaps not so much).

The market declines we have seen pretty much has them over a barrel to release this year imo, they can't afford to just keep fading out of peopes minds.
 
I'm a bit disappointed at both consoles but more so for orbis. I think 2014 would have been the best year to release a console. HMC, 3d stacking, full HSA, maxwell gpus which is supposed to be a big improvement and volcanic islands is its rival. A console with that technology would last a decade gladly.

I guess Sony went with gddr5 cause they didn't want to delay the system.

Another reason to be disappointed is unreal engine 4 is dropping their voxel lighting system which I believe was their strongest point for their engine. This can only be because the systems aren't powerful enough. I guess when epic said they need 2.5 teraflops they were not messing around, secret sauce be damned. I hope some breakthrough is introduced during spec analysis at gdc.
what? they're dropping the voxel lighting from the engine? dayum... it looked so damn beautiful though. i sad
 
Weird request.

Can someone help me out and either PM me or post in this thread and answer these questions? I need them when I talk to my source tonight. Something that was said is starting to really make me think along unique lines for future plans.

1. Do any of the streaming game companies utilize a combination of local hardware and distance hardware for their solutions? Terrain generation, differed lighting, other calculations and then merge them within the PC? Genki(SIC) or onlive? More importantly did they plan to or find problems with that mixing local hardware and distance hardware in streaming technology?

2. What speed did Genki or Onlive indicate as their "best performance" Internet speed?

3. What is the interface speed or even the delay between a computer getting Internet information and putting it on screen in either one of those 2, or likewise services? Its got to run around in the PC and shit before hitting the display. I am wondering if that is taken into account with the overall latency?

I am really sorry but I am going to have a long drive and spotty internet. But something in the MS powerpoint as well as a missing bit in the known Durango specs, and much more importantly something Corrien Yu kept talking about in interviews is making me go in a direction I didn't assume before. Even if its just explaining the weirdness.

Thanks whom-ever does this! And if none of you do...I still love ya.

These are all far above my pay grade. Someone else needs to jump on this stat!
 
Darn. Looks like the thread shifted to doom and gloom. But anyway.

Cannot believe the Orbis is going to have 4gb gddr5. Maybe I'm missing something but that seems a crazy advantage. Unless MS really has shifted its focus, not sure how or why they wouldn't have something in the Durango to make up for it.
.

It's been speculated that 3.5gb of that will be dedicated to games with 512mg reserved for the OS. But I really like the ideal that when a game is paused the OS inflates to 1gb the same way the Vita OS functions.
 

Karak

Member
These are all far above my pay grade. Someone else needs to jump on this stat!

Yes please do. This isn't a bullshit request or a random thing.
I mean it may be...but it sure the fuck doesn't feel that way now. Especially noticing just how closely MS is following certain patterns.

I am out.
 
Weird request.

Can someone help me out and either PM me or post in this thread and answer these questions? I need them when I talk to my source tonight. Something that was said is starting to really make me think along unique lines for future plans.

1. Do any of the streaming game companies utilize a combination of local hardware and distance hardware for their solutions? Terrain generation, differed lighting, other calculations and then merge them within the PC? Genki(SIC) or onlive? More importantly did they plan to or find problems with that mixing local hardware and distance hardware in streaming technology?

2. What speed did Genki or Onlive indicate as their "best performance" Internet speed?

3. What is the interface speed or even the delay between a computer getting Internet information and putting it on screen in either one of those 2, or likewise services? Its got to run around in the PC and shit before hitting the display. I am wondering if that is taken into account with the overall latency?

I am really sorry but I am going to have a long drive and spotty internet. But something in the MS powerpoint as well as a missing bit in the known Durango specs, and much more importantly something Corrien Yu kept talking about in interviews is making me go in a direction I didn't assume before. Even if its just explaining the weirdness.

Thanks whom-ever does this! And if none of you do...I still love ya.

I don't give a shit about no cloud gaming man! Ask about OS memory reserve, ask about special sauce, ask about CPU and GPU

Honestly? Tell the guy you wanna know everything.
 

Reiko

Banned
These are all far above my pay grade. Someone else needs to jump on this stat!

dxLs2.jpg
 

Striek

Member
Another reason to be disappointed is unreal engine 4 is dropping their voxel lighting system which I believe was their strongest point for their engine. This can only be because the systems aren't powerful enough. I guess when epic said they need 2.5 teraflops they were not messing around, secret sauce be damned. I hope some breakthrough is introduced during spec analysis at gdc.

Where did they say 2.5 TFLOPS?
From E3:
Nvidia: What is the target platform for UE4? What kind of hardware are gamers going to need to run UE4 based games?

Tim Sweeny: Unreal Engine 4s next-generation renderer targets DirectX 11 GPUs and really starts to become interesting on hardware with 1+ TFLOPS of graphics performance, where it delivers some truly unprecedented capabilities. However, UE4 also includes a mainstream renderer targeting mass-market devices with a feature set that is appropriate there.


The TFLOPS argument is stupid anyway. I remember MS said the X360 was 1 TFLOPS and Sony fired back that the PS3 had 2 TFLOP (1.8 of which was the GPU). Now its a new generation and both successors are having nearly the same figures bandied about them, heh.
 
Weird request.

Can someone help me out and either PM me or post in this thread and answer these questions? I need them when I talk to my source tonight. Something that was said is starting to really make me think along unique lines for future plans.

1. Do any of the streaming game companies utilize a combination of local hardware and distance hardware for their solutions? Terrain generation, differed lighting, other calculations and then merge them within the PC? Genki(SIC) or onlive? More importantly did they plan to or find problems with that mixing local hardware and distance hardware in streaming technology?

2. What speed did Genki or Onlive indicate as their "best performance" Internet speed?

3. What is the interface speed or even the delay between a computer getting Internet information and putting it on screen in either one of those 2, or likewise services? Its got to run around in the PC and shit before hitting the display. I am wondering if that is taken into account with the overall latency?

I am really sorry but I am going to have a long drive and spotty internet. But something in the MS powerpoint as well as a missing bit in the known Durango specs, and much more importantly something Corrien Yu kept talking about in interviews is making me go in a direction I didn't assume before. Even if its just explaining the weirdness.

Thanks whom-ever does this! And if none of you do...I still love ya.

1) No, it would be incredibly difficult. The game would have to have been designed around such a thing. It would also necessitate doing a partial install which is kind of counter to the idea of streaming games.

2) I can only say the minimum is 3 mbps and recommended is 5mbps+.

3) Gakai captures input and renders at 60 fps while only sending every other frame to the client. This has the effect of mitigating latency to where it appears as if there's none at all. OnLive typically appeared to add another 50-100ms to latency.
 

Kagari

Crystal Bearer
I'm a bit disappointed at both consoles but more so for orbis. I think 2014 would have been the best year to release a console. HMC, 3d stacking, full HSA, maxwell gpus which is supposed to be a big improvement and volcanic islands is its rival. A console with that technology would last a decade gladly.

I guess Sony went with gddr5 cause they didn't want to delay the system.

Another reason to be disappointed is unreal engine 4 is dropping their voxel lighting system which I believe was their strongest point for their engine. This can only be because the systems aren't powerful enough. I guess when epic said they need 2.5 teraflops they were not messing around, secret sauce be damned. I hope some breakthrough is introduced during spec analysis at gdc.

No.
 
I can't see either of those retailing for less than 400 quid/dollars. Going to be interesting to see what happens this gen with regards to marketshare and future third party support.

Mind you, those of you waiting to see what these consoles are going to be capable of at E3 should wait until they're released or videos showing actual gameplay if Sony's 2006 E3 is anything to go by, I'm expecting this years E3 to be full of bullshot FMVs and talk of theoretical performance rather than achievable performance lol.
 

coldfoot

Banned
The one with the most RAM. Both could have great AF. So it's a wash.
Not necessarily true when you have streaming. Uncharted manages to have great textures despite the PS3's RAM. Besides, the GPU can only use ~1GB of data per frame in the 360 vs ~3 GB per frame on the PS3. I think both of those limits are plenty.
 

McHuj

Member
Weird request.

Can someone help me out and either PM me or post in this thread and answer these questions? I need them when I talk to my source tonight. Something that was said is starting to really make me think along unique lines for future plans.

1. Do any of the streaming game companies utilize a combination of local hardware and distance hardware for their solutions? Terrain generation, differed lighting, other calculations and then merge them within the PC? Genki(SIC) or onlive? More importantly did they plan to or find problems with that mixing local hardware and distance hardware in streaming technology?

.

1. Can't really split the computation since the stuff you're computing (and for example exchanging between the CPU and GPU) is very, very high bandwidth, GB/sec, and very low latency (nanoseconds)
 
Weird request.

1. I've heard of locally drawn head up display and things but nothing complicated like you suggest.
Not quite the same but Square's CoreOnline is a browser plugin which downloads level data and then lets you play it rendered locally, just tried it and am playing Tomb Raider Underworld in less than a minute.
 

RaijinFY

Member
Cell was never meant to be used alone.
They were thinking about making a discrete GPU based on Cell (basically SPEs + ROPs) but it didn't perform as expected for graphics and they they had to use Cell coupled with a traditional GPU.
Generally speaking relying on being able to use many different computation elements to obtain the best possible performance has always led to uneven results and it's not a developer friendly approach. Many just won't bother, or won't have the budget, time or knowledge to explore all the hardware peculiarities.
That is also why I don't think it's the case with Durango because if I got the idea right, they're not adding stuff as co-processors to increase the raw power (which doesn't make a lot sense, at that point they would just choose more powerful main hardware), but it's more about specific solutions performing fixed things, to help to get the most out of what's inside while keeping costs low.

From what i have read on B3D, the Cell's GPU blabbling has been always bullshit. The GPU that was supposed to be inside the PS3 was Sony-Toshiba conjoint effort, comparable to what they had done with the PS2. But it wasnt in any case a "Cell GPU".
 

mephixto

Banned
So the PS4 and NextXbox gonna rock something similar to a 7970M? That's dissapointing that system can barely hold 55fps of Crysys 2 on Ultra. The article video show one of the least demanding sections of the game.
 
So the PS4 and NextXbox gonna rock something similar to a 7970M? That's dissapointing that system can barely hold 55fps of Crysys 2 on Ultra. The article video show one of the least demanding sections of the game.

How does it handle 30fps?

IMO using 60fps as a measuring stick on consoles will always be useless since most developers will sacrifice frame rate to push better graphics out of the system.
 
So the PS4 and NextXbox gonna rock something similar to a 7970M? That's dissapointing that system can barely hold 55fps of Crysys 2 on Ultra. The article video show one of the least demanding sections of the game.

Read the article please. I wish we could somehow enforce this notion that to discuss a topic referenced it is assumed people read the reference.
 
So the PS4 and NextXbox gonna rock something similar to a 7970M? That's dissapointing that system can barely hold 55fps of Crysys 2 on Ultra. The article video show one of the least demanding sections of the game.

The whole system will be more powerful because of more fast RAM, an 8 core CPU and a dedicated compute unit etc.
 
I know there's an improvement every year in hardware but there's only a significant improvement every 4-5 years. For example, moving to sandy bridge cpus was a huge improvement. Now haswell might be the next step since it brings power efficiencies to a new level. The next 2-3 years will just be refinements of that, no big deal. 3D stacking is another huge improvement that would work well in consoles especially with ddr4 which is also coming soon and will remain in the market for many years just like the previous ram setups. There are technologies that would be very beneficial to consoles. I am not talking about releasing 2015 or 2016 cause new shit will always hit the block. It's specifically 2014 cause there is significant improvements there that is well worth the wait.

As for my source on UE4, check the beyond3d forum, last few pages on the next gen thread. I believe a mod and another person spoke to the studio personally about it.

I know some are saying wait till final specs but how is it that for the past 1-2 years the rumours have been very consistent from different sources. This leaves me to believe that we know the bulk package that will reside in next gen consoles already.
 
Top Bottom