I wonder if we'll see an updated version of it at GDC.
Not sure, that would leave a little over a month for noticeable improvements.
I wonder if we'll see an updated version of it at GDC.
Assumptions with no data...To me, I think the Unreal Engine 4 demo wasn't properly optimized in time for the PlayStation Meeting and not only that but it was on prototype hardware.
Assumptions with no data...
Anyway we would need direct footage from the ps4 and whatever spec'ed out PC Epic rolls out to compare.
Didn't someone from Epic confirm that UE4 was only using about 27% of the PS4's power and that tech demo was thrown together in like two weeks?
Didn't someone from Epic confirm that UE4 was only using about 27% of the PS4's power and that tech demo was thrown together in like two weeks?
Didn't someone from Epic confirm that UE4 was only using about 27% of the PS4's power and that tech demo was thrown together in like two weeks?
Originally Posted by almighty
And?
The original spec PS4 had 4Gb so that would be the same as a PC with a 2Gb GTX680 and 2Gb system RAM.
I seriously, seriously doubt that this demo uses over 2Gb of main RAM, especially as the demo might not even be 64bit.
The GTX680 is just a much much faster GPU then what's inside PS4 and that's what the problem is, not lack of memory
Originally Posted by MikeR
Honestly you have no clue on what you are talking about. This demo was created/running on approximation hardware - not final dev/PS4 silicon. Also, the demo was created within a constrained timeframe for showing. Just for sh**s and giggles, the demo only used 27-29% of the AH resources - unoptimized. Before you ask, there is no link, I am the link.
I don't believe this to be true.
for a few reasons:
- The upgrade from Xbox 360 to the Xbox 3 is around 990 GFLOPS so a difference of 610GFLOPS between 2 consoles in the same generation is pretty big.
I don't remember anything like that but Jonathon Blow only had a couple of weeks notice for getting a trailer and speech together. You could extrapolate from that and have an idea of how much time Epic had for their tech demo.
Can you show the maths behind this?
If you have this right; essentially a 1000gflop increase is considered a "generational leap" but a 600gflop increase is "tiny, incremental difference".
I thought they removed the sparse voxel octree lightening solution, but they have cheaper GI solution in place.
What does the Voxel version of GI allow that the "regular" GI doesn't allow?
What is Global Illumination and why is it so important to game realism?
Global Illumination refers to the calculation of light bouncing around a scene. GI is responsible for many of the subtle shading effects and ambience we see in real-world environments, as well as glossy and metallic reflections. Introducing real-time Global Illumination into Unreal Engine 4 is the biggest breakthrough in lighting since Unreal Engine 1 introduced real-time Direct Illumination in 1995.
Please give us an overview of how the algorithm works from generating the octree, to cone tracing, to the gathering pass.
The technique is known as SVOGI Sparse Voxel Octree Global Illumination, and was developed by Andrew Scheidecker at Epic. UE4 maintains a real-time octree data structure encoding a multi-resolution record of all of the visible direct light emitters in the scene, which are represented as directionally-colored voxels. That octree is maintained by voxelizing any parts of the scene that change, and using a traditional set of Direct Lighting techniques, such as shadow buffers, to capture first-bounce lighting.
Performing a cone-trace through this octree data structure (given a starting point, direction, and angle) yields an approximation of the light incident along that path.
The trick is to make cone-tracing fast enough, via GPU acceleration, that we can do it once or more per-pixel in real-time. Performing six wide cone-traces per pixel (one for each cardinal direction) yields an approximation of second-bounce indirect lighting. Performing a narrower cone-trace in the direction of specular reflection enables metallic reflections, in which the entire scene is reflected off each glossy surface.
[Editor's Note: If the above sequence seems alien to you, it's because it is. Global Illumination requires a totally new lighting pipeline. In a traditional game, all indirect lighting (light that is bounced from a surface) is calculated in advance and stored in textures called lightmaps. Lightmaps give game levels a GI-look but since they are pre-computed, they only work on static objects.
In Unreal Engine 4, there are no pre-computed lightmaps. Instead, all lighting, direct and indirect, is computed in real-time for each frame. Instead of being stored in a 2D texture, they are stored in voxels. A voxel is a pixel in three dimensions. It has volume, hence the term "voxel."
The voxels are organized in a tree structure to make them efficient to locate. When a pixel is rendered, it effectively asks the voxel tree "which voxels are visible to me?" Based on this information it determines the amount of indirect light (Global Illumination) it receives.
The simple takeaway is this: UE4 completely eliminates pre-computed lighting. In its place, it uses voxels stored in a tree structure. This tree is updated per frame and all pixels use it to gather lighting information.]
PS4 GPU 1,840 GFLOPS - Xbox3 GPU 1,230 GFLOPS = 610 GFLOPS
Xbox 3 GPU 1,230 GFLOPS - Xbox360 GPU 240 GFLOPS = 990 GFLOPS
I feel like an idiot, but I'm guessing it's just how they calculate the bounces is different with the voxel version? Lol. =X
EDIT: Does anyone have a source for the demo not using the Voxel based stuff but still using GI?
So 50% more power in PS4 won't make any difference (1.23 -> 1.84), but a 50% increase from that to eg a 7950 (2.8) on PC will 'blow away' PS4?
So 50% more power in PS4 won't make any difference (1.23 -> 1.84), but a 50% increase from that to eg a 7950 (2.8) on PC will 'blow away' PS4?
Didn't someone from Epic confirm that UE4 was only using about 27% of the PS4's power and that tech demo was thrown together in like two weeks?
Anyone who knows anything will question someone who claims something to be using "27%" of a console's power. What does that even mean? Prime95 uses 100% of a CPU's power, doesn't mean anything.
agreed but it can make the difference between gaming at 1080 vs playing at 720 or 60 to 30 fps for middle to late gen games depending if durango is showing its limitations first .
Let me see if im right.
Fillrate = ROPS * hz
16 * 800Mhz = 12.8 GPixel/s
32 * 800Mhz = 25.6 GPixel/s
1920 * 1080 = 2073600 pixels per frame.
2073600 / 100.000.000 = 0.0020736 GPixel per frame.
For a 60fps using only one rendertarget of 1920 * 1080.
0.0020736 * 60(as in fps) = 0.124416 GPixel/s
You can fill 102 rendertargets of 1920*1080.
So yeah 32 Rops is probably over kill for 1080p for gaming.
Edit: But 16 rops for durango seems to fit well with 24fps 4k movies.
And 32 rops is useful for sony if they are releasing 48fps 4k or 3D 4k tv sets.
So 50% more power in PS4 won't make any difference (1.23 -> 1.84), but a 50% increase from that to eg a 7950 (2.8) on PC will 'blow away' PS4?
PS4 GPU 1,840 GFLOPS - Xbox3 GPU 1,230 GFLOPS = 610 GFLOPS
Xbox 3 GPU 1,230 GFLOPS - Xbox360 GPU 240 GFLOPS = 990 GFLOPS
weren't the PS3 specs leaps and bounds better than the 360 on paper and we all know how that turned out? what makes the scenario different this time?
weren't the PS3 specs leaps and bounds better than the 360 on paper and we all know how that turned out? what makes the scenario different this time?
Nvidia gave a crappy GPU in the ps3 + 256ram is weak
What if Durango has 67% of PS4's pixel-processing power?Ratio. Durango is ~500% of 360, PS4 (if leaks accurate) is ~150% of Durango. There's a big difference between those two things.
Split ram. It was actually 512.
Rops count wont make a difference at the resolution the next gen console will be working.
Just gonna quote a older post of mine.
so what will those extra ROPS do then ?
they are obviously not just there for show id wager and we know they won't be used for 4k gaming either
having nealy double of them compared to XBOX3 has to show a difference visualy right?
better fillrate?
so what will those extra ROPS do then ?
they are obviously not just there for show id wager and we know they won't be used for 4k gaming either
having nealy double of them compared to XBOX3 has to show a difference visualy right?
better fillrate?
so what will those extra ROPS do then ?
they are obviously not just there for show id wager and we know they won't be used for 4k gaming either
having nealy double of them compared to XBOX3 has to show a difference visualy right?
better fillrate?
This makes sense to me. It's also probably more effort than it would be worth to lower the ROPS count from the base GPU.I dont know but i heard something on B3D that there is no ROPS count between 16 and 32.
So you go from 16 ROPS to 32 ROPS. Maybe sony is preparing the ps4 for HFR 4K movies.
Only half from video ram.
Sony put 256MB system ram themselves, not Nvidia.
Multi-screens, Remote play, VR goggles, etc.. I think.
/QUOTE]
* puts hype hat on.
So in theory PS4 could easily support occulus rift or an adaptation of it?
What are the chances that sony has a oculus rift support agreement and is currently keeping that in the deepest pits of secrecy until e3?
*removes hype hat.
I didn't said anything slow. Just your goggles said so.Gemüsepizza;50463051 said:PS3 has 256MB GDDR3 RAM (like Xbox 360) and 256MB XDR RAM. You make it sound a little bit like one of them is slow.
* puts hype hat on.
So in theory PS4 could easily support occulus rift or an adaptation of it?
What are the chances that sony has a oculus rift support agreement and is currently keeping that in the deepest pits of secrecy until e3?
*removes hype hat.
Multi-screens, Remote play, VR goggles, etc.. I think.
/QUOTE]
* puts hype hat on.
So in theory PS4 could easily support occulus rift or an adaptation of it?
What are the chances that sony has a oculus rift support agreement and is currently keeping that in the deepest pits of secrecy until e3?
*removes hype hat.
Even the durango can easily support oculus rift kind devices.
Rops count wont make a difference at the resolution the next gen console will be working.
Just gonna quote a older post of mine.
Do we know if voxel lighting is confirmed to be removed from unreal 4, and therefor won't be used for ps4 games?
Voxel lighting is the biggest addition to unreal 4 isn't it? Kinda lame if it's taken out. What else is unreal 4 bringing to the table?
Didn't someone from Epic confirm that UE4 was only using about 27% of the PS4's power and that tech demo was thrown together in like two weeks?
There is no documentation that it's easy either. Just the rumoured architecture which looks considerably more complex than PS4. Tiling decompression units, multiple DMA move engines, all presumably to get around the external ram bandwidth limitations. If those things are relatively automated, then great. But if not, and developers have to manually schedule when to move small chunks of memory around to get the most out of the esram, then it could take a long time for that to be properly tapped, and many developers may never fully do so. sounds a lot like PS3 and efficient SPE usage
Well it still has dynamic lighting and global GI, which is a huge jump from UE3. I'm also hoping that the weird look of textures from UE3 is gone.
I know what it does, I just wish people explained how it worked without the mumbo jumbo. I mean.. I can sift my way well through some heavy programming related documents, but I just could not follow that description you posted above lol.
It's all about the latencies bruh.Weird how that works huh?
& all while also ignoring the higher bandwidth memory.
Out of curiosity, does more ROPS allow higher FPS? I mean... it allows greater fill rate, which can affect resolution, but wouldn't it also allow like a 60fps constant?Rops count wont make a difference at the resolution the next gen console will be working.
Just gonna quote a older post of mine.
But there are much better solutions... PS2 blew away the competition with that eDRAM. 48GB/s. The Xbox and GC couldn't match it... neither could the PS3 or 360. But you look at Durango (102 GB/s from the eSRAM) and PS4 (172 GB/s) and the PS4 is on top easily... without the need for having eSRAM.I actually like Durango's architecture from the rumors we see, it sounds like a PS2 on steroids somewhat. Very low latency ESRAM might make for some algorithms which will not work well on PC's and more PC like architectures (like GS's very fast bandwidth and ultra low overhead for some operations which are expensive on just about any other GPU out there thanks to its design), but the system will need more care from programmers than a system providing greater bandwidth with a single pool of memory. Both systems look fun to program for from the outside.
Yeah, I'm interested in this too.So the voxel lighting is removed though? Id be interested in reading in what interview/article this was said.
Pertaining to ROPs, does it affect alpha texturing (or is it alpha blending) capability of the GPU?