• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

FeiRR

Banned
Okay, so it is going to be a revolution. No prisoners, no compromises. They're going full in.

giphy.gif
 

ZywyPL

Banned
Do you rly think the XSX can hold it's CPU and GPU load at the same time? No, it won't.

So you imply PS4 and XB1 also didn't held their clocks? I mean, why would they, under heavy load? No way! Especially Pro that screams like a starting Jumbo Jet, maybe it downclocks itself to just 2.5TF, hence can't reach anything more than 1440-1620p? Makes sense!... Jesus Christ, I see XSX coming out stronger is a really tough pill to swallow for some.

New control looks pretty sweet.


This looks like a mix of XB controller shape with DS symmetrical analog sticks, a.k.a. the best possible scenario. Fricking fantastic, hype intensifies!
 

geordiemp

Member
I just heard from a journalist who writes for Windows Central that the controller is so heavy that after a prolonged session, play-testers are having trouble moving their fingers. Also, that light bar you see? It's dissipating a lot of heat and the controller can get really hot. Also, battery is worse than DS4. Sony are so worried about this that they are undergoing a significant revision of the controller. Watch this space.

Best post of the thread....You beat the discord gang to it lol

You should write it in a colbert poem,,,,/s
 

SlimySnake

Flashless at the Golden Globes
I just heard from a journalist who writes for Windows Central that the controller is so heavy that after a prolonged session, play-testers are having trouble moving their fingers. Also, that light bar you see? It's dissipating a lot of heat and the controller can get really hot. Also, battery is worse than DS4. Sony are so worried about this that they are undergoing a significant revision of the controller. Watch this space.
LMAO.
 
T

Three Jackdaws

Unconfirmed Member
I would have liked the button symbols to be coloured but it still looks great.
 
Last edited by a moderator:
Last edited:
S

SLoWMoTIoN

Unconfirmed Member
No. They're clear, so they look that way. If you check out the higher quality images on the blog, you can see that they are definitely raised.
So it is trying to be a PS Vita?

You know the Vita?


No?


:(
 

psorcerer

Banned
Can you elaborate on that? You mean bandwidth wise or cores utilization
Theres plenty that can be done to tax those cores

To utilize CPU cores not a lot of bandwidth is needed.
If data doesn't fit into cache CPU becomes ineffective. In fact it becomes a very narrow and slow GPU.
 

psorcerer

Banned
I would expect CPU usage in next gen games to be vastly different to current gen games. Once the baseline is a 16 thread CPU with solid IPC and clocks, developers will find ways to keep it busy: Smarter AI, more complex physics simulations, better animation systems or even advanced audio effects.

That's just words. Need numbers.

CPUs are not only used for “bad coding”, they are used for branchy coding where parallelisation is not effective. CPU code can be as optimized as GPU code; else there would not be any need for SIMD/AVX in the first place.

There is no need for SSE/AVX, it was a temp solution till GPUs got fully programmable compute.
If you have a very branchy code - it's a "bad code" automatically. It will not run fast neither on CPU nor on GPU.
 
Locking GPU in 10GB looks like a bad idea to me. You will need to copy memory from CPU to GPU all the time then. Loading from SSD? Let's now waste more bandwidth and latency to copy it over.

The GPU can always make normal resource requests through the CPU. It just has direct to SSD access (like a framebuffer) as as well as direct to 10GB VRAM access.over the 320bit bus. I myself wish it was wide across all 16GB of data but MS said that the memory timings were off so this was the best solution.
 

psorcerer

Banned
The GPU can always make normal resource requests through the CPU. It just has direct to SSD access (like a framebuffer) as as well as direct to 10GB VRAM access.over the 320bit bus. I myself wish it was wide across all 16GB of data but MS said that the memory timings were off so this was the best solution.

I'm not sure GPU can make any resource requests in DX12U. AFAIR, CPU feeds command buffer tokens to GPU and binds all the resources.
It probably means that loading a new texture mip would do: SSD->RAM->CPU->RAM->GPU. I.e. say goodbye to any GPU-originated work creation...

MS said that the memory timings were off

Oh, I think I understand the problem.
XBSX doesn't have a coherency support for GPU.
They cannot load data in place while GPU caches are full.
I think it will impact multiplatfrom games a lot.
Same situation as today: PS4 has coherency support but nobody uses it because XOne doesn't...
Say goodbye to loading things mid-frame... :messenger_crying:

P.S. you know what. That's what happens on PC. You cannot avoid CPU round trip.
Eases support of DX12U games for PC for MSFT...
 
Last edited:

Turk1993

GAFs #1 source for car graphic comparisons
I like it and it looks really comfortable, and did they just confirm a white console :). PLZ Sony i need my white console :messenger_grinning_sweat:
 

Evilms

Banned
  1. 270k - PS5 DualSense controller design reveal
  2. 214k - PlayStation : PS5 in holiday 2020
  3. 147k - Santa Monica : Kratos gif
  4. 129k - Ape Escape : 20th year anniversary
  5. 125k - PlayStation : PS5 details reveal announcement (Cerny talk)
  6. 105k - Naughty Dog : TLOU Part II reveal (PSX 2016)
  7. 103k - Neil Druckmann : TLOU Part II shooting finished
  8. 96k - PlayStation Japan : PS5 in holiday 2020
  9. 90k - PlayStation Japan : Valentine's Day 2019
  10. 87k - HBO : TLOU Series announcement


Sony/PlayStation is all power !
 
Last edited:

geordiemp

Member
I'm not sure GPU can make any resource requests in DX12U. AFAIR, CPU feeds command buffer tokens to GPU and binds all the resources.
It probably means that loading a new texture mip would do: SSD->RAM->CPU->RAM->GPU. I.e. say goodbye to any GPU-originated work creation...



Oh, I think I understand the problem.
XBSX doesn't have a coherency support for GPU.
They cannot load data in place while GPU caches are full.
I think it will impact multiplatfrom games a lot.
Same situation as today: PS4 has coherency support but nobody uses it because XOne doesn't...
Say goodbye to loading things mid-frame... :messenger_crying:

What about the cache scrubber ?
 
I'm not sure GPU can make any resource requests in DX12U. AFAIR, CPU feeds command buffer tokens to GPU and binds all the resources.
It probably means that loading a new texture mip would do: SSD->RAM->CPU->RAM->GPU. I.e. say goodbye to any GPU-originated work creation...



Oh, I think I understand the problem.
XBSX doesn't have a coherency support for GPU.
They cannot load data in place while GPU caches are full.
I think it will impact multiplatfrom games a lot.
Same situation as today: PS4 has coherency support but nobody uses it because XOne doesn't...
Say goodbye to loading things mid-frame... :messenger_crying:

The way I read it was that the GPU has two operational request flows. The first one is as you wrote and the second was SSD-> GPU cache bypassing both the CPU and PCIE bus. ALL game data including primitives. textures etc can be stored in the SSD virtual memory pool and accessed at wil by the GPU. I dont have enough insight as to whether the GPU copies to VRAM from that pool or directly into its caches.

Im sure a future architectural reveal will go into this. Gamer Nexus has a break down of the process as seen on the radeon pro SSG implementation but whether that implementation looks like the one employed in the XVA is unknown still. They sound very similar however.
 

psorcerer

Banned
What about the cache scrubber ?

Only on PS5 AFAIK?


The way I read it was that the GPU has two operational request flows. The first one is as you wrote and the second was SSD-> GPU cache bypassing both the CPU and PCIE bus. ALL game data including primitives. textures etc can be stored in the SSD virtual memory pool and accessed at wil by the GPU. I dont have enough insight as to whether the GPU copies to VRAM from that pool or directly into its caches.

Seems unrealistic to me. They announce DX12U, SFS and direct Storage for PC, and you cannot do it on PC, at all.
If XSX uses exacltly the same DX12U API as PC (we don't know yet) GPU cannot issue any work at all. Entire pipeline is managed by CPU only.
 

Lunatic_Gamer

Gold Member
“Instead of finishing the PlayStation 4 version first, the developer is prioritizing the PlayStation 5 version right now. The PlayStation 4 version will ported from the final PlayStation 5 version. Hence, whenever the game launches, it will be next-generation ready — a goal that TeamKill Media is taking really seriously.”

 
Status
Not open for further replies.
Top Bottom