• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

FranXico

Member
so do we think AMD's RT solution uses regular CUs that are basically reserved?

🤔 mhh... that to me sounds like a way less sophisticated way than what Nvidia is doing tho. right?
It sounds reasonable though. Were we expecting AMD to top NVidia in ray tracing?
 

Dodkrake

Banned
Interesting take, the separation of regular CU's and RT CU's would help link up with another leak nicely and also help with the B/C. Technically is this something you've just inferred or have knowledge of - i.e. not is this the PS5 specs but is this how we expect AMD to provide a RT solution?

Thank you. No major knowledge in hardware, however from what I read, AMD's implementation uses dedicated RT cores, freeing up the remainder for actual non RT work.
 

Dodkrake

Banned
That CU separation does it make sense? Isn't that a brute approach? That contradicts with the rumors about a more elegant RT solution in PS5.

IMO it would make sense, and it's elegant. You have separate ray trace units that can free up a lot of load from the main units. It's possible that the PS5 solution exploits ray traced units as regular CU's, but don't quote me because I have no idea. I just connected the dots. Sorry mate.
 

FranXico

Member
IMO it would make sense, and it's elegant. You have separate ray trace units that can free up a lot of load from the main units. It's possible that the PS5 solution exploits ray traced units as regular CU's, but don't quote me because I have no idea. I just connected the dots. Sorry mate.
I would hope that disabling RT would allow the "reserved" CUs to be used as regular ones.

But that being the case, both consoles would be using this approach.
 
Last edited:

Rudius

Member
images
I tremble thinking of the power this maniac is packing!
 

01011001

Banned
I would hope that disabling RT would allow the "reserved" CUs to be used as regular ones.

But that being the case, both consoles would be using this approach.

I already find the idea a bit odd but if they couldn't use them as regular CUs it would be a pretty awful use of hardware.

so yeah
 
so do we think AMD's RT solution uses regular CUs that are basically reserved?

🤔 mhh... that to me sounds like a way less sophisticated way than what Nvidia is doing tho. right?
because that makes it seem like there actually is no dedicated RT hardware and AMD is literally just reserving resources for RT instead if giving their GPUs additional hardware to offload the workload

If i remember right AMD had a patent that show there RT could be build more into there TMU .
Still it will be interesting to see if they did it that way for flexibility .
 
Last edited:

Dodkrake

Banned
I would hope that disabling RT would allow the "reserved" CUs to be used as regular ones.

But that being the case, both consoles would be using this approach.

This is a good thread on it (hope I can quote this page): https://linustechtips.com/main/topic/1081807-amd-raytracing/?tab=comments#comment-12716213

NVIDIA is using RT cores and tensor cores to do RTX. While RT cores are somewhat focused on ray tracing, they are not exclusive to this operation. You can technically program them to do other things too as far as I understand their tech and they can change and adapt RT computation if changes are needed via drivers (someone correct me if I'm wrong). AMD is planning on using HW RT acceleration (based on little leaks that we got so far), a dedicated ray tracing computation GPU unit that only does ray tracing and nothing else. In a nutshell, that's really all most gamers crave for anyway and may in the end be faster because it's so focused feature on hardware level. It might be a downside if DXR changes in the meanwhile and fixed function can't adapt to it in any way, but I guess that's the tax you pay for using fixed function units in GPU. You can imagine this like DirectX 7 that had fixed function rendering and DirectX 8 that brought pixel shaders which were programmable and allowed more things to be done visually without having to add dedicated pixel processing unit to the GPU. Something along those lines.
 
I see it the same way you do - except MS does it right. IMO the right way would be if Lockhart can play games as well as the XSX - only with 1080p instead of 4k. At most with a few reduced graphics options. Then I see absolutely no danger to the other consoles.
Ultimately, it wouldn't be a great difference compared to a PC (high specs vs. high end specs).
This is exactly the scenario.

I mean look at this:
 

Dodkrake

Banned
I already find the idea a bit odd but if they couldn't use them as regular CUs it would be a pretty awful use of hardware.

so yeah

Not necessarily. You can use HDD and SSD as virtual memory, however it is an awful use of hardware because it's not even remotely comparable to actual RAM. Sometimes having dedicated hardware for designated tasks, if the API's for programming are good, is a much better use of real estate. I can still see some sort of hybrid solution, but hardware is NOT my area of expertise.
 

Dodkrake

Banned
wait, so this would mean that it's actually the other way around? that Nvidia could technically use their RT cores for other things but the way AMD is believed to do it wouldn't allow it?

or am I misunderstanding something?

I'm the one that misunderstood, not you!
 

FranXico

Member

ethomaz

Banned
sFEFpYe.jpg

Not all though. This guy sounds triggered at the placeholder numbers in that other site.
Like he said PS5 used Jaguar cores in DevKits :messenger_tears_of_joy: :messenger_tears_of_joy: :messenger_tears_of_joy: :messenger_tears_of_joy:

BTW he said BS to that site:

Not the placeholder that he knows is a placeholder updated to RDNA2.
 
Last edited:

01011001

Banned
I'm the one that misunderstood, not you!

well no matter who misunderstands what,
this whole thing would open up a whole new can of worms.
lol

I hope Microsoft will go into detail in their GDC talks on how the RT hardware acceleration works on AMD GPUs
 

SlimySnake

Flashless at the Golden Globes
Lockhart should be fine as long as they don't gimp the CPU and SSD completely, how about people actually wait for MS to confirm the specs before enchanting doom and gloom spell?



It's a placeholder, don't read too much into it.

Besides, 60CUs @2.0GHz would mean a 15.3TF console, even SlimySnake SlimySnake is not that optimistic :messenger_tears_of_joy:
i think sony might be going with an even bigger chip. they have always gone wide and slow. and i think they will do that again.

i also think the cooling solution is expensive because there is two of them. the v shape devkit could have two heatsinks on either side. cooling either two separate chips or one giant one. it doesnt HAVE to mean they are pushing high clocks.
 

ethomaz

Banned
how do you know they didn't? do you have a dev kit? did you see all the details for every one of them?

tell us oh all knowledgeable
Zen launched in June 2017.
The first DevKits were using dedicated units... Zen + Vega fits and it is totally believable if he said so.
Jaguar cores? :messenger_tears_of_joy: :messenger_tears_of_joy: :messenger_tears_of_joy:
 
Last edited:

ethomaz

Banned
so you know that, meaning you have insider knowledge?
is that what you're saying?
You will start again like GitHub that was proved false?
Defense bullshit claims that makes no sense technologically is really a hard job.

Change of being Jaguar cores from 2013 = 0%

You need to really want to put PS5 in bad shape to claim something like that.
Way more than GitHub.
 
Last edited:

alex_m

Neo Member
That's revenues alone, cannot find any info about actual profits, but if we assume that they make 10% margin on that we can estimate $1bn (Sony profits on gaming and related services is closer to 16%). I think that they have separated some stuff recently, isn't that right?

Paul Thurrott has following to say about XBox
Microsoft’s Xbox business continues to be a bottomless pit of worry, with no profits over the years and falling revenues in the past two years because we’re between major product releases. But the bigger issue, of course, is that the hardware side of Xbox has always underperformed. And while the subscription services part of that business that is its future—Game Pass, someday xCloud—is finally starting to see some growth, it’s still a tiny business overall.

see: https://www.petri.com/paul-thurrotts-short-takes-microsoft-earnings-special-edition-12
 
Last edited:
Here's the thing I don't see mentioned--if, as Odium claims, the system is 11.6TF. They aren't going to reveal it as 11.6. They'll say 12TF, same as the Xbox Series X.

And outside of DF delving really heavily in to show that variance, no one will notice. And that assuming by final kits and clocks around May/June, they stick with 11.6 and don't clock up to 12 or 12.4 or whatever.
 

Dodkrake

Banned
If what was mentioned in the "coo" images is correct, this bird flies at 10.6 stock stable and can go up to 11.6 when pushed, however it will get sick. To me this means that base PS5 will be a minimum of 11.6TF at 1750mhz. A slight overclock to 1800mhz will put it at a sharp 12.
 

01011001

Banned
You will start again like GitHub that was proved false?
Defense bullshit claims that makes no sense technologically is really a hard job.

Change of being Jaguar cores from 2013 = 0%

so you are saying, with zero info and no insider knowledge that it is 100% impossible that early PS5 dev kits used Jaguar cores.

people who speak with certainty about stuff they have absolute no knowledge about are the absolute worst, and you are even worse because you judge someone else by being certain of something you literally can not be certain about because you have absolutely no idea what goes on behind the scenes at Sony and you have no idea what they did during early development of their dev kits.
 
Last edited:

Neo Blaster

Member
I strongly suspect that both will disappear relatively quickly. I see the same with PS4 and PS4 Pro.
One S/X definitely if Lockart is a thing. PS4 Pro maybe because it won't be the premium experience anymore. But PS4 is an extremely successful console, this plus all extended time Sony give to their consoles after new gen arrives will keep it in the market for quite a while.
 
If what was mentioned in the "coo" images is correct, this bird flies at 10.6 stock stable and can go up to 11.6 when pushed, however it will get sick. To me this means that base PS5 will be a minimum of 11.6TF at 1750mhz. A slight overclock to 1800mhz will put it at a sharp 12.
did you see his analysis:

No. o'dium said currently devkit is at 11.6

There are 52 coos (52 CUs)

On the second line, the are 4 groups of coos with a groupd of 1 coo, then 7, 4, 3: giving 1743 mhz

52*128*1743 = 11.601 (11.6 tfops), just above 11.6 which can't be a coincidence.

16 chips of GDDR6 at 512GB/s (hello Oberon A bandwidth !)

Ok so they intend to overclock for 100-150mzh so giving about : 12.2-12.6 tflops but still a pigeon...still a devkit ?

this one also seems reasonable
 

ethomaz

Banned
so you are saying, with zero info and no insider knowledge that it is 100% impossible that early PS5 dev kits used Jaguar cores.

people who speak with certainty about stuff they have absolute no knowledge about are the absolute worst, and you are even worse because you judge someone else by being certain of something you literally can not be certain about because you have absolutely no idea what goes on behind the scenes at Sony and you have no idea what they did during early development of their dev kits.
Somethings are impossible.
Others not.

That is one of them.

He spread bullshit and now is being laughed about that... that is fair.

I can't help you if you believe any PS5's devkit used a 2013 CPU not compatible with the final target specs instead the actual Zen that were ready for sometime already.
 
Last edited:

TBiddy

Member
What is not possible is not possible lol

Xbox fans are really weird... they believe in made up claims lol

It's certainly possible to use a Jaguar cpu in a dev-kit. What makes you think otherwise? Do you have access to a dev-kit? What are you basing your "it's not possible" on?
 
Status
Not open for further replies.
Top Bottom