• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

IGN rumour: PS4 to have '2 GPUs' - one APU based + one discrete

KageMaru

Member
So is it now agreed that:

1) The PS4 will have 3D wafer stacking
2) 3D stacked Memory
3) 2014 Full HSA with Fabric computing
4) Mix of wafers with different process and die nodes down to 20nm
5) Multiple changes in the OS to support efficiencies like CPUs prefetching for the GP GPU.

I am definitely not agreeing to anything here. You guys can daydream about these specs all you want, but I much rather not set myself up for disappointment.

lol who said 20x.

10x is what people are talking about.

I was referring to this:

OR nearly 20X speed increase (2 generations but later generations are not scaling linearly)
 

Globox_82

Banned
Is no one reading the posts!

Yes we are, but it is mostly giberish to me, so I give up after a few senteces.

Like this one:

"So is it now agreed that:

1) The PS4 will have 3D wafer stacking
2) 3D stacked Memory
3) 2014 Full HSA with Fabric computing
4) Mix of wafers with different process and die nodes down to 20nm "process-optimized"
5) Multiple changes in the OS to support efficiencies like CPUs prefetching for the GP GPU. "


I have no freaking clue what any of those mean, in therms of making games. That's why I earlier asked for simpleton explanation. Do you like the potential direction? Will it deliver what most of the people expect 8-10x more power? Etc generic questions.
 

deadlast

Member
Yes we are, but it is mostly giberish to me, so I give up after a few senteces.

Like this one:

"So is it now agreed that:

1) The PS4 will have 3D wafer stacking
2) 3D stacked Memory
3) 2014 Full HSA with Fabric computing
4) Mix of wafers with different process and die nodes down to 20nm "process-optimized"
5) Multiple changes in the OS to support efficiencies like CPUs prefetching for the GP GPU. "


I have no freaking clue what any of those mean, in therms of making games. That's why I earlier asked for simpleton explanation...

I don't understand half of that either, and neither does 90+% of gamers. When I speak to any of the rumors I take what Jeff has said and try to understand it the best I can.

Now for a really stupid question. What is the advantage over 2.5D and 3D mem stacking, from a straight performance standpoint? The best info I could find says that 3D is active and 2.5D is passive.
 
Yes we are, but it is mostly giberish to me, so I give up after a few senteces.

Like this one:

"So is it now agreed that:

1) The PS4 will have 3D wafer stacking
2) 3D stacked Memory
3) 2014 Full HSA with Fabric computing
4) Mix of wafers with different process and die nodes down to 20nm "process-optimized"
5) Multiple changes in the OS to support efficiencies like CPUs prefetching for the GP GPU. "


I have no freaking clue what any of those mean, in therms of making games. That's why I earlier asked for simpleton explanation. Do you like the potential direction? Will it deliver what most of the people expect 8-10x more power? Etc generic questions.
Yes I'll be happy with the PS4 and I expect 6-10X the power. See: http://www.neogaf.com/forum/showthread.php?p=37538289#post37538289 for what can fit into a 200w power budget. With games using HSA and more CPU, graphics should be even better.
 

i-Lo

Member
Thanks Jeff.

I don't think we'd be discussing Jaguar CPUs if the PS4 had a 200-250w power budget

It makes me wonder too. Why limit the wattage to under 200 when first few gen PS3s had a supply of 380W albeit using around 210W at its peak? While I understand that greater power means more heat generation (& lessened reliability) with what is being said about next gen design, it seems like more evolved tech can be crammed in with an upper limit of 250W.

Also, with MS's recent sales strategy of subscription, I am beginning to wonder if the PS4 can afford to cost just north of US $500 upfront without any adverse media reaction if it allowed for a similar model of say $299 initial deposit (about US $300 less than what PS3 cost in the beginning) and a 2 year contract of $15/month. This would enable PS4 to have healthy sales without the need to expedite price drops. This could mean better hardware, faster proliferation (compared to PS3) and more software (due to rapidly increasing install base).
 

Proelite

Member
Thanks Jeff.



It makes me wonder too. Why limit the wattage to under 200 when first few gen PS3s had a supply of 380W albeit using around 210W at its peak? While I understand that greater power means more heat generation (& lessened reliability) with what is being said about next gen design, it seems like more evolved tech can be crammed in with an upper limit of 250W.

Also, with MS's recent sales strategy of subscription, I am beginning to wonder if the PS4 can afford to cost just north of US $500 upfront without any adverse media reaction if it allowed for a similar model of say $299 initial deposit (about US $300 less than what PS3 cost in the beginning) and a 2 year contract of $15/month. This would enable PS4 to have healthy sales without the need to expedite price drops. This could mean better hardware, faster proliferation (compared to PS3) and more software (due to rapidly increasing install base).

I feel like it's a harder sell for Sony since they don't have the clout of Live to tie in with the monthly payments. How many Playstation Plus subscribers are there in total? If it's a double digit percentage of the total PSN userbase, they can compete with Live.
 

THE:MILKMAN

Member
Thanks Jeff.



It makes me wonder too. Why limit the wattage to under 200 when first few gen PS3s had a supply of 380W albeit using around 210W at its peak? While I understand that greater power means more heat generation (& lessened reliability) with what is being said about next gen design, it seems like more evolved tech can be crammed in with an upper limit of 250W.

Also, with MS's recent sales strategy of subscription, I am beginning to wonder if the PS4 can afford to cost just north of US $500 upfront without any adverse media reaction if it allowed for a similar model of say $299 initial deposit (about US $300 less than what PS3 cost in the beginning) and a 2 year contract of $15/month. This would enable PS4 to have healthy sales without the need to expedite price drops. This could mean better hardware, faster proliferation (compared to PS3) and more software (due to rapidly increasing install base).

To the bold. I think Sony and Nintendo tend to do this. PSU's are most efficient at 50-60% load.
 

StevieP

Banned
It makes me wonder too. Why limit the wattage to under 200 when first few gen PS3s had a supply of 380W albeit using around 210W at its peak? While I understand that greater power means more heat generation (& lessened reliability) with what is being said about next gen design, it seems like more evolved tech can be crammed in with an upper limit of 250W.

Like I said, I don't think we'd have anyone mentioning a Jaguar CPU if we were talking about a power budget that high. We'd be back to talking about Piledriver/Steamroller.

To the bold. I think Sony and Nintendo tend to do this. PSU's are most efficient at 50-60% load.

Well, cheap power supplies. Of which all the consoles use, of course. 60-70% load is the highest you want to take those. A better produced power supply can go as high as 80% for nominal efficiency but I don't think we should be looking at those in terms of consoles.
 

i-Lo

Member
I feel like it's a harder sell for Sony since they don't have the clout of Live to tie in with the monthly payments.

I am confused. Why can't they do the the same through PSN?

To the bold. I think Sony and Nintendo tend to do this. PSU's are most efficient at 50-60% load.

Except, 360's PSU ran at over 80% of its capacity. So I think it's possible to safely provide a power supply of say 350W this time around while enabling the PS4 to peak at around the same wattage.
 

THE:MILKMAN

Member
Like I said, I don't think we'd have anyone mentioning a Jaguar CPU if we were talking about a power budget that high. We'd be back to talking about Piledriver/Steamroller.



Well, cheap power supplies. Of which all the consoles use, of course. 60-70% load is the highest you want to take those. A better produced power supply can go as high as 80% for nominal efficiency but I don't think we should be looking at those in terms of consoles.

Well I actually think the launch PS3 had a pretty good quality PSU. And the launch 360 had a 203W? power supply at launch and used ~190W at the wall!

Even the Wii has/had a 52W power brick.

I wouldn't risk running a PC that consumes say 390W at the wall on a 400W PSU......
 

Proelite

Member
Yes I'll be happy with the PS4 and I expect 6-10X the power. See: http://www.neogaf.com/forum/showthread.php?p=37538289#post37538289 for what can fit into a 200w power budget. With games using HSA and more CPU, graphics should be even better.

People who want powerful console next generation should be very, very optimistic after reading that thread.

MS and Sony can build a much more efficient console in same power envelope buy using custom components and new architecture.

Coupled with the closed nature of a console, I wouldn't be surprised if good studios get 3-4 times the performance out of 200W than his rig did, meaning that he would have to get a PC rated in the 7.5-10 teraflop range to be competitive.
 

i-Lo

Member
People who want powerful console next generation should be very, very optimistic after reading that thread.

MS and Sony can build a much more efficient console in same power envelope buy using custom components and new architecture.

Coupled with the closed nature of a console, I wouldn't be surprised if good studios get 3-4 times the performance out of 200W than his rig did

The physical dimensions are highly similar to PS3 and as such I think it's a great demonstration of what is possible with near future tech. Imagine how it'd be if X51 itself was a closed system.

The only big differentiator is the price. The X51 can cost over $1000 whereas it is a taboo for a console to exceed $500. Of course, as aforementioned, a contract subscription model could alleviate this issue to a large degree (depends on the method of implementation). Still at around half the price, it'll be an interesting challenge to provide similar performance with customized architecture.

Lastly, with the rumours of Xbox 3 being developed in Texas, which might as well be dev kits, it'll be interesting to see how long Sony can delay launching PS4 after XB3 (provided it is supposed to be out after XB3).

EDIT: I also saw the following TDP-

1. With GTX 555- 172W

2. With AMD 7850- 203W

And the second one is still less than what first gen PS3 drew under full load. I am getting this fizzing sensation as captain slow would say.
 

Proelite

Member
The physical dimensions are highly similar to PS3 and as such I think it's a great demonstration of what is possible with near future tech. Imagine how it'd be if X51 itself was a closed system.

The only big differentiator is the price. The X51 can cost over $1000 whereas it is a taboo for a console to exceed $500. Of course, as aforementioned, a contract subscription model could alleviate this issue to a large degree (depends on the method of implementation). Still at around half the price, it'll be an interesting challenge to provide similar performance with customized architecture.

Lastly, with the rumours of Xbox 3 being developed in Texas, which might as well be dev kits, it'll be interesting to see how long Sony can delay launching PS4 after XB3 (provided it is supposed to be out after XB3).

Alienware, Intel, AMD, etc all make a killing on the X51. The BOM for MS / Sony would be much less, in the areas of 25%-33%.
 

UltimaKilo

Gold Member
From everything I've read, it seems that the next generation is going to be a smaller leap than was originally expected (hopefully there will still be 4GB of RAM). But with diminishing returns with graphics, why is that such a bad thing? Furthermore, why would Sony, who hasn't yet made a profit on PS3 and is currently under a restructuring in a very weak economy, not hold off on releasing a console until 2014? Seems ridiculous and too risky.
 

James Sawyer Ford

Gold Member
I don't believe we're anywhere close to the point of diminishing graphical returns.

Just compare any CG movie to games, there's a huge gap.

Having to live with a gimped system for what will probably be the next decade will be totally disappointing. Which is why if the rumors about Wii U are true, it's very hard for me to get excited about that system at all.
 
dOuLB.jpg
 
What do you consider multiple CPUs; 2 X86 + 1 GPGPU + GPU that may be GPGPU + DSP (some part of AMDs I/O) + FPGA?

Game developers do not have the final hardware, they have what's available now that has similar performance. GDDR5 is the min memory target that's available now and may not be the final memory seen in the PS4. Game developers don't need to know about the 10 year plan for the PS4 and the OS changes and new SDKs that are going to be developed. Just like the PS3 but most likely with fewer delays, the OS will evolve. It will parallel PC development this time as AMD and possibly Microsoft will be supporting it.

I'm reading allot into this and it should be considered my opinion at this time.

When I think multiple CPUs, I think of it having at least 2 x86 or 2 PPC CPUs like we used to see before multi-core CPUs caught on, or what we still see in servers. That's why I was confused because I didn't see anything that said something along those lines.

And hey even if we have actual target numbers, we're still speculating on them in the end since things can always change.

As for the 6x-10x range you mention, 10x is supposedly what the PS4 will end up being so it will be interesting/fun to see the path they take to get there.
 

unomas

Banned
I'm excited for next gen but I'm not. I just want the graphical leap to be significant enough where it's noticeable day 1, and also so that it doesn't break my wallet. Come on Sony and MS, do some magic! I'm not including Wii U in next gen unless there is a lot more power under the hood than what it's been made out to be.
 
I don't believe we're anywhere close to the point of diminishing graphical returns.

Just compare any CG movie to games, there's a huge gap.

totally agree! in fact forget CG movies; turn off the computer or whatever it is you game on and step outside for a moment and look around. now go back to your gaming system.

can you not tell the difference? if the ultimate goal of Computer Graphics is to simulate reality, we have a loooooooooooong way to go!

once something like the holodeck from Star Trek is available you could make a cogent argument that Computer Graphics are "good enough." maybe we'll reach that stage in say 20 years? 30 years? 40 years? etc.
 

Boss Man

Member
How is a 2.5D stack structured? That doesn't make any sense to me...

Very interesting info btw. I could be seeing this entirely wrong, but from what I'm reading it seems like fabric computing is a pretty big deal and beyond the scope of gaming.
 
I'm excited for next gen but I'm not. I just want the graphical leap to be significant enough where it's noticeable day 1, and also so that it doesn't break my wallet. Come on Sony and MS, do some magic! I'm not including Wii U in next gen unless there is a lot more power under the hood than what it's been made out to be.

If you are saying that based on what I think you are, then I would have to ask your definition of "a lot more power".
 
How is a 2.5D stack structured? That doesn't make any sense to me...

2.5D, 3D and 3D wafer stacking all provide nearly the same benefits in reduced wire lengths.

1) 2.5D is a standard chip with the silicon vertically connected to another chip using an interposer composed of TSVs. Generally only one layer or package on top of the interposer. More than 10 year old technology seen in Handhelds and expensive.

2) 3D is same type silicon I.E. Memory or FPGA or GPU that can be stacked atop each other in multiple layers using TSVs. Also silicon chip connected to another chip with pre-manufactured into the two silicon blocks bumps and traces so it doesn't need an interposer. 3-4 years old less expensive in volume than 2.5D.

3) 3D wafer stacking is silicon wafers designed to standards so they mate properly and have TSVs through them to allow multiple different types of chips to be stacked atop each other in multiple layers. 3D wafer stacking is the last phase and with volume has several aspects that allow chips to be less expensive. It's coming on-line in 2013

Memory for instance can be manufactured as 3D stacks and connected to a CPU using an interposer. That would be considered 2.5D. You can have a combination of 2.5D and 3D stacking.

AMD’s future belongs to partitioning of functions among chips that are process-optimized for the function (CPU, Cache, DRAM, GPU, analog, SSD) and then assembled as 3D or 2.5D stacks.
AMD is part of AMD-IBM-Samsung and designing silicon wafers to STANDARDS so that they can be 3D wafer stacked. This is the most economical method with game console volumes.

Very interesting info btw. I could be seeing this entirely wrong, but from what I'm reading it seems like fabric computing is a pretty big deal and beyond the scope of gaming.
On the PS3 with multiple CPUs of different types in the Cell, Game developers had to manage memory and insure the CPUs didn't step on each other requesting memory access, the application developer had to do tasks that should have been transparent and part of the OS. With a Fabric computing memory model the OS handles more of what it should be doing and CPUs can talk to each other and pass pointers to memory to hand off finished data for a job they were better at doing to another different type of CPU to do what it is better at doing.

Fabric computing memory model can have efficiencies as well as make the programmers job easier. Yes it should be part of any new computer, game console or consumer electronics networked ecosystem. This is going to have a HUGE impact in the next 10 years! If it were in the PS3 there would have been much happier game developers and fewer complaints about the Cell. Processes have evolved to catch up with the Sony-Toshiba-IBM's vision of Cell's heterogeneous computing and distributed computing model (HSA & Fabric Memory Model).

Cell and Cell like current GPUs (multiple simple CPU elements inside a package) would need a redesign to take advantage of Fabric computing but individual CPUs just need enough cache and access to fast memory as well as OS routines that support Fabric computing.



http://mandetech.com/2012/01/10/sony-masaaki-tsuruta-interview/ Sony CTO interview on Playstation tech
http://semiaccurate.com/2012/03/02/sony-playstation-4-will-be-an-x86-cpu-with-an-amd-gpu/ Article on PS4 leaks parallels Sony CTO interview but adds AMD Fusion
http://www.gsaglobal.org/events/2012/0416/docs/3D_Panel.pdf Game Console SOCs shown using 3D stacked DDR3 (faster, cheaper & more energy efficient than XDR2) and 3D ultra wide I/O memory (30 times (1Tbit/sec) faster than XDR2 )
AMD supplied PDF showing features in APU chipset with MANY future game console related must have features
http://eda360insider.wordpress.com/2011/12/14/3d-week-driven-by-economics-its-now-one-minute-to-3d/ AMD planning 5 years for 3D stacking but not mentioning it.
http://www.eetimes.com/electronics-news/4235499/AMD-s-Macri-talks-Heterogeneous-systems-architecture Video on Fusion by AMD
http://www.neogaf.com/forum/showpost.php?p=37491994&postcount=1430 AMD 2014 fusion GPUs and adding 3rd party CPUs to AMD SOC
http://www.neogaf.com/forum/showpost.php?p=37459458&postcount=1398 Power wall and the reasons for Heterogeneous computing
http://architects.dzone.com/articles/heterogeneous-computing Heterogeneous computing again

Then my conclusion. Cell vision is like HSA + Fabric computing & very attractive to Sony. Everything in the Sony CTO interview about future Playstation tech has been touched on in the above cites.

Charlie at SimiAccurate has all but confirmed . Only question in my mind, is the 2014 design (Full HSA and Fabric computing), including GPU changes. going to be ready by PS4 launch?

Going beyond game consoles with Jaguar Fusion and Fabric computing.
 

-viper-

Banned
Games don't have to simulate reality. Reality looks boring.

I just want a game that looks like a visually breathtaking experience. Look at Crysis 2. All the fancy shaders, effects, motion blur, lighting. Real life looks boring as fuck in comparison.

Gran Turimso 5 simulates real life, and look at how everybody calls the game sterile.

Real life is sterile.
 

i-Lo

Member
Games don't have to simulate reality. Reality looks boring.

I just want a game that looks like a visually breathtaking experience. Look at Crysis 2. All the fancy shaders, effects, motion blur, lighting. Real life looks boring as fuck in comparison.

Gran Turimso 5 simulates real life, and look at how everybody calls the game sterile.

Real life is sterile.

Reality is the end of the road, the destination, final benchmark for graphical technology. Whether it is boring or not is subjective.

Objectively, according to Tim Sweeney of Epic, it'll take computational power that is around 2000 times of what's available today to attain "perfect visual quality".
 

Dash Kappei

Not actually that important
The goal to "reality" will be film like quality, which isn't boring per se at all.
Think Tony Scott's movies color correction, think Sci-fi movies, think filter/grain/whatever effects for horror movies and apply that to gaming.
Think Pixar's quality for cartoony games, think The Lost World in a game with dinos, an action game looking like The Matrix with buildings collapsing and quakeing under your fists, a Kill Bill's sword combat sequence and so on. Think an arcadey racing game looking like Fast five, a more serious one looking like Drive and a sim lifted straight from Top Gear's latest episode, or an artsy game where every level mash up and lifts styles and color from Monet, Degas, or a cel shaded spiderman looking like Romita Jr's work coming to life.

Doesn't sound boring to me.
 

i-Lo

Member
So 3-4 generations from now?

Yea, perhaps 20 (5 years/generation) years down the road with transistors made out of the newer materials (silicene, graphene etc)allied with perhaps newer designs (3D stacking, cube etc) and newer ways to process information (quantum computing).

So basically, we are far from diminishing return at this point purely from a graphical standpoint.

Of course, ingenuity on the part of programmers and artists will go a long way to compensate to a great extent for the technical limitations until that time. Looking at GoW Ascension among many others, I am reminded of what is possible with clever design on a hardware now nearly 7 years old.
 
We are easily starting to hit diminishing returns. We may be far from "perfect" graphics, but the jumps are going to get less noticeable. After Crysis 1 there have been few games that wowed graphically. Look at Samaritan. If that is representive of next gen it is definitely an improvement but not as big a leap as previous gens. Between smaller leaps in graphics and soaring dev costs improved graphics shouldn't be the main draw of consoles in a gen or two.

Think of graphics in terms of a just noticeable difference. The higher or better something already is, the more it has to increase to see a difference.
 

Proelite

Member
We are easily starting to hit diminishing returns. We may be far from "perfect" graphics, but the jumps are going to get less noticeable. After Crysis 1 there have been few games that wowed graphically. Look at Samaritan. If that is representive of next gen it is definitely an improvement but not as big a leap as previous gens. Between smaller leaps in graphics and soaring dev costs improved graphics shouldn't be the main draw of consoles in a gen or two.

Think of graphics in terms of a just noticeable difference. The higher or better something already is, the more it has to increase to see a difference.

Unreal Engine 4 runs on the same hardware as Samaritan so let's hold back these types of comments until we see that. Samaritan is using a 6 year old engine.

Even with the same console hardware this generation, the launch games for 360 and PS3 and latest games have looked significantly different.
 

i-Lo

Member
Unreal Engine 4 runs on the same hardware as Samaritan so let's hold back these types of comments until we see that. Samaritan is using a 6 year old engine.

Even with the same console hardware this generation, the launch games for 360 and PS3 and latest games have looked significantly different.

IIRC UE4's aim is to attain Samaritan like graphical fidelity with far greater efficiency than UE3. Apparently the new engine really tunes itself for parallel processing and its advantages.

I assume, UE4 will make public début during E3 next year.
 

Ashes

Banned
We are easily starting to hit diminishing returns. We may be far from "perfect" graphics, but the jumps are going to get less noticeable. After Crysis 1 there have been few games that wowed graphically. Look at Samaritan. If that is representive of next gen it is definitely an improvement but not as big a leap as previous gens. Between smaller leaps in graphics and soaring dev costs improved graphics shouldn't be the main draw of consoles in a gen or two.

Think of graphics in terms of a just noticeable difference. The higher or better something already is, the more it has to increase to see a difference.

I don't think it will be just a noticeable difference though.
 

i-Lo

Member
I don't think it will be just a noticeable difference though.

I can foresee the betterment of these:

- Texture
- Draw distance
- Character & world detail
- Lighting & shadows
- Tessellation
- Physics & collision detection

Basically, these are improvements over what's available right now. I don't think there will be anything revolutionary like it was with first gen HD consoles.
 

deadlast

Member
I can foresee the betterment of these:

- Texture
- Draw distance
- Character & world detail
- Lighting & shadows
- Tessellation
- Physics & collision detection

Basically, these are improvements over what's available right now. I don't think there will be anything revolutionary like it was with first gen HD consoles.

I agree with this. The tech will be great but not as great as what we saw from last gen to this gen.
 

nordique

Member
I can foresee the betterment of these:

- Texture
- Draw distance
- Character & world detail
- Lighting & shadows
- Tessellation
- Physics & collision detection

Basically, these are improvements over what's available right now. I don't think there will be anything revolutionary like it was with first gen HD consoles.

Yes, especially the bolded
 

Ashes

Banned
I can foresee the betterment of these:

- Texture
- Draw distance
- Character & world detail
- Lighting & shadows
- Tessellation
- Physics & collision detection

Basically, these are improvements over what's available right now. I don't think there will be anything revolutionary like it was with first gen HD consoles.

True. but Moving from lots of sub-hd to lots of Hd is big enough to be noticeable. At least judging from just jumping to pc.
 

i-Lo

Member
The thing is, I see evolution of tech from now on instead of something brand new. First it was polygon count then with HD consoles it became a matter of shaders, real time lighting and shadows etc. The next big thing is expected to be real time ray tracing and that's more than one generation away if at all.

Anyway, I was wondering if the next gen system will be able to do real time down sampling, from say 1680x1050 to 1280x720. It would sharpen the image, aid AA and AF and because it's not exactly 1920x1080, it won't be as memory intensive.
 

BurntPork

Banned
Unreal Engine 4 runs on the same hardware as Samaritan so let's hold back these types of comments until we see that. Samaritan is using a 6 year old engine.

Even with the same console hardware this generation, the launch games for 360 and PS3 and latest games have looked significantly different.

Technically, UE4 doesn't require as much raw power as Samaritan simply due to the fact that it's an engine and is reliant on features to run, not horsepower.
 

Proelite

Member
From BGAssassin's post in the other threads, maybe the stuff I heard about PS4 having an already beastly APU + another discrete 7850/7870 level GPU isn't that far-fetched.
 
I can foresee the betterment of these:

- Texture
- Draw distance
- Character & world detail
- Lighting & shadows
- Tessellation
- Physics & collision detection

Basically, these are improvements over what's available right now. I don't think there will be anything revolutionary like it was with first gen HD consoles.

All of those things can be classified as evolutionary but they still can make a game look alot better. Very noticeable. Shading power in these next consoles will be a vast improvement over whats there in the HD twins.

Also, UE4 is aimed to be way more advanced than Samaritan.
 
Top Bottom