• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

MightyMax

Neo Member

Everywhere: Fan Theories About GTA Mastermind's New Game are 'Not Far Off', Says Developer​

"But there’s a lot of context and scale missing that will be really exciting for players to dive into and experience once we release."​


We recently reported on fan discoveries about Everywhere, the upcoming game from ex-Rockstar president Leslie Benzies and his new studio Build a Rocket Boy. The studio has now told IGN that some of the theories fans have been putting together are "not far off" what Build a Rocket Boy is envisioning for the final project – but may not represent the scale of the game it's aiming for.


In a statement to IGN, a Build a Rocket Boy spokesperson said that the company keeps a close eye on its communities, and has been impressed by what fans have pieced together using the slim number of clues available:

"We love how the community is piecing together all the information they can find to build an idea of what Everywhere might be about. Believe me when I say, we’re a regular reader of the Subreddits! Some of the things they found out, and you touched on in your article, are actually not far off from what we envision, but there’s a lot of context and scale missing that will be really exciting for players to dive into and experience once we release.

"We don’t like thinking in terms of open-world, or closed-world, but rather how we can build new worlds, and exciting, new ways of playing games."
Everywhere: 4 Art Images
ev-home-a-free-663x500-1615460990192.png


ev-home-man-663x500-1615460990193.png

4 IMAGES
ogtwvjxiok3e7zoptarywu-1200-80-1615460990194.png

ev-home-robot-663x500-1615461010832.png

The spokesperson didn't go into more detail about which elements were close, but our report included a number of details drawn from patents and hidden areas of the Everywhere website. The Everywhere fan community has been drawing lines between those ideas, putting together an image of a somewhat realistic online world, within which a second virtual world exists, where players can potentially create their own minigames.


“I think the first world will be something similar to GTA Online and the second world will [look] like Ready Player One," said Everywhere Discord admin Nestor, "where you can create worlds and games for the enjoyment of other players, and access them through doors or portals."

Other elements included potential payment systems, perhaps even for real-world objects, in-game spectation services, a battle royale potentially tied to music tracks, and even a companion app that ties together all of the above - which can seemingly already be accessed in part.

Build a Rocket Boy hasn't given any indication of when it might officially reveal more about the project. Previously, we've learned that the game will be "an immersive and large MMO experience", which recently switched from the Amazon Lumberyard engine to Unreal.

“In the near future, technology has brought humanity to the precipice of a world shifting change,” reads an official blurb. "There are those who want to use this technology to advantage only themselves, and those who want to use it to help all humankind. Will we look to the stars? Or stare only at our feet? Will we be inspired? Or live in fear?”

Build a Rocket Boy was founded by Leslie Benzies, the "unseen mastermind" of Grand Theft Auto. Benzies worked on almost every Rockstar game from Grand Theft Auto 3 in 2001 through to Grand Theft Auto 5 in 2013, and directed GTA Online. He left Rockstar amid some controversy in 2014, and announced Everywhere in 2016.
 

SSfox

Member
Did you test the power draw on this games? You say Xbox has "no next gen games" however Gears 5 runs at 60-120fps at PC Ultra settings with VRS and Global Illumination at up to 4k. It's very demanding on PC. It draws a lot of power so it's a good benchmark of how the machines react under load.
Bruh i'm talking to you about actual facts, let's put examples:

You play Borderlands 2 or Knack on PS4, console will be quiet, you play on it RDR2 or GOW, it won't be that quiet.

Blue Dragon and Lost Odyssey runs quiet on Xbox 360, on Gears 2 it will become boeing at best case (cause it may run into ROD and then rip the console)

I am not inventing stuffs, i am speaking actual "FACTS". And it's not for trolling Xbox cause in that case i'm speaking for both PS5 ans Series, both have to prove it (if i wanted to troll Xbox i would do it in purpose to make more fun), and the proof will be when huge full next gen games will starting hitting the 2 consoles then we'll see, maybe it doesn't look to be the case but i'm actually optimistic about it, i think the results are gonna be very good, not perfect maybe? but very good, and better than last gen at least i feel, but we'll see as usual.

I mean yes consoles are overall quiet, and i hope more than anyone that it will stay that way, remastered Old gen games or cross gen games running while console are quiet are very nice, but for me the absolute proof that those consoles are definitely quiet is once the true big next gen games will start to hit, i'm talking about games like Suicide Squad, GTA6, next Uncharted, Gears Of War 6 ect.

I tried to be as clear as i could. I hope you understand me now.
 
Last edited:

LucidFlux

Member
Mate. Loading a level consist so of many things not just textures some of the things are cpu limited like logic, ai, physics and so on series x and ps5 can load a level similarly in the same time ps5 being a bit faster but the real bottleneck of what this ssd's where supposed to fix was textures and Cold data like meshes, sound. Now if series x can't even handle textures this means that ssd isn't solving the problem.

Don't call me mate, friend! JK

Look, yes loading data into RAM does in fact cover... well every facet of the game as you mentioned from geometric data, textures, logic, player character data, save state and so on. Agreed.

Saying that the series X can't handle loading textures while we've seen plenty of last gen games load textures just fine on even the base xbox points to the issue not being the relatively fast SSD, and RAM bandwidth (compared to last gen).

Having said that, are there examples such as this where there is some sort of texture loading issue? YES! I'm not denying that point lol. But to say that the Series X hardware isn't capable of loading basic texture data in cross gen games is disingenuous. We've seen it done properly on even better looking games on the xbox one x which is much slower.

I don't know why Series X is having these issues in some games but compared to last gen where we have examples of textures loading just fine on much much slower HDDs and lower RAM bandwidth well I think that says it all. It literally can't be the fault of the hardware itself.



When mark cerny said no loading screens he didn't just be mean instant level loading or 2 second loading screens he meant streaming data so fast that if a player turns around the old data that wa said resident in ram is deleted and new data is loaded seamlessly without popin this means you can have scenes with more data than you could hold in ram meaning more detailed worlds advantage you've seen in unreal 5. Scenes with 100+ gb full of billions of polygons rendered on just 16gb gpu. And ratchet streaming worlds in a second and this is whats revolutionary on ps5 and why they call it the most revolutionary console since 3d.

You're preaching to the choir here brother. I'm a Cerny believer. Not sure why the need to even bring the PS5 into this discussion though.

I understand the paradigm shift that this will bring. Demon's souls already is packing their levels with more detail because it can load the next area as you walk down a hallway.



The xbox meanwhile can't even handle popin on crossgen games.

It's having issues with some games. Yes. Why is it having issues with those games when it doesn't with others is the question you should be asking. Small dev teams that can't optimize correctly?


This means it'll be left out when ps5 and pc games will have more detail, you'll basically have to cut down detail on Xbox so this beats the point of calling it nextgen it's more like a Xbox one x upgrade.

It wouldn't say it will be left out, if you watched the UE5 demo which I'm sure you have, you'll remember Epic will be able to scale this across all platforms of varying levels of storage speed and GPU power.

I admit, how exactly UE5 nanite scales is the real question. I think as you and many others have already mentioned, that PS5 will simply be able to show off higher resolution textures and geometric complexity faster because of how much quicker it will be able to load new data in the same time frame as Xbox.

My guess is that with constantly streaming in data that Xbox would continue to load in and blend to the higher resolution asset over time. It makes sense in a slower paced game, your character might spend minutes in the same room or area so the game would have time to transition from the level of detail from the initial load to a more detailed mesh/texture as you approach an object. Textures are already looking incredible in recent tiles, it's lighting (real time GI) that will truly start to set this generation apart. And both consoles will handle that very well.

I don't know maybe I'm just being overly optimistic for Xbox here but I don't think xbox is going to fall nearly as fall behind as you suspect when the true next-gen games start showing up. Will PS5 likely have slightly better looking textures overall, yea probably. Will Xbox look a generation behind because of this? No not at all.
 

Derift

Member
Never saw that in the many hours I put into Gears 5. Weird.
it happened a few times for me... I don't think its a widespread issue and I couldn't find any posts about it
Are you on insider build by some chance? Never seen it like that, however I had some issue with other game,s because especially skip-alpha is piece of shit. But I do like living on dat edge.



You should let us know in separate theread, since you already have the `thing`
Nope just your average gamer here lol it happened a few times had to restart it a few times to resolve it was really weird
 
Last edited:

Doncabesa

Member
Fuck me. Series S is silent as in fucking dead quiet playing Division 2.
This game would be howling on my now deceased PS4 Pro with fans near max.
No fan noise on Series S even after gaming for more than 2 hours now.
Have to hand it or Xbox here. This thing is amazing.
My kids play Dead by Daylight and it gets so hot on the vent but you never hear it. Their old ps4 and xbox one fat were stupidly loud.
 
I didn't think we'd get the clarity like that today.

Very clear.
Good to see MS support their platform. It never made sense to buy all those IPs and studios and give them to PlayStation. Sony would not buy studios to give Xbox games why should MS? Hopefully this effort will put to bed questions about MS' commitment in the video game space. I don't expect any Sega like 3rd party moves from MS at this point. I'm interested to see what they do next.
 

GAF machine

Member


Isn't the PS5 Tempest Engine similar to the Cell Processor. 😜


It's similar to the SPU of a CELL...

20200329172205.jpg

CELL is two different types of cores (heterogeneous); the SPU is but one type, and it would take more than a dozen Tempest Engines (each equivalent to eight SPUs) to run that urban landscape demo -- which means a PS3 didn't produce that RT'd imagery by itself. It was actually assisted over a network by seven QS20 CELL-based server blades (14 CELL processors)...

ibm-t-j-watson-research-center-21-l.jpg

One of the visualization architects responsible for the demo wrote on his blog:

"Even though the PS3’s RSX is inaccessible under Linux the smart little system will reach out across the network and leverage multiple IBM QS20 blades to render the complex model, in real-time, with software based ray-tracing. Using IBM’s scalable iRT rendering technology, the PS3 is able to decompose each frame into manageable work regions and dynamically distribute them to blades or other PS3s for rendering. These regions are then further decomposed into sub-regions by the blade’s Cell processors and dynamically dispatched to the heavy lifting SPEs for rendering and image compression. Finished encoded regions are then sent back to the PS3 for Cell accelerated decompression, compositing, and display." -- Barry Minor

Oh for sure. The CELL set the entire gaming industry up for the future it's currently pursuing.

Even the necessity for all-encompassing middleware engines like Unreal exploded in popularity due in part to the CELLs unconventional architecture.



Yeah, this thing is essentially the logical conclusion for all GPUs.

You the see the trends with UE5 moving to software rasterizers for micro-polygons, (mostly) software RT, the move to more and more fully programmable GPU rendering pipeline (with Mesh/Primitive Shaders and Task/Surface shaders) etc etc... Most of the traditional fixed function units of the GPU pipeline will be replaced by software running on gen purpose cores. So you'll end up with a chip that looks remarkably like the above massively parallel CPU data streaming monster, but with select fixed function graphics hardware for stuff like RT, ML, ROPS, TMUs, command processor etc etc...

Sounds sweet... Sweeney would be all over something like that. Years ago he voiced:

"It would be great to be able to write code for one massively multi-core device that does both general and graphics computation in the system. One programming language, one set of tools, one development environment - just one paradigm for the whole thing: Large scale multi-core computing." -- Tim Sweeney

I'd like to think that if a "CGPU" of Sweeney's (and your) description is the future of computing/rendering, then a CELL-based CGPU may be biding its time given that...

- CELL demonstrated it could trounce a top-of-the-line GPU at software-based RT despite being significantly disadvantaged in terms of transistors and flops (in line with Sweeney's circa '99 prediction for '06-7: "CPU's driving the rendering process"... "3D chips will likely be deemed a waste of silicon")
- the PlayStation Shader Language (PSSL) is based on the same ANSI C standard superheads from MIT used to mask CELL's complexity
- SPUs are programmable in C++ languages so SPU support of PSSL's C++ structs and members can be added with little or no hassle, which means the benefits of PSSL would likely extend to a massive many-core CELL-based CGPU designed to run shaders across a legion of SPUs and CUs using a single simple shader language
- entries [0015], [0016], [0017], [0052] of this patent and entries [0017], [0018], [0033] of this patent say that the described methods for backwards compatibility can be implemented on "new" processors in various ways
- the OpenPower Foundation (under governance of The Linux Foundation of which Sony is a Gold member) has open-sourced IBM's customizable A2I processor core for SoC devices; it has a number of features (can run in little endian mode; addresses x86 emulation) that make it a prime candidate to replace CELL's PowerPC-based PPE (had instructions for translating little endian data; addressed x86, PS1, 2, PSP emulation; A2I's LE mode would add PS4, 5, assumably 6 emulation)
- FreeBSD (PS4's OS based on 9.0, PS5 presumed to be 12.0 based) supports PowerPC; A2I is a PowerPC core with little (x86) and big (PPC) endian support; An A2I/CELL-based PS7 could run a little endian OS carried over from an x86-based PS6 or run an entirely new one written with little or big endian byte ordering
- FreeBSD now only supports LLVM's Clang compiler
- Clang/LLVM (compiler frontend/backend, SIE made the full switch to Clang x86 frontend during PS4 dev) support PowerPC (frontend), CELL SPU (backend courtesy of Aerospace corp.) and Radeon (backend)
- AMD's open-source Clang/LLVM-based compiler with support for PowerPC offloading to Radeon under Linux can serve as a reference for an SIE Clang/LLVM-based compiler that supports PPC (A2I)/CELL (SPU) offloading to Radeon under FreeBSD (Linux and FreeBSD were cut from the same cloth)

Interestingly, some folks at Pixar seem to think an architecture that melds CPU and GPU characteristics would be the preferred option for path-tracers, and anticipate such a device may appear by 2026:

"Several rendering teams have developed (or are in the process of developing) GPU-based path tracers. GPUs have immense computa- tional power, but have lagged behind CPUs in the available memory and also require a different algorithm execution style than CPUs." -- Pixar

"With the huge computational complexity of movies, it will be interesting to see which architecture wins. One possible outcome is that these archi- tectures will merge over the next decade, in which case this becomes a moot point." -- Pixar

IMO, the unfortunate thing about their future outlook is that the pros and cons of the "CPU vs. GPU" debate were rendered moot over a decade ago. I recall Kutaragi saying that he expected CELL to morph into a sort of integrated CGPU at some juncture and wanted Sony to make a business of selling them:

"Cell will evoke the appearance of graphics LSIs or southbridge chips, something almost like a PC, and that is the kind of business we want to start." -- Ken Kutaragi

Too bad SCE wasn't able to startup that business and become a successful vendor of CELL-based CGPUs back then. There's no telling how customer feedback from the likes of Pixar might've influenced future designs or what of those designs might've trickled down to PS consoles -- but all isn't lost. If SIE were to bring CELL out of "cryopreservation" for a CGPU, there are at least two coders (one of which is an authority on CELL) who'd jump at the chance to help shape its feature set and topology...

1) Mike Acton (CELL aficionado, former Insomniac Games Engine Director, current Director of DOTS architecture at Unity (career timestamped) had a few things he wanted to see implemented that would bring CELL's capabilities closer to those of today's GPUs
2) Michael Kopietz wants to play with a scaled-up monster -- a 728 SPU "Monster!":




The 'Monsters Inc.' slide says SPEs can replace specialized hardware. I'm not much of a techie, but I presume SIE would skip on ML hardware so the algorithms could benefit from the theoretical higher clock frequency, wide SPU parallelism and massive internal bandwidth of a "CELL2". A BCPNN study showed that CELL was extremely performant vs. a top-tier x86 CPU from its era due mainly to the chip's internal bus (EIB) bandwidth.

I think a CELL2 with a bus(es) broad enough to harbor and speedy enough to shuttle a few TB/s of data around internally to hundreds of high frequency SPEs to drastically accelerate ML (and other workloads in parallel), would be better than having dedicated ML hardware (and other dedicated hardware) constrained by GPU memory interface bandwidth limits and taking up space that could otherwise go to more TMUs or ROPs on the integrated GPU.

The re-emergence of CELL would not only give SIE an opportunity to further strengthen its collaborative ties with Epic and possibly establish a new one with Unity, but also allow Sony to pick up where it left off with ZEGO (a.k.a the BCU-100, used to accelerate CG production workflows with Houdini Batch -- the Spiderman/GT mash-up for example) and become a seller of CGPUs to Pixar and other studios. I'm sure if SIE really wanted to, it could come up with something in collaboration with all interested parties to satisfy creatives of all stripes in the game and film industries -- from Polyphony to Pixar...

framework.png


The thought of a future PS console potentially running a first or third-party game engine that drives something like the above Pixar KPCNN in real-time intrigues me. My wild imagination envisions a many-core CELL2 juggling...

- geometry processing (Nanite on SPUs in UE5's case)
- physically accurate expressions of face, hair, gestures, movement, collision, destruction, fluid dynamics, etc.
- primary ray casts
- BVH traversal (video of scene shown on the page) for multi-bounce path-tracing
- SPU-based shading and pre/post-processing of diffuse and specular data (Lumens on SPUs in UE5's case)
- Monte Carlo random number generation for random sampling of rays
- filtering (denoising) the diffuse and specular ray data via two convolutional neural networks (CELL has a library for convolution functions, consumes neural networks of all types and is flexible in how it computes them)

while an integrated GPU from the previous gen (or bare spec next gen entry level GPU) dedicates every single flop of compute it has to the trivial tasks of merging the diffuse/specular components of frames that were rendered and denoised by SPUs; then displaying the composited frame in native resolution (the GPU would only do these two tasks and provide for backwards compatibility; in bc mode Super-Resolution could be done by SPEs accessing the framebuffer to work their magic on pixels)

Given that George Lucas and Stephen Spielberg used to discuss the future of CG with Kutaragi, I think a hybrid rendering system of this sort would've been front of mind for a Kutaragi in the era of CGPUs. With SPUs acting as a second GPU compute resource for every compute intensive rendering workload, the GPU would be free to fly at absurdly high framerates. Kaz wants fully path-traced visuals (timestamped) in native 4K at 240 fps for his GT series (for VR I guess), I suspect a CGPU with 728 enhanced SPEs at 91.2 GFLOPS per SPE (3x the ALU clocked at 3.8 Ghz) would give it to him. Maybe we'd even get that photorealistic army of screaming orcs Kutaragi was talking about too (timestamped).

Realistically though, the current regime has a good thing going with their "PC cycle incrementalism bolstered by PS3's post-mortem" approach to hardware; so they'll likely work with AMD to come up with a Larrabee-like x86/Radeon design (timestamped) influenced by aspects of CELL's EIB rings. So long as it's $399, can upscale to the native res of the day, run previous gen games at ~60 fps and show a marginal increase in character behavior/world simulation over what we have today, it'll be met with praise from the vast majority of gamers.

Personally, I'd love to see them move away from balancing price and performance on a dusty 56 years old model born of Moore's law. I'd be elated if they balanced the two on one of CELL's models because unlike the "modern" CPUs an AMD sourced CGPU would descend from, CELL...

- doesn't waste half its die area on level cache; more of it goes to ALU (in red) instead to help achieve multiples of performance with greater power efficiency
- was intended to outpace the performance of Moore's law and designed to break free of the programming model the law gave rise to
- was designed to effectively challenge Gelsinger's law and Hofstee's collorary (two bosom buddies of Moore's law that use cache size, hardware branch prediction etc. etc. to limit traditional processor design performance gains to 1.4x (i.e. ~40%) despite having 2x the transistors, and drive power efficiency down by ~40% -- and they won't be going anywhere without fundamental changes in chip design)

All things being equal (node, transistor count, size, modern instruction sets, iGPU) it seems to me that a CELL/Radeon CGPU would be way more performant and much more power efficient than a Larrabee-like x86/Radeon CGPU for what I presume would be similar costs. That means a lot more bang for my buck and I'm all for that kind of "balance".

Hopefully the merits of CELL won't continue to go ignored.
 
Last edited:

Garani

Member
Yes, good thing I have game pass 🤣 for only 12 euros a month , That’s about 140 euros a year , I have a unlimited supply of games .. sound good , right?

Normally I would buy at least 4 games a year full price of 60 euros .

Every euro counts in corona time .
Oh my! I feel such a looser! For 50 bucks/year I have access to just 800 games on PS Now. And I can play them on my PS5 or stream them on PC.

Shit I guess I am riding the wrong horse?
 

HeisenbergFX4

Gold Member
Hello my favorite people on my favorite forums in my favorite thread.

Heading out this weekend to the Bahamas to start a week long fishing charter with some good friends and lots of alcohol.

Hope to come back with lots of fish and stories to tell though have already been told I cant share a few things until early to mid April

I shall return in a few weeks.

Cartoon Illustration GIF by Kraken Images
 
Hello my favorite people on my favorite forums in my favorite thread.

Heading out this weekend to the Bahamas to start a week long fishing charter with some good friends and lots of alcohol.

Hope to come back with lots of fish and stories to tell though have already been told I cant share a few things until early to mid April

I shall return in a few weeks.

Cartoon Illustration GIF by Kraken Images
Yeah well clearly you didn't use enough alchohol.

bunk-the-wire.gif
 
I mean Sony has to counter MS with a one-two punch and I'm afraid that Returnal and Ratchet at 80 euros each is not enough.

The value of the Gamepass is unbelievable. Even I'm starting to think about getting an XSX at some point. Plus MS is really doing something with backwards compability and they have the Elite controller which I'm envious.
I guess Sony could give away a bunch of last gen games nobody wants to pay for outright.

Personally, I’m more interested in this gen (on either console). Unfortunately, Xbox has nothing to even counter on that front yet.
 
Status
Not open for further replies.
Top Bottom