• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS4 Rumors , APU code named 'Liverpool' Radeon HD 7970 GPU Steamroller CPU 16GB Flash

Status
Not open for further replies.

Ashes

Banned
Lol. Symantics.

Ot: Who taught you that language was as logical as math? It's just not.
Therein: especially where field specific language is concerned. :p
 

ekim

Member
There is no possible way Sony has ray tracing on the cards. The original fermi GTX 480 was tracing at ONE frame per second.

You do ray tracing on mulitcore CPUs.

Snippet from Wikipedia:
On June 12, 2008 Intel demonstrated a special version of Enemy Territory: Quake Wars, titled Quake Wars: Ray Traced, using ray tracing for rendering, running in basic HD (720p) resolution. ETQW operated at 14-29 frames per second. The demonstration ran on a 16-core (4 socket, 4 core) Xeon Tigerton system running at 2.93 GHz.[14]
 

Hammer24

Banned
What the fucking fuck? Who the are you to be acting all high and mighty with this smug tone? I know about folding@home, I have a PS3 that I use almost daily, and I think you should STFU with your pompous comments since all you do is take quotes out of context and spread misconceptions in this forum.

How many times have you been proven wrong? How many times have you been corrected on one of your assumptions? Pull your head out of Sony's ass for two seconds and realize I'm not falling for your bullshit.

I can't wait until next gen specs are revealed and I hope they are nothing like you keep assuming they'll be so everyone can see what kind of joke you are.



That's Jeff for you. He'll spend hours looking through Sony patents and anything he finds=FACT! even though he doesn't understand that these companies make plenty of patents and many don't see the light of day.


Interestingly the exact same thing happened at the beginning of last gen, when user MikeB "educated" everyone about the awesome power that be the mighty cell.
When finally exposed, he resorted to posting pics of his sister (before he got banned in the end), I wonder if we´ll get another awesome meltdown this time.
 

mrklaw

MrArseFace
Legitimate Ray tracing software would require an order of magnitude processing power not feasible on next gen hardware. Not even on Next gen2.

I'll pay £10 to Child's play charity, if ray tracing makes it into a console game next gen. Digital or Retail release.

With a probable 2013 release date, let's say, 13+7, until 14 June 2020.

Last of Us already has real time ray tracing from the main characters fists, and thats on PS3 http://www.neogaf.com/forum/showthread.php?t=477927
 

Margalis

Banned
Last of Us already has real time ray tracing from the main characters fists, and thats on PS3

Lawl. Every game that has physics does ray casting. Casting a couple of rays that do zero bounces and interact only with physics bounds is not at all similar to casting a ray for every texel that does multiple bounces and interacts with the exact geometry of every surface.
 

mrklaw

MrArseFace
Ray-casting is often used for physics/collision problems but is a different kettle of fish from ray tracing for light simulation from a performance:results POV.

I know, I figured something *that* tangential wouldn't be bitten on ;)

but the new unreal engine 4 is doing nice things along those lines with lighting.
 
I think this is relevant to this discussion...


Toms Hardware reviewing the new 'Piledriver' APU's launching later this year, which are the basis for the 'Steamroller' cores that might be in the PS4.

The conclusion is quite promising..

"By the time AMD’s third-generation APU, Kaveri, is ready (2013, the company says), we’ll be looking at x86 cores based on Steamroller, the Graphics Core Next architecture, and HSA enhancements that allow the GPU to access CPU memory. Look at the difference between the software infrastructure between Llano’s introduction on the desktop one year ago and today’s preview. If that was just the tip of the iceberg, I can only imagine what top-tier developers will be doing with our graphics processors in another year’s time."

Full Article..

http://www.tomshardware.co.uk/a10-5800k-a8-5600k-a6-5400k,review-32463.html
 
Legitimate Ray tracing software would require an order of magnitude processing power not feasible on next gen hardware. Not even on Next gen2.

I'll pay £10 to Child's play charity, if ray tracing makes it into a console game next gen. Digital or Retail release.

With a probable 2013 release date, let's say, 13+7, until 14 June 2020.
The AMD Libraries support bundled ray tracing for lighting. IT's not "Legitimate Ray tracing" but a method of using only a few 100's of rays to determine lighting rather than 10s of thousands that does require an order of magnitude more power. In an AMD lecture on the subject it was admitted that true ray tracing would "bog down the processor". In any case Ray Tracing is starting with bundled Ray Tracing this generation. Next generation may fully support Ray Tracing.


The PS3 Blu-ray player supports dual layer and the PS4 is supposed to support quad layer disks. Most modern blu-ray players can support quad layer which might include the Slim drives. Reading multi-layer Blu-ray disks requires 2 things, 1) enough blu-laser energy to get reflections from all layers to create a S/N ratio able to be read and 2) A DSP (includes software firmware) that can read each time separated reflection.

http://en.wikipedia.org/wiki/Blu-ray_Disc said:
In January 2007, Hitachi showcased a 100 GB Blu-ray Disc, consisting of four layers containing 25 GB each.[160] Unlike TDK and Panasonic's 100 GB discs, they claim this disc is readable on standard Blu-ray Disc drives that are currently in circulation, and it is believed that a firmware update is the only requirement to make it readable to current players and drives.[161]
This probably is not supportable on early PS3 drives....who knows. Quad layer blu-ray players are necessary to support 4K blu-ray coming after Jan 2013 when h.265 is published.

Hammer24 said:
Interestingly the exact same thing happened at the beginning of last gen, when user MikeB "educated" everyone about the awesome power that be the mighty cell.

When finally exposed, he resorted to posting pics of his sister (before he got banned in the end), I wonder if we´ll get another awesome meltdown this time.
No meltdown from me and again, the 2011 cell vision does not require a Cell processor, it's hardware agnostic! Cell or no cell in PS4 is not the point.
 

magawolaz

Member
lol. awesome.

EDIT: Snipped!
aGgja.jpg
lmao
 

Hammer24

Banned
No meltdown from me and again, the 2011 cell vision does not require a Cell processor, it's hardware agnostic!

Let´s see if i can predict the future:
1. You´ll keep on posting walls of text with no info relevant to the next console gen whatsoever, until the specs for Durango and Orbis get announced.
2. When the specs are out, we´ll have a Durango that is more powerfull than the Orbis. You will continue to "educate" the dirty masses, why this just seems so, as the silicon in the Orbis is so much more advanced, that it´ll only be a question of time until devs will unleash the full power of the mighty APU to surpass whats possible on the Durango.
3. The first 3rd party multiplats will arrive, and look and run significantly better on Durango. You´ll tell everyone that devs are just too lazy, as the Orbis is capable of so much more.
4. Time goes on, the situation wont change, and people will call you out. And then the meltdown will come.
We´ve seen all this before.
 
Let´s see if i can predict the future:
1. You´ll keep on posting walls of text with no info relevant to the next console gen whatsoever, until the specs for Durango and Orbis get announced.
2. When the specs are out, we´ll have a Durango that is more powerfull than the Orbis. You will continue to "educate" the dirty masses, why this just seems so, as the silicon in the Orbis is so much more advanced, that it´ll only be a question of time until devs will unleash the full power of the mighty APU to surpass whats possible on the Durango.
3. The first 3rd party multiplats will arrive, and look and run significantly better on Durango. You´ll tell everyone that devs are just too lazy, as the Orbis is capable of so much more.
4. Time goes on, the situation wont change, and people will call you out. And then the meltdown will come.
We´ve seen all this before.

If the PS4 will really use an x86 multicore CPU, it will be for the first time the other way around. The PS4 would be easier to program than the XBOX720. We know next to nothing about what kind of CPU the next XBOX will use, but if the XBOX720 will use a PowerPC derivated CPU, we have two possible scenarios; in the worst case scenario it could be an in order multicore CPU like Xenon, meaning total crap to program for in comparison with the x86 PS4 CPU, in the best case scenario it could be an out of order PowerPC CPU, meaning that it won't be a bitch to program for but for sure it will not be easier to program for than a traditional x86 CPU.
 
Let´s see if i can predict the future:
1. You´ll keep on posting walls of text with no info relevant to the next console gen whatsoever, until the specs for Durango and Orbis get announced.
2. When the specs are out, we´ll have a Durango that is more powerfull than the Orbis. You will continue to "educate" the dirty masses, why this just seems so, as the silicon in the Orbis is so much more advanced, that it´ll only be a question of time until devs will unleash the full power of the mighty APU to surpass whats possible on the Durango.
3. The first 3rd party multiplats will arrive, and look and run significantly better on Durango. You´ll tell everyone that devs are just too lazy, as the Orbis is capable of so much more.
4. Time goes on, the situation wont change, and people will call you out. And then the meltdown will come.
We´ve seen all this before.
LOL, maybe but I secretly believe that Durango and PS4 will have the same SOC, I haven't mentioned it before as there is very little support for this.

Now, how can I be claiming the PS4 is more powerful than the Durango when I believe they will have essentially the same core (memory size may be a different).

I even mentioned a wild speculation where there might be 2 of the 1PPU4SPU building blocks mentioned in another Sony Patent in the PS4 and also in the Durango for backward compatibility for both (Xbox360 used 3 PPC processors in the Xbox360 but one was dedicated to OS and DSP duties which could be emulated with other hardware). Economy of scale and building block design would make it practical especially if it they were used in a slimmer slim PS3 SOC which could also emulate a Xbox360. So both Microsoft and Sony would cooperate on a SOC that could emulate a PS3 and Xbox360.

After 2013 TSVs make SOCs economically practical and building block designs having a large economy of scale is what supports this.

Again, wild speculation but the underlying logic is based on the Sony 2010 patent and the AMD "process optimized building block" SOCs.

As to 1) above, I have not gotten into specs as they will change and require speculation even wilder than I am comfortable doing. The number of GPU elements in developer platforms may be reduced because of efficiencies in the 2014 GPU design. It appears that there will be 8 Jaguar CPUs in the PS4 and probably in the Durango.

The rumors of the Durango using the Oban chip (Japanese name for a Large oblong blank coin that is written on) and it already in production at IBM Foundries Dec 2011 (rumors say limited production for developers) is too soon...... More likely is a SOC for some other game console like a New Xbox360 refresh and possibly for a Slimmer Slim PS3 that will be announced for the 2012 season.

microsoft-sony.com sony-microsoft.com
Microsoft has purchased the domain names microsoft-sony.com and sony-microsoft.com. Microsoft registered the former yesterday, and the latter two days, according to a WHOIS search on DomainTools (one, two, via WinRumors).

Both domains currently redirect to a Bing search results page. The purchase of the domains have naturally resulted in lots of speculation
Most of the speculation for this has timed out and is unlikely leaving something like the above as a possible.
 

Hammer24

Banned
Again, wild speculation ...

See, thats the problem.
Instead of muddying the water with (mostly) baseless speculation, why don´t you simply concentrate on what we know about the devkits out in the wild? While they are by no means identical to the finalized hardware, they´ll give a pretty good picture of where things are heading.
 

itsgreen

Member
If the PS4 will really use an x86 multicore CPU, it will be for the first time the other way around. The PS4 would be easier to program than the XBOX720. We know next to nothing about what kind of CPU the next XBOX will use, but if the XBOX720 will use a PowerPC derivated CPU, we have two possible scenarios; in the worst case scenario it could be an in order multicore CPU like Xenon, meaning total crap to program for in comparison with the x86 PS4 CPU, in the best case scenario it could be an out of order PowerPC CPU, meaning that it won't be a bitch to program for but for sure it will not be easier to program for than a traditional x86 CPU.

I am no programmer, but I don't think it will be a significant difference. The big problem was that Cell had asymmetrical cores. The SPU's had to have different (kind of) instruction than the main PU. The big advantage 360 had was that it had symmetrical cores, 3 exactly the same cores. I don't believe X86 or PPC will make a significant difference (considering that PPC development is a common thing on the console field).
 
LOL, maybe but I secretly believe that Durango and PS4 will have the same SOC, I haven't mentioned it before as there is very little support for this.

Now, how can I be claiming the PS4 is more powerful that the Durango when I believe they will have essentially the same core (memory size may be a different).


I even mentioned a wild speculation where there might be 2 of the 1PPU4SPU building blocks mentioned in another Sony Patent in the PS4 and also in the Durango for backward compatibility for both (Xbox360 used 3 PPC processors in the Xbox360 but one was dedicated to OS and DSP duties which could be emulated with other hardware). Economy of scale and building block design would make it practical especially if it they were used in a slimmer slim PS3 SOC which could also emulate a Xbox360. So both Microsoft and Sony would cooperate on a SOC that could emulate a PS3 and Xbox360.

After 2013 TSVs make SOCs economically practical and building block designs having a large economy of scale is what supports this.

Again, wild speculation but the underlying logic is based on the Sony 2010 patent and the AMD "process optimized building block" SOCs.

Actually, they seem to have a total different core. We can't really talk about things like SOC if we don't know what kind of CPU the next xbox will use. A single core of Steamroller would take as much space on silicon than 4 simple in order cores of an hypothetical Xenon-like Xbox720 CPU; therefore, untill we don't know what kind of CPU design the Xbox720 will implement, we can't really guess about the chip design.
 
I am no programmer, but I don't think it will be a significant difference. The big problem was that Cell had asymmetrical cores. The SPU's had to have different (kind of) instruction than the main PU. The big advantage 360 had was that it had symmetrical cores, 3 exactly the same cores. I don't believe X86 or PPC will make a significant difference (considering that PPC development is a common thing on the console field).

Being an in order PowerPC CPU versus an out of oder PowerPC CPU would make a huge difference.
 
Actually, they seem to have a total different core. We can't really talk about things like SOC if we don't know what kinf of CPU the next xbox will use. A single core of Steamroller would take as much space on silicon than 4 simple in order cores of an hypothetical Xenon-like Xbox720 CPU; therefore, untill we don't know what kind of CPU design the Xbox720 will implement, we can't really guess about the chip design.
Yup....we have less information about the Durango...the assumption for the Durango being a PPC came from the Oban being made in the IBM forge. IF the Oban is not a Durango SOC then Durango might not be using a PPC CPU.

Too many rumors and every rumor has been shown to have misunderstandings but every rumor has some element of truth....you just have to pick out what makes sense; very difficult to do........

If the Durango is using a OBAN SOC then the GPU will not be HSA (too early) and likely at 32nm, in which case the PS4 waiting on HSA and 28nm will be cheaper and more powerful with added features....that does not make sense! Building block design (both IBM & AMD are doing this) allows for easy redesign using a GPU with a smaller die size but to support full HSA requires a different cache and memory controller in addition to a full HSA GPU. This to my mind is not an easy redesign. OBAN can not be the SOC for a Durango!

Durango can use a PPC or X86 and be full HSA as HSA supports different processors like ARM. HSA can not support the PS3 CELL but can support the 1PPU4SPU building blocks in the Sony 2010 patent.

Edit: Another rumor has Microsoft only providing Xbox360 Backward compatibility for 3 years. IF this is true it kinda rules out PPC in a Durango as additional hardware adding to cost but removed after 3 years is implied. Further, you can't just attach 2 PPC CPUs to a memory buss, they need the kinda support the Sony 2010 patent described for the 1PPU4SPU (looks like HSA) building block.

Hammer24 said:
See, thats the problem (wild speculation).
Instead of muddying the water with (mostly) baseless speculation, why don´t you simply concentrate on what we know about the devkits out in the wild? While they are by no means identical to the finalized hardware, they´ll give a pretty good picture of where things are heading.
Where is the fun in that <grin>. Devkit is probably known for the PS4 but interesting enough not for the Durango...riddle me that. And what more can we speculate on DevKit leaks.

OK how about the onQ123's post about a ARM cortex A5 in the PS4 SOC for security (DRM, Blu-ray, IPTV) as ALL AMD HSA fusion designs from 2013 on will have a ARM processor included for security...sorta needed if a HSAIL virtual machine allows a ISA standard for software that works across all platforms that support it (HSA Foundation members) or any HSA platform allows other platforms to use it to off-load processor duties (Hand-held to PC or possibly PS4). This requires a little thought to understand. Using OpenCL and the Hsail libraries, applications can be written to be platform independent. This allows a Handheld like Vita to offload application, GPU duties for instance, to a PC allowing a higher AI or more detail in the game. This is big!

And Hammer24, it's not just me coming up with this......I've included cites from others and the AMD HSA Foundation articles confirm speculation that is 3 months old based on the April AMD Stockholder release. Maybe you guys are reacting to the tone of my posts as it does change when multiple logic threads converge and speculation is confirmed. My assumption is that posts are read and understood for what they mean. If you don't read my post or at least the highlighted parts then you can't understand what I am actually saying. I include cites so that others can understand what I am using as a base for speculation.

SteveP is saying there won't be stacked memory or Cell in the PS4, I am saying there will be stacked memory and Cell will not be in the PS4 in the form seen in the PS3 but might be in the PS4 in the form seen in the Sony 2010 patent (1PPU4SPU building block module). There are multiple people saying stacked memory will be in the PS4 not just me. I've cited them, Micron is providing custom memory for the Next generation Game consoles, Micron also makes the Memory wafers for the HMC. As to 1PPU4SPU building blocks in the PS4, that is all me and has no other support other than it seems to be to practical (BC and math).
 

RoboPlato

I'd be in the dick
So quick question, when people are saying things like "X teraflops won't be enough, etc" what exactly are they aiming for. I was just looking through the high res PC screen thread and someone posted some absolutely jaw dropping screens of the new BF3 DLC. If we could get quality like that at 1080p/30fps next gen I would be incredibly happy. Same with Crysis 2 on Ultra.

Personally, I hope they have to specs to make 1080p an attainable target. Even if 1080p isn't in all games, which it won't be, at least the games that don't make it will likely be greater than 720p.
 

TronLight

Everybody is Mikkelsexual
The AMD Libraries support bundled ray tracing for lighting. IT's not "Legitimate Ray tracing" but a method of using only a few 100's of rays to determine lighting rather than 10s of thousands that does require an order of magnitude more power. In an AMD lecture on the subject it was admitted that true ray tracing would "bog down the processor". In any case Ray Tracing is starting with bundled Ray Tracing this generation. Next generation may fully support Ray Tracing.

The real-time reflections in Crysis 2 pc with DX11 aren't achieved with a limited ray tracing?
 
This thread is weird, y'all. That said, that Tom's Hardaware review of the new AMD CPUs seems to suggest that one benefit of the dual GPU config is that together, they can push 1080p.

I'm not a tech person, though. Could the combo integrated/7970m that ppl are talking about push that Star Wars demo at 1080p? Sony do love checking off bullet points, and I bet they'd love to say Orbis does full HD gaming.
 

RoboPlato

I'd be in the dick
This thread is weird, y'all. That said, that Tom's Hardaware review of the new AMD CPUs seems to suggest that one benefit of the dual GPU config is that together, they can push 1080p.

I'm not a tech person, though. Could the combo integrated/7970m that ppl are talking about push that Star Wars demo at 1080p? Sony do love checking off bullet points, and I bet they'd love to say Orbis does full HD gaming.

That's what I'm thinking too, although they did try to say the same for this gen.
KuGsj.gif


7970M alone could probably run 1313, probably not with all of the effects though. We don't know exactly what it was running on, so it's hard to tell. We also don't know for sure if the GPU/APU combo is happening or if it just for the dev kits. Personally, I think/hope it is.
 
So quick question, when people are saying things like "X teraflops won't be enough, etc" what exactly are they aiming for. I was just looking through the high res PC screen thread and someone posted some absolutely jaw dropping screens of the new BF3 DLC. If we could get quality like that at 1080p/30fps next gen I would be incredibly happy. Same with Crysis 2 on Ultra.

Personally, I hope they have to specs to make 1080p an attainable target. Even if 1080p isn't in all games, which it won't be, at least the games that don't make it will likely be greater than 720p.

Ok, I am not a programmer but I will try to explain this concept.

In gaming, memory performance(mainly latency), floating point, and integer performance are the key factors for the performance of a CPU.

For example, Floating points (Flops) are important to push the geometry, while integers are important for AI.

RISC CPUs like the Cell have very high Floating performance but lower integer performace, while traditional X86 CPUs have high integer performance but lower floating performance.

For GPUs, the most important factor is the floating point performance, since it is a specialized processor.

When people are saying things like "X teraflops won't be enough, etc" they are probably coming from the observation that for example Epic said that to run the UE4 with the best setting you need a system that generates "X Flops".
 

UrbanRats

Member
That's what I'm thinking too, although they did try to say the same for this gen.
KuGsj.gif


7970M alone could probably run 1313, probably not with all of the effects though. We don't know exactly what it was running on, so it's hard to tell. We also don't know for sure if the GPU/APU combo is happening or if it just for the dev kits. Personally, I think/hope it is.

We know it was running on very high end PC HW, so supposedly something like a 680.
Though i guess it's still far from the optimization phase.
 
Ok, I am not a programmer but I will try to explain this concept.

In gaming, memory performance(mainly latency), floating point, and integer performance are the key factors for the performance of a CPU.

For example, Floating points (Flops) are important to push the geometry, while integers are important for AI.

RISC CPUs like the Cell have very high Floating performance but lower integer performace, while traditional X86 CPUs have high integer performance but lower floating performance.

For GPUs, the most important factor is the floating point performance, since it is a specialized processor.

When people are saying things like "X teraflops won't be enough, etc" they are probably coming from the observation that for example Epic said that to run the UE4 with the best setting you need a system that generates "X Flops".

See it like a currency(resources) devs can use to buy(implement) their features.
X amount of teraflops mean every dev has a x amount tera flops as a budget they can't overspend without getting performance problems(lower fps,tearing etc).
 

RoboPlato

I'd be in the dick
Ok, I am not a programmer but I will try to explain this concept.

In gaming, memory performance(mainly latency), floating point, and integer performance are the key factors for the performance of a CPU.

For example, Floating points (Flops) are important to push the geometry, while integers are important for AI.

RISC CPUs like the Cell have very high Floating performance but lower integer performace, while traditional X86 CPUs have high integer performance but lower floating performance.

For GPUs, the most important factor is the floating point performance, since it is a specialized processor.

When people are saying things like "X teraflops won't be enough, etc" they are probably coming from the observation that for example Epic said that to run the UE4 with the best setting you need a system that generates "X Flops".
I know what teraflops mean but I was more asking about what kind of visuals they're expecting from next-gen since what I'm expecting seems to be attainable at 1080p without breaking the bank.
 
That's a pretty awkward/incorrect analogy. TFLOPS can mean raw shader core throughput when talking about the GPU, or it can mean raw floating point performance for vectorized SIMD on a CPU. "AI" code is no more integer than floating point, if anything it's more the latter since AI is often dependent on sphere/ray queries, which are all floating point based. If you were going to characterize AI in any way it's usually that it is *branchy* with *poor memory access patterns*. This is where out-of-order CPUs like modern x86 shine, as opposed to the massive L2 miss penalty on contemporary Power architectures, bad pipeline flush penalty, etc.

When Epic was discussing TFLOPS they were specifically talking about the shader core throughput necessary to run the voxel cone tracing technique that they're advancing for UE4's rendering.
 

KageMaru

Member
If the PS4 will really use an x86 multicore CPU, it will be for the first time the other way around. The PS4 would be easier to program than the XBOX720. We know next to nothing about what kind of CPU the next XBOX will use, but if the XBOX720 will use a PowerPC derivated CPU, we have two possible scenarios; in the worst case scenario it could be an in order multicore CPU like Xenon, meaning total crap to program for in comparison with the x86 PS4 CPU, in the best case scenario it could be an out of order PowerPC CPU, meaning that it won't be a bitch to program for but for sure it will not be easier to program for than a traditional x86 CPU.

Aspects such as compilers, debugging tools, performance analyzers, and other tools matter just as much as architecture, if not more. If the tools allow the developers to get the job done, regardless of the architecture, that will make a difference in the end. Though I agree that x86 would be most preferable by most developers since it's well documented and people have plenty of experience.

I don't think MS is going to use anything but an OoO core since they originally wanted this in the 360 but never got it.

Basically whichever of the two is easier to develop for and receives more dev support will see the better return. Question is, out of the three, which do you think will pull ahead?

So quick question, when people are saying things like "X teraflops won't be enough, etc" what exactly are they aiming for. I was just looking through the high res PC screen thread and someone posted some absolutely jaw dropping screens of the new BF3 DLC. If we could get quality like that at 1080p/30fps next gen I would be incredibly happy. Same with Crysis 2 on Ultra.

Personally, I hope they have to specs to make 1080p an attainable target. Even if 1080p isn't in all games, which it won't be, at least the games that don't make it will likely be greater than 720p.

Having specs that could attain 1080p is already possible, the PS360 can render games at 1080p. They'll just look like last gen games or worse.

1080p may see more support next gen, but expect developers to go back to 720p in order to have more time per pixel. No matter how good games look, resolution will always be one of the first things to be sacrificed for more performance.
 
Let´s see if i can predict the future:
1. You´ll keep on posting walls of text with no info relevant to the next console gen whatsoever, until the specs for Durango and Orbis get announced.
2. When the specs are out, we´ll have a Durango that is more powerfull than the Orbis. You will continue to "educate" the dirty masses, why this just seems so, as the silicon in the Orbis is so much more advanced, that it´ll only be a question of time until devs will unleash the full power of the mighty APU to surpass whats possible on the Durango.
3. The first 3rd party multiplats will arrive, and look and run significantly better on Durango. You´ll tell everyone that devs are just too lazy, as the Orbis is capable of so much more.
4. Time goes on, the situation wont change, and people will call you out. And then the meltdown will come.
We´ve seen all this before.

Go troll somewhere else.
 

Karak

Member
If the PS4 will really use an x86 multicore CPU, it will be for the first time the other way around. The PS4 would be easier to program than the XBOX720. We know next to nothing about what kind of CPU the next XBOX will use, but if the XBOX720 will use a PowerPC derivated CPU, we have two possible scenarios; in the worst case scenario it could be an in order multicore CPU like Xenon, meaning total crap to program for in comparison with the x86 PS4 CPU, in the best case scenario it could be an out of order PowerPC CPU, meaning that it won't be a bitch to program for but for sure it will not be easier to program for than a traditional x86 CPU.

Since everyone seems to ignore that IGN usually suck at rumors we can bring another of their stories into this.

IGN did their developer poll and the overwhelming majority were indicating that the Durango was easier or would be easier to work on than the PS4. Take that as you will.

http://www.ign.com/articles/2012/06/01/the-next-generation-according-to-game-developers

1080p may see more support next gen, but expect developers to go back to 720p in order to have more time per pixel. No matter how good games look, resolution will always be one of the first things to be sacrificed for more performance.


I am probably in the minority on this but I am ok with that. Resolution is one thing but for particular games the effects going into the game to make the game world look interesting and react interestingly are more important than the resolution.
 
Since everyone seems to ignore that IGN usually suck at rumors we can bring another of their stories into this.

IGN did their developer poll and the overwhelming majority were indicating that the Durango was easier or would be easier to work on than the PS4. Take that as you will.

http://www.ign.com/articles/2012/06/01/the-next-generation-according-to-game-developers




I am probably in the minority on this but I am ok with that. Resolution is one thing but for particular games the effects going into the game to make the game world look interesting and react interestingly are more important than the resolution.

Not that I am doubting the verity, but I wonder who they asked? The sample size (35) is pretty small
 

Karak

Member
Not that I am doubting the verity, but I wonder who they asked? The sample size (35) is pretty small

Large enough to get a grouping and more than just MS's internal ones hahahahaa.
Personally I doubt everything IGN posts. But others seem to think that IGN is a basin of insider knowledge. So I figured might as well throw it in here.
 

StevieP

Banned
So quick question, when people are saying things like "X teraflops won't be enough, etc" what exactly are they aiming for. I was just looking through the high res PC screen thread and someone posted some absolutely jaw dropping screens of the new BF3 DLC. If we could get quality like that at 1080p/30fps next gen I would be incredibly happy. Same with Crysis 2 on Ultra.

Personally, I hope they have to specs to make 1080p an attainable target. Even if 1080p isn't in all games, which it won't be, at least the games that don't make it will likely be greater than 720p.

As KageMaru said, I still see 720p being more common, especially when you're constrained by TDP and cost on a console (where you are less-so on a PC). Even when you take into account APIs and OSes, BF3 on Ultra in 1080p running at any kind of respectable framerate (especially in multiplayer) requires a lot of hardware grunt that... well you're not going to see in a console for a long while.

at 1080p? Sony do love checking off bullet points, and I bet they'd love to say Orbis does full HD gaming.

Sony used that card with the PS3.

Not that I am doubting the verity, but I wonder who they asked? The sample size (35) is pretty small

35 members of 343 studios? lol
MS is really good at documentation though, in comparison to Sony and Nintendo - even with more standardized hardware in this upcoming round. That, and the gobs of memory.

Karak said:
IGN is a basin of insider knowledge. So I figured might as well throw it in here.

IGN has real, legitimate sources. It seems their understanding of technology, or whatever "second hand information" they are told is lacking sometimes or intentionally obsfucscated though.
 

Karak

Member
MS is really good at documentation though, in comparison to Sony and Nintendo - even with more standardized hardware in this upcoming round. That, and the gobs of memory.

IGN has real, legitimate sources. It seems their understanding of technology, or whatever "second hand information" they are told is lacking sometimes or intentionally obsfucscated though.


I am assuming the same thing in regard to the devs responses. Though I was surprised at the large majority. Then again...IGN.

Well that is an interesting concept to be sure about their reasons for all the mistakes. And since many times their sources have been actual neogaf threads. Good for them for following us but bad on them for not sourcing:)
 

coldfoot

Banned
I just hope they realize the cost savings by omitting DAC's and make the video output HDMI only with 720p minimum. HDMI + combined optical/stereo output jack should be the only 2 A/V ports behind the device. Not expecting 1080p standard, but I surely hope there is a mandatory 720p minimum standard enforced this time, it will also make video scaling much simpler.
 

Karak

Member
I just hope they realize the cost savings by omitting DAC's and make the video output HDMI only with 720p minimum. HDMI + combined optical/stereo output jack should be the only 2 A/V ports behind the device. Not expecting 1080p standard, but I surely hope there is a mandatory 720p minimum standard enforced this time, it will also make video scaling much simpler.

I would be really surprised to see the lack of composite outs.
 

Globox_82

Banned
Latest PC GPUs '24x More Powerful Than Xbox 360 GPU' - Nvidia Engineer

Tom Hopkins

Next-gen consoles have a lot of catching up to do to the latest PC graphics cards, according to a Nvidia engineer.

Published on Jun 14, 2012

The latest PC graphics cards are already "24 times more powerful than the one in the Xbox 360," according to a new report.

An exploration of constantly-improving current-gen console graphics by New Scientist features the comparison by Nvidia's principal engineer Simon Green.

"One day, the whole idea of owning a separate piece of hardware to play games on might be completely redundant," Green suggests.

Reasons graphics keep improving on the same old tech includes greater artistic investment and the continued development of new graphical tricks according to the article, which concludes that "even the next generation of consoles is unlikely to match," the high-end PCs that recently demonstrated Epic's Unreal Engine 4.

The Unreal Engine 4 demo was reportedly shown running via a single Nvidia GeForce GTX 680 card - the latest and most powerful Nvidia card though is the GTX 690, essentially two GTX 680s stuck together. It costs £900.

It's unclear which card Green was referring to, but we'd like to think Xbox Next and PS4 will approach the graphical power of the Unreal Engine 4 demo when they arrive in the next couple of years.
http://www.nowgamer.com/news/143050...werful_than_xbox_360_gpu_nvidia_engineer.html
 
Oh my god! Sony is copying the Wii-U by releasing a controller with a screen, only it's the size of a small TV, take that Nintendo!



It's a patent, so it must be real, right?.........right?
Jesus H.... your constant trolling of jeff is becoming really old. Sure, you might not agree with what he's writes but at least he's putting some research into his posts and provides cites.

There's no need for the constant bashing. At least jeff's trying to contribute something interesting and worthy of discussion unlike your constant shitting on him.

He steal your bike as a kid or something?
 

KageMaru

Member
Jesus H.... your constant trolling of jeff is becoming really old. Sure, you might not agree with what he's writes but at least he's putting some research into his posts and provides cites.

There's no need for the constant bashing. At least jeff's trying to contribute something interesting and worthy of discussion unlike your constant shitting on him.

He steal your bike as a kid or something?

Fine fine, I'll back off. Didn't realize it was that bothersome to people, my apologies.
 
Aspects such as compilers, debugging tools, performance analyzers, and other tools matter just as much as architecture, if not more. If the tools allow the developers to get the job done, regardless of the architecture, that will make a difference in the end. Though I agree that x86 would be most preferable by most developers since it's well documented and people have plenty of experience.

I don't think MS is going to use anything but an OoO core since they originally wanted this in the 360 but never got it.

Basically whichever of the two is easier to develop for and receives more dev support will see the better return. Question is, out of the three, which do you think will pull ahead?

This is not an easy question; one would think that a customized OoO PowerPC CPU would be a very powerful solution, but I would not understimate the AMD solution either.
 

Karak

Member
Fine fine, I'll back off. Didn't realize it was that bothersome to people, my apologies.

Don't stress it. People come in late and don't read. They see your response and think its unwarranted. There problem not yours.

But you could stop and instead of doing that you could explain something.

Does a system need more ram if it has duel GPU's?
 
Status
Not open for further replies.
Top Bottom