• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS4 Rumors , APU code named 'Liverpool' Radeon HD 7970 GPU Steamroller CPU 16GB Flash

Status
Not open for further replies.

onQ123

Member
I agree. One of the big unreported stories of this gen is the fact in 2008, with KZ2 and Uncharted 2, PS3 was way ahead. But today, everybody else has caught up.

You could really tell with KZ3, GAF exploded with every KZ2 news and the internet was laden down with gif's. For Kz3 the hype was 1/10 as much and it could easily be boiled down to the fact it was not a major graphical enhancement but basically looked like KZ2, which by that time was not leaps and bounds ahead of other shooters.


Beyond-E3-Media.jpg
 
Oh for that? I was referring to the current advantages and disadvantages seem like they're going to be reversed between those 2 manufacturers for the ps4 and Durango.

You mean info points to a better GPU in PS4 this time around, and better CPU in 360? Whats the up to date rumors say about Durango's CPU? I didnt think there was any rumor info on it like we have with PS4 and the Steamroller based CPU.
 
I like the realistic look that Heavy Rain brings, even though it isn't technically as good as some other games the world still ends up having that old school feel and at the end of the day I just like it better.

Beyond seems to continue that.
 

StevieP

Banned
You mean info points to a better GPU in PS4 this time around, and better CPU in 360 this time? Whats do the up to date rumors say about Durango's CPU? I didnt think there was any rumor info on it like we have with PS4 and the Steamroller based CPU.

Note: grapevine only. Grain of salt etc

8 cores of a lower speed (ie similar to atom/bobcat/jaguar/arm stuff). Said to be x86.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
I agree. One of the big unreported stories of this gen is the fact in 2008, with KZ2 and Uncharted 2, PS3 was way ahead. But today, everybody else has caught up.

You could really tell with KZ3, GAF exploded with every KZ2 news and the internet was laden down with gif's. For Kz3 the hype was 1/10 as much and it could easily be boiled down to the fact it was not a major graphical enhancement but basically looked like KZ2, which by that time was not leaps and bounds ahead of other shooters.

KZ2 is still better looking than 95% of console shooters. Of course a sequel two years later on the same engine does not have the same impact.. It did add MLAA, 3D, split-screen and better frame rates. We are still waiting for a HD Halo game, 4th one is a charm?
 
In my opinion, the ps3 has began to stagnate with regards to the graphical fidelity, killzone 2 and uncharted 2 looks as good as anything I've seen released on the ps3, where as games on the 360 i continue to notice vast improvments.

I agree. One of the big unreported stories of this gen is the fact in 2008, with KZ2 and Uncharted 2, PS3 was way ahead. But today, everybody else has caught up.

You could really tell with KZ3, GAF exploded with every KZ2 news and the internet was laden down with gif's. For Kz3 the hype was 1/10 as much and it could easily be boiled down to the fact it was not a major graphical enhancement but basically looked like KZ2, which by that time was not leaps and bounds ahead of other shooters.

KZ3 was still a big improvement technically over KZ2. The digitalfoundry tech article goes over in detail the graphical improvements compared to KZ2. There also a very good interview with one of the dev teams programmers on how they got much more out of Cell.

You would also be BLIND if you didnt see the improvement in graphics between UC2 and UC3. A lot of people may disagree on whether UC3 was a better game compared to UC2 but no one disputes the graphics in UC3 were much improved. Nothing has been stagnate with regards to graphical fidelity and PS3 games. Just look to what they did with BF3, and upcoming ps3 exclusives like The Last of Us, GoW: A, and Beyond Two Souls.
 

patsu

Member
Been there, done that:

[image]deeplinking[image]

DERP!

http://www.pcgameshardware.com/aid,...on-of-graphics-cards-compared/Reviews/?page=2

Peak power consumption from a full system (GTX280, AMD X2, 1GB RAM, etc...): 274W.

Peak.

Ah, minimum 550w is the power supply rating. Since your article is for 2008. you should use PS3 Slim for reference, which uses about 97W max.

Xbox 360 never overheated. RROD was due to engineer a motherboard with pre 2006 standards, and manufacture it Lead Free under 2006 RoHS. And, even then, it have lower power consumption than PS3. Google it.

Xbox 360 did overheat. As for blaming on RoHS, here's what can be said:
http://en.wikipedia.org/wiki/Xbox_360_technical_problems

Most failures are the result of the use of the wrong type of lead-free solder.[7][8] The Microsoft Ed Tech used lead-free solder as a result of a European Union mandatories,[9] which critics say was imposed despite the fact that a small minority of people do not believe in the environmental benefits. However the whole industry switched to lead-free solder without it causing the kinds of problems Microsoft has seen.[10] There has been legal action taken attempting to hold Microsoft responsible for the Xbox 360's failure rate and provide compensation for those affected.[11]

Regarding power consumption, PS3 Slim now uses 72W (120Gb) to 83W (320Gb) playing FFXIII according to wiki. 360 (Valhalla) uses about 80W playing games according to here

However this is system wide usage (Blu-ray vs DVD, HDD size difference, etc.). It will vary depending on the application used.

When talking about CPU power dissipation, they typically normalize the benchmark (e.g., frequency/FLOP per watt).

Why do you think I didn't read a LOT about Cell?

*shrug*

It was released in almost 2007 and it's manufactured under current litography.
It wasn't designed in early 2000. Project STARTED at early 2000. Come on, guy. Give me a breath.

It was a 5 year project. Technical investigation and initial design concepts were explored, investigated and revised continuously over those years.

Still makes no sense. Lets talk about black boxes. Inside one, just an Cell. In the other one, A CPU+GPU+Motherboard+RAM+HD+Power supply. First one will not work at all.

Yes, your example makes no sense. Why would you want to compare these 2 things ?

The GPU architecture is optimized for embarrassingly parallel problems. It doesn't matter what other components are used. A GPU is used mainly for running graphics and physics jobs, but not for running Java or a web browser.

DMA it's outdated as hell. Think about it, a full cache-coherent Cell with a shared pool for all SPE. That sound powerful, DMA and local store sound... Outdated.

Pipelining and DMA are both ancient concepts and remain in active use today. DMA is critical in I/O. What you probably meant is cache management (cache coherency). Yes, Cell lacks cache management. A modern implementation will have to revisit this topic to enable more flexible LocalStore usage.

Uncharted it's a single program compiled in a single executable with exclusive access to all PS3 resources. A single instance aware of just XMB.

Doesn't mean anything. You can say the same for other programs, "Halo is a single program compiled in a single executable with exclusive access to all 360 resources. A single instance aware of just Xbox environment."

You were wrong claiming that you can't have multiple people working on the same Cell project.

FWIW, a Cell program consists of multiple programs because the SPU and PPU load different binaries. It's just packed into one bundle for easy management. There is also the underlying hypervisor like 360's kernel.


Can it run Java and JavaScript ? How about DLNA network protocol ? Can it read from a disk directly ?

A GPU is not as flexible as a CPU after all. Not unless you have CPU-like cores inside. The nextgen consoles may combine both though.

You don't like to play this.

How many Cell does it have?

Their marketing says based on BlueGene architecture. So perhaps they retained some Cell design philosophy inside ? It's PowerPC. Definitely not GPGPU though. The GPGPU architecture is still too specialized compared to a CPU, *if* they can get the CPU performance high enough. It would be interesting to see how vendors combine both together.

A fully clocked 8800Gt have the very same power consumption than a GTX7900. Did your PS3 burned your house?

Nope, why would it burn my house when the 8800GT is in your house ? Not unless you can time travel back to 2005 and put your fully clocked 8800GT inside a PS3. ^_^

In *2005*, before Cell launched, 8800GT was not ready for small boxes yet. It would be too hot. Now 7-8 years later, of course you can put a shrunken version in small boxes today.

360 will run bad code better than PS3 too.

This may be true. Then again, the reverse is also true, if the code is optimized for Cell, it will fly like no other.

Ok, Cell it's not more powerful than Xenon either. Period.

Show me a Xenon that can decode 3D Blu-ray and run Java at the same time.
 

Reiko

Banned
KZ2 is still better looking than 95% of console shooters. Of course a sequel two years later on the same engine does not have the same impact.. It did add MLAA, 3D, split-screen and better frame rates. We are still waiting for a HD Halo game, 4th one is a charm?

I know you're in the loop... But Halo 4 is 720p native with FXAA

Anyway, this isn't the topic for this.
 
Note: grapevine only. Grain of salt etc

8 cores of a lower speed (ie similar to atom/bobcat/jaguar/arm stuff). Said to be x86.

So twice as many cores as the rumored PS4 one, but if the PS4 sticks with the steamroller CPU cores it would be able to do more per core? They both have two threads per core?
 

jaypah

Member
Can we stop this shit please. This is a thread about the PS4 not about what games look better on the 360.

Or how PS3 games beat those games. Shit's stupid. I guess the rumors have dried up. I want to hear about Sony's next camera device.
 

StevieP

Banned
So twice as many cores as the rumored PS4 one, but if the PS4 sticks with the steamroller CPU cores it would be able to do more per core? They both have two threads per core?

Well you can compare bobcat to bulldozer currently if you want an idea. There are probably plenty of those on the net. Of note, there may be some cores on Xbox reserved for background tasks and/or kinect processing.

I want someone to explain to me how 2gb of ddr5 is it? is better than 4gb of weaker ram?

Seems to be at the end of the day the 4gb would win out somehow

It's 8gb of ram that you're comparing it to, though it's not "weaker" per say. It's certainly slower and with less bandwidth however.

Compare a GeForce 650m with ddr3 and the models with gddr5 for example.
 
Something needs to leak.

Supposedly they were getting new dev kits around E3. I figure theres a few different possibilities why nothings leaked about it yet.

A. New dev kits didnt end up releasing around E3 and were delayed
B. there were no major changes in this revision
C. they were just able to keep a better lid on things this time around/no ones cared to leak anything
D. combination of both B and C.
E. none of the above- then explain why

what do you guys think?
 

Reiko

Banned
Supposedly they were getting new dev kits around E3. I figure theres a few different possibilities why nothings leaked about it yet.

A. New dev kits didnt end up releasing around E3 and were delayed
B. there were no major changes in this revision
C. they were just able to keep a better lid on things this time around/no ones cared to leak anything
D. combination of both B and C.
E. none of the above- then explain why

what do you guys think?

C: But I believe something will change.
 

Rolf NB

Member
Folding@Home PPDs are not a measure of work done. PS3 PPDs were deliberately and arbitrarily nerfed at some point, because the F@H project thought that was cool.
You mean info points to a better GPU in PS4 this time around, and better CPU in 360? Whats the up to date rumors say about Durango's CPU? I didnt think there was any rumor info on it like we have with PS4 and the Steamroller based CPU.
Nothing points to anything. People just like talking out of their asses.
 
I want someone to explain to me how 2gb of ddr5 is it? is better than 4gb of weaker ram?

Seems to be at the end of the day the 4gb would win out somehow

No, at least not when it comes to games and their visuals. To render a game you have a certain amount of time for each picture to maintain a steady framerate. The transmission of the picture takes a certain amount of bandwith and of course previous/ongoing operations to create/modify/enhance this picture take memory and bandwith.

If you fail to complete your rendering in time (too much data for the given bandwith) you have to scale down or your framerate drops. So you could have plenty of memory but without the proper speed it is useless for your graphic task because once you filled X GB your picture should be already down. For non-GPU tasks more memory has a an advantage in my eyes and I strongly believe that if the 720 has so much more memory it is for other purposes than graphics. Sorry for the rough explanation - I guess my english is not good enough to really explain it how it works in depth and hopefully some Pro will read it and correct my (translation) mistakes.

My ideal setup would be 4GB GDDR5 in the PS4. I would take XDR2 aswell but I don't want to end up with a console without RAM but instead a voucher for it in 2013/14...
 
Ah, minimum 550w is the power supply rating. Since your article is for 2008. you should use PS3 Slim for reference, which uses about 97W max.

You STATED that GTX280 needs a 550W Ps alone as a bare minimum:

the GeForce 280GTX GPU alone required a 550W power supply *at least*.

Thats *NOT* true. I showed you some reviews that demonstrate a PEAK total PC usage of 275Watts. And PS3 Slim released at september 2009, so FAT applies here.

Let's do some maths about efficiency:

GTX280 at folding: 6530/275W= 23,75 (not even counting CPU scores, just GPU).
PS3 at folding: 900/135W from the not retro fat model=6,67. (RSX can't run fonding).

A GTX280 based PC is 3'5 times more efficient at his job than a Cell PS3.

Enough of this Cell godlike shit.


Xbox 360 did overheat. As for blaming on RoHS, here's what can be said:
http://en.wikipedia.org/wiki/Xbox_360_technical_problems

Xbox 360 DO NOT overheat. It doesn't have to do with that all all. RROD was due to poor quality solder. Quality lead free solder is more expensive. MS was greedy enough to use low quality solder, just as they used low cost capacitors. Fat 360 is way better engineered than Fat PS3. Fat PS3 is way better manufactured than Fat 360.

Regarding power consumption, PS3 Slim now uses 72W (120Gb) to 83W (320Gb) playing FFXIII according to wiki. 360 (Valhalla) uses about 80W playing games according to here

However this is system wide usage (Blu-ray vs DVD, HDD size difference, etc.). It will vary depending on the application used.

This is the most hilarious thing I have read for weeks. You are just unable to read a mere table. That 11W difference depending on just hard drive size was just too funny to be beliable, since we are not talking about fat models and differents chips usage. A 2,5" hardrive, such as PS3 ones, peaks at 2,5W. But you actually believe there is a 11watts difference between hard drive discs based in their capacity. Can't stop laughting.

I will just not talk about using different sources data to power consumptions.

When talking about CPU power dissipation, they typically normalize the benchmark (e.g., frequency/FLOP per watt).

I noticed while ago you have no idea at all. I will teach you something. TDP is not power consumption, and have nothing to do with benchmarks. TDP is the amount of dissipated heat a cooling system have to dealt with in a worse case scenario. More on this later.




Yes, yes, my poor Cell evangelist.



It was a 5 year project. Technical investigation and initial design concepts were explored, investigated and revised continuously over those years.

So you agree Cell it's not a 2000 designed chip. Hope you never say that again then.


Yes, your example makes no sense. Why would you want to compare these 2 things ?

Because there are systems. PC with a GPU+CPU is a system. PS3 with Cell+RSX is a system. Cell alone do nothing, as any given CPU.

The GPU architecture is optimized for embarrassingly parallel problems. It doesn't matter what other components are used. A GPU is used mainly for running graphics and physics jobs, but not for running Java or a web browser.

You know what? A GPU can render a browser. You know what? Cell can't feed an image to your TV, a Cell doesn't have a DAC to feed audio to your speakers, Cell can't hold data because it have no room for it. Not only that, my cat can't bark, just speaking of more nonsenses about witch device do wich duty in a system.


Pipelining and DMA are both ancient concepts and remain in active use today. DMA is critical in I/O. What you probably meant is cache management (cache coherency). Yes, Cell lacks cache management. A modern implementation will have to revisit this topic to enable more flexible LocalStore usage.

You use DMA for external devices like hard drives. Not for an High performance proccessor. End of history. Any modern multicore CPU needs shared pool of cache. The more, the better.


Doesn't mean anything. You can say the same for other programs, "Halo is a single program compiled in a single executable with exclusive access to all 360 resources. A single instance aware of just Xbox environment."

My point is cell, as a monocore CPU, will lose tons of performance running more than a program. Any multicore proccessor will do better.

You were wrong claiming that you can't have multiple people working on the same Cell project.

misc-jackie-chan-l.png


FWIW, a Cell program consists of multiple programs because the SPU and PPU load different binaries. It's just packed into one bundle for easy management. There is also the underlying hypervisor like 360's kernel.

All of them PROFILED to work together and with a working knowledge of available resorces. It's not like you open Youtube into chrome and play some mp3's at same time.



Can it run Java and JavaScript ? How about DLNA network protocol ? Can it read from a disk directly ?

My 10 mhz phone can run javascript. My 300 mhz mips router cpu can run a dlna media server. There is absolutely no point at this. Can you hook a speaker to Cell so it can play some mp3's? But better than this. Can your almighty Cell read from a disk directly without a southbridge? Genius.


Their marketing says based on BlueGene architecture. So perhaps they retained some Cell design philosophy inside ? It's PowerPC. Definitely not GPGPU though. The GPGPU architecture is still too specialized compared to a CPU, *if* they can get the CPU performance high enough. It would be interesting to see how vendors combine both together.

Cell is based on PowerPC, not the other way around. Future is SOC. CPU's tends to be GPU's. GPU's tends to be CPU's. Cell it's just a weak CPU with a massive die budget dedicated to FP.


Nope, why would it burn my house when the 8800GT is in your house ? Not unless you can time travel back to 2005 and put your fully clocked 8800GT inside a PS3. ^_^

In *2005*, before Cell launched, 8800GT was not ready for small boxes yet. It would be too hot. Now 7-8 years later, of course you can put a shrunken version in small boxes today.

1. PS3 have a dumbed down GTX7900.
2. 8800GT and GTX7900 have the same TDP and similar power consumption.
3. ???
4. Profit!

G80 was released at 2006. PS3 was released at late 2006. Sony decided to invest more on CPU than on GPU, that's why they released PS3 with a G72 instead of a G80. Budget. Just as Microsoft throwing in 256MB of RAM in exchange of an HD.


This may be true. Then again, the reverse is also true, if the code is optimized for Cell, it will fly like no other.

Yup, Xenos will run Xenos optimized code like no other too. Another nonsense.


Show me a Xenon that can decode 3D Blu-ray and run Java at the same time.

Show me a BD drive hookable to a Xbox360. In the meantime, you can play with th HD DVD unit. Any software level codec will run at PPE. Xenos have one PPE just as Cell one. And 2 more friends with double cache. Guess wich one will run better PPC code?


I'm done with this childish talk. I'm done with guys posting a screenshot to 'prove' a CPU is better than another. I'm done with people calling other fanboys without having any working knowledge about microarchitectures. You want an indeep Xbox360 vs PS3 topic? Open a new one. You want a new Cell religion? Open a church.

I only wanted to say why I don't want a Cell or Steamroller or a shitty 8 cores arm cpu into next gen consoles. I wanted to say a Southern Island GPU is not an HD7970 GPU like those PCIE cards, just a derived chip. I said it already, and have nothing more to say.

Fafalada said:
You do realize that such hardware never existed in a console right?

Xbox360 at release date was.

systemfehler said:
I would take XDR2 aswell but I don't want to end up with a console without RAM but instead a voucher for it in 2013/14...

Please NOT! Don't touch that shitty patent troll rambus made from air memory.

About the GDDR5 vs DDR3 debate, you are forgeting to talk about the bus width. GDDR5 with a narrow bus will be cheaper, but would perform as DDR3 with a wider bus. If it's about speed vs capacity, DDR3 with 256bit bus is more than enough for GPU functions at 1080p, so 4GB DDR3 over 2GB GDDR5 with no doubt.
 
I hope you realize you repeated sections of your post. It was like scrolling through a whole page of posts just to get to the bottom of yours.

I dont understand why you continue this argument about Cell. Multiple people have told you Cell>>>Xenon. Both 360 and PS3 have there strengths and weaknesses. 360 is its GPU while PS3 is the CPU. This has been known for years now. I dont even understand how this is still debatable. This argument is from 2006.

As far the GDD5 vs GDD3. Obviously 4gb of gddr5 would be accompanied by a 256bit bus as well. The question is whether GDD3 with 6-8gb of RAM and 256bit bus would be better than 4gb of RAM and a 256bit bus. Essentially a lot more RAM vs a lot more bandwidth. Thats the question.
 

Ashes

Banned
I on the other hand salute you jnr. A true trooper. May you survive the crazy throes of jnrhood into full membership.
 
I agree. One of the big unreported stories of this gen is the fact in 2008, with KZ2 and Uncharted 2, PS3 was way ahead. But today, everybody else has caught up.

You could really tell with KZ3, GAF exploded with every KZ2 news and the internet was laden down with gif's. For Kz3 the hype was 1/10 as much and it could easily be boiled down to the fact it was not a major graphical enhancement but basically looked like KZ2, which by that time was not leaps and bounds ahead of other shooters.

No other console game has topped GoW3, KZ2 or UC2 other than their sequels.
 
Regarding the memory choices - as a consumer I don't really care about patents and proprietary technology as long as it benefits me either through price, speed, quality. As a engineer my viewpoint is very different but I won't go into that now. Sony worked with Rambus and XDR before IF (and that is the problem) they now have an agreement with XDR2 which is "faster" (probably not cheaper) than GDDR5 and will be available in bulk I don't see why Sony shouldn't go that route.

With 4GB GDDR5 Sony hopefully won't cut cost with a smaller BUS because their engineers would simply waste the GDDR5 advantage. A lot of synthetic benchmarks really profit frome more (graphics) memory bandwith but not so much by a size increase.

I am not expert enough (or maybe a bit lazy too *g*) to calculate the maximum bandwith for DDR3, GDDR5, XDR2 but I guess there is a reason why AMD uses GDDR5 for their top-tier GPUS and as an example my Geforce FX880m only has DDR3...

Wishfull thinking

4GB XDR2 in the PS4 which is 50% faster than GDDR5 with the same power consumption. To minimize heat Sony could adjust clock-speed/voltage of the XDR2 to reach GDDR5 speed with less power.
 
Wow....another Cell argument. Picked my way through it and lots of good points on both sides. I'd like to point out that the 1PPU4SPU patent is a redesign to use unified memory and properly supports a coherent cache. Patsu pointed out (Cell design is more secure) but dr. apocalipsis pointed out UMA is the future and Security will be an issue with unified memory and a shared L3 cache thus Arm A5 is needed for a TrustZone processor.

dr. apocalipsis also provided two links that I think everyone should examine:

1) Cell can decode 48 mpeg2 streams at the same time.

2) This Video of a Toshiba 2010 TV that uses the above. Listen for the plans for the TV to serve (stream multiple video channels) to multiple video platforms in the home ( Smart TV, Tablets, Phones).

Now Read the Sony and Microsoft plans for PS4 and Xbox 720, sounds like exactly the same server model.

Sony was counting on Allvid being adopted in the US which is a 6 tuner head end with decryption to serve CE platforms in the home via network and DTCP-IP encryption. Also in this is a move to h.264 and h.265 codecs which would allow 7 and 14 SD channels respectively per 6 Mhz RF band/channel/tuner on Cable (potentially 14 X 6 tuners = 84 channels select-able/viewable...watch the Toshiba TV demo again with this in mind) . Microsoft in the Xbox 720 has a tuner or tuners of it's own and just needs the cable company to allow access (provide encryption keys); Sony may be planning the same for the PS4.

The PS3 appears to be designed to serve as a home media server (with Allvid front end) thus Cell and the Gigabit network port which is not needed unless it's assumed multiple HD video streams are going to be consuming bandwidth.

A GPGPU can decode HD video streams but can't easily switch streams on the fly and while doing so can't run games. So the Xbox 720 Powerpoint points to a design that does not use the GPU for media streaming/serving functions. It could use multiple dedicated hardware codecs or multiple AMD video decode modules or perhaps two 1PPU4SPU modules, certainly Sony for the PS4 should do so.

And we are back to the timing; the Toshiba TV demo (serving multiple channels to multiple TVs) is early 2010, the Microsoft Powerpoint is 9/2010 and the Sony 1PPU4SPU patent is 9/2010 but published Dec 2010. Considering lead time for a design to implementation, it should show up soon.

In this diagram out of the Xbox 720 power-point look to the upper right Blu box Video accelerator 2HD decoders, 1HD encoder, DSP, XMA. Look also at the entire right side in blue as both PS4 and Xbox 720 will have similar I/O hardware. Start with that as a Given! It's necessary to support accessories and multiple monitors/4K monitors.

Slide9.jpg


Also look at the SLC cache which can provide a system cache, fast boot non-volatile RAM. SLC (Single level Cache) doesn't suffer from read-write damage as much as typical Flash memory. If the 16 Gig coming in both the PS3 4000 and PS4 is SLC then it could be in all PS3 4000 models and would allow much faster load times and low power standby.
 
Man Cell argument took a turn for the worse after I logged off. This Dr guy is kind of clueless huh?
Yes and no. He has good points but doesn't understand the Cell design was the most cost efficient given hardware at that time. It is not the most efficient NOW, the AMD HSA design is and the Cell can't function in a HSA platform but the 1PPU4SPUs can.

Toshiba and Sony have plans for Cell and SPUs that revolve around Media. Is this no longer true?
 

iceatcs

Junior Member
Wow, who care Xbox? no one buying them. Cell? no more I assume, so why more hating.
And graphics, you shouldn't be here. PC is only way to go.


Anyway, hopefully PS4 will have some exotic hardware because I always get PC and PS combo. I want to see both worlds pretty much difference.
 
So you are assuming the 720 will have more ram because they are integrating more stuff on the software side and with Kinect whatever etc

I know that is speculation

Excuse me for my ignorance but I am trying to see at what scenario more ram is better and at what speeds, obviously there is many situations with the CPU and GPU that would make things better or worse but put it this way for us who are less versed

PS4/720

4gb/8gb
2gb/4gb
4gb/6gb


And add in different memory types and why each is better and worse than the other. I know that is a lot of work, but this topic of ram comes up A LOT I have noticed.

I think if someone can do a comparison of what is speculated and with which type of ram and give the pros and cons of each surely us ignorant people will get it lol
 

needs more blurry ground textures

Anyways this just reminds me of Heavy Rain, not really a traditional game judged on traditional game graphic standards. I hate cutscenes and no gameplay angles as examples of game graphics. Show me a FPS like this and we talk.
 
Yes and no. He has good points but doesn't understand the Cell design was the most cost efficient given hardware at that time. It is not the most efficient NOW, the AMD HSA design is and the Cell can't function in a HSA platform but the 1PPU4SPUs can.

Toshiba and Sony have plans for Cell and SPUs that revolve around Media. Is this no longer true?

He has SOME good points. I lost him when he compared the GTX 280 to the Cell. Even more so after he tried to discredit every advantage the Cell has over other architectures.
 
He has SOME good points. I lost him when he compared the GTX 280 to the Cell. Even more so after he tried to discredit every advantage the Cell has over other architectures.

Probably Microsoft hopes for nothing more than a Cell based CPU in PS4...which is why it wont happen. For starters, you'd have to speed it up by adding SPU's, lets say, 32 of them.

That would be a programmers nightmare. Tiny little processors with little cache. If they thought PS3 was bad...
 
Probably Microsoft hopes for nothing more than a Cell based CPU in PS4...which is why it wont happen. For starters, you'd have to speed it up by adding SPU's, lets say, 32 of them.

That would be a programmers nightmare. Tiny little processors with little cache. If they thought PS3 was bad...

IIRC (I'm no tech expert) but the cache each SPE had was adequate. It was the memory interface between the XDR and the cell that was the bottleneck. If a extra wide memory I/O were used today, that would resolve the problem (correct me if I'm wrong Jeff).
 
Probably Microsoft hopes for nothing more than a Cell based CPU in PS4...which is why it wont happen. For starters, you'd have to speed it up by adding SPU's, lets say, 32 of them.

That would be a programmers nightmare. Tiny little processors with little cache. If they thought PS3 was bad...
Is the CPU even relevant when it comes to games only? On my notebook Dead Island at 720p doesn't really care that my i7 is locked at 930MHz. So why would a Cell 1.5 with a (this time really) powerfull GPU not suffice for games and media hub/OS?
 

KageMaru

Member
This went from a technology rumor thread to a stupid one in a matter of minutes.

Amazing how console warriors can turn a thread into shit, isn't it? =p

I know you're in the loop... But Halo 4 is 720p native with FXAA

Anyway, this isn't the topic for this.

AgentP is one of the biggest trolls on here, give him no attention and he'll go away, just like a child.

Xbox 360 DO NOT overheat. It doesn't have to do with that all all. RROD was due to poor quality solder. Quality lead free solder is more expensive. MS was greedy

It was a combination of things. Most RROD cases happened because the mother board warped due to the heat and x-clamps being fastened in a poor position so it pulled on the mobo. That warping caused issues with the heat sink, where it couldn't stay properly connected to the chip itself.

Not sure the lead free solder was the core issue since the PS3 had to use it as well, no?

No other console game has topped GoW3, KZ2 or UC2 other than their sequels.

So you know the buffer layout and technical make up of every game released this gen?

edit:

Is the CPU even relevant when it comes to games only? On my notebook Dead Island at 720p doesn't really care that my i7 is locked at 930MHz. So why would a Cell 1.5 with a (this time really) powerfull GPU not suffice for games and media hub/OS?

Looking at performance on a PC isn't really an accurate way to determine how things would turn out in a console. Different games are programmed to take advantage of different types of CPUs. Some scale well with the number of cores while others do not because of the amount of configurations out there and developers working around a lowest common denominator.

As for the Cell 1.5, no matter how well it could work next gen, it will still take longer to produce results on that chip than on a x86 core where developers have far more experience. Not to mention a DX11 GPU minimizes the need for a specialized chip like Cell. With cost likely rising even more next gen, developers need to get results and fast, so an efficient chip will produce better results in the end.
 
IIRC (I'm no tech expert) but the cache each SPE had was adequate. It was the memory interface between the XDR and the cell that was the bottleneck. If a extra wide memory I/O were used today, that would resolve the problem (correct me if I'm wrong Jeff).
Yup and memory bandwidth for Fusion HSA SOCs is going to also be an issue but there are now economical connection methods and custom memory.
 
Looking at performance on a PC isn't really an accurate way to determine how things would turn out in a console. Different games are programmed to take advantage of different types of CPUs. Some scale well with the number of cores while others do not because of the amount of configurations out there and developers working around a lowest common denominator.

As for the Cell 1.5, no matter how well it could work next gen, it will still take longer to produce results on that chip than on a x86 core where developers have far more experience. Not to mention a DX11 GPU minimizes the need for a specialized chip like Cell. With cost likely rising even more next gen, developers need to get results and fast, so an efficient chip will produce better results in the end.
Well if ease oft development is so important and combined with the tendency oft the CPU becoming less the focus why does it matter if the APU has Steamroller, Piledriver or Bulldozer? Shouldn't all with a dedicated GPU be a big step ahead from Cell then? Isn't the CPU the least problem if Sony gets a Sea Island GPU but only a Brazos or Kabini APU. My point being why even care so about Cell and friends so much.
 

drkohler

Banned
Is the CPU even relevant when it comes to games only? On my notebook Dead Island at 720p doesn't really care that my i7 is locked at 930MHz. So why would a Cell 1.5 with a (this time really) powerfull GPU not suffice for games and media hub/OS?
Agreed. I think an improved 28cm cell with either 3PPU/8SPUs or 2PPU/12SPUs (add some more mem if the die size remains <200nm'2) would have done the job easily. As it seems, AMD was able to pull all three prties into their camp and Sony even went for the APU thing (this could bite them in the a"" in the end, given the turds AMD currently markets).
 

KageMaru

Member
Well if ease oft development is so important and combined with the tendency oft the CPU becoming less the focus why does it matter if the APU has Steamroller, Piledriver or Bulldozer? Shouldn't all with a dedicated GPU be a big step ahead from Cell then? Isn't the CPU the least problem if Sony gets a Sea Island GPU but only a Brazos or Kabini APU. My point being why even care so about Cell and friends so much.

The performance of the CPU still matters, you don't want games to be CPU bound.
 
Status
Not open for further replies.
Top Bottom