• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF/Eurogamer: First Xbox 3 Devkit leaks, 8 Core Intel CPU, nvidia GPU, 8-12GB RAM

Marco1

Member
They're called NDAs man. Sure someone could be making shit up, but the ones that aren't sure as hell aren't going to "prove" it online and lose their job just so you can get hype.

That said, I put more cred in some people than others. BG is one of those folks.

I agree.
We only have to look at MS at the moment to see what they're planning.
Kinect and the current dashboard isn't going anywhere as it's to much like the surface and W8 phone dash.
They are going to go after the living room next gen, I have no doubts about that. They are chasing what apple and sony have wanted for years.
Remember whoever gets the living room first wins but you won't need huge specs and a noisy, power hungry,huge box isn't going to get you there.
But a box that works seamlessly with surface, your desktop and phone will.
 
It´s important that at least 4 GB of RAM is dedicated to games. But equally important is the gpu and Durango rumors unfortunately point toward a weak solution. I hope MS will step up their game and doesn´t drop the ball with the gpu, because a weak one would be a really bad bottleneck for a gaming machine.
 
The uninformed posters are always what make these threads so fun.


These are the details I feel are near rock solid about the next Xbox.

- launching fall of 2013.

- it will be fully backwards compatible with Xbox 360 software.

- it will have at least 4GB of ram dedicated to games. Another 2-4 for OS and/or other apps that can runmconcurrent to games.
 

big_erk

Member
I think being stuck with five year old technology/performance is a pretty big string.

While you may want to argue that the performance of the WiiU puts it behind the curve, I don't think the argument of five year old technology holds water.

1. The CPU was still being developed by IBM until very recently. If they were just going to use five year old tech, why waste the dev time.

2. The final GPU was still being worked on until recently. Once again, if they were just going to use five year old tech, why waste the time.

3. While you may not care for the display controller, the tech that is housed in it is very innovative and being used in ways that are new.

I think Nintendo is clearly trying to go for more bang for the buck, and I don't have a problem with that. They seem to be using their R&D to increase efficiencies rather than just raw power. That does not equate to old tech. It's just a different way to approach new tech. Is it the right approach? We won't know that until closer to launch.
 
Really, this early in the game it doesn't matter. MS has a target performance wise that they are aiming for. They put together kits that approximate what kind of performance you could expect. At this stage (talking about months ago...that hardware is from what...February?) it really doesn't matter as long as it falls within that performance target.

But that's what you seem to be missing. Like I said comparable components don't always produce the same performance. In a closed environment the devs don't want "surprises". They want to have a decent idea of what they are working on from the get go.

Oh yeah because your "source" that you won't even attempt to name is just infinitely more trust-worthy.

I hope you realize how asinine that sounds. If not think about it a little more.

The difference is that this dae guy can actually prove he has HW. He did this for the 360 alphas also and he still apparently has a job.

What part of "he guessed" are you not seeing? It would be different if the article said he was also told it had an Intel/nVidia combo or he opened up the kit himself. Early on we were hearing they were going with an IBM/AMD combo again and then we began hearing they switched to AMD to provide the CPU. Now one person guesses, which is clear in the article that's the case, that it's an Intel/nVidia combo which goes against everything that has been talked about up to this point and that's all of sudden the correct one? Come on now.
 

Durante

Member
If someone has access to a dev kit it should be trivial to determine the CPU it's using. Just run CPUID.

But that's what you seem to be missing. Like I said comparable components don't always produce the same performance. In a closed environment the devs don't want "surprises". They want to have a decent idea of what they are working on from the get go.
Yeah, what he's missing is the well-known lack of performance portability with languages such as OpenCL and DirectCompute. While they do allow you to treat all manner of compute devices the same way on a superficial level, if you want to get good performance out of GPUs or accelerators you'll still need to perform hardware-specific optimization.

3. While you may not care for the display controller, the tech that is housed in it is very innovative and being used in ways that are new.
What part of the technology in the controller is innovative? The resistive touch, the TN panel or the gyro? I guess the only claim that could plausibly be made is low-latency wireless streaming, but that's more of a software thing and also not really new.
 

Triple U

Banned
bgassassin said:
I hope you realize how asinine that sounds. If not think about it a little more.
Well lets get some info on this "source".

bgassassin said:
What part of "he guessed" are you not seeing? It would be different if the article said he was also told it had an Intel/nVidia combo or he opened up the kit himself. Early on we were hearing they were going with an IBM/AMD combo again and then we began hearing they switched to AMD to provide the CPU. Now one person guesses, which is clear in the article that's the case, that it's an Intel/nVidia combo which goes against everything that has been talked about up to this point and that's all of sudden the correct one? Come on now.
There's guessing like what you and steviep do and then there's guessing based on what he has in front of him. Very distinct difference. What we heard from whatever supposed unverified source is irrelevant. There is absolutely no proof that you can provide other than the vague "my sources said blah blah" schtick. He could have very well have opened the machine up. Whatever his reasoning may be, it seems a nth more convincing than anything you have brought to the table.

Specialguy even said that the guy told him he was a developer.
 
This thread is so damn depressing. All signs point to a console that is weaker graphically than the state of the art consumer PC at the time of release. I love consoles and I love PC's, and I have both so I ain't even mad, but is this pretty much the consensus? (The console line isn't static because of the whole "devs tapping into the hidden power of the console etc. etc." phenomenon)
Untitled-6.png
 

Clockwork

Member
Yeah, what he's missing is the well-known lack of performance portability with languages such as OpenCL and DirectCompute. While they do allow you to treat all manner of compute devices the same way on a superficial level, if you want to get good performance out of GPUs or accelerators you'll still need to perform hardware-specific optimization.

Do you really think that is something that needs to be worried about at this stage in development? I don't think anyone's code at this point would even be close to a "ready to ship" stage.

Are we all of the understanding that this isn't even a final dev kit?
 

hodgy100

Member
It's usually a bad sign when you have to lead a post like that.

Simply put, no one on GAF has any clue how large or power hungry the next Xbox is. Any claims theyre making in this regard they're pulling out of their ass.

The first Xbox was a monster in size. Hell many cable boxes in the US are absolutely massive. There's no basis to say the next Xbox won't be a fair bit larger than the 360 if that's what ms needs to do get the performance their developers and partners are requesting.
It's also a bad sign when a returning point focuses on the first sentence rather than the content.

I don't believe anyone is claiming what they know will be in next generation consoles. But that doesn't mean we can't make educated guesses.

You keep harping on about the size of the original Xbox. The ps3 is larger than the original xbox! And new high end hardware is hotter and more power hungry than the ps3!

and like I said in my last post (which you didn't seem to read) the next gen consoles are going to need some serious thermal design to pull off the ammount of heat they are going to have to shift. what the 360 has isn't gonna cut it. And I doubt a cooling system as extravagant as the ps3s will be enough.
 

StevieP

Banned
And for what reason am I supposed to believe GLOFO is producing the machine?

There are a few.

What part of the technology in the controller is innovative? The resistive touch, the TN panel or the gyro? I guess the only claim that could plausibly be made is low-latency wireless streaming, but that's more of a software thing and also not really new.

Without devlolving this into yet another Wii U thread, Nintendo making an "innovative" controller doesn't mean using the newest parts. Accelerometers in the Wiimote weren't exactly new at all, because that tech is... well it's ancient. Magnetometers (compass? lol) gyrometers, LCDs - none of these things are 'new' (capacitive touch tech is actually older than resistive, btw). What they do is take existing technology and package together in a way that's generally not done before to provide a new experience. With that said, NFC technology does err on the 'new' side of things - I say that as someone who's work has been experimenting with NFC for about a year now.

Well lets get some info on this "source".


There's guessing like what you and steviep do and then there's guessing based on what he has in front of him. Very distinct difference. What we heard from whatever supposed unverified source is irrelevant. There is absolutely no proof that you can provide other than the vague "my sources said blah blah" schtick.

Specialguy even said that the guy told him he was a developer.

You seriously want bg to out his source for your benefit? lol c'mon dude
If you don't believe me (lol my tag) or bg or whoever else that's fine. You're welcome to have that opinion. But don't ask people to out who gave them information. Just take it with a grain of salt if you're of that opinion.

Go read some of this guy's twitter and realize that despite the fact that there are legitimate facets to him doing 360 work, that he is also a troll. That doesn't make him any more or less credible than any other person on the internet. He may have a Durango kit, hell it may even contain an Intel CPU and Nvidia GPU (lol) but I also think you need to keep in mind that this is the internet. DigitalFoundry even spoke of the 'cautionary' aspect of the whole thing and there is a very good reason for that. I personally think they shouldn't have even posted the article, given on what specialguy has found out about the guy through private conversations.

This thread is so damn depressing. All signs point to a console that is weaker graphically than the state of the art consumer PC at the time of release. I love consoles and I love PC's, and I have both so I ain't even mad, but is this pretty much the consensus? (The console line isn't static because of the whole "devs tapping into the hidden power of the console etc. etc." phenomenon)
Untitled-6.png

Move the red line higher in the first photo and the blue line lower in the second photo.
 

big_erk

Member
What part of the technology in the controller is innovative? The resistive touch, the TN panel or the gyro? I guess the only claim that could plausibly be made is low-latency wireless streaming, but that's more of a software thing and also not really new.

It's not so much the individual pieces of tech involved as how they are being used as a whole. And yes, the low latency streaming is very cool.
 
Yeah, what he's missing is the well-known lack of performance portability with languages such as OpenCL and DirectCompute. While they do allow you to treat all manner of compute devices the same way on a superficial level, if you want to get good performance out of GPUs or accelerators you'll still need to perform hardware-specific optimization.

Thanks for saying it in more detail than I did.

Well lets get some info on this "source".

Nope. You're still missing how asinine that is.


There's guessing like what you and steviep do and then there's guessing based on what he has in front of him. Very distinct difference. What we heard from whatever supposed unverified source is irrelevant. There is absolutely no proof that you can provide other than the vague "my sources said blah blah" schtick.

Specialguy even said that the guy told him he was a developer.

LOL. Yeah what I'm told is just me guessing. But I agree it is a very distinct difference because I'm not guessing.

Heck look at the following article.


Feldstein’s group has also locked up a spot in the next generation XBox and PlayStation, which have yet to be unveiled; as well as Nintendo’s next-generation Wii U, which was first unveiled last year. The work with Sony is known inside AMD as ‘Project Thebes.’

Guess that's also irrelevant. To me it seems that you want to believe what you want. I won't bother debating against that kind of thinking as it would be just better to agree to disagree.
 

Durante

Member
I actually completely forgot about the NFC in the Upad. I hope they'll find a good use for it other than selling merchandise.

Do you really think that is something that needs to be worried about at this stage in development? I don't think anyone's code at this point would even be close to a "ready to ship" stage.
Well, yes I do. It's not like optimizing for a recent CPU, where you can maybe carve out 20% or 30% by low-level optimizations. On entirely different GPU architectures, we could be talking about a factor of 10 difference. In the worst case, it could even be that a specific algorithm works well on one architecture and is almost impossible to implement effectively on another.
 

mrklaw

MrArseFace
If someone has access to a dev kit it should be trivial to determine the CPU it's using. Just run CPUID.

Yeah, what he's missing is the well-known lack of performance portability with languages such as OpenCL and DirectCompute. While they do allow you to treat all manner of compute devices the same way on a superficial level, if you want to get good performance out of GPUs or accelerators you'll still need to perform hardware-specific optimization.

What part of the technology in the controller is innovative? The resistive touch, the TN panel or the gyro? I guess the only claim that could plausibly be made is low-latency wireless streaming, but that's more of a software thing and also not really new.

How much low level optimisation do you expect right now though? We're 18 months from release and these will be very early kits. I'd expect them to be more about illustrating likely power levels than being relevant for fine tuning anything.
 

Clockwork

Member
How much low level optimisation do you expect right now though? We're 18 months from release and these will be very early kits. I'd expect them to be more about illustrating likely power levels than being relevant for fine tuning anything.

That's what I've been saying but I guess I am wrong.
 

onQ123

Member
Heck look at the following article.

Guess that's also irrelevant. To me it seems that you want to believe what you want. I won't bother debating against that kind of thinking as it would be just better to agree to disagree.

That's the same name that sweetvar26 gave



"Alright guys, got some information for you guys.

The PS4 AMD project called as Thebe. Previously it used to be based on Themesto and Callisto based chips but now that has been revised. They moved on to a chip called Jaguar replacing the Streamroller. They moved on to TSMC 28nm solution from the 32, which the streamroller is.

The whole thing basically is APU solutuion, they made the changes considering the 10 year product life cycle and to keep the initial product costs at minimum.

As of now it is called as the Thebe Jaguar project or TH-J."

http://www.neogaf.com/forum/showpost.php?p=37364710&postcount=1265
 

Zen

Banned
Move the red line higher in the first photo and the blue line lower in the second photo.

In terms of practical application those two charts are pretty accurate. We could see console refreshes and streaming (likely) to abate the discrepancy with time.
 

omonimo

Banned
That's the same name that sweetvar26 gave



"Alright guys, got some information for you guys.

The PS4 AMD project called as Thebe. Previously it used to be based on Themesto and Callisto based chips but now that has been revised. They moved on to a chip called Jaguar replacing the Streamroller. They moved on to TSMC 28nm solution from the 32, which the streamroller is.

The whole thing basically is APU solutuion, they made the changes considering the 10 year product life cycle and to keep the initial product costs at minimum.

As of now it is called as the Thebe Jaguar project or TH-J."

http://www.neogaf.com/forum/showpost.php?p=37364710&postcount=1265
The same 10 years of ps3? Great. We talking of a new hardware between 2013-14 & that bullsh!t of 10 years of life cycle continue to run... the life cycle of a console it's define to the costumer not to the company plan...
 

Durante

Member
How much low level optimisation do you expect right now though? We're 18 months from release and these will be very early kits. I'd expect them to be more about illustrating likely power levels than being relevant for fine tuning anything.
This would make sense for CPU code. But as I explained above, when you're writing (nontrivial) GPU computing code you have to be aware of the architecture. I guess over a year is enough to (re-)write a lot of it on more final hardware, and it would be possible to simply not do any heavy-duty GPU coputing on launch titles, but it's a much bigger deal than switching CPU vendor or increasing RAM.

It just would seem pretty non-sensical to me to equip a devkit with an entirely different GPU than what you put in the final system. It's not like AMD and NV don't have a vaguely performance-equivalent product at every level.
 
Not sure why people are so adamant about this "Dev Kits have double the ram", the only reason that dev kits have had double the ram in the past is because the amount of ram has been tiny to begin with. Debugging programs don't magically bloat in size every few years, you would need at most around 2gb now. having an extra 4-6gb of ram would be massive overkill.
 
Originally Posted by jbug617:
One of the big console guys from AMD just left for Nvidia recently.
http://www.forbes.com/sites/briancau...md-for-nvidia/

Feldstein’s group has also locked up a spot in the next generation XBox and PlayStation, which have yet to be unveiled; as well as Nintendo’s next-generation Wii U, which was first unveiled last year. The work with Sony is known inside AMD as ‘Project Thebes

That's the same name that sweetvar26 gave

I haven't been following the thread but my roommate works with AMD/ATI. I know that he worked on something related to Wii you last year, the PS4 and now he just moved on to something related to the next Xbox.

As of what I've talked to him or heard, though they are doing project on the PS4 as well as the next Xbox, supposedly the Xbox project is on higher priority compared the PS4 and that they are developing something unique for it.

As for the PS4, he says that the CPU is made by AMD as well. Something that was developed couple of years ago, optimized and being used for the PS4.

Let me know if you guys have any questions and I can pass on to him, see if he can answer stuff.

I am not sure if I can post this stuff here and him getting into trouble for it, if true, I shall take down the post.

"Alright guys, got some information for you guys.

The PS4 AMD project called as Thebe. Previously it used to be based on Themesto and Callisto based chips but now that has been revised. They moved on to a chip called Jaguar replacing the Streamroller. They moved on to TSMC 28nm solution from the 32, which the streamroller is.

The whole thing basically is APU solutuion, they made the changes considering the 10 year product life cycle and to keep the initial product costs at minimum.

As of now it is called as the Thebe Jaguar project or TH-J."

http://www.neogaf.com/forum/showpost.php?p=37364710&postcount=1265

Good catch bgassassin & onQ123, this confirms several points:

1) PS4 will have 8 Jaguar cores and AMD GPU
2) Xbox 720 is AMD X86 and AMD GPU

This thread is now proved mostly false as there is no Nvidia GPU it's an AMD GPU and from the picture of the Microsoft Visual studio with the init for VMX proves it's X86.
 
Well lets get some info on this "source".


There's guessing like what you and steviep do and then there's guessing based on what he has in front of him. Very distinct difference. What we heard from whatever supposed unverified source is irrelevant. There is absolutely no proof that you can provide other than the vague "my sources said blah blah" schtick. He could have very well have opened the machine up. Whatever his reasoning may be, it seems a nth more convincing than anything you have brought to the table.

Specialguy even said that the guy told him he was a developer.

I'm with you on everything you've said in this thread. I've always had my doubts about these bgassassin and StevieP sources.
 

StevieP

Banned
I'm with you on everything you've said in this thread. I've always had my doubts about these bgassassin and StevieP sources.

Big surprise.

It's very well possible that what I've been told has been a complete fabrication and multiple people are fucking with me (and bg, despite one of his sources being a verified developer). It's also quite possible "DAE" is fucking with you, too.

You don't, however, out people for shits and giggles. With the slight possibility that these people aren't fucking with you, there comes the possibility that they are traced and lose their jobs just because other people on the internet want clarification. Just keep that in mind when you make posts like these.
 
I'm surprised no one is plucking at the way the article is written.

I mean, sure, Durango's alpha dev kit is a regular PC:

300x-1


That means the only indicative conservative target performance piece in there should be the GPU; just like the PowerMac G5's used as X360's Dev kits a few years back:

1487374415115817.JPG


It was a stock Power Mac dual 2.0GHz with 512 MB of system memory and an ATi X800 (R420) graphics card. 512 MB being the configuration because that stock model shipped with that ammount. In short that's a big bunch of nothing. that thing is just a PC.


But the article goes further into what-the-fuck-are-you-speculating land, here:

DaE also says that Microsoft is targeting an eight-core CPU for the final retail hardware - if true, this must surely be based around Atom architecture to fit inside the thermal envelope. The hardware configuration seems difficult to believe as it is so divorced from the technological make-up of the current Xbox 360, and we could find no corroborative sources to establish the Intel/NVIDIA hook-up, let alone the eight-core CPU.

Atom!? what are these guys smoking? makes no sense, seems like the guy was throwing mud into the wall but couldn't really interpret what he was saying; first off, there's no off the shelf octo-core atom configuration and most of all, it wouldn't make any sense to do; so chances are this ain't it, if true and octo-core it's a damn Xeon workstation and the dude just went leftfield on it. Probably not indicative in any way of what the final hardware will be; more like stock off the shelf workstation.


Atom makes absolutely no freaking sense for a console; we're talking about an architecture based on Pentium 1 whose performance is reduced and limited to in-order-execution; it's roughly half efficient per clock when compared to an ancient Pentium M; takes a beating from everything on the market too.

The top range Atom Cedarview gives out 5333 DMIPS @ 2.13 GHz; that's little over double than double the performance of the original Celeron 733 in the original Xbox; and in fact less powerful in integer than the X360 cpu on a per-core basis; 5333 DMIPS versus 6400 DMIPS per core (who also couldn't manage to triple Xbox 1 integer performance per clock despite having more than 4 times the clockrate; and this is not because Pentium III performance was mighty good at that point; and it really isn't now meaning this cpu is as subpar as it gets; adding more of them doesn't make them a good choice).

Multiplying 5333 per 8 cores gives it almost thrice the complexity (only one processor short of trippling X360's configuration) and only managing to double and surpass by a very little margin the X360 offering. This means that even current gen ports can be a pain in the behind if they go with atom as they can't even match the performance per core the X360/PS3 had; and surpassing it with more complexity (the need to use 8 cores) reminds me of Saturn or PS3, it's just dumb.

And we're talking about 2.5 DMIPS/MHz in Atom Cedarview/Valleyview (the last models); current ARM Cortex A9 are matching the very same 2.5 DMIPS/MHz and wasting less energy per clock so if they were changing architecture from PPC to something else, they'd be stupid to go with x86 if they were to use Atom's. Also, since ARM's are licensed to various manufacturers and produced in higher quantity chances are it would also be cheaper. I'm not saying ARM is likely though, I'm saying someone that suggest's Atom's is out of his mind, threw a speculation article into nowhere.

And I'm not even touching the floating performance of that turd; that being the only thing current gen CPU's had that could be considered good; the Atom hasn't.


The rest... The GPU doesn't really matter but being the odd-hardware-out of this gen can't help them, if the other two are using ATi then the knowhow of things ATi GPU's usually do right, less known features and performance optimizations/penalties will be somewhat shared between them (and thus, the games will be optimized for ATi from the ground up), making it so that the one with Nvidia the most radically different one. Can't see it being too much of a factor, but it does matter somewhat.

As for the memory configuration, sure it's a devkit (actually, it's a stock PC workstation preliminar development kit, but going along with the crappy article) but going over 4 GB on the real thing would be unresponsible and put them in a bad place when pricedrops became a reality amongst competitors; even 4 GB kinda does.

GDDR3/DDR3 chips exist in 512 MB configurations, so 8 GB would require 16 chips; 4 GB would require 8 chips. As for GDDR5, that only exists in 256 MB configurations, so we're talking 32 chips for 8 GB and 16 for 4 Gigabytes, even hybrid configurations would still mean a lot of chips when considering X360 only had 4 memory chips.

4 GB of memory this gen will be a homerun, suggesting 8-12 GB in the title of a spec leak of a future console is kind of a tabloid thing to do.


Very bad article, expected more from Digital Foundry.
 

Waaghals

Member
This thread is so damn depressing. All signs point to a console that is weaker graphically than the state of the art consumer PC at the time of release. I love consoles and I love PC's, and I have both so I ain't even mad, but is this pretty much the consensus? (The console line isn't static because of the whole "devs tapping into the hidden power of the console etc. etc." phenomenon)

Things have been going this way for some time.

Back in the early days of 3d it could take pc's a couple of years to fully catch up with consoles because most home computers didn't come with proper 3d accelerators.

3DFX had the Voodoo cards that could compete, but although these provided excellent gaming performance they weren't commonplace. Consoles, by comparison, had dedicated gaming hardware as standard.

By the time the X360 launched Both ATi and NVIDIA had been in the gaming GPU-space for quite some time. GPU dedicated to gaming use could be found even in laptops by then. It only took a few months before pc gamers could buy a GPU that was as modern and FASTER than the one in the 360.


Since then GPU's have grown bigger, much faster and much hotter.
Here's a comparison:

Geforce 7900 (2006)

Geforce 560 midrange card (2011)

Modern GPUs consume more power by themselves than a whole traditional console. Even if cost was no object it wouldn't be feasible to put a high end card in a small closed box.
 
nvidia gpu is shocking to me. regardless, there's no way microsoft is going to kill bc for 360 titles, especially because of xbla games. it would be a huge mistake and make people hesitant to purchase dd titles on the new system. there's no way they would be that foolish.

i'm in the beta for the new xbox dashboard and one thing i noticed is that indie games now show up under the 'demos' tab, as do xbla titles and the demos for retail games. everyone is aware of what apple has done with ios and i predict the next xbox is going to be much more 'open' to anyone that wants to develop for it. i suspect we may even see the different categories such as 'xbla' and 'indie' disappear altogether.
 

StevieP

Banned
lostinblue - the final console will have 8gb of ram, however it won't be of the faster/higher bandwidth variety most expect in consoles (i.e. the gddrs of the world) but a more consumer-PC type of memory.


Edit: in regards to your "Atom" speculation, while it's true it's not a very beefy core at all, there may be some point of reference there in regards to why that was speculated.

Steamroller was in the original target specs for the PS4. There have been some recent rumours of a "Jaguar" switch, and Jaguar is essentially AMD's answer to Atom (and Bobcat's successor) and very much meant to compete with the likes of Atom/ARM cores/IBM Bluegene type stuff, etc.

The thing that sets Jaguar apart is that AMD is, as jeff_rigby has mentioned, adding stuff that would make it more relevant to gaming such as VSX.
 
why does a devkit have twice the RAM than retail? i dont understand.

but anyway 4gb might be enough for next-gen consoles, i mean that is still quite a hell of a lot more than the 512mb the developers have been struggling with for so many years. i was afraid MS and Sony might only go for 2gb RAM... so 4-6gb is a relief to me, if this turns out to be true.

You need twice the memory for storing debug symbols and running the debugger.

@lostinblue
The X360 cpu was in order so was cell. In fact, the design being an in-order cpu makes a lot of sense.

Also, there's power draw to worry about. 4 atom cores run at ~2 ghz uses about 8 Watts. On the other hand, a Core i7 at ~2 Ghz uses about 55 W. If they're doing 8 atom cores, it's for power draw and heat dissipation.
 

Hoo-doo

Banned
lostinblue - the final console will have 8gb of ram, however it won't be of the faster/higher bandwidth variety most expect in consoles (i.e. the gddrs of the world) but a more consumer-PC type of memory.

Since when did you get your own sources? I thought you were mostly just a tech-savvy guy parroting what the other inside sources were saying? Where does the 8GB come from?
 
Kinect was popular with non-gamers. That backs Chittagong's hunch up - why pack it in when year 1-2 is core gamers anyway? If you can't deliver core games that benefit from kinect, why bundle it and increase costs (or reduce available budget for the core machine)?

Plus there is every possibility that mainstream gamers will be disillusioned with kinect - do we have any kinect software sales figures that show whether it's continuing to succeed?


The only reason they might bundle is to emphasise the windows 8/media box element, which is probably enough of a reason to bundle it

I would assume they want the non-gamer group to be early adopters as well due to all of the possible multimedia additions as well. The gaming enthusiast would be pretty much content with the console just playing the games with an online component to support Though on the enthusiast side I wouldn't be shocked to see MS try to shift from headsets to using Kinect's voice option. Headsets could still be used, but they wouldn't be pushed by MS to cut costs on their end.

That's the same name that sweetvar26 gave



"Alright guys, got some information for you guys.

The PS4 AMD project called as Thebe. Previously it used to be based on Themesto and Callisto based chips but now that has been revised. They moved on to a chip called Jaguar replacing the Streamroller. They moved on to TSMC 28nm solution from the 32, which the streamroller is.

The whole thing basically is APU solutuion, they made the changes considering the 10 year product life cycle and to keep the initial product costs at minimum.

As of now it is called as the Thebe Jaguar project or TH-J."

http://www.neogaf.com/forum/showpost.php?p=37364710&postcount=1265

Forgot about him saying that, though I did refer to him earlier about MS using AMD.

I'm with you on everything you've said in this thread. I've always had my doubts about these bgassassin and StevieP sources.

Yeah despite the fact that they gave me accurate PS4 info and Xbox 3 spec info I mentioned in this thread before DaE is virtually the same. The only difference is who is providing the chips. He guessed Intel and nVidia. Everything else has pointed to AMD. Including the Forbes article a few posts up unless you want to ignore that too.
 

AndyH

Neo Member
I've posted in threads to do with Durango a few times, and I'd like to repeat what I've heard down the grapevine once more. I have been told that the target specs are an x86 CPU with 8 low speed cores, a high end AMD GPU, and 8GB of RAM. 2 of these cores, and 3GB of RAM are dedicated to the OS.

I don't know who StevieP and BGs sources are but they are saying similar things to mine. I was told these specs months before anyone even dreamed that they would be this high and was ridiculed when I posted them (by StevieP and BG), so when even they have changed their tune you know it is likely to be true.

Nobody will out their source because we don't want anyone to get fired over this shit.
 

Durante

Member
lostinblue:
I already mentioned that I'm really skeptical about the Atom part of the article. Just throwing it out there based on flimsy reasoning is bad enough, but doing so with a "must surely" is just hilarious.

But I can't agree on the memory part. If they're going with DDR3/4 I can easily see 8GB happening. And you're just flat-out wrong on 360: the launch system had 8 memory chips, not 4.

You need twice the memory for storing debug symbols and running the debugger.
Maybe your executable will be twice (or even 5 times) as large. But that doesn't mean you need twice as much memory, since the vast majority of that is used up by assets, which are the same size regardless of build.
 
If these rumours are to be true (some of them) then Microsoft will allocate more RAM to the OS alone than Sony will have in the machine altogether. Crikey.
 
Maybe your executable will be twice (or even 5 times) as large. But that doesn't mean you need twice as much memory, since the vast majority of that is used up by assets, which are the same size regardless of build.

Uh, the debugger must keep the full program in memory while debugging. The debugger needs to be able to quickly view assets.
 
If this is to be the last round of consoles as we know it as many believe, surely putting out a mammoth box now is a good idea? Sony had issues with RAM with the PS3. Surely they won't fall into another RAM trap.

It's not the last round of consoles. If anything, this gen taught us that making long generations does not work cause software sales can and will collapse. It also becomes increasingly difficult to launch new IPs since the console audience will already have stratified to a given set of games.
 

StevieP

Banned
If this is to be the last round of consoles as we know it as many believe, surely putting out a mammoth box now is a good idea? Sony had issues with RAM with the PS3. Surely they won't fall into another RAM trap.

The reason why "2GB of GDDR5 in a unified architecture" with the provision that "we may have 4GB if densities increase in time" was mentioned in target specs is because it's using a faster type of memory that only comes in lower densities, unlike Microsoft's console.

There have been rumours that Sony's internal development teams are pushing Sony to bite the bullet, density increase or not, but those are the rumours. The console is launching late 2013, so whatever is available to them reasonably will be where they go if I were to guess. lostinblue is correct in that adding a lot of complexity to your motherboards impacts future cost reduction greatly and after this generation Sony (and Microsoft, and Nintendo) are very sensitive to that. Die shrinks will be much less of a factor than it was in previous generations as well, so initial price matters.

In regards to it being the 'last' console generation, don't forget about streaming. I am also of the belief that a 5-year generation will be more likely than an 8-year generation this time around if streaming isn't yet viable.

Hoo-doo said:
Since when did you get your own sources? I thought you were mostly just a tech-savvy guy parroting what the other inside sources were saying? Where does the 8GB come from?

Please continue to laugh at my tag and have yourself a wonderful day.
 

chaosblade

Unconfirmed Member
As for the memory configuration, sure it's a devkit (actually, it's a stock PC workstation preliminar development kit, but going along with the crappy article) but going over 4 GB on the real thing would be unresponsible and put them in a bad place when pricedrops became a reality amongst competitors; even 4 GB kinda does.

GDDR3/DDR3 chips exist in 512 MB configurations, so 8 GB would require 16 chips; 4 GB would require 8 chips. As for GDDR5, that only exists in 256 MB configurations, so we're talking 32 chips for 8 GB and 16 for 4 Gigabytes, even hybrid configurations would still mean a lot of chips when considering X360 only had 4 memory chips.

4 GB of memory this gen will be a homerun, suggesting 8-12 GB in the title of a spec leak of a future console is kind of a tabloid thing to do.

This whole post is full of awesome, but this part in particular is worth quoting and emphasizing to the people expecting oceans of RAM.


Going with a ton of slow RAM would be an interesting choice, but I'd imagine it might put the console in a bit of a predicament with multiplatform development.
 
If this is to be the last round of consoles as we know it as many believe, surely putting out a mammoth box now is a good idea? Sony had issues with RAM with the PS3. Surely they won't fall into another RAM trap.

What do you mean by, "last round of consoles?"

It's not the last round of consoles. If anything, this gen taught us that making long generations does not work cause software sales can and will collapse. It also becomes increasingly difficult to launch new IPs since the console audience will already have stratified to a given set of games.

I agree. This generation has gone on for far too long for nearly 7 years (way, way past the usual 5-6 year generation mark), is dried up, & is getting very stagnant. It's time for new consoles to arrive, especially when consumers & publishers/developers themselves want newer hardware.
 

RoboPlato

I'd be in the dick
If these rumours are to be true (some of them) then Microsoft will allocate more RAM to the OS alone than Sony will have in the machine altogether. Crikey.
If Sony winds up waiting until 2014 and uses GDDR5 as rumored then there won't be much of a difference at all. A lot of rumors hint at devs not being satisfied with 2Gb so hopefully Sony is listening. It's in their best interest to make a powerful, dev friendly system this time around.

EDIT: Beaten by StevieP
 
It's not the last round of consoles. If anything, this gen taught us that making long generations does not work cause software sales can and will collapse. It also becomes increasingly difficult to launch new IPs since the console audience will already have stratified to a given set of games.

I'm expecting these consoles to turn into streaming boxes after a while. Then cheaper variants without disc drives to launch solely for streaming.
 
Stevie and BG been upsetting folks in here? Don't listen to them, the nextbox and PS4 WILL be more powerful than a current day $3000 gaming PC and even do things they cannot, and at a fraction of the cost besides. Yeah. Anyone with an ounce of common sense might call bullshit on what I just said, but there's a simple explanation; MS and Sony are secretly developing their new machines alongside Jesus as acting chief architect, and he can turn water in to wine so imagine what he can do with a solder iron.
 

Hoo-doo

Banned
Please continue to laugh at my tag and have yourself a wonderful day.

I have seen you post for months in the WUS-threads. Not once have I seen you lay down actual inside scoops or inside knowledge.

Oh wait, I have seen it once, it blew up in your face and earned you the tag in question.
What changed?
 
Stevie and BG been upsetting folks in here? Don't listen to them, the nextbox and PS4 WILL be more powerful than a current day $3000 gaming PC and even do things they cannot, and at a fraction of the cost besides. Yeah. Anyone with an ounce of common sense might call bullshit on what I just said, but there's a simple explanation; MS and Sony are secretly developing their new machines alongside Jesus as acting chief architect, and he can turn water in to wine so imagine what he can do with a solder iron.
You won't last long here junior posting like that.
 

Durante

Member
Uh, the debugger must keep the full program in memory while debugging. The debugger needs to be able to quickly view assets.
I have no idea what you are talking about. Are you trying to claim that assets are stored twice when debugging?

This whole post is full of awesome, but this part in particular is worth quoting and emphasizing to the people expecting oceans of RAM.
Yeah, let's quote and emphasize the part that's wrong.
 
Top Bottom