• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

VGLeaks: First look as Durango XDK (always connected, kinect required, must install)

I am pretty skeptical about this. The infrastructure in the US will make this fail to the PS4. Of course I take this article with a grain of salt.
 
The current devkits have 7970's inside them. If games were designed using that part, when moving to beta devkits (which btw most third parties don't have yet, and are quite pissed about) they will receive a downgrade of some form.
wtf are you talking about?! 7970? what? how? since when?
 

ekim

Member
The current devkits have 7970's inside them. If games were designed using that part, when moving to beta devkits (which btw most third parties don't have yet, and are quite pissed about) they will receive a downgrade of some form.

That rather implies that MS targeted a gaming (not raw power) performance of a 7970. I don't think that games will be downgraded.
 
The current devkits have 7970's inside them. If games were designed using that part, when moving to beta devkits (which btw most third parties don't have yet, and are quite pissed about) they will receive a downgrade of some form.

That seems unbelievable. Developers wil be targetting a set specification, sp even if some internal specs are changed, games would still have been developed with a set spec in mind.

What you're suggesting simply isn't beliveable.
 
Hmm, look what I found

http://arstechnica.com/gaming/2013/...4s-hardware-power-controller-features-at-gdc/

Speaking of memory, Norden hyped up the 8GB of GDDR5 RAM in the system as the type of memory that's currently usually found only on high-end graphics cards. Calling the RAM "expensive" and "exotic," Norden stressed that you "can't buy this [RAM] for 50 bucks... that's why high-end graphics cards cost as much as they do."

Anyways, article may deserve a new thread? GDC presentation writeup.
 
But (ignoring rumored reservations, which can be changed later anyway)...

0% more CPU
0% more setup engines
0% more RAM quantity...

And the bandwidth numbers might be misleading since Durango has ESRAM to work around that. It's expensive and it's there for a reason. PS3 also has 2X the bandwidth of 360 this gen, doesn't matter.

Get out of here Stalker.

true i suppose. but again, setup engines same, triangles throughput same, etc.

go look at 7770 (durango) vs 7850 (orbis) benches. same game same settings might be say, 23 fps vs 35 or something like that. big, but somehow not ginormous,



yes you can, more or less. since ps3 can and does texture from xdr. the ps3 gpu gets fed with ~2x the main bandwidth the 360 gpu does.

But the 360 has 10MB of EDRAM...turns out to be pretty important...

Like I said, it's not there for nothing, it's expensive. Acting like it doesn't exist is silly.

Really, just go away. You keep doing this shit. You say stuff, and it doesn't make any fucking sense. And the eDRAM in the 360 was important because it provided much higher bandwidth than the PS3 can provide (and no, the PS3 can't provide "double" that of the 360, it doesn't fucking work like that.)

It's much easier for one company to paint a prettier picture when the other one hasn't announced anything. If the roles were reversed, Microsoft would have done the same.

I agree, 100%. That's why I think MS is crazy for not saying anything.
 

onQ123

Member
The current devkits have 7970's inside them. If games were designed using that part, when moving to beta devkits (which btw most third parties don't have yet, and are quite pissed about) they will receive a downgrade of some form.

So what's in the Beta Kits? and also if devs don't have the beta kits with the final specs yet where was the rumors that the Xbox 3 & PS4 games would only be slightly different coming from?
 

jaosobno

Member
That seems unbelievable. Developers wil be targetting a set specification, sp even if some internal specs are changed, games would still have been developed with a set spec in mind.

What you're suggesting simply isn't beliveable.

Well his info so far has been almost 100% correct. There is no reason to doubt him.

If what thuway says it's true, we are looking at 1.2 (7770) vs 3.8 TFLOPS (7970). That's a 2.6 TFLOPS of downgrade right there. Such a move will undoubtedly cause quite a noticable graphical downgrade (unless developers knew that the final GPU will be much weaker and never used Radeon 7970 all the way).

Microsoft could very well allow developers to show much better looking games simply to start "Durango is better than PS4" post E3 talk and generate (albeit fake) hype. This would of course backfire immediately after Durango release when alpha vs final build comparisons start. Regardless of backfiring, initial (E3) impressions would likely stick in many gamers' heads and that alone could be enough for many to consider Durango equal or even better than PS4.

Another explanation is that perhaps Microsoft once pursued more powerful machine (initial rumors all pointed to Durango being more powerful than PS4) and gave up eventually deciding that price&media center capabilities > gaming power.
 

KageMaru

Member
What I mean is the power of these consoles will lead to lots of situations where games, because they can look even better on super high end pc hardware, may more regularly be displayed on pc than ever before, because devs will feel they can achieve more or less a similar look with changes more appropriate for each console without people noticing. I really believe people are underestimating just how difficult it will be for people to nitpick a really talented developers game visually this gen. A lot of power has been providing for them to do things for real they had to fake before, but now they have an even greater ability to fake ever more impressive things. Art and game design are going to be hugely important this gen for further distinguishing your game from all the other amazing titles that will surely be out there.

------- Oops, sorry, double post

Great posts and agreed that it will be hard for many people to tell the differences between these versions. IIRC a dev here pointed out how people here believe the GC was as powerful as the xbox, even though the xbox GPU was 3x faster than the GPU in the GC. If people can't see the difference in power with that performance gap, they may not see it with a gap that's much smaller.
 

flattie

Member
Well his info so far has been almost 100% correct. There is no reason to doubt him.

If what thuway says it's true, we are looking at 1.2 (7770) vs 3.8 TFLOPS (7970). That's a 2.6 TFLOPS of downgrade right there. Such a move will undoubtedly cause quite a noticable graphical downgrade (unless developers knew that the final GPU will be much weaker and never used Radeon 7970 all the way).

Microsoft could very well allow developers to show much better looking games simply to start "Durango is better than PS4" post E3 talk and generate (albeit fake) hype. This would of course backfire immediately after Durango release when alpha vs final build comparisons start. Regardless of backfiring, initial (E3) impressions would likely stick in many gamers' heads and that alone could be enough for many to consider Durango equal or even better than PS4.
Another explanation is that perhaps Microsoft once pursued more powerful machine (initial rumors all pointed to Durango being more powerful than PS4) and gave up eventually deciding that price&media center capabilities > gaming power.

That strategy didn't work out too well for Sony and the PS3...Would MS really risk the ridicule from what will be the potential early adopters?
 

ekim

Member
Well his info so far has been almost 100% correct. There is no reason to doubt him.

If what thuway says it's true, we are looking at 1.2 (7770) vs 3.8 TFLOPS (7970). That's a 2.6 TFLOPS of downgrade right there. Such a move will undoubtedly cause quite a noticable graphical downgrade (unless developers knew that the final GPU will be much weaker and never used Radeon 7970 all the way).

Microsoft could very well allow developers to show much better looking games simply to start "Durango is better than PS4" post E3 talk and generate (albeit fake) hype. This would of course backfire immediately after Durango release when alpha vs final build comparisons start. Regardless of backfiring, initial (E3) impressions would likely stick in many gamers' heads and that alone could be enough for many to consider Durango equal or even better than PS4.

Another explanation is that perhaps Microsoft once pursued more powerful machine (initial rumors all pointed to Durango being more powerful than PS4) and gave up eventually deciding that price&media center capabilities > gaming power.

The real problem is, that you can't compare graphical output by the flop count of the used GPU. Just see it this way : 1.2 TFlops GPU in a closed environment >>> 1.2 TFlops in a PC.
 

jaosobno

Member
That strategy didn't work out too well for Sony and the PS3...Would MS really risk the ridicule from what will be the potential early adopters?

Sure, there would be ridicule, but maybe Microsoft believes that it's worth the risk. Besides, their behavior regarding next gen is bizarre at best. They are dead silent about Durango, they are allowing Sony to spread their wings and generate hype without even attempting to interfere. Articles how "PS4 is great, easy to work with and powerful" are normal everyday thing and how does Microsoft react to that - with dead silence.

Considering the above, execs at Microsoft could very well think it's a smart move.

The real problem is, that you can't compare graphical output not by the flop count of the used GPU. Just see it this way : 1.2 TFlops GPU in a closed environment >>> 1.2 TFlops in a PC.

I am very much aware of that. I was comparing 7770 (rumored Durango GPU equivalent) against 7970 in a devkit (also a closed environment). It's true that there is probably much more overhead in a devkit then in a final console (devkit being development environment), but I very much doubt that the overhead is so great that it would require 2.6 extra TFLOPS.
 

ekim

Member
Sure, there would be ridicule, but maybe Microsoft believes that it's worth the risk. Besides, their behavior regarding next gen is bizarre at best. They are dead silent about Durango, they are allowing Sony to spread their wings and generate hype without even attempting to interfere. Articles how "PS4 is great, easy to work with and powerful" are normal everyday thing and how does Microsoft react to that - with dead silence.

Considering the above, execs at Microsoft could very well think it's a smart move.



I am very much aware of that. I was comparing 7770 (rumored Durango GPU equivalent) against 7970 in a devkit (also a closed environment). It's true that there is probably much more overhead in a devkit then in a final console (devkit being development environment), but I very much doubt that overhead is so great that it would require 2.6 extra TFLOPS.

Well the non-beta devkits lack the move engines, the ESRAM pool and several other Durango-GPU stuff which is compensated by bruteforcing stuff into the 7970 (if true) and thus minimizing the practical power gap between both. So I don't think there will a downgrade as suggested by thuway. Devs just have to make use of the distinct Beta-GPU features once they get their hands on the Beta-Devkits.
 

McHuj

Member
When the alpha kits went out, MS had a pretty good idea of what the APU was going to be capable of. They didn't design a chip, but then gave out a devkit at the same time that wasn't going to be representative (or close to) the final hardware.

The devkit could have a 7970, I don't think that's far fetched at all. We just don't know the reason why.

Outside of the obvious stuff (like number of CUs, TMU's and ROP's) are the 77xx, 78xx, 79xx series at all different? I know double precision flops on the 79xx are 1/4 the single precision flops and on the 78xx they are 1/16. I doubt that would matter. Are the cache sizes different?
The 7970 comes with a 384-bit bus perhaps they need a board with a bigger bus and bandwidth to emulate the eSRAM and main memory pools.

The alpha kit would not only be running the emulated xbox hardware, but some level of debugger in the same system, so it's not inconceivable, that you need to much greater hardware to simulate that.
 

jaosobno

Member
Well the non-beta devkits lack the move engines, the ESRAM pool and several other Durango-GPU stuff which is compensated by bruteforcing stuff into the 7970 (if true) and thus minimizing the practical power gap between both. So I don't think there will a downgrade as suggested by thuway. Devs just have to make use of the distinct Beta-GPU features once they get their hands on the Beta-Devkits.

Ah, wasn't aware of this. If that's the case, then you are probably right, they could be bruteforcing these features on 7970.

But I still do wonder, isn't 2.6 extra TFLOPS a bit excessive for emulating these features? We are talking about surplus of power greater than the entire FLOPS count of the final Durango.
 

McHuj

Member
Ah, wasn't aware of this. If that's the case, then you are probably right, they could be bruteforcing these features on 7970.

But I still do wonder, isn't 2.6 extra TFLOPS a bit excessive for emulating these features? We are talking about surplus of power greater than the entire FLOPS count of the final Durango.

I think the bigger issue with emulating something like eSRAM is available bandwidth and not necessarily the flops count.

In theory the Durango GPU can read from both system memory and eSRAM at a combined ~170 GB/sec. I don't know when the alpha kits went out, but I think at the time, only the 79xx series had bandwidth greater than that. So I think only those boards could come close to emulating that the memory of the Durango.

We also don't know if it was a stock 7970 or one with disabled units. It wouldn't surprise me if it had disabled CU's and ROP's, but full bandwidth capability.
 
Ah, wasn't aware of this. If that's the case, then you are probably right, they could be bruteforcing these features on 7970.

But I still do wonder, isn't 2.6 extra TFLOPS a bit excessive for emulating these features? We are talking about surplus of power greater than the entire FLOPS count of the final Durango.

Keep in mind PC's have higher API overheads and things like PCI bandwidth limitations.

Small things add up here and there. Final silicon won't be nearly as powerful, but since the chips are all directly on the PCB, it'll be much more efficient. No interfaces for it to deal with.
 
Keep in mind PC's have higher API overheads and things like PCI bandwidth limitations.

Small things add up here and there. Final silicon won't be nearly as powerful, but since the chips are all directly on the PCB, it'll be much more efficient. No interfaces for it to deal with.

Do you also believe that there will be a drop in graphics once the devs are given the final kits or will they be about the same?
 

ekim

Member
Ah, wasn't aware of this. If that's the case, then you are probably right, they could be bruteforcing these features on 7970.

But I still do wonder, isn't 2.6 extra TFLOPS a bit excessive for emulating these features? We are talking about surplus of power greater than the entire FLOPS count of the final Durango.

Well I don't know the exact math behind it, but I guess MS knows what they are doing.
 
Do you also believe that there will be a drop in graphics once the devs are given the final kits or will they be about the same?

Well, they aren't going to know the constraints of the system until they use it. Why not shoot higher and then scale back later? It's easier to get the big work done and then drop down.

IIRC PS3 and 360 early works did the same thing.
 

thuway

Member
BTW I am predicating a downgrade based on what developer's have at the moment. Most third parties don't have beta devkits yet and are using uber powerful alpha kits with 7970s. You can make a damn gorgeous looking demo using those kits. The next devkits will break into the stage of optimization which will be a task that will last until the days of release.

Unlike MS, Sony worked in revese. They had hardware that was underpowered and slowly made the kits better as time went on. We are now at a point where the specs are official and any one with half a brain can approximate the results of what you can get.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
The current devkits have 7970's inside them. If games were designed using that part, when moving to beta devkits (which btw most third parties don't have yet, and are quite pissed about) they will receive a downgrade of some form.

I find it hard to believe that beta kits which were released last year are not in the hands of anyone working on Durango games 3+ months later.
 

ekim

Member
BTW I am predicating a downgrade based on what developer's have at the moment. Most third parties don't have beta devkits yet and are using uber powerful alpha kits with 7970s. You can make a damn gorgeous looking demo using those kits. The next devkits will break into the stage of optimization which will be a task that will last until the days of release.

IMHO devs will get the optimization done rather fast and games will run and look just like on the alpha kit.
 

KageMaru

Member
But the GPU in the Alpha kits are significantly more powerful.

None of this makes a difference if MS has properly communicated what to expect from the final system.

The hardware is there for the devs to have something work on and become familiar with the tools. The final config of the system shouldn't be a surprise to any dev that's been receiving dev kits this whole time.
 

JJD

Member
Why would MS still have not released the beta dev kits to third parties? Did they have any serious problems at the foundries when making the chips?

As far as we know MS opted for the Durango project quite some time ago, and the specs didn't change that much. This is strange.

They should be way ahead of this.

Are the PS4 beta dev kits out in the open already?
 
Why would MS still have not released the beta dev kits to third parties?

As far as we know MS opted for the Durango project quite some time ago, and the specs didn't change that much. This is strange.

They should be way ahead of this.

Are the PS4 beta dev kits out in the open already?

Well some devs have said that they don't have a MS kit at all. I'm sure some of the larger devs may have them by now. It isn't completely unbelievable that the dev kits may not have arrived to everyone as of yet.
 

SpaceHobo

Banned
So the whole "MS is going to show high end PC footage then hope no one notices the downgrade at launch" thing is being forwarded again ?

A lot of the stuff written is starting to reek , why the desperation for MS to fail ? It's also a little embarrassing that some are desperate to believe any so called insider as long as the news is negative.
 

jaosobno

Member
Why would MS still have not released the beta dev kits to third parties? Did they have any serious problems at the foundries when making the chips?

I doubt it's a chip production issue. You don't need nearly as much chips for devkits as you need for a mass console production. Even if they are facing low yields, they should have been able to produce beta kits by now.

Must be something else...

So the whole "MS is going to show high end PC footage then hope no one notices the downgrade at launch" thing is being forwarded again ?

A lot of the stuff written is starting to reek , why the desperation for MS to fail , it's embarrassing the desperation to believe any so called insider as long as the news is negative.

There is no desperation for them to fail, we are simply trying to analyze the situation. And you can hardly call thuway "so called insider" (I presume you are referring to him) since he, along with Bruce Lee Roy, provided the most accurate info regarding next gen.
 
But the GPU in the Alpha kits are significantly more powerful.

If you ask me 7970 rumor is bullshit or the 1.2Tflops beta gpu is bullshit.
That's a 2.6 TFlops something difference if you ask me that is fucking over kill.

That would mean the Beta gpu must have 100% efficiency and the alpha must operate on 32% efficiency. Which is bullshit. You could probably simulate everything you needed with a 7870.

Something doesn't add up.

Or they changed directions and vision when they went from alpha to beta design.
Alpha design xbox division was still run with gamers in mind maybe pre kinect raze?
And when beta was designed post kinect raze leadership was flushed and the MBA took over.
Wanted more of a service box instead of a gamer box.

Or the documentations are missing information microsoft only disclosed in person to the techleads also a way to combat leaks from documents.

Or maybe i just want a stronger box then rumors are pointing at.
/Tin foil hat ramblings of a disappointed gamer.
 

tinfoilhatman

all of my posts are my avatar
So the whole "MS is going to show high end PC footage then hope no one notices the downgrade at launch" thing is being forwarded again ?

A lot of the stuff written is starting to reek , why the desperation for MS to fail , it's embarrassing the desperation to believe any so called insider as long as the news is negative.

Welcome to NeoGaf........if anything MS has been far far far far far more honest about what they've shown than Sony over the years.
 
None of this makes a difference if MS has properly communicated what to expect from the final system.

The hardware is there for the devs to have something work on and become familiar with the tools. The final config of the system shouldn't be a surprise to any dev that's been receiving dev kits this whole time.

They'll have a general idea, but they still don't know the full constraints until they use it. What's hard to get? I'm not saying it'll go from PC super duper settings to shit, I'm saying it won't look the same.
 
Lol, talk about moving goal posts. You don't have that info.

Of course I have that info. And so do you if you're on the Internet. You realize that each corporation publicly states their earnings each and every year, right? And LOL at "moving goalposts". Success in business isn't measured by volume. It's measured by profit.
 
A slight part of me wants the system to be excellent just to see the meltdowns, that goes for the ones posting negative garbage about PS4 too.

I get the "trolling for kicks" but for me it becomes boring after the 48th time of "Kinect 2 Box RROD" or people hoping PS4 is $599 again just so it fails.
 

KageMaru

Member
They'll have a general idea, but they still don't know the full constraints until they use it. What's hard to get? I'm not saying it'll go from PC super duper settings to shit, I'm saying it won't look the same.

They have more than a general idea. They should know target amount of cores, clock rates, custom features, purpose of these features, amount of memory, memory type, theoretical bandwidth, etc.

Below is an example of documentation that MS sent out to devs in June of 2004 for the 360, about a year and a half before the system launched. If you read it, you'll see how close the target specs are to the final hardware, regardless of how early this document was sent out...

Basic Hardware Specifications
Xenon is powered by a 3.5+ GHz IBM PowerPC processor and a 500+ MHz ATI graphics processor. Xenon has 256+ MB of unified memory. Xenon runs a custom operating system based on Microsoft® Windows NT®, similar to the Xbox operating system. The graphics interface is a superset of Microsoft® Direct3D® version 9.0.

CPU
The Xenon CPU is a custom processor based on PowerPC technology. The CPU includes three independent processors (cores) on a single die. Each core runs at 3.5+ GHz. The Xenon CPU can issue two instructions per clock cycle per core. At peak performance, Xenon can issue 21 billion instructions per second.
The Xenon CPU was designed by IBM in close consultation with the Xbox team, leading to a number of revolutionary additions, including a dot product instruction for extremely fast vector math and custom security features built directly into the silicon to prevent piracy and hacking.
Each core has two symmetric hardware threads (SMT), for a total of six hardware threads available to games. Not only does the Xenon CPU include the standard set of PowerPC integer and floating-point registers (one set per hardware thread), the Xenon CPU also includes 128 vector (VMX) registers per hardware thread. This astounding number of registers can drastically improve the speed of common mathematical operations.
Each of the three cores includes a 32-KB L1 instruction cache and a 32-KB L1 data cache. The three cores share a 1-MB L2 cache. The L2 cache can be locked down in segments to improve performance. The L2 cache also has the very unusual feature of being directly readable from the GPU, which allows the GPU to consume geometry and texture data from L2 and main memory simultaneously.
Xenon CPU instructions are exposed to games through compiler intrinsics, allowing developers to access the power of the chip using C language notation.

GPU
The Xenon GPU is a custom 500+ MHz graphics processor from ATI. The shader core has 48 Arithmetic Logic Units (ALUs) that can execute 64 simultaneous threads on groups of 64 vertices or pixels. ALUs are automatically and dynamically assigned to either pixel or vertex processing depending on load. The ALUs can each perform one vector and one scalar operation per clock cycle, for a total of 96 shader operations per clock cycle. Texture loads can be done in parallel to ALU operations. At peak performance, the GPU can issue 48 billion shader operations per second.
The GPU has a peak pixel fill rate of 4+ gigapixels/sec (16 gigasamples/sec with 4? antialiasing). The peak vertex rate is 500+ million vertices/sec. The peak triangle rate is 500+ million triangles/sec. The interesting point about all of these values is that theyre not just theoreticalthey are attainable with nontrivial shaders.
Xenon is designed for high-definition output. Included directly on the GPU die is 10+ MB of fast embedded dynamic RAM (EDRAM). A 720p frame buffer fits very nicely here. Larger frame buffers are also possible because of hardware-accelerated partitioning and predicated rendering that has little cost other than additional vertex processing. Along with the extremely fast EDRAM, the GPU also includes hardware instructions for alpha blending, z-test, and antialiasing.
The Xenon graphics architecture is a unique design that implements a superset of Direct3D version 9.0. It includes a number of important extensions, including additional compressed texture formats and a flexible tessellation engine. Xenon not only supports high-level shading language (HLSL) model 3.0 for vertex and pixel shaders but also includes advanced shader features well beyond model 3.0. For instance, shaders use 32-bit IEEE floating-point math throughout. Vertex shaders can fetch from textures, and pixel shaders can fetch from vertex streams. Xenon shaders also have the unique ability to directly access main memory, allowing techniques that have never before been possible.
As with Xbox, Xenon will support precompiled push buffers (command buffers in Xenon terminology), but to a much greater extent than the Xbox console does. The Xbox team is exposing and documenting the command buffer format so that games are able to harness the GPU much more effectively.
In addition to an extremely powerful GPU, Xenon also includes a very high-quality resize filter. This filter allows consumers to choose whatever output mode they desire. Xenon automatically scales the games output buffer to the consumer-chosen resolution.

Memory and Bandwidth
Xenon has 256+ MB of unified memory, equally accessible to both the GPU and CPU. The main memory controller resides on the GPU (the same as in the Xbox architecture). It has 22.4+ GB/sec aggregate bandwidth to RAM, distributed between reads and writes. Aggregate means that the bandwidth may be used for all reading or all writing or any combination of the two. Translated into game performance, the GPU can consume a 512?512?32-bpp texture in only 47 microseconds.
The front side bus (FSB) bandwidth peak is 10.8 GB/sec for reads and 10.8 GB/sec for writes, over 20 times faster than for Xbox. Note that the 22.4+ GB/sec main memory bandwidth is shared between the CPU and GPU. If, for example, the CPU is using 2 GB/sec for reading and 1 GB/sec for writing on the FSB, the GPU has 19.4+ GB/sec available for accessing RAM.
Eight pixels (where each pixel is color plus z = 8 bytes) can be sent to the EDRAM every GPU clock cycle, for an EDRAM write bandwidth of 32 GB/sec. Each of these pixels can be expanded through multisampling to 4 samples, for up to 32 multisampled pixel samples per clock cycle. With alpha blending, z-test, and z-write enabled, this is equivalent to having 256 GB/sec of effective bandwidth! The important thing is that frame buffer bandwidth will never slow down the Xenon GPU.

Audio
The Xenon CPU is a superb processor for audio, particularly with its massive mathematical horsepower and vector register set. The Xenon CPU can process and encode hundreds of audio channels with sophisticated per-voice and global effects, all while using a fraction of the power of a single CPU core.
The Xenon system south bridge also contains a key hardware component for audioXMA decompression. XMA is the native Xenon compressed audio format, based on the WMA Pro architecture. XMA provides sound quality higher than ADPCM at even better compression ratios, typically 6:112:1. The south bridge contains a full silicon implementation of the XMA decompression algorithm, including support for multichannel XMA sources. XMA is processed by the south bridge into standard PCM format in RAM. All other sound processing (sample rate conversion, filtering, effects, mixing, and multispeaker encoding) happens on the Xenon CPU.
The lowest-level Xenon audio software layer is XAudio, a new API designed for optimal digital signal processing. The Xbox Audio Creation Tool (XACT) API from Xbox is also supported, along with new features such as conditional events, improved parameter control, and a more flexible 3D audio model.

Input/Output
As with Xbox, Xenon is designed to be a multiplayer console. It has built-in networking support including an Ethernet 10/100-BaseT port. It supports up to four controllers. From an audio/video standpoint, Xenon will support all the same formats as Xbox, including multiple high-definition formats up through 1080i, plus VGA output.
In order to provide greater flexibility and support a wider variety of attached devices, the Xenon console includes standard USB 2.0 ports. This feature allows the console to potentially host storage devices, cameras, microphones, and other devices.

Storage
The Xenon console is designed around a larger world view of storage than Xbox was. Games will have access to a variety of storage devices, including connected devices (memory units, USB storage) and remote devices (networked PCs, Xbox Live). At the time of this writing, the decision to include a built-in hard disk in every Xenon console has not been made. If a hard disk is not included in every console, it will certainly be available as an integrated add-on component.
Xenon supports up to two attached memory units (MUs). MUs are connected directly to the console, not to controllers as on Xbox. The initial size of the MUs is 64 MB, although larger MUs may be available in the future. MU throughput is expected to be around 8 MB/sec for reads and 1 MB/sec for writes.
The Xenon game disc drive is a 12? DVD, with an expected outer edge throughput of 16+ MB/sec. Latency is expected to be in the neighborhood of 100 ms. The media format will be similar to Xbox, with approximately 6 GB of usable space on the disk. As on Xbox, media will be stored on a single side in two 3 GB layers.

Industrial Design
The Xenon industrial design process is well under way, but the final look of the box has not been determined. The Xenon console will be smaller than the Xbox console.
The standard Xenon controller will have a look and feel similar to the Xbox controller. The primary changes are the removal of the Black and White buttons and the addition of shoulder buttons. The triggers, thumbsticks, D-pad, and primary buttons are essentially unchanged. The controller will support vibration.

They aren't going to change for Durango and suddenly leave developers in the dark. I agree it won't look the same, but that's a given and another thing that devs would also already know.
 

TRios Zen

Member
If you ask me 7970 rumor is bullshit or the 1.2Tflops beta gpu is bullshit.
That's a 2.6 TFlops something difference if you ask me that is fucking over kill.

That would mean the Beta gpu must have 100% efficiency and the alpha must operate on 32% efficiency. Which is bullshit. You could probably simulate everything you needed with a 7870.

Something doesn't add up.

Or they changed directions and vision when they went from alpha to beta design.
Alpha design xbox division was still run with gamers in mind maybe pre kinect raze?
And when beta was designed post kinect raze leadership was flushed and the MBA took over.
Wanted more of a service box instead of a gamer box.

Or the documentations are missing information microsoft only disclosed in person to the techleads also a way to combat leaks from documents.

Or maybe i just want a stronger box then rumors are pointing at.
/Tin foil hat ramblings of a disappointed gamer.

Honestly not that big of a tech head so admittedly a lot of this conversation goes over my head.

But you are right - something isn't adding up here.

Specifically, I've heard devs/websites mention that the two systems are close in power, however leaked specs don't show any real "closeness" in power, they show a significant advantage to PS4. You hear estimations that MS is going in a different direction with this system, yet they use a significantly overpowered card (7970) in dev kits (compared again to our leaked specs). Why would they do that if they don't expect their performance to be anywhere near that?

I'm not expecting a sudden reveal where the 720 is SUPER-ULTRA powered...it just seems like we are missing a piece to the puzzle, given the information we currently have.

Unfortunately, till the official reveal, we won't know for sure, and MS seems perfectly content in their silence right now.
 
Top Bottom