• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

MS Dances Around 1080p Support: IGN Interview

Bad_Boy

time to take my meds
Y2Kevbug11 said:
Well that's my point. VGA is not HDMI. So how is it possible?

Must be a loophole then? Or is the AACS (is this even applicable to HD-DVD?) clear about just not allowing component?
some sort of loophole I'm guessing for VGA as Onix said.
 

aaaaa0

Member
Y2Kevbug11 said:
I don't understand how you can get 1080p HD-DVD movies out of the 360. I thought it was against AACS regulations for 1080p signal to be sent through anything other than HDMI?

There is a loophole for analog VGA output in AACS. You can use any resolution that a computer was widely capable of displaying as of June 1st 2004, as long as the movie in question does not have both the Digital Only Token or the Image Constraint Tokens asserted. Neither token is scheduled to be enabled by movie studios until 2011.

In 2004, many widely available video cards and monitors were capable of resolutions much higher than 1920x1080p, so VGA is allowed to do 1920x1080p in analog.

See the AACS Interim Adopter License Agreement, page 92, table "AACS Authorized Analog Outputs".
 
Our platform is flexible enough to allow support of a digital connection in the future should we choose to do so.
What exactly do they mean by "platform"? Does that mean that RSX is capable, but that it would require a new hardware revision? Or that the existing connecter is capable of being reconfigured for digital output via software?
 

bill0527

Member
inpHilltr8r said:
What exactly do they mean by "platform"? Does that mean that RSX is capable, but that it would require a new hardware revision? Or that the existing connecter is capable of being reconfigured for digital output via software?

This is what I've been asking in the other thread.

IGN should have followed up with this question, but like the good journalists they are... they simply didn't. They should have pressed and gotten an answer as to whether that flexibility applies to current 360 owners or does 'flexibility' mean a new hardware revision?

Its doubtful they would give a straight answer as it would be a negative signal if they were to say that its absolutely not possible on the current version of the 360 and woulid require a hardware revision. But I would have felt better if the question would have at least been asked.
 

Raistlin

Post Count: 9999
aaaaa0 said:
There is a loophole for analog VGA output in AACS. You can use any resolution that a computer was widely capable of displaying as of June 1st 2004, as long as the movie in question does not have both the Digital Only Token or the Image Constraint Tokens asserted. Neither token is scheduled to be enabled by movie studios until 2011.

In 2004, many widely available video cards and monitors were capable of resolutions much higher than 1920x1080p, so VGA is allowed to do 1920x1080p in analog.

See the AACS Interim Adopter License Agreement, page 92, table "AACS Authorized Analog Outputs".

Yep ... that's what I looked up previously.

It's listed as Table A1 in this document, and in one of the others ... Appendix A. The tables go over resolutions for all outputs, details on ITC and DOT, and refer to other sections for details on the Sunset requirements.

Which BTW ... I believe the dates for the Sunset requirements may be newly defined - I'm not sure the dropdead dates were in there all that long ago. One should note ... they are dropdead dates ... they (unfortunately) may be enacted prior to that date.
 

Raistlin

Post Count: 9999
inpHilltr8r said:
What exactly do they mean by "platform"? Does that mean that RSX is capable, but that it would require a new hardware revision? Or that the existing connecter is capable of being reconfigured for digital output via software?

It's PR.

They need a new motherboard ... potentially a bus redesign (is it protected currently?) ... and a new scaler.
 

aaaaa0

Member
Onix said:
One should note ... they are dropdead dates ... they (unfortunately) may be enacted prior to that date.

Indeed. However, there is currently a gentlemen's agreement that neither token will be enabled until the dropdead date, for obvious reasons (like lack of penetration of HDMI capable sets and HDCP interop problems).
 
Mojovonio said:
Man, when I decide to guy buy a new TV, i'm going to be sooooo ****ing confused.


I was you about two weeks ago. Lots of info out there so don't worry about it. When the time comes, you'll know what type of TV best fits what you're looking for.
 

aaaaa0

Member
Onix said:
potentially a bus redesign (is it protected currently?)

From what I've heard all the buses on the 360 are already encrypted. It's one of the things that's making a modchip to run arbitrary code so damn hard to make.
 

Raistlin

Post Count: 9999
aaaaa0 said:
Indeed. However, there is currently a gentlemen's agreement that neither token will be enabled until the dropdead date, for obvious reasons (like lack of penetration of HDMI capable sets and HDCP interop problems).

I agree ... I'm just saying it is a possibilty.



For example, let's say the HD movie war ends quickly ... coupled with higher than expected HDMI-equipped HDTV's. At that point, studios may not have much to fear.



I don't expect it to happen though (even under those circumstances). The PR would likely be pretty bad. At worst ... some few studios may test the waters on specific titles.

I can see Disney doing it :/
 

Raistlin

Post Count: 9999
aaaaa0 said:
From what I've heard all the buses on the 360 are already encrypted. It's one of the things that's making a modchip to run arbitrary code so damn hard to make.

Well ... that's one thing they've got going for it :D

In that case though, they simply need to modify the motherboard to add a port ... modift the case to fit it ... and get a new scaler (if what I understand about their WebTV based chip is correct).
 

Klocker

Member
OP article has been updated BTW:

IGN : Does the Xbox 360 have the internal bandwidth between CPUs and graphics processors necessary to move a full 1080p image? There's a big difference between 1080i and the 3GB/s of 1080p.

Microsoft: *updated* Yes, the Xbox 360 has the necessary internal bandwidth between CPUs and graphics processors to move a full 1080p image.


...***Updated 9/28, 2:15 PM PST***

IGN: Can the X360 send out a digital signal now, or ever?

Microsoft Xbox 360 currently doesn't include a digital out connection for video. Our platform is flexible enough to allow support of a digital connection in the future should we choose to do so. When the Xbox 360 was being developed HDMI was nascent and with our current connections we support what the overwhelming majority of consumers have available to them. It's important to note that the market penetration of 1080p displays is in the single digits. Regardless, for those early adopters who have displays and projectors that support 1080p over VGA and component we have a solution and it is a free upgrade for them. We are watching the market closely and will continue to evaluate our solution in the face of consumer demand, but have no announcements regarding additional cables or connections.
 
X360BlockDiaAnnotated1.gif

This is the chip that prevents HDMI implimentation of any kind currently in the X360. AFAIK, there is no way that any add on device can circumvent this roadblock to HDMI output for the X360.

Perhaps future mobo rev of X360 will have a revised chip that can output in digital form, but until then I don't see how HDMI output can be done without breaking DRM.


Onix said:
In that case though, they simply need to modify the motherboard to add a port ... modift the case to fit it ... and get a new scaler (if what I understand about their WebTV based chip is correct).

The chip was designed by the ex Web TV team, not based on the Web TV per se.
 

Raistlin

Post Count: 9999
Klocker said:
OP article has been updated BTW:


Microsoft: *updated* Yes, the Xbox 360 has the necessary internal bandwidth between CPUs and graphics processors to move a full 1080p image.


Regarding games though ... I do wonder if the 360 design may not really be as condusive to 1080p games as PS3?



1) EDRAM - Considering they would have to do multiple tiles to render the image, one has to wonder how much of a processing strain that would be. 1080p is already processor intensive to begin with … multiple tiling added to that may not make for speedy engine.

2) Cell – Since Cell is better suited to ‘lending a hand’ to the GPU (for example, Warhawk has Cell render the clouds and water among other things), I think it can be argued that PS3 has a little bit more headroom since they can offload some of the graphics work from RSX.



This is not to say there will be no 360 1080p games ... just that it may be a bit more difficult.
 

arne

Member
Onix said:
Regarding games though ... I do wonder if the 360 design may not really be as condusive to 1080p games as PS3?

This is not to say there will be no 360 1080p games ... just that it may be a bit more difficult.

I'm really wondering here because I am not well versed in many technicalities.

How difficult is the jump from 1080i to 1080p for graphics processing?
I'm just thinking of one example, DOA4, which was rendered natively at 1080i, what would a game like that take to make the jump?
 
Onix said:
Regarding games though ... I do wonder if the 360 design may not really be as condusive to 1080p games as PS3?

1) EDRAM - Considering they would have to do multiple tiles to render the image, one has to wonder how much of a processing strain that would be. 1080p is already processor intensive to begin with … multiple tiling added to that may not make for speedy engine.

2) Cell – Since Cell is better suited to ‘lending a hand’ to the GPU (for example, Warhawk has Cell render the clouds and water among other things), I think it can be argued that PS3 has a little bit more headroom since they can offload some of the graphics work from RSX.


This is not to say there will be no 360 1080p games ... just that it may be a bit more difficult.

EDRAM is an additional roadblock for 1080p gaming, but all it means is that the tile number has to be double as the 720p version of the engine (since 1080p has slightly over 2x the space requirement in the EDRAM as 720p), if they keep everything else the same. So if an engine is doing 720p with HDR with no AA fits under one tile, then a 1080p version of that same engine will require 2 tiles.

Now AFAIK, the math goes as such that 720p games with HDR and 2x MSAA takes up around 2 tiles. If you make that into 1080p, they can give up on the 2x MSAA and still be able to fit into 2 tiles I think. As long as the scaling down to 720p from 1080p is similar in effect with the analog video out chip as 2x MSAA, this should be a fine solution for such situations.

Also, my biggest concern for 1080p60 games from X360 was the possible roadblock from the 500MB/s bus from Xenos to the scaler, but my rudimentary math sez 1920x1080x32x60 fits just under the 500MB limit @ 497,664,000 bytes.

Of course, my info above is second hand and from old memroy so could be a bit off (and is just talking about the EDRAM factor, and not the fillrate and the shader ops limit).
 

Raistlin

Post Count: 9999
arne said:
I'm really wondering here because I am not well versed in many technicalities.

How difficult is the jump from 1080i to 1080p for graphics processing?
I'm just thinking of one example, DOA4, which was rendered natively at 1080i, what would a game like that take to make the jump?

I'm not sure exactly either. Obviously more BW is needed ... plus they would need to tile more?

In general, I think most delopers would prefer to avoid interlaced rendering if at all possible. If you can't gurantee a really stable framerate - all sorts of nastiness can happen.
 

Raistlin

Post Count: 9999
Shogmaster said:
EDRAM is an additional roadblock for 1080p gaming, but all it means is that the tile number has to be double as the 720p version of the engine (since 1080p has slightly over 2x the space requirement in the EDRAM as 720p), if they keep everything else the same. So if an engine is doing 720p with HDR with no AA fits under one tile, then a 1080p version of that same engine will require 2 tiles.

Now AFAIK, the math goes as such that 720p games with HDR and 2x MSAA takes up around 2 tiles. If you make that into 1080p, they can give up on the 2x MSAA and still be able to fit into 2 tiles I think. As long as the scaling down to 720p from 1080p is similar in effect with the analog video out chip as 2x MSAA, this should be a fine solution for such situations.

Also, my biggest concern for 1080p60 games from X360 was the possible roadblock from the 500MB/s bus from Xenos to the scaler, but my rudimentary math sez 1920x1080x32x60 fits just under the 500MB limit @ 497,664,000 bytes.

Of course, my info above is second hand and from old memroy so could be a bit off.

Good stuff.

I'm just wondering if many devs would want to add in the extra tiling, etc? Obviously that slows things down versus a similar PS3 engine.

I would think the last thing MS would want is for a 1080p game to finally come out ... and have it not look so hot compared to similar PS3 games.
 
Shogmaster said:
This is the chip that prevents HDMI implimentation of any kind currently in the X360. AFAIK, there is no way that any add on device can circumvent this roadblock to HDMI output for the X360.
I think one think we should have probably realised in the last couple of days is that the chip isn't simply a hardwired dac based video out chip. I'm guessing the chip can can be programmed and loaded with new software/logic, to support new output modes. Also I'm suspecting to support hdmi they will probably load new software into this chip so it can use the analog pin outs as a digital data path and probably send a digital frame buffer with some form of proprietary encryption so that an external dongle/connector can decrypt and then encrypt it into a hdcp complient hdmi signal.

It's probably a very clever approach having a fully programable video out chip and i would assume microsoft probably did it that way so they could adapt their video out options when standards evolve/change.

Honestly that's the only way i can see how could do it.
 

Klocker

Member
Moonwalker said:
I think one think we should have probably realised in the last couple of days is that the chip isn't simply a hardwired dac based video out chip. I'm guessing the chip can can be programmed and loaded with new software/logic, to support new output modes. Also I'm suspecting to support hdmi they will probably load new software into this chip so it can use the analog pin outs as a digital data path and probably send a digital frame buffer with some form of proprietary encryption so that an external dongle/connector can decrypt and then encrypt it into a hdcp complient hdmi signal.

It's probably a very clever approach having a fully programable video out chip and i would assume microsoft probably did it that way so they could adapt their video out options when standards evolve/change.

Honestly that's the only way i can see how could do it.


makes sense
 
Onix said:
Good stuff.

I'm just wondering if many devs would want to add in the extra tiling, etc? Obviously that slows things down versus a similar PS3 engine.

I would think the last thing MS would want is for a 1080p game to finally come out ... and have it not look so hot compared to similar PS3 games.

It depends on the number of tiles I think. I think there are many 2 tiles games put or in development right now that run fast and smooth, but I'm not sure about 3 tiles. Obviously more tiles = slower performance, so I don't think anyone would want to go beyond 3 tiles.

But the thing is, I think 1080p with HDR can be done easily with 2 tiles if you don't impliment MSAA so I'm not too worried if a dev wants to pursue that route.


Moonwalker said:
I think one think we should have probably realised in the last couple of days is that the chip isn't simply a hardwired dac based video out chip. I'm guessing the chip can can be programmed and loaded with new software/logic, to support new output modes. Also I'm suspecting to support hdmi they will probably load new software into this chip so it can use the analog pin outs as a digital data path and probably send a digital frame buffer with some form of proprietary encryption so that an external dongle/connector can decrypt and then encrypt it into a hdcp complient hdmi signal.

It's probably a very clever approach having a fully programable video out chip and i would assume microsoft probably did it that way so they could adapt their video out options when standards evolve/change.

Honestly that's the only way i can see how could do it.

Wow. That's pretty cool and clever. That would be cool if it is indeed possible. Not that it really matters in the big picture though. HDMI is not at all a requirement for a HD console success IMO.
 
Also, my biggest concern for 1080p60 games from X360 was the possible roadblock from the 500MB/s bus from Xenos to the scaler, but my rudimentary math sez 1920x1080x32x60 fits just under the 500MB limit @ 497,664,000 bytes.
Wouldn't it be 24bit color not 32bit? Most displays are 24bit only 8bit rgb. so that would bring the bandwidth down to 360mb roughly which means there is enough. Obviously the bus was provisioned with enough bandwidth to satisfy 1080p. Don't expect deep color though. ;)
 
Moonwalker said:
Wouldn't it be 24bit color not 32bit? Most displays are 24bit only 8bit rgb. so that would bring the bandwidth down to 360mb roughly which means there is enough. Obviously the bus was provisioned with enough bandwidth to satisfy 1080p. Don't expect deep color though. ;)

Either for RGB plus Alpha or HDR modes that X360 has.

Deep color is just another term for HDR, ain't it?
 

Raistlin

Post Count: 9999
Moonwalker said:
I think one think we should have probably realised in the last couple of days is that the chip isn't simply a hardwired dac based video out chip. I'm guessing the chip can can be programmed and loaded with new software/logic, to support new output modes. Also I'm suspecting to support hdmi they will probably load new software into this chip so it can use the analog pin outs as a digital data path and probably send a digital frame buffer with some form of proprietary encryption so that an external dongle/connector can decrypt and then encrypt it into a hdcp complient hdmi signal.

It's probably a very clever approach having a fully programable video out chip and i would assume microsoft probably did it that way so they could adapt their video out options when standards evolve/change.

Honestly that's the only way i can see how could do it.

That would be one costly dongle :D

I'd be quite surprised if the scaler chip was capable of doing something like that over it's pin-out ... but who knows. This solution would add some lag though, since you'd be going GPU -> Scaler (encrypts the signal ... and potentially does scaling) -> dongle (decrypts ... then encrypts with HDCP).



The one REALLY serious flaw in this design however is power. An HDMI card actually contains (at least) 2 processors. There is the HDMI transmitter processor ... then there is an HDCP encrypter.


That would be awesome ... the 360 + HD-DVD + HDMI dongle would have THREE power supplies :lol
 
Did an MS executive state the 1080p is impossible this next gen?

How come they turned around so fast while 720p games struggling to run at 30fps?
 
acousticvan said:
Did an MS executive state the 1080p is impossible this next gen?

How come they turned around so fast while 720p games struggling to run at 30fps?

There are plenty of games running at 60. The bottleneck to 60 in most cases are games pushing alot of shaders. The same goes for PS3 in that realm as well.
 

Raistlin

Post Count: 9999
Shogmaster said:
It depends on the number of tiles I think. I think there are many 2 tiles games put or in development right now that run fast and smooth, but I'm not sure about 3 tiles. Obviously more tiles = slower performance, so I don't think anyone would want to go beyond 3 tiles.

But the thing is, I think 1080p with HDR can be done easily with 2 tiles if you don't impliment MSAA so I'm not too worried if a dev wants to pursue that route.

Beyond the potentially extra tiling, aren't 1080p games still more processor intensive? I'm not super familiar with the innerworkings of GPUs ... but not all processing is done in the EDRAM is it?


Wow. That's pretty cool and clever. That would be cool if it is indeed possible. Not that it really matters in the big picture though. HDMI is not at all a requirement for a HD console success IMO.

See my above post.

Time for a bigger Powerstrip!!! :D

pst10d.jpg
 
Onix said:
Beyond the potentially extra tiling, aren't 1080p games still more processor intensive? I'm not super familiar with the innerworkings of GPUs ... but not all processing is done in the EDRAM is it?

As I said in the post, all my rambling was talking just about the EDRAM factor. The other roadblocks to 1080p gaming is the same ones faced by PS3: bandwidth between the buses, fillrate, and shader limits.

The bandwidth between the chips and buses is around the same neighborhood for both consoles so if PS3 can do 1080p, X360 can too. Fillrate is the same too now since RSX has been downgraded to 500MHz. Shader ops are also in the same ballpark even though the two GPUs take totally different routes to do their work (unified vs traditional).

So if the EDRAM/tile situation is OK for 1080p in Xenos, 1080p developement should be about as difficult for 360 as PS3 IMO.


See my above post.

Time for a bigger Powerstrip!!! :D

The cost can be passed onto those who actually care about this HDMI BS. Most won't give a shit, nor even know the issue exists! ;)
 

Raistlin

Post Count: 9999
Shogmaster said:
The cost can be passed onto those who actually care about this HDMI BS. Most won't give a shit, nor even know the issue exists! ;)


No no ... I don't care about the cost.

Read my last sentence (Post # 176)!!!



THREE POWER SUPPLIES :lol
 

Raistlin

Post Count: 9999
Shogmaster said:
OH TEH NOES!!! HOW WILL WE LIVE WITH SUCH MONSTROCITY!!!!

I guess.

Not the end of the world ... but seriously, that's Genesis+SegaCD+32x territory right there.

I just can’t imagine plugging in 3 different power supplies for one frigging console … one with a huge brick, one with a small brick … and one coming off of your video cable.

:lol
 
Onix said:
Not the end of the world ... but seriously, that's Genesis+SegaCD+32x territory right there.

I just can’t imagine plugging in 3 different power supplies for one frigging console … one with a huge brick, one with a small brick … and one coming off of your video cable.

:lol

Keep in mind that unlike the Genesis+SegaCD+32x monstrosity, HD-DVD and HDMI is not a requirement for X360 gaming. It's just extra for us crazy people (although I'm stopping at HD-DVD add on. The HDMI craziness is for someone else!).
 
3 different power supplies?

If the video is flexible enough to pass digital data over the pins, i wouldn't be surprised that the connector could supply a power/volt line to an external box.

Clearly ms have hardware now that can do it, last time i checked though hdmi 1.3 chips were still very pricey(one of the reasons why sony didn't want to put in the tard pack originally), i would assume they're waiting for the price of chips to become more affordable before releasing a cable.

A different video connector is comparable to the 32x/segacd? riiiight.....
 

Raistlin

Post Count: 9999
Shogmaster said:
Keep in mind that unlike the Genesis+SegaCD+32x monstrosity, HD-DVD and HDMI is not a requirement for X360 gaming. It's just extra for us crazy people (although I'm stopping at HD-DVD add on. The HDMI craziness is for someone else!).

I know ... they are add ons.


My point being that it is so retarded, I can't imagine it selling very well ... which is why even if it were possible (which I don't believe) ... I can't imagine it would ever see the light of day.
 

Raistlin

Post Count: 9999
Moonwalker said:
3 different power supplies?

If the video is flexible enough to pass digital data over the pins, i wouldn't be surprised that the connector could supply a power/volt line to an external box.

Clearly ms have hardware now that can do it, last time i checked though hdmi 1.3 chips were still very pricey(one of the reasons why sony didn't want to put in the tard pack originally), i would assume they're waiting for the price of chips to become more affordable before releasing a cable.

This isn't a USB mouse.

That would be some serious power you'd be putting through there.


A different video connector is comparable to the 32x/segacd? riiiight.....

Let's see ...

360 + large power prick + HD-DVD + small power prick + USB connector cable + HDMI Dongle box + 3rd power chord?

Yeah ... I think that's in the same neighborhood.
 

bill0527

Member
Shogmaster said:
The thread was doing so well with it's second chance too... :lol

I'm going to continue to believe that it can be done until I see concrete proof otherwise telling me that the current version of the Xbox 360 console is not HDMI capable without a hardware revision.

Would it be costly?

Probably.

And that's why we continue to see statements from Microsoft that they wont' be bringing it until the market is ready for it. I don't remember who posted it last night, maybe it was Shog or someone else, but 1080p TVs have something like 1 % of all HDTV penetration in North America and 2% penetration among all HDTV's in Japan. It probably does not make financial sense for them to engineer this cable right now for the 3 people that own 1080p TVs and Xbox 360 systems.
 

HokieJoe

Member
JB1981 said:
But it certainly won't take 1080p that long. Like I said before, 720p was and is a stop-gap resolution. By the time both of these consoles hit their stride 1080p will be everywhere.

It depends on what you mean by "everywhere". The problem is there won’t be sufficient market forces in play to move 1080p TV’s unless the prices are right.

For network TV, cable, and satellite, 1080p broadcasts will be virtually nonexistent. They don't have the plant capacity, and even if they did, they'd just use it to cram more channels in their respective line-up- like they do now. Considering that most people use their TV to watch broadcast content, I find it unlikely that this will be a prime mover for 1080p.

HD movies will be delivered at 1080p via disc, but it will take a while for Bluray and HD-DVD player prices to hit a price point that pushes mass market acceptance. That price certainly isn’t $500-$1,000. It’s more likely in the $199-$299 range and lower. Thus, 1080p HD movies won’t be a real mover until players are attractive to the mass market. It could take upwards of two years for player prices to hit mass market acceptance.

If history tells us anything, it’s that HDTV adoption has been plodding. Most of the reason for this is a lack of broadcast content, and they’ve simply been unaffordable for the majority of people. The best arrow in the 1080p quiver is how well CE manufacturers are able to manage pricing for 1080p sets- at least until analog broadcasts are shut down in 2009.

In the short term, I don’t think that price will be low enough to move 1080p sets sufficiently. When one considers all of the people who haven’t jumped in at 720p/1080i thus far, I don’t see how they’ll suddenly want to buy into higher priced 1080p sets while *more affordable* 720p sets are sitting on store shelves. Without mass market pricing, or affordable content for two+ years, I just can’t see how 1080p will become the standard anytime soon.
 

Raistlin

Post Count: 9999
HokieJoe said:
For network TV, cable, and satellite, 1080p broadcasts will be virtually nonexistent.

I don't want to really get into this again ... but I must ...



1080i/60 broadcasts, which make up the majority of HD TV content ... can only be resolved by a 1080p TV. '1080i' TV's don't exist. Those CRT's simply cannot resolve that resolution.

Furthermore, 1080p video at 30fps or less (in other words, video and movie content) can be be interlaced to 1080i60 ... and a 1080p display will de-interlace it, displaying the original progressive content.
 
Shogmaster said:
Amir said 1080p through Component for games and downloaded movies too.




edit: I WANT EVERYONE TO LEARN THIS GODDAMN CHART ONCE AND FOR ALL!

NGvideoConOptions-1.gif

And what avoids PS3 to use a VGA output apart from many people wishes?
 
Top Bottom