• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

WiiU technical discussion (serious discussions welcome)

mrklaw

MrArseFace
Anyone yet figured out if it really is HDMI version 1.4

Check cables maybe something's written on it.

The 2011 iwata interview from mercury news is dead link on wikipedia.

EDIT: aha now i Remember Iwata said the tech will be capable of 3D stereoscopic but nintendo will not focus on it, that means it definitely has HDMI 1.4.

Well that makes me confused again : "High Speed HDMI 1.3 cables can support all HDMI 1.4 features except for the HDMI Ethernet Channel"

1.3 is fine for stereo 3d

I think 1.4 adds Ethernet and maybe audio return channel, but even that might be 1.3
 

Stewox

Banned
1.3 is fine for stereo 3d

I think 1.4 adds Ethernet and maybe audio return channel, but even that might be 1.3

It doesn't have support for 120hz, which is pretty much essential for a 60FPS game, you know, FPS gets halved once you enable 3D so it would be 30FPS with 1.3
 

Thraktor

Member
It doesn't have support for 120hz, which is pretty much essential for a 60FPS game, you know, FPS gets halved once you enable 3D so it would be 30FPS with 1.3

I would be very surprised if we see more than one or two games this coming gen that run at 120Hz in 3D. It's simply not worth the effort of trying to sqeeze 120 fps out of a console (even PS4/XBox3) when only a very small proportion of players will even use the 3D mode. Any games which do have 3D options will be 60Hz in 2D and the same in 3D (ie 30Hz for each eye).
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
It doesn't have support for 120hz, which is pretty much essential for a 60FPS game, you know, FPS gets halved once you enable 3D so it would be 30FPS with 1.3

My 3D projector is HDMI 1.3 and supports 120hz just fine.

Also. From a console. Normally the images are frame packed so they only need 60hz.
 

mrklaw

MrArseFace
It doesn't have support for 120hz, which is pretty much essential for a 60FPS game, you know, FPS gets halved once you enable 3D so it would be 30FPS with 1.3

It supports 720p/60/3D via frame packed, so you still get 60fps per eye.

No TVs current support a 120Hz input natively
 

Oblivion

Fetishing muscular manly men in skintight hosery
The NSMBU team is far from one of Nintendo's B Teams. I would give you NSMB2, but not the main NSMB games.

Well, the NSMB team does have level designers from the A teams, but I was mainly referring to the graphics dudes.

Also, NSMB: DS.
 

Kai Dracon

Writing a dinosaur space opera symphony
Personally, I suspect the main reason why all of Nintendo's initial first party software sticks to 720p - hardware theorycraft aside - is Nintendo is that cautious. Their first games with HD assets, so they didn't order the teams to push their limits immediately and possibly get in over their heads, resulting in further delays.

Plus NSMB U has a lot of 2D, hand-drawn art elements and they may have felt the time/labor tradeoff to create them at 1080p wasn't worth it.
 

guek

Banned
Personally, I suspect the main reason why all of Nintendo's initial first party software sticks to 720p - hardware theorycraft aside - is Nintendo is that cautious. Their first games with HD assets, so they didn't order the teams to push their limits immediately and possibly get in over their heads, resulting in further delays.

Plus NSMB U has a lot of 2D, hand-drawn art elements and they may have felt the time/labor tradeoff to create them at 1080p wasn't worth it.

There's probably a lot of truth to this. But it's also certainly in line with nintendo's past if they're also literally just thinking "720p is good enough"
 

Appleman

Member
There's probably a lot of truth to this. But it's also certainly in line with nintendo's past if they're also literally just thinking "720p is good enough"

Honestly, if all of their games have the IQ of NSMBU, 720p IS good enough for me. Thankfully I'm not playing on a monitor though *shudder*
 

MDX

Member
There's probably a lot of truth to this. But it's also certainly in line with nintendo's past if they're also literally just thinking "720p is good enough"


Well I would like to see the numbers of HD Ready TVs vs FULL HD TVs.
Could very well be that Nintendo is taking into consideration that most people world-wide have HD Ready TVs.
 

bobeth

Member
Well I would like to see the numbers of HD Ready TVs vs FULL HD TVs.
Could very well be that Nintendo is taking into consideration that most people world-wide have HD Ready TVs.

That's nonsense. 1080p requires to render more than twice as many pixels than 720p, no need to look any further.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
The two most convincing Wii U games from a technological standbpoint both use custom inhouse engines. The teams didn't have to wait for middleware providers or be afraid to mess with something they don't really understand. And in both cases, the teams are small so the ports were done by their main engine guys, who had the opportunity to focus exclusively on this one platform. It's probably really that simple.

To make that clear, I'm not saying "lazy developers" are to blame for sub-par or shoddy ports. The workflow and the motivation at Frozenbyte or Shin'en simply isn't comparable to that of a three guys at Treyarch or a few guys at WB Games Montreal hacking something together in time for launch because they were told to. Frozenbyte and Sin'en brought their A team, because they have no other teams. They have access to everything, can change any random asset and every line of code as required, and they know their tech down to the last detail. And last, but not least: They want to deliver, because their reputation and ultimately their own money is on stake.
Oh, I absolutely agree with your point. I was merely discussing the BW perspective of Trine2 since that's what I believed Thraktor's inquiry was about. But business-wise, you're absolutely correct.
 

elostyle

Never forget! I'm Dumb!
There's probably a lot of truth to this. But it's also certainly in line with nintendo's past if they're also literally just thinking "720p is good enough"
Honestly, for games with a look like NSMBU, there is not that much of a quality difference between 720p with 4xMSAA and 1080p. Different for rayman though.
 

Thraktor

Member
The two most convincing Wii U games from a technological standbpoint both use custom inhouse engines. The teams didn't have to wait for middleware providers or be afraid to mess with something they don't really understand. And in both cases, the teams are small so the ports were done by their main engine guys, who had the opportunity to focus exclusively on this one platform. It's probably really that simple.

To make that clear, I'm not saying "lazy developers" are to blame for sub-par or shoddy ports. The workflow and the motivation at Frozenbyte or Shin'en simply isn't comparable to that of a three guys at Treyarch or a few guys at WB Games Montreal hacking something together in time for launch because they were told to. Frozenbyte and Sin'en brought their A team, because they have no other teams. They have access to everything, can change any random asset and every line of code as required, and they know their tech down to the last detail. And last, but not least: They want to deliver, because their reputation and ultimately their own money is on stake.

I agree with you in the sense of who is actually working on the ports and the time they've had (and incidentally Joel from Frozenbyte in the Trine 2 thread seems to as well), but I do think that the differences between the games can still provide clues about the Wii U's architecture. For example, the fact that Trine 2 uses (BW-heavy) deferred shading and a fair amount of transparencies on top while performing better than PS360 versions indicates that the eDRAM bandwidth probably isn't a bottleneck in this regard. Conversely, Black Ops 2 uses forward shading (afaik) and (even in a quick'n'cheap port) shouldn't consume the same sort of bandwidth. Hence, we're looking for something other than eDRAM bandwidth that would affect transparencies, which either the shading technique (forward vs. deferred) is dependent on, or which would take time or expertise to work around (which Frozenbyte and Shinen had, but not most other port teams).
 

Argyle

Member
I agree with you in the sense of who is actually working on the ports and the time they've had (and incidentally Joel from Frozenbyte in the Trine 2 thread seems to as well), but I do think that the differences between the games can still provide clues about the Wii U's architecture. For example, the fact that Trine 2 uses (BW-heavy) deferred shading and a fair amount of transparencies on top while performing better than PS360 versions indicates that the eDRAM bandwidth probably isn't a bottleneck in this regard. Conversely, Black Ops 2 uses forward shading (afaik) and (even in a quick'n'cheap port) shouldn't consume the same sort of bandwidth. Hence, we're looking for something other than eDRAM bandwidth that would affect transparencies, which either the shading technique (forward vs. deferred) is dependent on, or which would take time or expertise to work around (which Frozenbyte and Shinen had, but not most other port teams).

Black Ops 1 reportedly used deferred shading, was it confirmed that they switched back to a forward renderer for Black Ops 2?
 

Thraktor

Member
Black Ops 1 reportedly used deferred shading, was it confirmed that they switched back to a forward renderer for Black Ops 2?

It's entirely possible (nay likely) that I'm wrong on that one, I just had it in my mind that all the CoDs used forward renderers for some reason.
 

ozfunghi

Member
I don't want to start a cross-thread thing, but if you like at this and know that that is being made by just two guys, working on DX10 tech... one has to wonder. Most people assume WiiU games will start looking "somewhat" better the coming years but won't make as big a leap as the first gen 360 games to games like Halo4, just because the hardware is not capable of much more and new techniques won't make that big an impact anymore. But if you look at what those guys are pulling off, is it really that unlikely that studio's like RETRO would be able to squeeze stuff like this out of the WiiU in a couple of years?
 

Durante

Member
The "DirectX level" (or equivalent) isn't nearly as important for the final result as the raw performance of the device, at least since GPUs became fully programmable. Reset is being built on PCs with "current-gen graphics cards". Depending on the exact model (and the exact Wii U GPU specs which are still not known) that could mean anything from a 5x to a 12x performance gap (more if you count dual-GPU cards, but I doubt they'd be targeting those).
 

ozfunghi

Member
The "DirectX level" (or equivalent) isn't nearly as important for the final result as the raw performance of the device, at least since GPUs became fully programmable. Reset is being built on PCs with "current-gen graphics cards". Depending on the exact model (and the exact Wii U GPU specs which are still not known) that could mean anything from a 5x to a 12x performance gap (more if you count dual-GPU cards, but I doubt they'd be targeting those).

But either, why are they working on (or restricting themselves to) DX10 tech, if the raw performance of the card that would be needed would be that of a DX11 card anyway? If they want people with DX10 cards to be able to play this, then what are we looking at? What is the best DX10 card available? And are they targeting just a few DX10 cards, or also sub-top DX10 cards? That would mean we are looking at cards a few years old even, and how far behind can the WiiU GPU be, compared to those? I currently still own a HD4890, i assume that would be one of the more performant cards of the DX10 generation.

It seems silly to me, for them to be able to get it working on DX10 hardware, robbing themselves of DX11 features, if only a few people with high-end DX10 cards will be able to play it.

Also, completely unrelated to this story, just to draw a comparison, i wanted to ask for opinions on how much graphics will be able to evolve on the software side of things. These are two guys. Two. Not targetting 2000 dollars worth of GPU power.
 

Thraktor

Member
But either, why are they working on (or restricting themselves to) DX10 tech, if the raw performance of the card that would be needed would be that of a DX11 card anyway? If they want people with DX10 cards to be able to play this, then what are we looking at? What is the best DX10 card available? And are they targeting just a few DX10 cards, or also sub-top DX10 cards? That would mean we are looking at cards a few years old even, and how far behind can the WiiU GPU be, compared to those? I currently still own a HD4890, i assume that would be one of the more performant cards of the DX10 generation.

It seems silly to me, for them to be able to get it working on DX10 hardware, robbing themselves of DX11 features, if only a few people with high-end DX10 cards will be able to play it.

Also, completely unrelated to this story, just to draw a comparison, i wanted to ask for opinions on how much graphics will be able to evolve on the software side of things. These are two guys. Two. Not targetting 2000 dollars worth of GPU power.

The GTX680 is the most powerful DX10 card. It's also the most powerful DX11 card, the most powerful DX9 card, etc., etc. My guess is that they only use DX10 simply because they don't consider the extra functionality of DX11 necessary for what they're trying to do. As Durante says, though, the power of the hardware is of much more importance than the version number of the API they're using.
 

Argyle

Member
Black Ops 1 was still forward rendering, indicated here.

Ha, looks like you are right.

I was trying to figure out where I got that idea in my head, I think someone who shoulda known better told me and I didn't engage the bullshit filter. I remember my first thought was "and it still runs at 60 on console? damn!" when I heard that...
 

ozfunghi

Member
The GTX680 is the most powerful DX10 card. It's also the most powerful DX11 card, the most powerful DX9 card, etc., etc. My guess is that they only use DX10 simply because they don't consider the extra functionality of DX11 necessary for what they're trying to do. As Durante says, though, the power of the hardware is of much more importance than the version number of the API they're using.

Ok, but would they literally mention working on/with/for DX10 tech, if they are in fact working on top of the line currently available hardware if it means they're just not using the DX11 specific features? They even mention that it is not set in stone this will be what the final build will require. That to me doesn't sound like he's talking about a GTX680 of which the DX11 features are 'currently' not being put to use.
 

disap.ed

Member
I guess with the start of the NextGen consoles developers will start to finally leave DX9 behind, DX10 doesn't make the big difference to DX11 featurewise AFAIK, but DX10 never got it's foot in the door, so I guess DX11 will be the way to go a year or 2 from now.
 

Stewox

Banned
My 3D projector is HDMI 1.3 and supports 120hz just fine.

Also. From a console. Normally the images are frame packed so they only need 60hz.


What you're seeing is probably the TV's "120Hz" internal refresh rate, do you belive it's the same input?



The 120Hz is 100% TV based. It will apply the feature to any form of content being fed into the TV, by any cable type. 120Hz is a refresh rate or how often the image on the screen is "drawn" (scanned). The TV doesn't care what or how you feed it an image, its always going to scan at that rate.

There aren't any external devices that feed your TV with anything higher than 60 Hz. The 120 Hz and 240 Hz figures that you see are all internally generated in the TV.

And the web is quite in confusion so i won't go through this research because it feels unnecessary, I just get the best stuff and don't care about it since why would I care about 1.3 if 1.4 has those features supported. It's not like 1.4 cables are hard to find or cost 10 times more.

Basically you're pretty stupid to use 1.3 version anyways and you know that, but still i don't quite belive it. Maybe it is 120FPS because you use lower resolution but you know, it's a confusion on the web and I'm not going to filter what's right and wrong because i don't have time for this.

http://www.hdmi.org/manufacturer/hdmi_1_4/3d.aspx


EDIT:

Looks like it's not real 120FPS http://www.overclock.net/t/1092386/why-cant-hdmi-support-real-120hz/10

Display Port 1.2 can support 17.8 Gib/s which is enough for 2560x1600 @ 120 Hz

HDMI 1.3 has something like 8 Gib/s and 1.4 has about 9 or 10 Gib/s but that's not confirmed, some say 1.3 and 1.4 are the same except for audio return and ethernet.

Basically as far as WiiU goes if I can see it work weird I'll replace the HDMI cable, other than that I don't really care about HDMI since DisplayPort is what I'm interested in for PC monitors. The TV & video market simply was a pile of shit to me and looks like it still is.


EDIT: better find here

1.4 supports 3d stuff with "the frame packing 3D format at either 720p50 and 1080p24 or 720p60 and 1080p24, side-by-side horizontal at either 1080i50 or 1080i60, and top-and-bottom at either 720p50 and 1080p24 or 720p60 and 1080p24.[123]" but no 120 hz

I thought 1.4 could support 120hz 3D, looks like HDMI totally doesn't. But I don't say stuff unless I'm sure and I double-check and fix my posts.

And no offense, but i've seen you coming out in tech discussions and saying stuff factually and in a way like it was 100%, but it always happend to be wrong. So are you trolling or what. Maybe you should stay away from tech, because you're wasting my time here.

http://www.avsforum.com/t/977601/120-hz-hdmi-cables

Bottom line is, your 120Hz is just the HDTV PR bullshit. It's fake 120HZ. It's not real, they just do a lot of image manipulation techniques to make it look like it, half-baked. I don't use 3D on my UE40D6750 and it still has those high-end features such Motion Plus which are sometimes marketed as "240HZ" or "400hz", it doesn't magically make the signal better or what's outputting from the broadcast/device.
 

Koren

Member
Bottom line is, your 120Hz is just the HDTV PR bullshit. It's fake 120HZ. It's not real, they just do a lot of image manipulation techniques to make it look like it, half-baked. I don't use 3D on my UE40D6750 and it still has those high-end features such Motion Plus which are sometimes marketed as "240HZ" or "400hz", it doesn't magically make the signal better or what's outputting from the broadcast/device.
I think there's a lot of confusion because "120Hz" itself is misleading.

There is, at least:
- the number of images (can packed several frames) sent each second over the cable
- the number of frames sent each second over the cable
- the number of real frames displayed by the set each second
- the number of real frames received by each eay each second
- the number of different (interpolated) frames displayed by the set each second
- the number of different (interpolated) frames received by each eay each second
- the number of frames (refreshed) displayed by the set each second
- the number of frames (refreshed) received by each eay each second

All those numbers can be different, and when you're talking about "120Hz" for a good 3D, I think you'll only want 60 images per second for each eye, which can mean:
- only 60 images per second on the display if it's passive (120 if active)
- only 60 images per second in the cable if both frames are packed

As far as I remember, HDMI can offer 1080p@60 3D frame packing, which means 120 HD frames per second (60 for each eye), which is clearly sufficient.

Even if you want 120Hz per eye on active 3D to avoid flickering, there's no real reason to send 120 images over the cable, simply displaying each twice (thing Left_1 Right_1 Left_1 Right_1 Left_2 Right_2 Left_2 Right_2...)


That being said, we have troubles having 60fps in games. We have trouble having 1080p. So a video game source for 1080p@60 in 3D is quite a stretch... And as far as I know, there's no movies that support this either.
 

wsippel

Banned
I agree with you in the sense of who is actually working on the ports and the time they've had (and incidentally Joel from Frozenbyte in the Trine 2 thread seems to as well), but I do think that the differences between the games can still provide clues about the Wii U's architecture. For example, the fact that Trine 2 uses (BW-heavy) deferred shading and a fair amount of transparencies on top while performing better than PS360 versions indicates that the eDRAM bandwidth probably isn't a bottleneck in this regard. Conversely, Black Ops 2 uses forward shading (afaik) and (even in a quick'n'cheap port) shouldn't consume the same sort of bandwidth. Hence, we're looking for something other than eDRAM bandwidth that would affect transparencies, which either the shading technique (forward vs. deferred) is dependent on, or which would take time or expertise to work around (which Frozenbyte and Shinen had, but not most other port teams).
It's possible that UE3 and the CoD engine simply don't use the memory "correctly" and store everything in MEM2. Es easy as the Wii U reportedly is to develop for, the memory layout is pretty weird and unusual.
 

mrklaw

MrArseFace
I think there's a lot of confusion because "120Hz" itself is misleading.

There is, at least:
- the number of images (can packed several frames) sent each second over the cable
- the number of frames sent each second over the cable
- the number of real frames displayed by the set each second
- the number of real frames received by each eay each second
- the number of different (interpolated) frames displayed by the set each second
- the number of different (interpolated) frames received by each eay each second
- the number of frames (refreshed) displayed by the set each second
- the number of frames (refreshed) received by each eay each second

All those numbers can be different, and when you're talking about "120Hz" for a good 3D, I think you'll only want 60 images per second for each eye, which can mean:
- only 60 images per second on the display if it's passive (120 if active)
- only 60 images per second in the cable if both frames are packed

As far as I remember, HDMI can offer 1080p@60 3D frame packing, which means 120 HD frames per second (60 for each eye), which is clearly sufficient.

Even if you want 120Hz per eye on active 3D to avoid flickering, there's no real reason to send 120 images over the cable, simply displaying each twice (thing Left_1 Right_1 Left_1 Right_1 Left_2 Right_2 Left_2 Right_2...)


That being said, we have troubles having 60fps in games. We have trouble having 1080p. So a video game source for 1080p@60 in 3D is quite a stretch... And as far as I know, there's no movies that support this either.

its good to clarify how many different places you can measure these things.

However I think 3D tops out at 1080p/24/3D with frame packing (possibly 30..). only 720p can do 60 per eye frame packed.

You're getting 120 *images* per second being sent but the console is only updating them 60 times a second. Its just sending them in pairs for L/R eyes.
 

beje

Banned
It's possible that UE3 and the CoD engine simply don't use the memory "correctly" and store everything in MEM2. Es easy as the Wii U reportedly is to develop for, the memory layout is pretty weird and unusual.

It's pretty likely. I wouldn't expect anything to start completely taking advantage of the console until we see Bayonetta 2 in action, unless any other exclusive game pops out of nowhere Nintendo-style.
 

Koren

Member
However I think 3D tops out at 1080p/24/3D with frame packing (possibly 30..). only 720p can do 60 per eye frame packed.
I'm a bit lost in all this HDMI specifications. That being said, the 3D part of HDMI 1.4a is available to anyone, and among secondary 3D formats, there is mention of 1080p@60 in frame packing format (there's even a 1920x1080p@120 but in side-by-side and top-bottom only). So it's not an impossibility per se...

That being said, I don't think there's a TV set that can process this kind of data. It's not mandatory, and the only formats that should always be supported are, If I read correctly, 1080p@24 and 720p@60 in frame packing, which are precisely the two formats you suggest (and 1080p@24 top-bottom, 720p@60 top-bottom, 1080i side-by-side).

I was under the impression that some recent video card were able to do 1080p@60 in 3D frame packing over HDMI. I'm not a specialist at all of video cards, so I may have understand it wrong, though.
 

Durante

Member
It's possible that UE3 and the CoD engine simply don't use the memory "correctly" and store everything in MEM2. Es easy as the Wii U reportedly is to develop for, the memory layout is pretty weird and unusual.
No, I don't think that's likely. They run far too well for doing everything in the main memory pool. And I don't think that after 6 years of 360 having an eDRAM pool is such an alien/weird concept to developers (yes, I know the specifics differ).

It seems silly to me, for them to be able to get it working on DX10 hardware, robbing themselves of DX11 features, if only a few people with high-end DX10 cards will be able to play it.
It may seem silly to you, but it's perfectly possible that they simply haven't needed any DX11-specific features yet.
 

lherre

Accurate
Only for your info, UE3 is no longer developed/updated for consoles since July (it's still in development for pc or phones, but I think it won't be too much at all). The July build is the latest UE3 build for consoles. So the development for wii U (or other current consoles) is the one in this build, it won't be improved. At least officially by Epic, you always can try to update/integrate the changes/improvements by yourself.
 

Thraktor

Member
Only for your info, UE3 is no longer developed/updated for consoles since July (it's still in development for pc or phones, but I think it won't be too much at all). The July build is the latest UE3 build for consoles. So the development for wii U (or other current consoles) is the one in this build, it won't be improved. At least officially by Epic, you always can try to update/integrate the changes/improvements by yourself.

Hmm, that's quite interesting. Does that include Orbis and Durango versions? I ask as I imagine quite a few developers may want to use UE3 for cross-generation development, and it would seem strange to finalize the Orbis/Durango builds before final hardware's available. Or are they expecting devs to handle the Orbis/Durango ports themselves?
 

wsippel

Banned
No, I don't think that's likely. They run far too well for doing everything in the main memory pool. And I don't think that after 6 years of 360 having an eDRAM pool is such an alien/weird concept to developers (yes, I know the specifics differ).
The engines probably store something in MEM1. Even on Wii, some things had to be in MEM1, other things had to be in MEM2. xFB was strictly MEM2 for example, if I remember correctly.
 

lherre

Accurate
Hmm, that's quite interesting. Does that include Orbis and Durango versions? I ask as I imagine quite a few developers may want to use UE3 for cross-generation development, and it would seem strange to finalize the Orbis/Durango builds before final hardware's available. Or are they expecting devs to handle the Orbis/Durango ports themselves?

There is no official UE3 support for this platforms so ...
 
I'm a bit lost in all this HDMI specifications. That being said, the 3D part of HDMI 1.4a is available to anyone, and among secondary 3D formats, there is mention of 1080p@60 in frame packing format (there's even a 1920x1080p@120 but in side-by-side and top-bottom only). So it's not an impossibility per se...

That being said, I don't think there's a TV set that can process this kind of data. It's not mandatory, and the only formats that should always be supported are, If I read correctly, 1080p@24 and 720p@60 in frame packing, which are precisely the two formats you suggest (and 1080p@24 top-bottom, 720p@60 top-bottom, 1080i side-by-side).

I was under the impression that some recent video card were able to do 1080p@60 in 3D frame packing over HDMI. I'm not a specialist at all of video cards, so I may have understand it wrong, though.

this is all pretty much on the nose. recent graphics cards on PC DO support framepacked 1080p60, but I too am unaware of any TVs or projectors that can accept that signal yet.

furthermore lower HDMI specced devices have been able to pull off framepacked 3D by taking bandwidth away from other things. that's how the PS3 and 360 do it, despite not being HDMI 1.4a devices.

1080p60 SBS works though, but obviously you lose half the vertical resolution, and at least on PC it seems to have more overhead than framepacked 720p60. Still it often looks better (being a linear scale). Some 360 3D games back in the pre framepacking days would recommend you play them in 1080p SBS as it saved a bit of resolution vs 720p SBS. despite being sub HD even in 2D mode, Black Ops was one such game.
 

Thraktor

Member
There is no official UE3 support for this platforms so ...

OK, thanks. It's a little surprising as I would have expected developers to use it for cross-generation games, but I suppose if developers had been asking for it, Epic would have provided.
 
OK, thanks. It's a little surprising as I would have expected developers to use it for cross-generation games, but I suppose if developers had been asking for it, Epic would have provided.

Epic execs have actually given interviews earlier this year where they gave that same expectation, so I'm surprised too. Did UE4 become their cross-generational engine in response to developer feedback?
 

ozfunghi

Member
None yet, but Iwata confirmed that the hardware's capable of it.

About that... what 3D does Assassins Creed 3 provide exactly? It has side by side, top/bottom or color 3D. How is that supposed to work? I have the game but don't have a 3D tv... You can have color 3D on the gamepad, and my TV splits up in two screens for the other options... as you can see, i'm not really into 3D display technology, lol. I assume the color thing is for use with some red/green glasses? I can't imagine that works brilliantly.
 

Earendil

Member
About that... what 3D does Assassins Creed 3 provide exactly? It has side by side, top/bottom or color 3D. How is that supposed to work? I have the game but don't have a 3D tv... You can have color 3D on the gamepad, and my TV splits up in two screens for the other options... as you can see, i'm not really into 3D display technology, lol. I assume the color thing is for use with some red/green glasses? I can't imagine that works brilliantly.

Sounds like a migraine waiting to happen.
 
Top Bottom