Foghorn Leghorn
Unconfirmed Member
CBOAT attacking Albert, CBOAT attacking this guy... it really is wild at Xbox headquarters right now![]()
I've only seen the last couple, where's the Albert one?
CBOAT attacking Albert, CBOAT attacking this guy... it really is wild at Xbox headquarters right now![]()
Wait, they're still releasing native 1080p media? :/
So it's a PC build then presumably?
Or alternatively cutscenes are pre-rendered at higher IQ using in-game assets?
I wouldn't judge a game's quality based on some pre-rendered trailer. Go try to find an xb1 kiosk and play the game for yourself. You need to see the game playing on their xb1 hardware.
Not nearly as much, probably. It's a conservative design based on a very common and well known architecture, exceptionally well documented and with solid and robust tools available. The more unconventional a platform is, the bigger the room for growth. It's by far the least challenging system out of the three.
Xbox One isn't unconventional, it's just poorly designed.
Well, I believe the poster who wrote the post, Also I looked at the same lines as him and I came to the same conclusion that it was native 1080p, If Microsoft have said specifically this is from an Xbox One, then It won't be a PC build, that would be illegal to say and would be very silly to get into trouble for something so minor. I personally think native 1080 cutscenes will be on the XB1 for Ryse.
I'd prefer to take the word of people in the know rather than someone on a forum doing some maths
They have very little actual performance to show for their transistor budget, so it's poorly designed.
Could be pre-recorded video cutscenes like Uncharted used in the past.
.... are you even trying to understand?
They have very little actual performance to show for their transistor budget, so it's poorly designed.
But I'll humor your attempt at deflecting the logic:
the cost is decided by the size of the die
a 6.1billion transistor die on the same process node comes from the same wafers that would be used for the 5billion transistor die for the xbone apu
yields would be somewhat lower as a bigger die = more chance of having a fault on each die cut from the wafer.
20 percent bigger die for the gpu would cost maybe 40 percent more? depending on yields (as the die gets bigger yields drop exponentially, which is why the giant 5billion transistor die with embedded esram is a SHIT idea for a low end apu like what the xbox one uses)
the small 1.2billion transistor chip for the cpu would have much better yields, and cost 20 percent or less of the big xbone chip...
add some small costs for the more complex pcb to connect the two..
They have this huge apu that has very little performance and costs a lot to make...
again: poorly designed
I'm sure this isn't what MS or AMD engineers wanted, but if the suits told them to make it work without vram, then this is what they had to do...
The memory architecture is unconventional, as is its use of lots of dedicated silicon.There's nothing unconventional about the Xbone especially for console developers. PS2 and PS3 - now those were unconventional.
Because no scaling tech has some magic applied to it that produces non existing information, not even that mega hyper scaler inside Xbone. Whats not there cant be forged. This upscaling shit has to end, because all it does is interpolating and the result is always the same - blurr. Your very high IQ would be working on a native 900p display because that is the rendered resolution, the output res is upscaled with 33% non existing information, ergo no pixel mapping on a 1080p device (only a 900p image streched to fit 1080p) - ergo image quality suffers from this process. Not so simple anymore. Some current gen games have great IQ too, like Gears, God of War 3, or Last of Us... at least on a HD ready, but the IQ on a full HD display is barely "OK".Why were you talking about TV scalers then? The Xbox outputs at 1080p, the Xbox does the scaling. My point remains, only very few people can tell Ryse is not 1080p, the game looks jaw droppingly good and has a very high IQ. Simple.
Because no scaling tech has some magic applied to it that produces non existing information. Whats not there cant be forged. This upscaling shit has to end, because all it does is interpolating and the result is always the same - blurr. Your very high IQ would be working on a native 900p display because that is the rendered resolution, the output res is upscaled with 33% non existing information, ergo no pixel mapping on a 1080p device (only a 900p image streched to fit 1080p) - ergo image quality suffers from this process. Not so simple anymore. Some current gen games have great IQ too, like Gears, God of War 3, or Last of Us... at least on a HD ready, but the IQ on a full HD display is barely "OK".
And I see that Ryse promotional material is straight from PC... oh my.
Given that they had to design an architecture around DDR3, I think that it is a very good design. Microsoft's engineers are not stupid. If anything deserves criticism, it's the management that made the requirements (Win8/media multitasking, Kinect, ...) and apparently underestimated the schedule.
Could be pre-recorded video cutscenes like Uncharted used in the past.
Given that they had to design an architecture around DDR3, I think that it is a very good design. Microsoft's engineers are not stupid. If anything deserves criticism, it's the management that made the requirements (Win8/media multitasking, Kinect, ...) and apparently underestimated the schedule.
I think it's the whole package. It's all balanced as they say. There isn't a piece that designed for 1080p.
Because no scaling tech has some magic applied to it that produces non existing information, not even that mega hyper scaler inside Xbone. Whats not there cant be forged. This upscaling shit has to end, because all it does is interpolating and the result is always the same - blurr. Your very high IQ would be working on a native 900p display because that is the rendered resolution, the output res is upscaled with 33% non existing information, ergo no pixel mapping on a 1080p device (only a 900p image streched to fit 1080p) - ergo image quality suffers from this process. Not so simple anymore. Some current gen games have great IQ too, like Gears, God of War 3, or Last of Us... at least on a HD ready, but the IQ on a full HD display is barely "OK".
And I see that Ryse promotional material is straight from PC... oh my.
Because no scaling tech has some magic applied to it that produces non existing information. Whats not there cant be forged. This upscaling shit has to end, because all it does is interpolating and the result is always the same - blurr. Your very high IQ would be working on a native 900p display because that is the rendered resolution, the output res is upscaled with 33% non existing information, ergo no pixel mapping on a 1080p device (only a 900p image streched to fit 1080p) - ergo image quality suffers from this process. Not so simple anymore. Some current gen games have great IQ too, like Gears, God of War 3, or Last of Us... at least on a HD ready, but the IQ on a full HD display is barely "OK".
And I see that Ryse promotional material is straight from PC... oh my.
I'm sure its a lot more complicated than saying this module is x better at only x% price increase. Besides, 40% is still a pretty huge cost increase.
It isn'tThe IQ on the Ryse trailer is very high, that's what I was referring to, if that's footage that's rendered, upscaled and output all from the Xbox One, then we have nothing to worry about. If it turns out that those scenes were prerendered on PC and then output from the Xbox One, then we have an issue.
Did I miss something?
which I find odd as I had a similar power gpu before and it coped with 1080 years ago on my pc, though it used gddr not this ddr / esram solution.
the gpu is 1080, the memory set-up seemingly not (easily anyway as forza shows it can be).
The memory architecture is unconventional, as is its use of lots of dedicated silicon.
Your on fire tonight.
The IQ on the Ryse trailer is very high, that's what I was referring to, if that's footage that's rendered, upscaled and output all from the Xbox One, then we have nothing to worry about. If it turns out that those scenes were prerendered on PC and then output from the Xbox One, then we have an issue.
PS3 had that split RAM problem and lack of a scaler.
I'm sure the Xbone could run many games from years ago at 1080p as well. A similar GPU (Radeon 7770) also exists in my system and it will certainly not run Battlefield 4 on 1080p. It's not a sensible comparison.
900p isn't 33% shy of 1080p, is it?
So, this thread teached me a lot of things.
1.) Albeit consoles being a technical medium, tech doesn't matter. Games do however and albeit consoles being a technical medium, the perceived quality of a game is not at related to it's technical presentation. At all.
2.) Resolution doesn't matter, because the difference between native and non native resolutions is only visible to 0,0001% of "people".
3.) "People" will buy anything anytime, so quality standards are not necessary, because even for us Enthusiasts everything is fine as long as "people" will buy. And they will.
4.) 720p in 2018 will still look great.
5.) We don't even need to start thinking about 60FPS as a standard for the coming years. It doesn't matter anyway because see 1.)
6.) Dead Rising 3, Killer Instinct 720p, Forza 4 HD and Lair would receive just as much hype and would sell as good if they were on 360, because it's the games. "People" would be fine to stay with 360 anyway, because they don't notice technical differences.
7.) If "people" are happy, (XBOX-)GAF is happy.
Wow, thanks. That's great.
I haven't really been swept into the controversy surrounding the Xbone and its CoD: Ghosts/TitanFall-are-rendered-at-720p negative buzz, because I figured hey, these are launch games, we'll see future titles upped to 1080p as developers become more attuned with the hardware.
But you're telling me that 720-900p will be the standard for Xbone games throughout the entire gen?! Woooooooow. That's not good at all.
Only when spreading FUD. Upscale is no magic and its not something good either. Its necessary to fit an image on display that has a bigger res. The tech reached its limit and the result is still the same. Worse image quality because a picture has to be "streched". There is nothing good about it, and shouldnt be necessary for games in 1080p in 2013...
Fair enough. But you act like the Xbone scaler is some wizardy shit that could make up a 30% resolution difference without any loss in quality.
... again, read please, I don't have many more ways to keep rephrasing the same thing
the chip that is 5x more powerful doesn't cost anywhere near 5x as much to make
performance/dollar in production cost for the xbox one apu is very very poor.
It should be way more cost effective
the underlined is what i've been rephrasing 3x for you now, I give up.
I think it's the whole package. It's all balanced as they say. There isn't a piece that designed for 1080p.
Good thing is that PS4 50% power advantage started to show this early...I mean 900p vs 1080p is already 40% ....720p vs 1080p is 225% ...not factoring other differences in textures and framerates.
1920x1080 is 2,073,600 pixels
1600x900 is 1,440,000 pixels
The difference 633,600 pixels, which is roughly 30,6%
I didn't compare it to PS2 or PS3, I compared it to PS4. And there's a bigger learning curve compared to that system.Are you saying there is going to be a huge learning curve for devs on Xbox One, as there was for Cell and Emotion Engine? I don't see that at all. Improved tools will certainly help, but the memory setup is hardly foreign to developers.
Its my fault, I only skimmed through your posts, I apologise for that. I still think using your comparison is a bit naive as you'd need to make a more direct comparison for such a specific piece of hardware, not denying that you may possibly have a point though.
Would MS have even been better off with discrete CPU and GPU, and split memory pools, say 3GB main RAM and 5GB GDDR5?
I think they just went for an APU to try and get some longer term cost reductions (although with process shrinks slowing down, I'm not sure how much of a saving they expect over 5 years). I think both MS and Sony might have been surprised how expensive the current consoles still are to produce.
They could have got better performance for similar silicon budget at launch with a discrete GPU or they could have gone with EDRAM on a daughter die like the 360, but those again don't scale down as easily as an APU.
shortsighted? easy to say in hindsight I guess.
The thing on the right isn't an XBO, it's a considered, and discarded, PS4 architecture.![]()
OP reminded me of that. Hopefully devs will get a handle on the new Xbox soon. It took a long while with the PS3 and in a lot of cases 3rd party support never fully recovered. At least the X1 doesn't seem as confusing as the PS3 was to program for.
Precisely why Xbone's design is so silly. 720p on Xbone vs. 1080p on PS4 is massively underutilising the Xbone, but I suspect it'll be commonplace because of eSRAM.
If MS were sensible they'd set up first party studios and an equivalent of the ICE team (perhaps at 343i), because it would at least minimise the gap for first party games. But looking at stuff like Forza and Fable for Xbone, they're apparently happy for their teams to carry on half-arsing things.
If Xbone engineers were real gamers, they would immediately refuse designing X1 as it is now. MS should've implemented Gddr5 inside the APU or opted to more conventional PowerPC + Powerful GPU approach....but they cared about shoving media apps and advertising more than anything else.
The thing on the right isn't an XBO, it's a considered, and discarded, PS4 architecture.
The thing on the right isn't an XBO, it's a considered, and discarded, PS4 architecture.
The thing im wondering is that if the PS4 considered to use 1000mb/s ESRAM then why is the xbox one only using 200 or what it is?
The thing im wondering is that if the PS4 considered to use 1000mb/s ESRAM then why is the xbox one only using 200 or what it is?
If Xbone engineers were real gamers, they would immediately refuse designing X1 as it is now. MS should've implemented Gddr5 inside the APU or opted to more conventional PowerPC + Powerful GPU approach....but they cared about shoving media apps and advertising more than anything else.