• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Xbox One | Ryse confirmed running native at 900p

Status
Not open for further replies.
That's because you're talking rubbish.

1.3Tf is similar power to a GeForce GTX 560 according to Wikipedia. That's good enough for 1080p graphics and that's not in a closed box environment surround by tons of custom chips nor does it have eSRAM.

the bolded part is a bottleneck not an advantage, usually high-end cards have 1Gb of GDDR5.

EDIT: Beaten by a mere seconds :)
 
i interned with them way back in 2009 to 2010 and also with their research group. So i dont really have insignt into their product development cycle apart from the general vibe i got while working at the research labs was that ms researchers come up with great ideas but from the innovation end to the actual product it just gets lost ... and the pr spin makes it worse ... thats just the vibe i got and like i said i was an intern years back ... im in grad school for a phd now and actually am currently research kinect applications (to non gaming areas) surveilance medical applications etc ...

essentially tis good tech just not leveraged well ... and imo if youre going to put something that expensive and nice into your box thats what you be focussed on leveraging not your underpowered gpu which you had to make a compromise on because you decied the kinect was the way to go ... thats just my opinion

Thanks for the response. It must be tough on their engineers. These folks don't normally face the public. Just quietly work on creating amazing new tech. They must shake their heads when they see the PR spin coming from MS marketing.
 
What the hell guys, I haven't slept for over a day and you're somewhat confusing me. Wish my GTA V would hurry up and get here lol.

So 1080p is 125% more pixels than 720p...never thought mrklaw would get a maths question wrong lol.

I was in a rush. caught out by semantics bleurgh :P
 
yes but consoles and pcs are very different markets ... as i explained console exclusives are a major selling point what exclusives do pcs have i went into more detail into this in my earlier post
They're not different enough for them to be left out of the next gen conversation.

Dick waving is dick waving, regardless of who's doing it and for which platform they're doing it for. If people want to make others feel like their platform of choice is technologically inferior then they can't complain when others do the same to them.
 
shit, can I borrow your PS4 and Knack for a single day?
I've played a good 15 minutes of the game (still in development, of course), but Cerny himself told me that they are targeting 40 fps (which looks dreadful on a 60 Hz display). They should either go for a locked 30 or reduce details and resolution in order to hit 60, I think. His comment suggests they are on the wrong track, however.
 
and without hardcore gamers at the start of the gen, they don't get the word of mouth and momentum to even start getting that mainstream interested. It is a very high risk strategy and I think they were relying on Sony not being as strong as they are.

agree on this. they overestimated their crossover on hardcore gamers ... and as some have pointed out they are trying to fight battles on two fronts (casuals with nfl tv kinect) and hardcore (on specs exclusives) its very hard to win both ...
 
That's only partly true. Kinect took financial resources away, not silicon resources. The real reason they chose a smaller GPU was due to the fact that they need silicon space to accommodate the 32MB of ESRAM. About 1.6 billion transistors for this esram alone in a 5 billion transistor chip. It takes up a decent chunk of physical space on the APU, so they couldn't choose a larger GPU. Ps4 doesn't have this issue so they can dedicate more space for GPU cores.

yeah, I'm not exactly best friends with the microsoft silicon architects nor the suits that called the shots, or i ever implied anything like that. its only my opinion, as everybody has his/her own opinion.
but also the hdmi-in took some of that space and design budget. dont overlook that. something was the first domino piece the fell on the others ;)
in general, their focus on their greedy plan with kinect as a forced medium to collect ad response, thats what ruined the machine imo.

I could be wrong. its all how it adds up for me.
 
That's because you're talking rubbish.

1.3Tf is similar power to a GeForce GTX 560 according to Wikipedia. That's good enough for 1080p graphics and that's not in a closed box environment surround by tons of custom chips nor does it have eSRAM.

The GTX 560 has a much higher fillrate and main memory bandwidth and more ROPs, though.

Edit: beaten. This thread is too damn fast for me.
 
I bloomin played the Witcher 2 on my gaming laptop at 1080p back in 2011. xbone is probably twice as powerful as my laptop was at least. 1080p should be ubiquitous, because it hasn't been a big deal on PCs for years (and on machines less powerful than what we are getting)...
 
PS4 is underpowered compared to PCs
Xbox is underpowered compared to PS4 and PCs

The only solution to keep them competitive for us enthusiasts, is to continuously invest in high quality first party exclusives for the next 5 years. If they do that, then the power levels are irrelevant - we'll buy them for the games.
 
Well, PS3 and 360 were both able to play some games @1080p/60fps

your statement is quite inaccurate my friend, and that's coming from the biggest Sony fan you can have on GAF.

I am talking about full AAA games not minigames and PSN games. Only game that was 1080/60fps was ridge racer and Wipeout HD. Only RR was proper retail game. Wipeout later was released to retail.

GT5 had 1080 but it was 1280x1080p. And i think this will be the case with Forza. Stretching horizontal resolution + good AA is better than native way lower resolution.

Honestly anyone under a 40 inch display more probably won't notice the difference, which I assume is quite a few... Those of us with 60+ inches will be able to tell instantly, hopefully the upscaler is as good as they say.

I have 40' TV and difference is noticeable a lot. Thing is it isn't that much difference. With good AA method you can practically play games without aliasing in 900p and still picture will be sharp. On 1080p you don't need that much AA if you play on 40TV screen. I usually add 2 or 4x AA and game is clean of aliasing.
 
Yeah, IQ matters much more. Most people wont notice many high precision effects, like for example higher quality DoF than competition. And why do You need high precision of effects when Your IQ will scale everything down anyway?
And for game that focuses on close combat and high fidelity assets IQ is priority, because with crappy IQ all fidelity gains are wasted.

another way to look at it is that you have this team of engineers and designers who've been working on this game for years and are willingly making the trade off of IQ for effects. if the benefits of IQ over graphic effects were so stark and obvious then don't you think they would go that route? fact is, claiming that IQ is obviously the way to go no matter what is essentially calling the whole team a bunch of morons. if on the other hand, you want to argue for IQ as a matter of personal taste, well that's a different matter.
 
This raises an important question...

Was the 1080p released a few days ago 900p upscaled or was it running at 1080p native on PC?
I already posted my analysis earlier in the thread. It was running at what looks like 1800x900 upscaled.

However, that doesn't mean it wasn't running on a PC. The AA is shockingly good, and given the downgrade in resolution I'm now less sure it'll always be that way.
 
I've played a good 15 minutes of the game (still in development, of course), but Cerny himself told me that they are targeting 40 fps (which looks dreadful on a 60 Hz display). They should either go for a locked 30 or reduce details and resolution in order to hit 60, I think. His comment suggests they are on the wrong track, however.

DAMN, You talked with Cerny himself !!!

if the lead architect of PS4 told you himself to not worry about framerates, you need to listen :)
 
900p | 1600 x 900 | 1,440,000 pixels - 56% more pixels than 720p
1080p | 1920 x 1080 | 2,073,600 pixels - 225% more pixels than 720p, 44% more pixels than 900p[


Maths. Everybody has one.


If 1600x900 is 56% more pixels than 720p

1080p can't be 225% more pixels than 720p.


Sorry.
 
But it does have GDDR5 ram at 128 GB/s bandwidth, which is a whole lot more than the Xbox One's DDR3 at 68 GB/s. But honestly, you are right. These are just launch release pangs. This always happens. Don't know how Sony have avoided it, but all these games will be topped very quickly later in the launch window cycle or thereafter.

I'd like to see some analysis of how much bandwidth you really do need for a 1080p if your not required to dump graphics data across a PCI bus.
It could be that 68 GB/s is enough.

the bolded part is a bottleneck not an advantage.

Keep feeding your fud sony console warrior.
 
This raises an important question...

Was the 1080p released a few days ago 900p upscaled or was it running at 1080p native on PC?

You cant tell that from 10mbps video.

===

Btw Digital Foundry made comparison some times ago with Crysis 3
http://www.eurogamer.net/articles/digitalfoundry-can-xbox-one-multi-platform-games-compete-with-ps4

1080p_003.png

900p_003.png


1080p_001.png

900p_001.png


1080p_000.png

900p_000.png
 
I've played a good 15 minutes of the game (still in development, of course), but Cerny himself told me that they are targeting 40 fps (which looks dreadful on a 60 Hz display). They should either go for a locked 30 or reduce details and resolution in order to hit 60, I think. His comment suggests they are on the wrong track, however.

OT, but how did knack look in person visually speaking. Looks decent enough from vids to me, with a clean attractive look.
 
I've played a good 15 minutes of the game (still in development, of course), but Cerny himself told me that they are targeting 40 fps (which looks dreadful on a 60 Hz display). They should either go for a locked 30 or reduce details and resolution in order to hit 60, I think. His comment suggests they are on the wrong track, however.

I'm sure they're targeting 40 so that they can lock in 30 and not have to worry about framerates dips in high density action.
 
PS4 is underpowered compared to PCs
Xbox is underpowered compared to PS4 and PCs

The only solution to keep them competitive for us enthusiasts, is to continuously invest in high quality first party exclusives for the next 5 years. If they do that, then the power levels are irrelevant - we'll buy them for the games.

This would be true in PS2 era when publishers released exclusive games for consoles. Now most of the games are multiplats and for people getting best version of multiplat that looks better will be solid incentive. Especially if that difference will be much bigger that PS3/X360 was.

One thing is true. MS needs steady stream of good exclusives now.
 
AA method you can practically play games without aliasing in 900p and still picture will be sharp. On 1080p you don't need that much AA if you play on 40TV screen. I usually add 2 or 4x AA and game is clean of aliasing.

No amount of AA will make up for the lower texture resolution...

No? I've watched the rise demo on my TV and I'm content. I already know what it's going to look like when it gets there in November.

Looking good to you and the ability to tell the difference between 900p and 1080p on a 60 inch (1080p I assume) monitor are two different things.
 
I've played a good 15 minutes of the game (still in development, of course), but Cerny himself told me that they are targeting 40 fps (which looks dreadful on a 60 Hz display). They should either go for a locked 30 or reduce details and resolution in order to hit 60, I think. His comment suggests they are on the wrong track, however.

No it doesn't. GOW3/A look and play fantastic, and they have variable frame rates, Ascension running on average around 45fps.
 
At least with Crysis 2 and Crysis 3 they had excuse. There is no excuse here, just wanting to go all out with CE features.

Knowing Crytek they've even put SMAA 4x instead SMAA T2x ... They probably think that SMAA 4x in 900p is better than 1080p SMAA T2x - wrong!

I could understand that in multiplatform development, but in exclusives You can budget games as You want.


I have many doubts that crytek can reach SMAA x4 in ryse.
 
I'd like to see some analysis of how much bandwidth you really do need for a 1080p if your not required to dump graphics data across a PCI bus.
It could be that 68 GB/s is enough.

Highly doubt that, but it would really depend on the game itself. I'd imagine for example Ryse is more bandwidth thirsty than something like COD.
 
22rOM6R.gif


What the f...????
Why would you do that?
Use 1080p/30fps as the key target and build your game around it.
900p on a next gen console is horrible/a joke!
 
I'd like to see some analysis of how much bandwidth you really do need for a 1080p if your not required to dump graphics data across a PCI bus.
It could be that 68 GB/s is enough.

there is no way to do such an analysis...because resolution is all just a part of the whole picture...

of course the Xbone can do 1080p with 68GB/sec...the 360 and the PS3 could do 1080p...the Wii U can do 1080p...

its all about the trade offs you want to make in other areas...there is no magic number of bandwidth, ROPs, CUs, clock speed that means you can hit 1080p...its all about your design and vision...
 
That's because you're talking rubbish.

1.3Tf is similar power to a GeForce GTX 560 according to Wikipedia. That's good enough for 1080p graphics and that's not in a closed box environment surround by tons of custom chips nor does it have eSRAM.
Nvidia Flops and AMD Flops are not equal. Generally, Nvidia Flops > AMD Flops.
 
They're kinda right though. Outside of Sony developed linear story style games which graphically win the generation, the 360 pretty much wins on the other stuff, especially open world games like Skyrim, Saints Row, etc.
Infamous 2 was open world and did some very impressive stuff with lighting and character detail/animation. You also have NFS: Most Wanted, an open world racer, performing better on PS3.

Overall you do see tradeoffs on the PS3 because multiplatform developers spend most of their time trying to get something that works well on the 360 to run on a console with a very different memory setup and rendering pipeline. Sometimes this leads to a PS3 version that isn't optimal. Crysis for example didn't go tiled deferred on the SPUs and rendered pretty much everything on RSX. There are performance wins on the PS3 that show in select titles because the developers made considerable effort to push the SPUs.
 
It makes perfect sense. If you get 40fps with some dips down to 30 in high density action, then lock it to 30 overall. It's better than targeting 30, locking 30, then still getting framerates dips that now drop you below 30.
Okay, that's more intelligible. I was thinking "targeting 40fps" as meaning "that's what we'll release at!"
 
The problem with that example is that two different devs were involved. Not only that, but Bungie was never known as technical powerhouse whereas 343i was specifically formed as a no-holds-barred superstudio.

PGR3 to PGR4 is a better example. But if Crytek--you know, those guys almost everyone agrees are the single most technically skilled studio in existence?--are having problems, that bodes very ill for less gifted devs in the same timeframe.

Crytek and any team of talent with a decent enough learning curve on new hardware (that obviously isn't packing the most raw power) can't always pull miracles when crunching for launch. Also, make no mistake, Crytek could have easily made this game 1080p if they wanted, but they made the decision to make it prettier with more visual bells and whistles. They said it themselves, they wanted it to look even better, and I guess that's what they did. I constantly questioned if this game was running on a high end pc, because I thought it looked way too good for native 1080p on the system at launch. I said numerous times that I thought there was some pc funny business going on, but I guess they simply lowered the resolution and made it prettier.

Crytek are incredibly skilled, for sure, but let's not forget that their 360 and PS3 efforts weren't more impressive than 343's with Halo 4, Naughty Dog's with Uncharted 2 and 3, or even Sony Santa Monica with God of War 3. It should never be a surprise that devs possibly don't get the most out of a system at launch. We've seen this way too many times before. GAF is suppose to be a community that is more informed than your average gamer, but we seem to be making some pretty hasty and bold assumptions about a system we #1 knew was harder to develop for #2 is bound to experience issues at launch like all consoles before it. Devs need time to come to grips with the system and Microsoft will improve the tools and drivers to aide development on the system. I know this won't matter to some people who are foaming at the mouth over this news, but I guess it's not that important since the game still looks amazing, so carry on ;)
 
I have many doubts that crytek can reach SMAA x4 in ryse.

I had many doubts that they can reach anything higher than Medium settings from Crysis 3, but they manage to put many very high settings in there.
Still i also doubt that there is SMAA 4x there, but latest video was so clean from jaggies, so dunno maybe they are stupid enough to compromise image clarity for image stability.

---
Yeah, but GTX 560 is a 32 ROPs GPU and has a pixel fill rate of 22.7 GPix/s. That's 65% more pixel fill rate than Xbox One has. And don't forget the bandwidth.

From my understanding 16 ROP doesnt really limits You in 1080p resolution without MSAA. 32ROP was mostly added in GPUs for multi monitor rendering and resolutions around 4k and higher bandwidth.
 
I've played a good 15 minutes of the game (still in development, of course), but Cerny himself told me that they are targeting 40 fps (which looks dreadful on a 60 Hz display). They should either go for a locked 30 or reduce details and resolution in order to hit 60, I think. His comment suggests they are on the wrong track, however.

Stop with this shit. It is absolute horseshit.

1 frame here and there is unnoticeable for human.

GOW3 had best IQ on consoles and was 40FPS+ variable
MGS4. 30-60fps variable

and so on.
 
I had many doubts that they can reach anything higher than Medium settings from Crysis 3, but they manage to put many very high settings in there.
Still i also doubt that there is SMAA 4x there, but latest video was so clean from jaggies, so dunno maybe they are stupid enough to compromise image clarity for image stability.

---


From my understanding 16 ROP doesnt really limits You in 1080p resolution. 32ROP was mostly added in GPUs for multi monitor rendering and resolutions around 4k.

Wait, 16 ROP doesn't really limit resolution, but 32 ROP was added to support higher resolutions like 4k?
 
Also, make no mistake, Crytek could have easily made this game 1080p if they wanted, but they made the decision to make it prettier with more visual bells and whistles.
Yeah, I realized that was true and corrected myself later.

I said numerous times that I thought there was some pc funny business going on, but I guess they simply lowered the resolution and made it prettier.
The unfortunate truth is that we now know they lowered the resolution, but we can't actually be sure there's no PC funny business going on. By far the most shocking thing about the footage is the AA level. With the downgrade revelation, I'm no longer as sure that's a locked feature.
 
Status
Not open for further replies.
Top Bottom