• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Do you think next generation will finally be jaggy free?

subversus said:
I hope for mlaa implementation and 1280 as a standard. Fuck 1920. Eats a lot of resources, adds nothing to experience.

Might as well just revert back to 480p with that logic. Smh.


dwebo said:
Yep. I don't even notice them too badly in Wii games. /hugs plasma

Sitting miles away from the screen and your TV blurring the image to fuck isn't a solution to aliasing. The Wii's raw output is supposed to be aliased to all hell, if it isn't then your TV isn't representing it properly, so its not something I would shout about.
 
DeBurgo said:
5 years after they're gone there will be a post on neogaf lamenting the loss of jaggies in games.

We already had that post. No, I'm not joking either.


Hazaro said:
Quest for 1080p visuals at 30FPS!
Jaggies!
Quest for 1440p visuals at 30FPS!
Jaggies!
Quest for 1920p visuals at 30FPS!
Jaggies?

Soon we will pack enough power per pixel resolution we won't need anymore AA.
Truly developers are wise beyond their years.

The iPhone4 has a pixel pitch above 300ppi and yet it still has aliasing issues. Unless everyone is about to buy 5000p displays sometime soon, AA will continue to be necessary.


eso76 said:
considering we already have a decent amount of 720p 4xaa titles,

We do? How many retail console games with actual "high end" visuals and 720p/4xmsaa have released within the last couple of years? There's New Vegas (only on 360 mind) that I know of, not much else besides that. We've seen an awful lot more subHD titles recently than we have games with both a full 720p framebuffer and 4xmsaa. The situation has actually got significantly worse as this generation has progressed, not better.
 

subversus

I've done nothing with my life except eat and fap
Pikelet said:
:lol

I am guessing you don't play pc games?
1920x1080 is a massive improvement over 1280x720. Not just in terms of looks, but also the extra amount of things that can be thrown on screen

I play PC games in 1360*768. My TV can handle 1920*1080 at 30 hz only and I don't see any massive improvement. In fact it's hard for me to find a difference if you show me 1920 and 1280 screenshots side by side if they were resized to the same size of course. I can see a difference between 2560 and 1280 though.
 

pestul

Member
subversus said:
I play PC games in 1360*768. My TV can handle 1920*1080 at 30 hz only and I don't see any massive improvement. In fact it's hard for me to find a difference if you show me 1920 and 1280 screenshots side by side if they were resized to the same size of course. I can see a difference between 2560 and 1280 though.
Dude, you need some new eyes.
 

AndyD

aka andydumi
subversus said:
I play PC games in 1360*768. My TV can handle 1920*1080 at 30 hz only and I don't see any massive improvement. In fact it's hard for me to find a difference if you show me 1920 and 1280 screenshots side by side if they were resized to the same size of course. I can see a difference between 2560 and 1280 though.

Most often people resize down, losing the higher res advantage. But I agree with the above, you may need new eyes if you can't tell 720 vs 1080p.
 

eso76

Member
brain_stew said:
We do? How many retail console games with actual "high end" visuals and 720p/4xmsaa have released within the last couple of years? There's New Vegas (only on 360 mind) that I know of, not much else besides that.

according to beyond 3d:

X360

Afro Samurai = 1280x720 (4xAA)
Beowulf = 1280x720 (4xAA)
Blur = 1280x720 (4xAA)
Cars Mater-National = 1280x720 (4xAA)
Club, The = 1280x720 (4xAA)
Chronicles of Riddick: Dark Athena = 1280x720 (4xAA)
Darkness, The = 1280x720 (4xAA)
Dirt = 1280x720 (4xAA)
Dirt 2 = 1280x720 (4xAA)
Fallout 3 = 1280x720 (4xAA, certain edges)
Fantastic Four: Rise of the Silver Surfer = 1280x720 (4xAA)
Fight Night Round 4 = 1280x720 (4xAA)
Fifa Street 3 = 1920x1080 (4xAA)
HAWX = 1280x720 (4xAA)
HAWX2 = 1280x720 (4xAA)
Legend of Spyro: Dawn of the Dragon, The = 1280x720 (4xAA)
Lost Planet = 1280x720 (up to 4xAA dependent on framerate)
NBA Live '08 = 1280x720 (4xAA)
NBA Street Home court (demo) = 1920x1080 (4xAA)
Need For Speed: Pro Street = 1280x720 (4xAA)
Need For Speed: Shift = 1280x720 (4xAA)
Need For Speed: Undercover = 1280x720 (4xAA)
NFL Tour = 1280x720 (4xAA, 60fps)
NHL '09 = 1280x720 (4xAA)
Race Driver: GRID = 1280x720 (4xAA)
Race Pro = 1280x720 (4xAA)
Resident Evil 5 = 1280x720 (dynamic 0-4xAA)
SCORE International Baja 1000 = 1280x720 (4xAA)
SEGA: Rally Revo = 1280x720 (4xAA, alpha to coverage)
Soldier of Fortune: Payback = 1280x720 (4xAA ?)
Spiderman: Web of Shadows = 1280x720 (4xAA)
Super Stars V8: Next Challenge = 1280x720 (4xAA)
Virtua Tennis 2009 = 1280x720 (4xAA)

Ps3

Full Auto 2 (demo) = 1920x1080 (4x AA)
Gran Turismo 5: Prologue = 1080p mode is 1280x1080 (2xAA) in-game while the garage/pit/showrooms are 1920x1080 with no AA. 720p mode is 1280x720 (4xAA)
Heavenly Sword = 1280x720 (4xAA)
NHL '09 = 1280x720 (4xAA)
SEGA: Rally Revo = 1280x720 (4xAA, alpha blend)


It's not a huge number, but i'd say the vast majority of the games have at least 2xaa and there's a few 4xaa titles too.
Last gen we mostly had 480i games with no aa and few people know (or cared) that a lot of games ran at lower than 480 resolution. We got maybe 2 720p titles (wreckless 2, and dragon's lair 3d had a 1080 mode, not sure).
Assuming the gap between 'this gen and next gen' is comparable to 'last gen and current gen' (and it should be more pronounced, considering the time between them is going to be twice as much) i do think 1080p 4xaa is the minimum standard we can expect, the vast majority of titles next gen will meet or exceed that, and i expect that to look almost jaggy free, while certainly not as clean as, say, Halo photomode
 
subversus said:
I play PC games in 1360*768. My TV can handle 1920*1080 at 30 hz only and I don't see any massive improvement. In fact it's hard for me to find a difference if you show me 1920 and 1280 screenshots side by side if they were resized to the same size of course. I can see a difference between 2560 and 1280 though.

You don't see any massive improvement because your monitor is unable to display the extra pixels and you're adding interlacing artefacts. Do the comparison on a display that actually has 1080 lines of vertical resolution.
 

dwebo

Member
brain_stew said:
Sitting miles away from the screen and your TV blurring the image to fuck isn't a solution to aliasing. The Wii's raw output is supposed to be aliased to all hell, if it isn't then your TV isn't representing it properly, so its not something I would shout about.
I would hardly say I was shouting. As for TV settings, I just pulled recommended numbers for my model off some AV forum, which included bumping sharpness down a few notches IIRC. Besides "LOL tune it properly," I don't see why lowering sharpness isn't a legitimate option if you're really bothered by jaggies. For a console game, what else are you going to do about it?
 
dwebo said:
I would hardly say I was shouting. As for TV settings, I just pulled recommended numbers for my model off some AV forum, which included bumping sharpness down a few notches IIRC. Besides "LOL tune it properly," I don't see why lowering sharpness isn't a legitimate option if you're really bothered by jaggies. For a console game, what else are you going to do about it?

Nothing. Playing your games through a PC emulator is the only possible solution. Adding destructive blur is not.
 

Raistlin

Post Count: 9999
No, but I do believe some level of AA will be pretty close to standard (ie. not having it will be the outlier).

So at worst, IQ will certainly be better than this gen on the average.
 

Red

Member
Raistlin said:
No, but I do believe some level of AA will be pretty close to standard (ie. not having it will be the outlier).

So at worst, IQ will certainly be better than this gen on the average.
If the current trend follows, there will be impressive IQ in early titles and those derived from old engines, and increasingly poor IQ will become common as developers try to push more effects.
 

subversus

I've done nothing with my life except eat and fap
brain_stew said:
You don't see any massive improvement because your monitor is unable to display the extra pixels and you're adding interlacing artefacts. Do the comparison on a display that actually has 1080 lines of vertical resolution.


Well, may be it's the case. They look different on my TV, but not THAT different.
 
Oblivion said:
I have no idea how graphics work, but isn't it also dependent on your T.V.?

If you're running at your native res, no. The inherent problem is that the game renders into a buffer which is a grid of rectangles (at native res, each square corresponds to one pixel on your display). Now, take this rectangle grid, and try to draw a perfectly straight line at 45 degrees. Bam, jaggies. Go into MS Paint, take the line tool and you can see it for yourself clear as day.

MS really wanted to get rid of jaggies this gen, but thanks to an increase in techniques such as deferred rendering (uses multiple frame buffers, and is perfect for handling large amounts of dynamic lighting), they have trouble doing traditional AA techniques with the 10MB space MS provided for doing "free" AA. I think on the next-gen consoles, the systems will be better prepared for these rendering techniques, but even still, don't expect miracles.

It's merely a giant game of "where to put your resources". You could invest some of your processing power to provide a jaggy free image, but that same time spent doing that could be put to use pushing out more polygons, or better animation systems. It's merely a choice the developer has to make, and decide what's the best use for their resources.
 

subversus

I've done nothing with my life except eat and fap
Stallion Free said:
So you admit your TV is subpar, why should we take your opinion on this matter seriously?

I don't know, you tell me :lol It's just an opinion, I can change it any minute
if I really see the difference. Also I'm intrigued now, must obtain 1920*1080 to test it properly.
.
 

Corky

Nine out of ten orphans can't tell the difference.
Currentgen ambitions aren't fully realized even after 5 years :

720p
30fps
4xAA

Expecting next-gen to aim for... :

1080p
60fps
XxAA


...and fail miserably
 
Kodiak said:
god damn brain stew - is there a tech related thread on gaf that you haven't made your bitch?
Obviously not. Its his thing. He loves tech related threads and speculating based on facts and reasoning. Hell I visit tech threads just to read his posts half the time.

Edit: I don't get why so many people expect most games to have 0 AA next gen. I mean looking at what MLAA can do now and the performance cost, I would expect it to become an industry standard for pretty much every game out there in the next few years.
 

DonMigs85

Member
I noticed Xbox 1 games looked cleaner and less prone to dot crawl even when just using composite cables on a regular old CRT compared to PS2 or Wii/GCN. Is this due to the 5-line flicker filter or something?
 

Corky

Nine out of ten orphans can't tell the difference.
Lostconfused said:
Obviously not. Its his thing. He loves tech related threads and speculating based on facts and reasoning. Hell I visit tech threads just to read his posts half the time.

I'm not sure if you are being facetious, but I actually do the bolded part xD
 

RedSwirl

Junior Member
The people who want no jaggies and 60fps standard so badly should really just stick to PC gaming.

Details like that are clearly not the biggest priority to a large chunk of console developers. I don't even think 60fps really benefits most games outside of shooters, racing games, music games, and fighting games. If it's not twitch-based I don't even care as long as the framerate is stable.
 

Emitan

Member
Corky said:
I'm not sure if you are being facetious, but I actually do the bolded part xD
Those kinds of posts are the reasons I come here. I was in heaven when everyone was speculating about the 3DS.
 

Dead Man

Member
ALl I want is the option to Vsync on all games, console/PC/whatever. I will play at 20fps of jaggies with no tearing over 60fps with tearing. I hate it. It makes my brain hurt.
 

Scrow

Still Tagged Accordingly
zoukka said:
You see more shit with bigger resolution.
you see more detail with higher resolution. it doesn't necessarily change your fov or draw distance. it's just more pixels.
 

zoukka

Member
Scrow said:
you see more detail with higher resolution. it doesn't necessarily change your fov or draw distance. it's just more pixels.

Yes it doesn't have anything to do with draw distance technically, but in practice, higher rez enables you to see small shit (that's far away from the camera) better.
 

Kittonwy

Banned
Zombie James said:
It seems like there's this growing movement away from tradition forms of anti-aliasing to more customized, less resource-intensive methods (MLAA, DLAA). These algorithms are only going to get better and the hardware they're going to be running on will be that much better as well. Do you think next-generation will finally be the one where high quality anti-aliasing will be cheap enough (performance-wise) where every game can use it?

Maybe next generation people will stop obsessing over image quality when we have AA techniques that are adequate and we'll stop posting pictures of shit ugly-looking games running with 4xMSAA at some crazy resolution and calling them good-looking because in fact they're not.
Indifferent2.gif
 

Dead Man

Member
Kittonwy said:
Maybe next generation people will stop obsessing over image quality when we have AA techniques that are adequate and we'll stop posting pictures of shit ugly-looking games running with 4xMSAA at some crazy resolution and calling them good-looking because in fact they're not.
Indifferent2.gif
Every now and then you make a sensible post. This is one of them.
 
Dead Man said:
Every now and then you make a sensible post. This is one of them.

eh I have a feeling he's probably trolling pc gaming with that post regardless of how sound his argument might appear on the surface
 

Lazy8s

The ghost of Dreamcast past
Real AA, full precision color blending, and high accuracy floating point 3d depth is affordable when a tile and not a frame is the intermediate buffer for rendering.
 

vg260

Member
I've given up hope for 60fps standard ever. All i want next-gen is 1080p minimum standard. Shouldn't be hard based on current PC vid card performance.
 

V_Arnold

Member
Personally: I do not care.
I cant even count the awesome looking games this generation.
And yet, for something that I remember fondly, there is a counter-opinion because it is not 720p (Halo3), because it has "the j4gg13s!" (dunno...Bayonetta? TEKKEN 6?), or because it does not have constant 30fps all the time!!

This generation, for me, has proven that it can present totally awsome-looking games, with flaws that can or cannot be tolerated. I tolerate the hell out of them, but I do not think next gen will "solve" the "issue" because as long as there is a limit of the next gen's power, developers will start cutting corners.

BTW the only game where aliasing provided me with a less then optimal experience was Naruto Ultimate Ninja Storm 2-s adventure screends on Xbox 360. Horrible, because I could not even see the characters properly.
 
It would be nice, but because the cost of hardware of a console of that power would be high it would be a expensive launch. More than the last couple
 

Raistlin

Post Count: 9999
Crunched said:
If the current trend follows, there will be impressive IQ in early titles and those derived from old engines, and increasingly poor IQ will become common as developers try to push more effects.

It matters what current trend you are following.

PS3 IQ has continued to improve throughout the generation.
 
Crunched said:
If the current trend follows, there will be impressive IQ in early titles and those derived from old engines, and increasingly poor IQ will become common as developers try to push more effects.

Techniques like MLAA are practically free on modern PC GPUs (like less than a ms of rendertime at 1080p, half that at 720p), and considering the Wii2/PS3/Xbox3 will all be using a GPU derived from a contemporary PC GPU, well I just think it'd make no sense to ditch it all. Its not going to claw you back any significant amount of performance at all, so what's to gain by ditching it? I expect techniques like MLAA to be pretty much a minimum standard next generation, but hopefully they're not used in isolation.
 
Lostconfused said:
Obviously not. Its his thing. He loves tech related threads and speculating based on facts and reasoning. Hell I visit tech threads just to read his posts half the time.

Edit: I don't get why so many people expect most games to have 0 AA next gen. I mean looking at what MLAA can do now and the performance cost, I would expect it to become an industry standard for pretty much every game out there in the next few years.

On these next generation consoles, using better quality MLAA than featured in God of War will cost you less than 1 frame per second in a game that's targeted at 30fps.* 1 whole frame. Of course its use is will be incredibly widespread and yes, it should indeed be the minimum standard.


*Current unoptimised PC implementations take less than a half a ms of GPU rendertime at 720p on a 3 year old GPU. Conservatively put that at 1ms for a 1080p frame for next generation consoles (to ensure we can use a higher quality algorithm) and consider that a standard 30fps game has 32ms of GPU rendertime.




RedSwirl said:
The people who want no jaggies and 60fps standard so badly should really just stick to PC gaming.

Details like that are clearly not the biggest priority to a large chunk of console developers. I don't even think 60fps really benefits most games outside of shooters, racing games, music games, and fighting games. If it's not twitch-based I don't even care as long as the framerate is stable.

You don't think 60fps benefits games outside of 4 of the most popular genres of games around? I don't think you're making a strong argument here.
 

DonMigs85

Member
Even in 720p mode I'd have to say Wipeout HD is the cleanest, smoothest game I've seen on PS3 save for the minor screen tear at the top. The Ratchet games look smooth too.
Even Burnout Paradise looks quite smooth, and that's only with 2x MSAA. Jaggies didn't bug me too much in RE5 on PS3 either, but the 360 version is definitely cleaner-looking.
Supposedly GT5 Prologue uses 4x MSAA in 720p mode but the cars still look pretty aliased at times. I guess it's highly selective.
 

Kittonwy

Banned
jim-jam bongs said:
eh I have a feeling he's probably trolling pc gaming with that post regardless of how sound his argument might appear on the surface

I actually find PC games like Crysis, Star Craft 2, Witcher 2 and Two Worlds 2 to be pretty damn good-looking, I just don't find an ugly-looking game (poor lighting, poor character models/environments, etc) good-looking just because it's running with full anti-aliasing at some outrageous resolution, lipstick on a pig, etc, etc. My point is that hopefully developers will have good anti-aliasing in their games so people don't get hung up on jaggies anymore.
Indifferent2.gif
 
Top Bottom