• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Are you in or out of a iterative consoles?

I'm not sure if you really took the time to understand what I was saying the first time then, because you have basically reiterated what I was trying to get at. It seems like it hit you as you were typing it out.

Lol, I actually haven't changed my stance at all. There was nothing in that paragraph that I hadn't said or hinted at in previous posts.

But, following that and to the rest of your reply (and some subsequent replies) I just want to point this out...This whole thread is predicated on the idea that Sony are currently working on a 4.5 console to be released in the next 18 months - PS4's first 3-4year time frame. I've just argues a model that I believe would work if that turns out to be true.

My believe is that done right it would smooth the transition from one generation to the next with smaller more regular speed bumps instead of longer more sudden changes.

It would be better for FC, which is a given if both machines, PS4->PS5 or PS4.5->PS5, are based on the same architecture because the older machine will be closer in power to the newer machine so for cross gen releases the power divide isn't so pronounced.

As for a new SKU only being for scaled graphics, that should be a short term view. As the next console hits and the last falls away each iteration will have its turn as lead platform.

Really though, I'm just happy with whatever Sony give me. The only way I'm moving away from Consoles and Playstation in particular is if Sony do something incredibly stupid.
 
I've just argues a model that I believe would work if that turns out to be true.

My believe is that done right it would smooth the transition from one generation to the next with smaller more regular speed bumps instead of longer more sudden changes.
I can respect that. I don't agree, but I respect your belief in it. My stance has also remained consistent - I guess we draw from the same context but come to different conclusions.
 
I would not upgrade my console every couple of years. I would upgrade every 5 years or whenever their is a major change that leaves the older consoles obsolete. I'm all for backwards compatibility of the software though, hopefully we will have that moving forward.

LB
 
I don't really see a point in releasing a second SKU if it's just for graphics scaling.

I don't really see a point in releasing a new model of smartphone every year or less, and yet...

Another fundamental misunderstanding about this conversation is the idea that iterative consoles are only to get/"force" people to upgrade in shorter time frames. Let's set aside that no one ever forces you at gunpoint to be all "#shutupandtakemymoney" just because the nerd equivalent of Malibu Stacy getting a new hat happened. But actually, the opposite is a key factor in this, as well, that people want to upgrade LESS and iterative consoles give those people more options to loop themselves back into consoles.

The amount of consoles that Sony and Microsoft sold near the tail end of the longest generation in history without really even being THAT much better in price from the middle of the generation seems to indicate that, for a number of customers, they don't want/need to update their consoles every 5-6 years and were willing to stretch out the time they take updating to something new. PS3/360 were enough of a technological jump that, even when they were long in the tooth, it was still a good investment. But as we know very well by now, the times of bleeding edge hardware on consoles is over.

Let's say hypothetically I bought a 360 in 2008. I got a good 4 years out of it before new consoles hit the scene, but I want to get more out of it before I upgrade. Right now, the business model doesn't accommodate that very well.
And while it theoretically could accommodate that while retaining the current 5-6 year cycle, cross-gen development becomes a more difficult proposition when you're discussing a difference in hardware capability that's THAT wide. It needs that in-between step to make it viable, much like mid-tier PC hardware keeps the investment in multiple spec targets viable in the PC gaming market.

So it's 2016, I'm finally in the market for a new hardware box. Xbox One and PS4 are already horribly outmoded technologically. So my choices are either buy the aging hardware that will be outmoded in 2-3 years now, wait 2-3 years with nothing new to play or subject myself to PC gaming, something that I have avoided for years.
Again, the current business model gives 2 less than ideal options and an option that console makers don't want you to consider as an option because they run the risk of losing a return customer to the console market, a risk they can't take. Giving something new in that timeframe keeps people invested in console purchases when their purchasing choices don't align with the mandated industry schedule of every 5-6 years.

Lastly, the reason people buy consoles isn't because they want something that lasts. The malaise that happens at the end of a generation and the eagerness to trade up tells a different story of that.
No, what they look for is a clearly defined upgrade path. PC hardware changes with a new GPU series once a year at the latest, so it's never clear when the real "game-changer" happens; the enthusiast press for PC gaming doesn't exactly help matters on this. Consoles make that clear to consumers right now. And that's something that cutting the iteration cycle to every 3 years won't take away.
 
Another fundamental misunderstanding about this conversation is the idea that iterative consoles are only to get/"force" people to upgrade in shorter time frames.
this has been covered earlier. It isn't about actually being forced to upgrade, but value proposition, which is dependent on the consumer and their experiences and expectations of the product they're about to invest in.

And while it theoretically could accommodate that while retaining the current 5-6 year cycle, cross-gen development becomes a more difficult proposition when you're discussing a difference in hardware capability that's THAT wide. It needs that in-between step to make it viable, much like mid-tier PC hardware keeps the investment in multiple spec targets viable in the PC gaming market.
Cross-gen games that currently exist prove otherwise, and graphics scaling would account for a lot given it's the same architecture. The install base of the older console far outnumbering the new console base during the transition period drives cross-gen games, not simply graphical difference.

So it's 2016, I'm finally in the market for a new hardware box. Xbox One and PS4 are already horribly outmoded technologically. So my choices are either buy the aging hardware that will be outmoded in 2-3 years now, wait 2-3 years with nothing new to play or subject myself to PC gaming, something that I have avoided for years.
Again, the current business model gives 2 less than ideal options and an option that console makers don't want you to consider as an option because they run the risk of losing a return customer to the console market, a risk they can't take. Giving something new in that timeframe keeps people invested in console purchases when their purchasing choices don't align with the mandated industry schedule of every 5-6 years.
Maybe you should give it a try. What keeps people invested in consoles is (surprise!) good games, and a good experience, primarily. Hardware is only one factor, and is not THE factor.

Lastly, the reason people buy consoles isn't because they want something that lasts. The malaise that happens at the end of a generation and the eagerness to trade up tells a different story of that.
No, what they look for is a clearly defined upgrade path. PC hardware changes with a new GPU series once a year at the latest, so it's never clear when the real "game-changer" happens; the enthusiast press for PC gaming doesn't exactly help matters on this. Consoles make that clear to consumers right now. And that's something that cutting the iteration cycle to every 3 years won't take away.
Oh please. Don't claim to speak for PC gamers when you admit you actively avoid it. And don't claim to speak for all console gamers either. Numerous people in just this thread would take insult to that. People buy in to consoles for reasons that are technologically, economically, and personally related, and if this thread and the reaction across a myriad number of other sites that have reported the rumor are anything to go by, it's that people have their own ideas about what a console is or means to them. Nobody has the right to dictate this.
 
I don't see that. It seems to me that tech is moving fairly slow, compared to the insane jumps in graphical power we used to see. PS1 to N64 to Dreamcast, for example, in just a few years.

Hell, the time between N64 and Dreamcast was almost exactly what PS4 to PS4.5 would be if they released it right now. They could put a GTX980 in there and it still wouldn't be that kind of leap.

Same thing on the PC side back then. Those first generations of graphics cards were nuts.

I can't see a tech reason for this model, not at all. Technology moves along, of course, but I'd say past consoles had it far worse on that front. All I see is the phone/table market success inspiring greed.
Agreed. I wouldn't say tech is moving slowly but in terms of perceivable visual advancements to the average person on the street I'd say it certainly appears slower. The progress from the 90s to the mid 2000s was insane by comparison.

I don't see the argument at all that tech is advancing faster then ever so we need constant hardware updates.
 
Really though, I'm just happy with whatever Sony give me. The only way I'm moving away from Consoles and Playstation in particular is if Sony do something incredibly stupid.

Honestly this seems to be telling of your argument. A good deal of people don't feel that strongly towards a particular manufacturer and are more platform agnostic which is why you are seeing people saying they would bail and go to PC if Sony or Microsoft started coming out with new consoles every 2-3 years.

Personally a 2-3 year cycle would feel like being in perpetual cross-gen limbo where games are announced in one cycle and set to come out in the next cycle
 
Would probably just put that money towards upgrading my PC.

Iterative console + closed ecosystem just seems like a match made in hell to me.
 
Would probably just put that money towards upgrading my PC.

Iterative console + closed ecosystem just seems like a match made in hell to me.


I don't see what problems it can bring to make it a match made in hell.

If you have an iPhone 6 and upgrade on the 6 > 7 path, the 6S coming out in the middle doesn't impact you one bit.
 
I don't see what problems it can bring to make it a match made in hell.

If you have an iPhone 6 and upgrade on the 6 > 7 path, the 6S coming out in the middle doesn't impact you one bit.

Well, I am also commenting on it as someone who works in the industry and already wants to bang their head against the wall thanks to console cert etc.
I will literally curse Sony's name if they go through with this. lol
 
I don't see what problems it can bring to make it a match made in hell.

If you have an iPhone 6 and upgrade on the 6 > 7 path, the 6S coming out in the middle doesn't impact you one bit.

Consoles aren't phones though, some games take 5-6 years to make. As a developer that may mean you're working on 2-3 different generations as well as 3-5 different platforms...fuck that
 
Thinking about this again, I guess most people would not have problems, at least here. For the people who buy multiple consoles and get every hardware revision (specially with handhelds), this would be great..

I suppose I have to accept that despite consoles being trapped in time and keeping the same cycle for years, post smartphone boom things have to change. At the rate hardware evolves, I've never seen a console generation feel as outdated as this one.
 
Consoles aren't phones though, some games take 5-6 years to make. As a developer that may mean you're working on 2-3 different generations as well as 3-5 different platforms...fuck that

Then the industry may have to change to cater to it. Some larger scale games that warrant it then sure but a lot of games that take that long ( Duke forever, last guardian etc ) likely wont make their cost back and we are already seeing large studios closing due to not being profitable. There's a huge amount of wasted costs in development due to poor planning, re writing from scratch, changing engines etc. Those days will have to be gone and the correct people will need to be put in charge of projects to make sure things like that do not happen.

Don't get me wrong, there are some smaller studios who have huge development times due to being a smaller studio and likely do everything as efficiently as possible and this cycle change will hurt them the most, there will need to be solid hardware roadmaps for them to work towards to make it possible to continue working in this way. Thats up to Sony/MS to provide the correct tools and information if they wanted this to work.
 
Well, I am also commenting on it as someone who works in the industry and already wants to bang their head against the wall thanks to console cert etc.
I will literally curse Sony's name if they go through with this. lol

Yeah that's something that bothered me about those announcement.How hard will it be for the dev it wil be the code the games (not even talking about coding to the metal).
It can also be deceiptive for the consumer since a game can look very good but not on his console iteration.

Can it lead to games with horrible framerates on non-upgraded iteration?

I think it's way more problematic and killing a great advantage that console have over PC ... optimisation.

I'm strongly against it wether Sony or MS proposes it.

I can deal with it on my Pc because that's how the platform work alwaysq tweak it to be better.

But I won't buy a ps4.5 for me that contradictory to the definition of consoles.
 
Then the industry may have to change to cater to it. Some larger scale games that warrant it then sure but a lot of games that take that long ( Duke forever, last guardian etc ) likely wont make their cost back and we are already seeing large studios closing due to not being profitable. There's a huge amount of wasted costs in development due to poor planning, re writing from scratch, changing engines etc. Those days will have to be gone and the correct people will need to be put in charge of projects to make sure things like that do not happen.

Don't get me wrong, there are some smaller studios who have huge development times due to being a smaller studio and likely do everything as efficiently as possible and this cycle change will hurt them the most, there will need to be solid hardware roadmaps for them to work towards to make it possible to continue working in this way. Thats up to Sony/MS to provide the correct tools and information if they wanted this to work.

I'm not just talking delayed or extended game development, I'm talking about larger open world games made by big studios mostly.
 
Yeah that's something that bothered me about those announcement.How hard will it be for the dev it wil be the code the games (not even talking about coding to the metal).
It can also be deceiptive for the consumer since a game can look very good but not on his console iteration.

Can it lead to games with horrible framerates on non-upgraded iteration?

I think it's way more problematic and killing a great advantage that console have over PC ... optimisation.

I'm strongly against it wether Sony or MS proposes it.

I can deal with it on my Pc because that's how the platform work alwaysq tweak it to be better.

But I won't buy a ps4.5 for me that contradictory to the definition of consoles.

But consoles have no real definition, only whats been set as a precedent so far.

Consoles on launch have always traditionally been a step up on PC's. Even the 360 launch, no driving game on PC looked near as good as PGR3. It looks incredible at the time. That doesn't last very long

The moment this gen came out it was already behind PC achievements and has struggled to keep up and that gap will only grow. This is to help consoles as a platform stay relevant, a cheaper model could be available to keep them accessible by the masses and a + model for enthusiast.

If Microsoft plan to do this and tie it into UWP then it makes PC developers life much easier.


Release one code base across Windows and Xbox
Lock low specs to Xbox One
Lock High specs to Xbox One +
PC release with scalable Low > ultra settings

Thats 3 seperate markets you have access to, average gamers and families with the base model, Console enthusiats with the + model and PC gamers. All whilst on one development path, one product but not being limited to one section of the market whilst keeping development costs low due to not having to outsource ports etc.

Granted, no one wants UWP and it might end up being dropped because microsoft can't commit to anything ever but it's a solid vision if they could pull it off
 
If this happend, and I played consoles exclusively, I would honestly start considering just grabbing a PC.

It would be basically only about graphics and performance at that point anyway, right? A PC would do it better with better control, features, and upgradability. It is a weird idea to think of, I am not sure how and why they will do it. At least for Sony. For MS it makes a bit of sense given win10.
 
Honestly this seems to be telling of your argument. A good deal of people don't feel that strongly towards a particular manufacturer

Possibly, I can't say I feel that strongly towards Sony. More towards console gaming in general. Under the current business model if MS offered a more compelling product next gen I consider switching. I'd prefer platform security though. As with everyone else I dread to bin my PS3 because most of the stuff on there can't carry across, I'd loved to see an end to that mentality.

Consoles aren't phones though, some games take 5-6 years to make. As a developer that may mean you're working on 2-3 different generations as well as 3-5 different platforms...fuck that

You're right, they are not. An iteration model doesn't make them phones either. There's no reason games can't take 5-6 years to develop. In fact it should be easier (see my earlier Last Guardian example) since dev tools should be me consistent between generations. And I agree 3+ different generations, 3-5 different platforms - fuck that. If they do go down this route I hope they are sensible enough to look at 1 platform with 2 generations at a time.

I'm not just talking delayed or extended game development, I'm talking about larger open world games made by big studios mostly.

Done right I really can't see how they'd be effected. In fact I think it would be easier for devs in terms of time scales and targeting a platform as a whole.

I can't believe its beyond the wit of man to solve any associated problems and I can't see publishers wanting to stop producing games in an obviously popular and profitable genre. Don't worry - Ubisoft and Square Enix will find away.

Actually scrap that - I'm suddenly worried Square Enix have already found a way and I don't like their answer...
 
You're right, they are not. An iteration model doesn't make them phones either. There's no reason games can't take 5-6 years to develop. In fact it should be easier (see my earlier Last Guardian example) since dev tools should be me consistent between generations. And I agree 3+ different generations, 3-5 different platforms - fuck that. If they do go down this route I hope they are sensible enough to look at 1 platform with 2 generations at a time.

I meant if a developer was looking to make a game for multiple platforms ie;
Xbox One + OnePlus
PS4 + PS4.5 (with PS5 only a couple of years after)
NX + NXPlus
PC

Optimising and QA for that would be murder surely?

Don't worry - Ubisoft and Square Enix will find away.

Lol, they are not who I meant. More like Rocksteady and Bethesda, but I guess the point is the same.
 
I meant if a developer was looking to make a game for multiple platforms ie;
Xbox One + OnePlus
PS4 + PS4.5 (with PS5 only a couple of years after)
NX + NXPlus
PC

Optimising and QA for that would be murder surely?

I don't think so. You're still targeting the same 4 platforms - PS4, XB1, NX and PC. The fact that on all four platforms options exist for higher fidelity on more powerful hardware is almost beside the point. They've been doing that on PC for decades.


As for QA, I've seen Square Enix and Ubisofts answer to that one anyway so I doubt that would phase them either.
 
I don't think so. You're still targeting the same 4 platforms - PS4, XB1, NX and PC. The fact that on all four platforms options exist for higher fidelity on more powerful hardware is almost beside the point. They've been doing that on PC for decades.


As for QA, I've seen Square Enix and Ubisofts answer to that one anyway so I doubt that would phase them either.


This... Your platform total doesn't change. Just with in each ecosystem there are performance baselines. It's little different to a PC developer having multiple visual options, only that they would benchmark on each hardware revision and lock those settings at the best suited for that iteration.
 
This... Your platform total doesn't change. Just with in each ecosystem there are performance baselines. It's little different to a PC developer having multiple visual options, only that they would benchmark on each hardware revision and lock those settings at the best suited for that iteration.

Right,but a number of companies struggle with PC QA as it is.

To me this also undermines the ability for developers to code to the metal as mentioned earlier by someone else. I accept I may be in the wrong here as I know little of development really.
 
Right,but a number of companies struggle with PC QA as it is.

To me this also undermines the ability for developers to code to the metal as mentioned earlier by someone else. I accept I may be in the wrong here as I know little of development really.

You may be absolutely right as for not getting the most out of the hardware. It's one of the reasons consoles can achieve more on lower spec hardware than what the equivalent spec in a PC would be able to. It's certainly one of the draw backs to consider for MS/Sony if this ever happens
 
No way. The console exclusives have been incredibly weak so far. Most of my favorites have been remasters from prior consoles. I'm fully intending to spend 300-400 on a graphics card next year, not on an iterative console.

Boojakashaaa!!! My booiii.....yes, that's exactly my stance on the topic right now.
GIVE ME A FUCKIN REASON to buy a new console every 2 years. Let's talk games first, then hardware specs. So far this gen, imo it is not really worth owning an individual platform.
Man, I've bought a WiiU (!!!) for Bayonetta 2 and Smash and the only god-tier exclusive game on my day one PS4 is Bloodborne. (Xbox users please hold your tits, no, Xbone exclusives don't interest me...been playing those on my 360 for years).
I've built a new Pc recently and I will be more sceptical and reserved before buying my next console.
 
To the people complaining that the newest hardware isn't getting properly utilized because of cross-gen software, which itself is a form of forwards compatibility, I don't think we should ever expect that for the most high-end hardware. Things were different in the 90's when consoles were customized boxes that could achieve unique things and you had bleeding-edge PC hardware that could also achieve unique things. Maybe games supporting that were possible because the industry overall was smaller back then.

When my iPhone 6 Plus was the latest thing I personally didn't mind that there was only like one game fully optimized specifically for it. I was just happy everything else was running smoothly. I'd feel the same if I invested in a 980Ti or something.

That's the crux. People use history and experience to temper their expectations about a new entry in a product line.

Things brings up possibly the scariest part of the whole conversation: There isn't much left that only a console can do.
 
Holy fucking shit.

Imagine spending 12 or 13 years with the PS3's hardware.

I would be fine with this.

For a long time I've thought that graphics are probably "good enough" as they were back then. It's not like fucking Star Fox or Resident Evil, where you have to imagine what this abstract shape is meant to represent.

Characters have individual fingers, individual toes, clothing and hair physics. It's not perfect, but it never will be perfect. The processing power required for a perfect simulation is so exponentially above where we are that it may as well be impossible.

If the PS4 was around for 12, 15, 20 years, I would not have a single problem with that.
 
I would be fine with this.

For a long time I've thought that graphics are probably "good enough" as they were back then. It's not like fucking Star Fox or Resident Evil, where you have to imagine what this abstract shape is meant to represent.

Characters have individual fingers, individual toes, clothing and hair physics. It's not perfect, but it never will be perfect. The processing power required for a perfect simulation is so exponentially above where we are that it may as well be impossible.

If the PS4 was around for 12, 15, 20 years, I would not have a single problem with that.

Same really, graphically I'm well past the point of "good enough", its all diminishing returns now. I only really see the argument for more power when its wanting stuff like seemless movement between areas without loading. In other words, if its a gameplay related limitation and one not achievable by knocking down the fidelity.

I do care somewhat about performance though. It is of concern to me if PS4 framerates take a dive because of devs targeting the higher model with yet more new expensive effects that I wouldn't even notice thus leaving vanilla PS4 not quite coping with it(but just coping enough to pass cert). I'd need to see console gaming graphical sliders to feel more comfortable with this change, I don't trust devs to pick the right balance for multiple iterations of hardware.
 
Right,but a number of companies struggle with PC QA as it is.

You have good companies, you have bad companies. Some struggle already (Ubisoft love to bury NPC's in rooftops for some reason!?!), others set standards. This won't change.

To me this also undermines the ability for developers to code to the metal as mentioned earlier by someone else. I accept I may be in the wrong here as I know little of development really.

Not really. I doubt games are really written at that low a level any more its too complicated. Engines are optimised at that level, but it's the same hardware, same architecture. It's not like PC where you have and number of combinations between Intel, NVidia, AMD, system ram and graphics ram. There is just the APU and unified memory.

For a long time I've thought that graphics are probably "good enough" as they were back then.

As other have pointed out theres more to faster hardware than prettier visuals. NPC count, AI quality, Physics, World Size, and so on.

My argument as always been what would you prefer. Always sticking out the same stale old hardware for 5-8(+) years each generation, or once iterations are established having a smaller speedboost every 3-5yrs (to the new lowest spec)?

I'd need to see console gaming graphical sliders to feel more comfortable with this change, I don't trust devs to pick the right balance for multiple iterations of hardware.
Without wanting to intentionally insult my fellow man, or suggest the masses are somewhat less capable that they actually are, I'd counter with - look at the target audience for consoles. There's reasons a large proportion of the 150-200m console gamers don't game on PC. One of them is trying to understand what the difference is between FXAA, MSAA, CSAA, etc. I'd prefer the dev just set what works for the game and the experience they want to promote.
 
As other have pointed out theres more to faster hardware than prettier visuals. NPC count, AI quality, Physics, World Size, and so on.

My argument as always been what would you prefer. Always sticking out the same stale old hardware for 5-8(+) years each generation, or once iterations are established having a smaller speedboost every 3-5yrs (to the new lowest spec)?

World size: We're quickly reaching a very real limitation of manpower versus world size. By which I mean, we can have games with massive, massive worlds (FUEL, The Crew, etc.) but have a very hard time filling them with anything interesting to look at or do. Hardware is not the limitation here.

NPC Count: Games like Dead Rising can have hundreds, if not thousands of active NPCs on screen at the same time. Those characters are dumb as rocks, but if you drill down to something like GTA5, we're still talking significant numbers of highly varied pedestrians with some degree of intelligence.

AI Quality: In most games, largely unnecessary. Believe me, I used to be extremely interested in artificial intelligence technology and, like, the science of deathmatch bots and stuff, so I get the thought that smart AI is "important." But John Carmack was right: You don't need intelligence so much as you need the illusion of intelligence. It's easy to waste processing power on a simulation that could be just as easily faked.

Physics: You've got me here. I still pine for Red Faction Guerilla's destructibility in an environment like Saints Row. Crackdown 3 supposedly does it, but it needs an internet connection for "cloud physics" which seems ridiculous to me.

But, by and large, I do still think things are "good enough."
 
Top Bottom