Plinko
Wildcard berths that can't beat teams without a winning record should have homefield advantage
Coen said:What if Project Cafe doesn't have a TV-out? What if the controller is its only output?
Will. Never. Happen.
Coen said:What if Project Cafe doesn't have a TV-out? What if the controller is its only output?
If the question is will the 720/PS4 be able to produce something that approximates that demo in the same way that say KZ2 approximates its early target renders then there is a good chance. They may well be able to produce something that 'looks' better even, because I am sure a lot of that demo is brute forced and a few cheats might produce superior effects.Mr_Brit said:They were also running it on a triple SLI system and said that they could get it down to a single GPU with optimisation. That tells me plenty about how optimised it was. It's also a tech demo which means that they probably went crazy with the effects to make them as prominent as possible, a retail game would be more balanced.
That's exactly what I meant. Thanks.poppabk said:If the question is will the 720/PS4 be able to produce something that approximates that demo in the same way that say KZ2 approximates its early target renders then there is a good chance. They may well be able to produce something that 'looks' better even, because I am sure a lot of that demo is brute forced and a few cheats might produce superior effects.
If the question is will they match the power required to reproduce that demo then the answer is probably no.
ReyVGM said:So now it's "don't believe Kotaku's rumors" and "IGN's ones are more believable"?
A few thousand pages back it was "IGN sucks, don't believe their rumors".
You really can't see a 20nm $400-450 console not matching a GTX 580? That is with 2.25 die shrinks, 3 years and a new architecture. You'd be crazy to think that there isn't a very good chance when you factor that in.poppabk said:If the question is will the 720/PS4 be able to produce something that approximates that demo in the same way that say KZ2 approximates its early target renders then there is a good chance. They may well be able to produce something that 'looks' better even, because I am sure a lot of that demo is brute forced and a few cheats might produce superior effects.
If the question is will they match the power required to reproduce that demo then the answer is probably no.
Mr_Brit said:You really can't see a 20nm $400-450 console not matching a GTX 580? That is with 2.25 die shrinks, 3 years and a new architecture. You'd be crazy to think that there isn't a very good chance when you factor that in.
The majority of the games that are much more impressive on PC come from Europe (usually eastern europe) - Metro 2033, Mafia II, Batman AA - as are a lot of the PC only devs/games - Witcher, Crysis 1, STALKER etc. I think its more to do with the market, talent, and resources available in that part of the world, than injections of Nvidia cash.Mr_Brit said:There's an interview with the lead developer where he admits to not pushing the 360 version as far as possible. I also know exactly how hard Metro 2033 pushes PCs and what exactly does that have to do with anything I said. If it wasn't for millions of dollars nvidia spent on the PC version we would have gotten another subpar console port. There's a reason all the most impressive console ports(Metro 2033, Batman AA) are nvidia sponsored as devs otherwise wouldn't put any effort in.
stephentotilo said:I wasn't guessing.
I can guarantee you Metro 2033, Mafia II, Batman AA would all have been average at best if it wasn't for nvidia's money. You just have to read developers' thoughts about PC gaming to know how much effort they would put into PC ports if there were no outside influences.poppabk said:The majority of the games that are much more impressive on PC come from Europe (usually eastern europe) - Metro 2033, Mafia II, Batman AA - as are a lot of the PC only devs/games - Witcher, Crysis 1, STALKER etc. I think its more to do with the market, talent, and resources available in that part of the world, than injections of Nvidia cash.
No recent architecture change made GPUs more efficient. All companies did was to add more features. Also, a single die shrink only brings a few percent. Again, according to IBM: Going from 45nm to IBM's state-of-the-art 28NM HKMG gate-first technology gives 40% more performance at 20% less power consumption. Marketing (= best case) scenario. And considering the delays switching to 28nm, I don't even think Sony's and Microsoft's next gen systems will be 20nm anyway. All of that isn't about money, anyway. It's about heat.Mr_Brit said:You really can't see a 20nm $400-450 console not matching a GTX 580? That is with 2.25 die shrinks, 3 years and a new architecture. You'd be crazy to think that there isn't a very good chance when you factor that in.
Just because 28nm was delayed doesn't mean 20nm will be, if anything it will bring it forward as they'll have been working on 20nm simultaneously so will be quite far in bringing it to market. If Sony and MS are really launching in late 2013 or 2014 then there's a better chance of them using 20nm than 28nm. I'd even guess that Wii 2 has a good chance of launching at 28nm.wsippel said:No recent architecture change made GPUs more efficient. All companies did was to add more features. Also, a single die shrink only brings a few percent. Again, according to IBM: Going from 45nm to IBM's state-of-the-art 28NM HKMG gate-first technology gives 40% more performance at 20% less power consumption. Marketing (= best case) scenario. And considering the delays switching to 28nm, I don't even think Sony's and Microsoft's next gen systems will be 20nm anyway. All of that isn't about money, anyway. It's about heat.
BMF said:Hard drives are 15x-20x as cost effective as flash. There will be flash, but it won't be the same level of mass storage that's available in hard drives.
Mr_Brit said:Just because 28nm was delayed doesn't mean 20nm will be, if anything it will bring it forward as they'll have been working on 20nm simultaneously so will be quite far in bringing it to market.
I'd like to believe you, but I can't see a supposedly technically ok but not spectacular machine rendering both a 1080p image for TV and a 720p image for the controller. Somewhere, something's gotta give. I suppose it's possible the controller feed would be severely limited when the main image is been send to a TV, but it wouldn't surprise me if Nintendo decided to cut out the TV entirely.Plinko said:Will. Never. Happen.
Coen said:I'd like to believe you, but I can't see a supposedly technically ok but not spectacular machine rendering both a 1080p image for TV and a 720p image for the controller. Somewhere, something's gotta give. I suppose it's possible the controller feed would be severely limited when the main image is been send to a TV, but it wouldn't surprise me if Nintendo decided to cut out the TV entirely.
Well, we'll see. Still: You completely ignored the rest of my post. Even if Sony and Microsoft manage to cut the TDP in half by 2014, which is highly unlikely, it's still too much.Mr_Brit said:Just because 28nm was delayed doesn't mean 20nm will be, if anything it will bring it forward as they'll have been working on 20nm simultaneously so will be quite far in bringing it to market. If Sony and MS are really launching in late 2013 or 2014 then there's a better chance of them using 20nm than 28nm. I'd even guess that Wii 2 has a good chance of launching at 28nm.
Plinko said:Also keep in mind that Totilo is certain it will have 8 GB flash storage but has heard "mixed things about whether Nintendo will cap their machine's graphical resolution at 1080i or 1080p."
I completely buy the hard drive--that makes perfect sense for Nintendo. The 1080i, though, still doesn't--it technically doesn't make sense and the other rumors are pretty adamant that 1080p is set and that stereoscopic 3D is possible.
StevieP said:It makes sense if the console does not have HDMI/digital out and has *only* component instead. Which would be a shame.
Plinko said:Is HDMI that much more expensive?
StevieP said:Well, unlike component there is a licensing fee associated with it.
Nintendo hates licensing fees.
StevieP said:It makes sense if the console does not have HDMI/digital out and has *only* component instead. Which would be a shame.
StevieP said:Well, unlike component there is a licensing fee associated with it.
Nintendo hates licensing fees.
I'm pretty sure that the licensing fee per device is 5 cents. Nintendo would have to take penny pinching to a whole new unseen level if they chose to omit HDMI for that reason.Plinko said:Is HDMI that much more expensive?
Shin Johnpv said:But it doesn't since again component video has no problem outputting 1080p.
Yes, you've mentioned that before, but I'm not hearing any arguments why it would be - as you put it - idiotic.Plinko said:There's no way they release a home console without the option to use the TV.
Now, the option to NOT use the TV and use the controller instead, that makes sense--it is actually what I've envisioned from the beginning. But there will always be an option to use the television--it would be idiotic otherwise.
What sort of experimental understanding of journalism are you working under? I reported the stories under my byline with no named sources. That's me putting my reputation on the line.Fourth Storm said:Well, how reliable is your source? Are you willing to put your reputation on the line for it? Forgive me if I'm a bit skeptical, but there's a hell of a lot of nonsense going around these days.
StevieP said:Most TVs, as far as I know, cap component at 1080i. Really, having only component output is the only way I can see the 1080i rumour making any sense. That, or having *only* controller screens as the video output and no TV out.
I still hold by my expectation. 4-8gb of built in flash - optional hard drive. It's the best of both worlds.TwinIonEngines said:I can see them using a laptop drive. It's less cost effective than a full size drive, but still far better than flash or solid state. They've gotten really rather robust, and they should satisfy Nintendo's requirements for reliability and shock/drop resistance.
Coen said:Yes, you've mentioned that before, but I'm not hearing any arguments why it would be - as you put it - idiotic.
Again, I'm not saying it won't have a TV-out, I'm looking for an explanation for the 1080i rumors.
Vagabundo said:Hey Stephen,
Do you know if pointer controls still work well?
Thats all I'm really interested in; its a deal breaker for me.
BMF said:Any word on a hard drive slot?
Coen said:Yes, you've mentioned that before, but I'm not hearing any arguments why it would be - as you put it - idiotic.
Again, I'm not saying it won't have a TV-out, I'm looking for an explanation for the 1080i rumors.
stephentotilo said:You're asking if Wii Remotes will work with the new console? Yes. I and others have already reported that. You'd be mistaken to imagine multiple people huddled around a TV each with a Cafe screen-controller in their hand. I don't even know if the new console can stream to more than one controller (I've never asked). Imagine one person with the screen controller; another with a Wii Remote. Or multiple people with Wii Remotes and none with screen controllers. Mix and match.
Thanks.stephentotilo said:I have no idea. Sorry.
Installs shouldn't be necessary, and 8GB is plenty for patches and DLC. And if you need more, add an SD card. I think it's an acceptable solution. I just hope it's fast.Neuromancer said:Stephen Totilo is very credible but I hope Kotaku's wrong about this 8GB of storage story. That kind of sucks.
stephentotilo said:I have no idea. Sorry.
ShockingAlberto said:Nintendo will never allow open USB ports again. Not after the Wii.
BMF said:I still hold by my expectation. 4-8gb of built in flash - optional hard drive. It's the best of both worlds.
1-D_FTW said:I just did a google search and it seems component cables can technically do 1080P, it's the RCA plugs at the end that are the weak link. You need "BNC" plugs to be able to do the transfer without degradation.
Seriously, though, if Nintendo is half-assing the machine so badly they won't even put in HDMI, I'll stop following Stream after E3 and Nintendo will be dead to me. That's my line in the sand. They cross that, I'm done with them.
I'm talking about an optional hard drive. It doesn't need to come with one, just the ability to put one in.mj1108 said:Nintendo won't use traditional, platter hard drives. If they include any kind of storage it would be solid state. The reason being is that with traditional hard drives there's too many moving parts and they're slower.
Plinko said:Again, (as the French site was reporting) if a dev is saying "They're doing it right this time" I don't at all buy the 1080i idea. I can't imagine any dev in their right mind would say that.
Unless it's Ubisoft.