• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

Gaming Claims That Are Hilarious in Retrospect

speculawyer said:
That picture is not quite right . . . they didn't lie about 1 of the HDMI ports.

Someone already posted this, but the photo on the right is a shot from the 20GB PS3 before Sony wised up and added HDMI.

From Edge in 2006:
http://www.edge-online.com/news/sony-cuts-20gb-ps3-japan-price-adds-hdmi

In addition, yet another shot was fired in the next generation high-def battle, as Kutaragi confirmed that the 20GB PS3 actually would be equipped with a High-Definition Multimedia Interface (HDMI) port. Previously, Sony had indicated that the cheaper PS3 wouldn’t have the feature.
 
Tatsumaki Senpuukyaku! said:
Why do you need two HDMI ports and three ethernet ports anyway? I'm sure I'm missing something obvious here. The only thing that comes to mind is dual monitors which seems really useless for console gaming.
Two HDMI ports would have been awesome for many games. It was an unrealistically expensive idea (all the memory, the encoders, etc.), but at least great things could have been done with it. Imagine local multiplayer games where you each get your own full screen. Games that provide you twice the available resolution by having two monitors. Awesome possibilities.

Of course imagine if they launched at $799. :lol


The Nintendo DS was not a flop now, was it?
 
mr stroke said:
1105277838.jpg

You gotta admit that it was the most appropriately named console. :lol
 
speculawyer said:
Two HDMI ports would have been awesome for many games. It was an unrealistically expensive idea (all the memory, the encoders, etc.), but at least great things could have been done with it. Imagine local multiplayer games where you each get your own full screen. Games that provide you twice the available resolution by having two monitors. Awesome possibilities.

Of course imagine if they launched at $799. :lol

All PS3 dev kits still have 2 HDMI ports... but one is covered up.
 
I remember back in the old 3dfx vs Nvidia days (I worked for 3dfx at the time), Nvidia floated out the Pixar comparison:

Achieving Pixar-level animation in real-time has been an industry dream for years. With twice the performance of the GeForce 256 and per-pixel shading technology, the GeForce2 GTS is a major step toward achieving that goal.

Someone from Pixar took it to 'em:

These guys just have no idea what goes into `Pixar-level animation.' (That's not
quite fair, their engineers do, they come and visit all the time. But their
managers and marketing monkeys haven't a clue, or possibly just think that you
don't.)

`Pixar-level animation' runs about 8 hundred thousand times slower than
real-time on our renderfarm cpus. (I'm guessing. There's about 1000 cpus in the
renderfarm and I guess we could produce all the frames in TS2 in about 50
days of renderfarm time. That comes to 1.2 million cpu hours for a 1.5 hour
movie. That lags real time by a factor of 800,000.)

Do you really believe that their toy is a million times faster than one of the cpus
on our Ultra Sparc servers?


Story here: http://siliconinvestor.advfn.com/readmsg.aspx?msgid=13781290

Oh how we laughed at 3dfx. And then we went bankrupt. :(
 
A recent one is Fallout 3 having over 200 endings. They touted this until launch, and surprise! There are like four endings, and they were all terrible. I think I read that there are technically 32, but this is totally stretching it. They are counting the slight variations in the screens that are shown in the ending depending on what sidequests you did. Also, the endings were just black and white screenshots with a cheesy voice over.

Let's not forget this gem, it was mentioned but you really need the picture:
daikatana_john_romero_make_you_his_bitch.jpg


Also pretty much anything Yamauchi said.
 
Slacker said:
I remember back in the old 3dfx vs Nvidia days (I worked for 3dfx at the time), Nvidia floated out the Pixar comparison:

That really made my day back then. Especially that they called the GeForce a "toy". That must've really steamed a lot of guys who refuse to see gaming that way, especially PC gaming.
 
speculawyer said:
That picture is not quite right . . . they didn't lie about 1 of the HDMI ports.

Edit: And WTF were they thinking with multiple Ethernet ports? Turn your PS3 into a firewall? And why 3?!?!

The original PS3 that was unveiled at E3 2005 complete with boomerang controller was going to be a router as well as a next gen console, a video communication device, a 32:9 display device and a bunch of other stuff too. This image is comparing the PS3 shown and detailed at E3 2005 vs the 20GB PS3 shown at E3 2006, which was missing the HDMI port. Sony saw sense after everyone complained and put it back into the 20GB for launch, however the extra cost of puting HDMI in that unit lead Sony to stop making it very quickly and it was never released in Europe. It also lead to Sony looking for other cost cutting measures to get the price down as much as possible, so bye-bye backwards compatibility and USB ports.

I am absolutely positive that Ken Kutaragi wanted to make the $1000 games console. Then Sony realised what he was doing, told him he was crazy and started ripping out features to get it down to $600. That's what Kutaragi means when he said it was too cheap, the PS3 isn't what he envisioned.

Gametrailers still has the re-cap vid of that press conference, always good for a laugh... and a cry at the same time.

http://www.gametrailers.com/player/6258.html?type=flv
 
alistairw said:
But dude. They just hit another milestone!
You are saying their "gaming claims" are hilarious even without being able to see them in retrospect? :P

Anyway, picked something from a random thread on the Duke Nukem Forever forums:
Scott Miller said:
Currently it's being developed by a solid team of five people. A new top-talented artist is joining the team in two weeks, and as a soon as Shadow Warrior is done then two more mappers will join as well as George Broussard. Thing are rolling along fine
 
The disturbing amount of, and continuing, Wii screenshot that are released in resolutions waaay past what the system can render, chock full of AA and AF, and in all looking a million times clearer, crisper, and more detailed than what the final game does.

Few things piss me off more this generation than when developers do that, and it really makes me question the quality of their game. DON'T release screenshots that look better than the actual game, especially when its obviously bullshots thanks to stupidly high resolution. It just makes me think the game is ugly as sin and you're too afraid to show it.
 
speculawyer said:
That picture is not quite right . . . they didn't lie about 1 of the HDMI ports.

Edit: And WTF were they thinking with multiple Ethernet ports? Turn your PS3 into a firewall? And why 3?!?!
Something about making your ps3 a router, I think that was only said once.
 
I don't know why people attribute that "Toy Story graphics on PS2" claim to Sony. When did that rumor start?

But this one is real:

These guys are achieving the level of visual detail that you really did get in Toy Story, and this is a real game, this is the way the game really plays.

- Seamus Blackley, while demonstrating Malice for the upcoming Xbox.
 
I loved Seamus Blackley.

My favorite "gaming evangelist" ever. His interviews about the industry as a whole were always worth listening too.
 
"You can communicate to a new cybercity. This will be the ideal home server. Did you see the movie 'The Matrix'? Same interface. Same concept. Starting from next year, you can jack into 'The Matrix'!"

- Ken Kutaragi (Newsweek, 02/00)
 
Slacker said:
I remember back in the old 3dfx vs Nvidia days (I worked for 3dfx at the time), Nvidia floated out the Pixar comparison:

Achieving Pixar-level animation in real-time has been an industry dream for years. With twice the performance of the GeForce 256 and per-pixel shading technology, the GeForce2 GTS is a major step toward achieving that goal.

Someone from Pixar took it to 'em:

These guys just have no idea what goes into `Pixar-level animation.' (That's not
quite fair, their engineers do, they come and visit all the time. But their
managers and marketing monkeys haven't a clue, or possibly just think that you
don't.)

`Pixar-level animation' runs about 8 hundred thousand times slower than
real-time on our renderfarm cpus. (I'm guessing. There's about 1000 cpus in the
renderfarm and I guess we could produce all the frames in TS2 in about 50
days of renderfarm time. That comes to 1.2 million cpu hours for a 1.5 hour
movie. That lags real time by a factor of 800,000.)

Do you really believe that their toy is a million times faster than one of the cpus
on our Ultra Sparc servers?


Story here: http://siliconinvestor.advfn.com/readmsg.aspx?msgid=13781290

Oh how we laughed at 3dfx. And then we went bankrupt. :(
Ah, the toy part really sold it... But shit, I can see people talking that kind of shit being annoying as hell.
 
I remember OPM printing an interview with George Lucas in which he says that the PS2 was doing in real time what he had just accomplished in Episode I.

Though considering how bad Episode I CG looks these days... maybe that makes sense. PS3 is up to a similar (better) level.
 
Coolio McAwesome said:
"You can communicate to a new cybercity. This will be the ideal home server. Did you see the movie 'The Matrix'? Same interface. Same concept. Starting from next year, you can jack into 'The Matrix'!"

- Ken Kutaragi (Newsweek, 02/00)
"Gamers' attention spans at reading threads in their entirety will rival their attention spans at playing Final Fantasy." - Lee Saito
 
Kapsama said:
1. Killzone 2 CGI Neither game looks as good as those trailers.

Does the real game look as good as the infamous E3 CGI ? Not quite, but it's not like it missed by a country mile. Killzone 2's graphics delivered.

Kapsama said:
3. Microsofts Xbox 1 spec sheet which went from
- Pentium 3 CPU
- Geforce 3 GPU
- 300 Million Polygons

to

- Celeron CPU
- Geforce 2.5 GPU
- 150 Million Polygons

The Xbox GPU was actually a bit more capable than a Geforce 3, same pixel/texture unit layout with twice as many vertex shaders. Geforce 3.5 ?
 
Kapsama said:
As far as the PS2 tech demos go, they were accurate.
The Final Fantasy 8 intro was rendered in real time. No one ever claimed that all PS2 games would look like that.
The old guy's head was rendered in real time. No one ever claimed that all characters in a games would have heads that detailed.
You're quite right, but there's a reasonable question about whether there was an intent to mislead.

On a similar note, who remembers the Rebirth demo for the Gamecube? Probably could just about be justified as a demonstration of seamless transition between real-time graphics and FMV, but still very questionable in intent.


We've covered a lot of talking up of things that didn't meet the expectations, but what about cases of people talking down something that ended up a phenomenal success? I remember some about the Wii, but I think the real star of the show for those has to be the DS.


And my own contribution to the thread:

Me said:
Look, there's been dozens of companies trying to break into the current console market; Konix, Atari, NEC - and none's really managed to make any headway into breaking the Sega/Nintendo stranglehold. The Playstation's just going to join that list.

I'll get a Saturn instead.
 
Naked Snake said:
N64 can do real-time ray tracing. Turok developer confirmed it in a GameFan interview :lol
I do wonder about the story behind that. I think that may have been based in truth, albeit what turned out to be a bastardised version of it.

If you don't faff with lighting, a sphere - naturally - looks the same from any angle. I'm fairly sure that Mario 64 uses pre-rendered spheres for a number of characters (most notably the bob-ombs). Those look ray-traced - because they are - without having to have been calculated in real-time - which they aren't. I wonder if the turok guy is referring to those when talking about real-time ray-tracing.

This does raise a key further question, then: Was the turok guy bullshitting - on his own part or just repeating Nintendo's claims - or just dim enough to not realise that they weren't rendered in real time?
 
brain_stew said:
Every game 720p HD at a minimum as well, oh how that dream quickly vanished. You could probably count on one hand the amount of 360 titles that are native 720p with 4xMSAA and no, Capacom's "variable" AA doesn't count.

Microsoft were right to set that as the baseline for IQ, just a shame they didn't design a console that was suited to delivering it.
That's actually not true.

The vast majority of 360 games ARE 1280x720 (it's simply that a few exceptions are big titles) while many of them use 4x AA (though 2x AA is much more common).
 
1. Killzone 2 CGI and Motorstorm CGI. Neither game looks as good as those trailers. Hell not even Crysis looks anywhere as good as Killzone 2 CGI.
Honestly, with the Killzone 2 CGI, have you taken a recent look at it? Yes, it's true that neither the final game nor Crysis are quite as detailed, but I think both of those games look superior in their final forms. The KZ2 CGI has a plastic-y look to everything along with some bland lighting. The actual game is more impressive looking in motion, despite the fact that it obviously isn't as detailed.

I think both Killzone 2 and Motorstorm captured the type of action presented in those trailers very well. In fact, I'd say those trailers were actually partially responsible for the fantastic results those teams achieved in the end.
 
In their 200th issue, Edge had a whole bunch of quotes from throughout the years. Great stuff.

"No matter how great Saturn is, or Playstation is, or Ultra 64 is, we will outsell them by an enormous amount with 32X" - Tom Kalinkse, Sega US chief.
 
Demigod Mac said:
I don't know why people attribute that "Toy Story graphics on PS2" claim to Sony. When did that rumor start?

http://www.prnewswire.com/cgi-bin/stories.pl?ACCT=104&STORY=/www/story/02-27-2000/0001150833&EDATE

has the text of a Newsweek cover story.


Levy explains that the secret is the Emotion Engine, a fast, high-powered
chip set that is fine-tuned to generate the polygons that are the building
blocks of 3-D graphics. PlayStation 1 could handle 360,000 polygons per
second. Version 2 can handle 20 million, a jump from "South Park" to "Toy
Story."

Also from that article:
The new Sony PlayStation 2, which goes
on sale March 4 in Japan, is not only a quantum leap in game technology, but
because of its Internet compatibility, it's Sony's bid to compete with
AOL-Time Warner and Microsoft on the Web.

There was much more to the pre-release hype about PS2's internet capabilities(downloadable music store!) than just the Matrix quote, and we all know how that ended up.
 
Dead Man Typing said:
Sony saw sense after everyone complained and put it back into the 20GB for launch, however the extra cost of puting HDMI in that unit lead Sony to stop making it very quickly and it was never released in Europe.

The 20gig only had a short life in the US. In Japan, the 20gig had a very healthy life being sold for over a year and only was discontinued to make room for the 40 gig.

Acosta said:
My PS3 is 100% BC, so it´s not exactly a lie. The statement was true when it was made.

If we are going by quotes that were true when they were made then J.Allard's HDD quote is true since at the time he made the quote there wasn't any game released for the 360 that required the HDD to play.

MS_LIE.jpg
 
I remember reading in early PSone hype (1994-ish) that it would use a blue laser which would make the CD:s store MUCH more data. Early Blu-Ray info confusion?
 
EatChildren said:
The disturbing amount of, and continuing, Wii screenshot that are released in resolutions waaay past what the system can render, chock full of AA and AF, and in all looking a million times clearer, crisper, and more detailed than what the final game does.

Few things piss me off more this generation than when developers do that, and it really makes me question the quality of their game. DON'T release screenshots that look better than the actual game, especially when its obviously bullshots thanks to stupidly high resolution. It just makes me think the game is ugly as sin and you're too afraid to show it.
I agree, but we are in the minority. (And it's not just Wii, though it's the most egregious case by far)
 
Sapiens said:
Mine does. And anyone that really wants BC can hunt one down.

Acosta said:
My PS3 is 100% BC, so it´s not exactly a lie. The statement was true when it was made.


When the PS3 introduces lag between controller inputs and the visuals onscreen, that isn't 100% BC, no matter what model you own.
 
Top Bottom