And when he ports it to iOS, it'll have lower quality graphics + sound. But I guess he should cripple the PS4 version to match the iOS version, so it'll be "optimized."
I endorse your graduation to full-sized member.
And when he ports it to iOS, it'll have lower quality graphics + sound. But I guess he should cripple the PS4 version to match the iOS version, so it'll be "optimized."
Gemüsepizza;62305421 said:Why do people call this "unoptimized"? If everything performs fine, why should he optimize anything? You only optimize things when software is not performing like you want.
if blow ever sees this thread, i hope he takes the whitta quote to mind.
Embarrasing how on the mark he and Patcher were, lol."Everytime you have a certain knowledge of a certain subject and you see how much ignorance there is out there, When you see people who have no knowledge of neither, outside of their perspective opinion, I often have to hold my tongue and remind myself how little their opinion is worth, just become it's not coming from a position of someone with knowledge"
The Witness was announced for PC and iOS too. Unless Blow thinks the next iPad is gonna have over 5GB of RAM he should probably consider optimizing the game.
What crock of bs. It's easy for any app to use 5Gb lol.
He probably should stay away from technical talks.
Yeah. But we know this since the beginning of the computer age. What's the news
Mr. Blow talks about? One can say similar if we would have 1TB of RAM. So what
exactly is the point he is talking about -- what we didn't knew already since
the dawn of the computer age?
We need a special thread dedicated for us uninformed ignorant gamers because these responses make us look embarrassing.
If it means never seeing a loading screen in the game then it's fine.
It's just weird considering the backflips devs had to do last gen to squeeze every byte of memory out of their systems.
So instead of sub 30 fps due to cramming in too much in the way of effects for the frame buffer, it will be sub 30 fps due to bloat.
Next-gen.
The Witness was announced for PC and iOS too. Unless Blow thinks the next iPad is gonna have over 5GB of RAM he should probably consider optimizing the game.
This makes me wonder what the initial boot up time ends up being seeing how this seems to be an open world game.
When people used that phrase, I always imagined them etching assembly into the computer case. So stupid, though I guess CODING TO THE MAGNETIC MEDIA doesn't have the same ring.Because CODING TO THE METAL
Because CODING TO THE METAL
this thread is great. I love the idea that pre-loading the entirety of the game's assets into the fastest storage medium the device has is somehow going to make the game perform worse than if those assets were streamed in during play from optical or HDD. thanks for the giggles guys.
And if they don't have to worry as much about that now, why are people beefing?
I was worried that having so much RAM would result in devs just writing lazy, bloated, unoptimised code, rather than actually making good use of the extra power.
If he's just loading the whole game into RAM (or as much as possible), that might have been helpful to have explained in the first place to put it in proper context.
Jonathan Blow just shot himself in the foot.
I don't know how much the rest of you know about video game programming (I'm an expert), but optimization and wasteful streaming are huge parts of it. It's not like it is in Bethesda where you can become successful by being utterly incompetent at managing memory. If you use more than 5GB of RAM, you bring shame to yourself, and the only way to get rid of that shame is repentance.
What this means is the video game public, after hearing about this, is not going to want to purchase The Witness for either system, nor will they purchase any of Blow's future games. This is HUGE. You can laugh all you want, but Blow has alienated an entire market with this move.
Blow, publicly apologize and redesign your whole game to use less RAM or you can kiss your business goodbye.
.
It's reaching to think unoptimized code = poor performance? That is what we've essentially been taught to think all this time.
But whatever. I'll leave you "real GAF devs" to continue talking about how not making your game more efficient on resources is a good thing if possible.
Right now I'm planning a game for the iPad/PC. Most of my work is designing the game, but a good portion of it right now is how I can load the game into the iPad's 512mb of memory. I have to figure out how to load the game while the player scrolls across a map, how to load and unload assets in the background, and potentially limit the amount of images the player can see at any given time.
If it was just PC, I wouldn't really need to plan this at all. I could go straight into game making and not have to worry about memory juggling because most computers have 1GB to spare for games.
If tomorrow you told me I could design a game for 8GB of memory, I would be able to throw out at least a month's work of planning and testing for memory crashes, loading/unloading assets, and so on. It would be absolutely awesome.
Anyone screaming "bah lazy devs not optimizing their code" just don't get it. I can fit eight people into a Mini Cooper if I absolutely had to, stacking and squeezing them on the floors and seats, but if they could ride in a passenger van instead, why the hell would I still try to get them to fit in a Mini Cooper sized amount of space inside the van? I could just pile them in, drive my van, and be so much more comfortable.
This is an amazingly dumb post.
Jonathan Blow just shot himself in the foot.
I don't know how much the rest of you know about video game programming (I'm an expert), but optimization and wasteful streaming are huge parts of it. It's not like it is in Bethesda where you can become successful by being utterly incompetent at managing memory. If you use more than 5GB of RAM, you bring shame to yourself, and the only way to get rid of that shame is repentance.
What this means is the video game public, after hearing about this, is not going to want to purchase The Witness for any system, nor will they purchase any of Blow's future games. This is HUGE. You can laugh all you want, but Blow has alienated an entire market with this move.
Blow, publicly apologize and redesign your whole game to use less RAM or you can kiss your business goodbye.
Jonathan Blow just shot himself in the foot.
I don't know how much the rest of you know about video game programming (I'm an expert), but optimization and wasteful streaming are huge parts of it. It's not like it is in Bethesda where you can become successful by being utterly incompetent at managing memory. If you use more than 5GB of RAM, you bring shame to yourself, and the only way to get rid of that shame is repentance.
What this means is the video game public, after hearing about this, is not going to want to purchase The Witness for any system, nor will they purchase any of Blow's future games. This is HUGE. You can laugh all you want, but Blow has alienated an entire market with this move.
Blow, publicly apologize and redesign your whole game to use less RAM or you can kiss your business goodbye.
I don't know why this is so funny.
It may be but it does make me wonder about how flexible and scalable that engine is.
Yeah on rereading I think misinterpreted to whom "the programmer" referred, apologies missile.You should have read what he wrote again. Sorry to single you out but iirc he is a coder himself. He isn't saying that Blow is lazy or what he is doing is unwarranted rather that it's not really news given the various ways developers are looking to exploit the room with or without using ancillary systems (like streaming and then to what degree).
Personally, it's about the game and it doesn't matter how much of the available resources are being used. They are there for the devs to exploit.
Based on what we'd seen of the game so far, it didn't really look like the visual style required that kind of complexity, that's all.Well, what is wrong with assuming the game has very high resolution textures? Blows point about texture memory footprint isn't bullshit. You're not only dealing with diffuse textures for all the objects in the scene, but bump maps and render targets. I'm not going to claim to know the render set up for The Witness, but it really isn't hard or undesirable to make use of 5GB of store space.
The engine isn't even what Blow is talking about; I assume that is going to compress the textures for mobile platforms.
No worries.Sorry to single you out, but you were just the most recent of a ludicrous batch of posts. Blow is as low level as an indie guy gets, and kind of has been for a decade or so. Here's a sample bibliography; saying that he doesn't have a grasp on these issues is just weird, and shows a serious misunderstanding of his position.
Based on what we'd seen of the game so far, it didn't really look like the visual style required that kind of complexity, that's all.
Thank you, I was reading the thread waiting for an actual software developer to say something like this.If tomorrow you told me I could design a game for 8GB of memory, I would be able to throw out at least a month's work of planning and testing for memory crashes, loading/unloading assets, and so on. It would be absolutely awesome.
Dat >5GB worth of data:
It's a beautiful game.
Lower end, at least. Requiring 4 GBs opens up the game a significant number of users as opposed to 5 GBs, since most people purchase RAM in 2/4 GB denominations.
Absolutely gorgeous. People saying this looks like a last-gen game are out of their minds.
That'll be one mighty compression. That said the art so far has shown mostly solid colours. I am more keen to know about the lighting and how that would change for iOS devices.
My post(s) was more addressed at people over here.To be fair Mr. Blow was just casually posting on Twitter, not making a huge deal about anything. Look in the mirror to see who's making a big deal about 5GB, because it's not him, it's US.
Absolutely gorgeous. People saying this looks like a last-gen game are out of their minds.
Lightmaps are used on ios so a lower precision in creating them I guess.