Where is 2GB GDDR5 and 4GB DDR3 coming from? Shouldn't it be 3.5GB and 5GB, or is this related to something else?
Hypothetical question i guess, ie 4 v 8
Where is 2GB GDDR5 and 4GB DDR3 coming from? Shouldn't it be 3.5GB and 5GB, or is this related to something else?
Going for DDR3 allows them to include more RAM.
It often is. But XDR can fortunately be accessed by the GPU, it's just slower than accessing the VRAM.Slightly, and it's for Cell. Isn't the bandwidth and memory amount responsible for why many multiplat games have better framerates or better aa or whatever than they do on PS3?
That's the thing. I don't even think it is a big leap from the PS3 to that laptop and mobile GPU combo. Certainly it's not nearly a big enough leap for me.It will be a big leap for those who have been exclusively console gaming for the last 5-7 years. For those who have had significant exposure to PC gaming, then its a different story.
I know going for DDR3 means more RAM but it automatically gives them a bottleneck that forces them to think of ways to go around that. Wouldn't it be easier to go with 4 GB of GDDR5 instead of 8 GB of DDR3 if both configs sort of balances each other out?
Wait the games. Next gen will raise the bar even for PCs.That's the thing. I don't even think it is a big leap from the PS3 to that laptop and mobile GPU combo. Certainly it's not nearly a big enough leap for me.
That's the thing. I don't even think it is a big leap from the PS3 to that laptop and mobile GPU combo. Certainly it's not nearly a big enough leap for me.
How would the comparable laptop in that article run the Witcher 2, which the 360 ran well?
Where is 2GB GDDR5 and 4GB DDR3 coming from? Shouldn't it be 3.5GB and 5GB, or is this related to something else?
The thing that you're missing (and a lot of people don't get this) is that you can't look to a "comparable laptop" to get a grasp on how the tech will be used in consoles.That's the thing. I don't even think it is a big leap from the PS3 to that laptop and mobile GPU combo. Certainly it's not nearly a big enough leap for me.
How would the comparable laptop in that article run the Witcher 2, which the 360 ran well?
Sony was actually working on the Move concept already in 2004 (maybe earlier too): http://www.youtube.com/watch?v=JbSzmRt7HhQ. I do think that Sony decided to go further with this idea because of the Wii success though, but Sony already had the basic Move idea before the Wii was released.-Trophies
-In game XMB
-Background downloading
-Motion controls (move)
-Triggers on the DS3
-Ingame music
Thats off the top of my head and the big ones. I'll write another list with some other ones I think about.
Thats not including things that the PS3 just outright cannot do because of hardware, ie cross game chat, it also has beacons, and the ability to launch your games from your current game, plus more shit I really don't feel like outlining.
360 didn't run The witcher 2 well man, it's like two different gens when compared to PC.
That's hyperbole, the PC version is definitely a level above but nowhere near a generational gap.
That's the thing. I don't even think it is a big leap from the PS3 to that laptop and mobile GPU combo. Certainly it's not nearly a big enough leap for me.
How would the comparable laptop in that article run the Witcher 2, which the 360 ran well?
What does impress me are the tech demos for Luminous, Unreal Engine 4, and another next-gen engine that escapes me. The target render for Cyberpunk sparks my imagination of the graphical, and artistic leap in visuals possible by mid-generation as well.
Slightly, and it's for Cell. Isn't the bandwidth and memory amount responsible for why many multiplat games have better framerates or better aa or whatever than they do on PS3?
Somehow, I think something got lost in translation, and what they meant with "target" was the overall feel and look of the game.
No way next gen systems can do that.
360 didn't run The witcher 2 well man, it's like two different gens when compared to PC.
Eideka makes a great point. The PC is unquestionably a level ahead of the consoles but suggesting a generational gap is hyperbole.
Unless you can either prove it and I know you can't just like I know there is no way to disprove it, this bit of tirade should be dropped. It's getting annoying now.
Personally, I can easily see them doing all those things at 720p30fps to the say the least. But that's just a gut feeling.
Alpha blending is something that can be done on the EDRAM, which makes it nearly free as far as bandwidth goes since the GPU has fast access to the EDRAM. Textures and game assets are generally accessed from the main memory (since they are too small to fit into a EDRAM which is only a couple of megabytes). To make it simpler, the 360 gets free bandwidth (or does not consume bandwidth from the main memory) for specific effects, which the PS3 does not get, although XDR has a slightly higher bandwidth than 360's memory.
Somehow, I think something got lost in translation, and what they meant with "target" was the overall feel and look of the game.
No way next gen systems can do that.
If MS puts full win 8, or win 8 RT, could they offer access to all metro apps?
They'd need to deal with controller fragmentation - it's not touch, you don't have mouse and keyboard, so they'd have to have controller support somehow
Did I miss something? I haven't visited GAF in ~24 hours, but this seems to be a 50 page thread about... nothing new at all.
I'm rather happy with my 5v, completely silent Raspberry Pi.
It's good enough for me
Still, 3 GB for OS doesn't sound realistic.
Full Windows 8 doesn't need half that.
I can maybe understand MS wants a system that can run several apps in background while gaming, but i don't need them and they will probably affect gaming performance anyway (unless not only a part of the ram is reserved for OS, but Cpu cores etc...too).
Do you honestly believe that most of these responses would even exist if XB3 had the same amount of RAM available for game development (3.5 vs 5)? Because people have yet to see the proof of what makes GDDR5 a good choice over DDR3, the contention is based on "amount" alone, pushing the "type" into irrelevancy. It's more about people hoping that PS4 doesn't get short end of the stick when it comes to multiplat titles like current gen.
Underwhelming specs? 4GB of ram isn't really something I think is future proof, open world games are going to have problems a la PS3 no?
Durango is using 8GB minus OS (which just can't be 3GB, that's just too much)
And after reading that the PS4 version of Planetside 2 would need some adjustments and removal of some effects...bleh.
Can somebody put my concerns to rest?
I wanted someone to properly answer my question about how well the laptop in that article could run the Witcher 2? Not some lie about the 360 version (of a game that no PC of similar power to the 360 could run at all.)
My point is that if that's how powerful the PS4 is going to be then it's not good enough at all imo.
Flip the question round. What makes more better? Why the assumption that it is better? If you can't feed the GPU quickly enough, and the GPU can't process as much anyway, then all you have is a big cache.
I'd like feedback from the developers on here as to how a typical streaming engine might break down the memory - eg how much for the immediate view, how much for caching your surroundings, how much buffer from HDD etc.
Did I miss something? I haven't visited GAF in ~24 hours, but this seems to be a 50 page thread about... nothing new at all.
Not enough tflops.
Underwhelming specs? 4GB of ram isn't really something I think is future proof, open world games are going to have problems a la PS3 no?
Durango is using 8GB minus OS (which just can't be 3GB, that's just too much)
And after reading that the PS4 version of Planetside 2 would need some adjustments and removal of some effects...bleh.
Can somebody put my concerns to rest?
Did I miss something? I haven't visited GAF in ~24 hours, but this seems to be a 50 page thread about... nothing new at all.
That's the thing. I don't even think it is a big leap from the PS3 to that laptop and mobile GPU combo. Certainly it's not nearly a big enough leap for me.
How would the comparable laptop in that article run the Witcher 2, which the 360 ran well?
Did I miss something? I haven't visited GAF in ~24 hours, but this seems to be a 50 page thread about... nothing new at all.
Can there be tech like Gakai where a user could download an "App" on the PS4 or Xbox or whatever that's has like 1 Gig of generic Textures and sounds, for example, and then play streamed games where most/part of the graphics load is handled locally?
Forgive my ignorance, but from the gathered leaks, which system is the most powerful.
Over time the os usage of ram shrinks , doesn't it?
Ps3's XMB used a lot of ram initially, but the footprint got smaller with new firmware updates.
Being the software experts that MS are, isn't it possible for them to free up ram as the gen progresses, thus making more than 5 available to developers?
Same with Sony.
Me thinks pretty much even systems.
I'm not sure this is what you're getting at, probably not, but Gaikai was showing a thing where you could start playing a game before it finished downloading. It would download the bits you need to get started then download the rest in the background while you play, so you can start playing faster.
http://www.youtube.com/watch?feature=player_embedded&v=SRyx8dFooV0
Don't the 8 cores take care of that concern?
Don't the 8 cores take care of that concern?
woah I have not seen that before. So I guess this is one potential answer to the lag faced in streaming games right? Very interesting