Why does it keep being labeled as "rushing" the console if it releases next year? It could be ready for next year and MS has just been watching the competition and the market.
For me, I just want a noticeable upgrade in graphics, 1080p and NO jaggies! It seems like every generation starting with the Ps2 the elimination of jaggies is touted but then it never materializes.
That is what the edram will be for. They will probably put enough to do MRT so they can do deferred rendering at 1080p.
If MS is going to use a split memory pool, is 1GB GDDR5 even enough with 2GB DDR3?
Battlefield 3 on PC is a now game, and it already uses about 1.3gb of video memory for 1080P with everything set to max. Future games will only go up.
If we have a choice between 2GB DDR3 + 1GB GDDR5 or 2GB GDDR5 + Large EDRAM, I take the latter. I'd surely expect the Nextbox to at least be able to offer modern games at the ability to max them out at 1080P. How can the first option do so when it doesn't even have enough video memory to handle all effects at 1080P? With a bunch of EDRAM and fast memory, the second option should hopefully make this possible. Someone correct me if I'm wrong with this line of thinking.
Official Recommended System Requirements
OS – Windows XP / Vista
Processor – Intel Core 2 DUO @ 2.2GHz or AMD Athlon 64 X2 4400+
Memory – 2.0 GB RAM
GPU – NVIDIA GeForce 8800 GTS/640 or similar
The above "Official Recommended System Requirements" will not run the game on very high. To run the game at 30+ frames per second on very high the following is recommended.
OS - Windows Vista
Processor - A dual core processor 2.6 GHz or above
Memory - 3.0 GB RAM
GPU - NVIDIA GeForce 280 GTX or a Radeon HD 4870 X2
Crysis on PC near the beginning of this generation of console
PS3 & Xbox 360 specs are nowhere close to these specs but they survived 5 years after Crysis with 1/4 of the recommended ram to run Crysis.
Again PC RAM =/= Console RAM, the RAM for the PC is used for the OS and other things where as the RAM in the console is used for just games.
Again PC RAM =/= Console RAM, the RAM for the PC is used for the OS and other things where as the RAM in the console is used for just games.
6 core and dual GPU in 2012. Make it happen Microsoft. Day 1.
Graphics have hit the wall on the current gen. Played Uncharted 3 a few weeks ago and it felt like playing Uncharted 2, which was a big wow over Uncharted 1.
How is it even remotely possible that MS can build a console for an affordable price with a GPU as large as the 6990 in 2012? We are deluding ourselves. I'm sure someone like Brainstew will set this shit straight.
Except that's not true - and I'm pretty sure we've discussed this before. 1080p60 FP is already supported for single link under the HDMI 1.4 optional formats specification. The 3.4 Gbit/s max bandwidth of HDMI 1.3/1.4 has always been enough to do it.won't be any next-gen, not anytime soon at least - it'd need a HDMI standards update.
That's just it though ... BD3D is 1080p24 FP. Lot less bandwidth than 1080p60 FP. Current TV's don't support that.Im not sure, but bluray movies can be played in full 1080p 3D. They sequentialy show one full frame for right eye and then one for left eye. The video that is played is esentialy 1080p@48fps, and hdmi can transfer that much data without a problem.
It's going to be DX12 and beyond. It'll have GPU features that might or might not make it to AMD 8000/9000 series.
Again PC RAM =/= Console RAM, the RAM for the PC is used for the OS and other things where as the RAM in the console is used for just games.
Given that PS3 and 360 have near identical RAM (I say near because 360 has 10mb edram), it makes me wonder if the designers of PS4 would be going with the same amount or if more whether MS will follow suit.
are people losing faith on the CES rumor already?
1080p30 FP is also part of the optional spec IIRC.But you get 1080p/24/3D. So each eye only gets 24fps. I don't even think 1080p/30/3D is an approved spec, which should be doable as its just 1080p/60 levels of bandwidth.
Ps3 and 360 have near equivalent ram in amount, but the split pool on the ps3 makes it really less... Or at least less manageable
It would be rather pointless to go beyond 2 GB, 3 GB seems like it could be too much
Remember we are in a shader driven world
A 4x increase in ram where the following isn't likely to increase 4 times: sound and music data, program data and variables, operating system, character and world meshes
Some of those will see minor increases, but I highly doubt you'll see any of it more than double... This leaves likely more than 4x the texture data
People keep thinking in the way of basic game design, but it's all in shaders and effects that changes everything, NOT the amount of textures or poly detail
If you're using GDDR5 (particularly if it's clocked high) ... what's the advantage of using eDRAM on the GPU? For example, Hynix Semiconductor has a design for memory that offers as much bandwidth as 360's eDRAM. Why pay for the same speed twice?4 GB of high quality RAM is probably the best trade off we can get. If you want to go really bonkers, 2 GB of GDDR5/XDR 2 + 2 GB DDR 3 + 150 mb of ED RAM. That will yield too much win.
Unless people have no common sense (quite possible) ... it actually should be helpful.No. Not even close to the same thing. Comparing smartphones and game consoles isn't helpful at all.
6-core and 2GB is a pretty modest upgrade over 360. 2 GPUs on the other hand... very interesting.
I wonder if they'll be able to sell it for $300.
Stop being reasonable! Quad GPUs! 200MBs of eDRAM each! Five different banks of eDRAM on the CPU! Three banks of different main memory types! Digital distribution only!If you're using GDDR5 (particularly if it's clocked high) ... what the advantage of using eDRAM on the GPU? For example, Hynix Semiconductor has a design for memory that offers as much bandwidth as 360's eDRAM. Why pay for the same speed twice?
Now let's say they go with slower video RAM since they have eDRAM - why on earth would they use 150MB? To what end? That's way more than needed for 1080p with several effects buffers. I can't imagine the costs.
Another point of contention: Why can't MS or Sony pack the latest tech and sell at a calculated and deliberate loss per console with production plans in place that yield an earlier opportunity to break even followed by profits. It's the same thing they did with this gen of consoles except this time the turn around period could be calculated to be much shorter.
Stop being reasonable! Quad GPUs! 200MBs of eDRAM each! Five different banks of eDRAM on the CPU! Three banks of different main memory types! Digital distribution only! Final destination!
Heh ... was about to post thisShow me where I can get a $100 5870.
but thats retail prices, cost to make the card is way lower.A 6990 is 700 dollars alone. Next year in 2012 it will be 500 dollars if we're lucky. Beyond that, the CPU + RAM + HDD, etc- will make the system cost a fuck ton. Your looking at an on average 800 dollar system in 2012 with an insane amount of heat and power issues.
These two corporations masterfully manipulate the media,always ready to report on mindless rumors or wild speculations in order to get hits;the hardcore gaming community gets thrilled and starts treating rumors as facts,and as result no one talks about the Wii U, which is coming really soon,since they think something bigger is waiting just around the corner.late 2013, early 2014
Heh ... was about to post this
fixed for even more awesomeStop being reasonable! Quad GPUs! 200MBs of eDRAM each! Five different banks of eDRAM on the CPU! Three banks of different main memory types! Digital and HVD distribution!
MS probably can get a 5870 for less than $100 for a console part today.
Unless people have no common sense (quite possible) ... it actually should be helpful.
People have no problem replacing smartphones every 2 years and for and increasing population, tablets every 3 or so years (if not sooner for some). Given the life cycle of console, the amortized costs is far less.
If people get sticker shock due to up-front costs - they're fucking idiots with no concept of budgeting.
No.Bingo!
I certainly don't mind being premium price for a premium product, the higher the spec and the more features the better.
The real reason most people where unhappy with the PS3 launch price was not feature set or specs but rather the lack of polished and must have titles.
Especially considering where 360 was at the time in terms of polished titles that looked and felt next gen and at the cheaper price..
It won't sell well regardless of the price due to supply constraints. They should price it so that it's maintains complete inventory depletion for first few months and pave the way for lots of subsequent price drops.Any system releasing above $350 right now isn't going to sell well at all.
I doubt it. Price reductions come from yield improvements and die shrinks (though the latter can reset the former for a bit).MS probably can get a 5870 for less than $100 for a console part today.
Even so, that would eat up so much of the budget, and it doesn't matter because a 5870 would not be suitable for a console.
I doubt it. Price reductions come from yield improvements and die shrinks (though the latter can reset the former for a bit).
The problem here is the 5870 is a 2.15 billion transistor part with extra HW for busing, etc, and dual banks of RAM (redundant).
That's an absolutely HUGE card. I highly doubt a die shrink is enough to get it down to that price.
Ah, I getcha ... though I disagree with the $350 price.You're basically making my point.
People are spending their money on smartphones and tablets at more frequent rates than they did 5-6 years ago.
That means less money for other things in a bad economy. Any stlystem releasing above $350 right now isn't going to sell well at all.
Depends how you define it. If you're talking about 5870 level of performance I wouldn't expect the console's GPU to be far off.
Anandtech estimated the 360's GPU at 90nm to be about 182mm^2.
If MS launches a new xbox with the same sized GPU, at 40nm they could fit around 1 billion transistors in the same die area. At 28nm, it would be around 2 billion transistors. The 5870 was about 2.1 billion transistors, so it's close.
Given appropriate optimizations for a console and a 28nm process, I think they could hit 5870 performance with a console GPU. The GPU manufacturing costs would be probably under $50 as well.
Depends how you define it. If you're talking about 5870 level of performance I wouldn't expect the console's GPU to be far off.
Anandtech estimated the 360's GPU at 90nm to be about 182mm^2.
If MS launches a new xbox with the same sized GPU, at 40nm they could fit around 1 billion transistors in the same die area. At 28nm, it would be around 2 billion transistors. The 5870 was about 2.1 billion transistors, so it's close.
Given appropriate optimizations for a console and a 28nm process, I think they could hit 5870 performance with a console GPU. The GPU manufacturing costs would be probably under $50 as well.
The AMD Gpu is 262mm2 with the daughter Edram die.
We're talking 3 billion transistors here for a 28 nm part. We're talking above 6970 performance levels here.
My bad too ... in my previous post my brain made me think the 5970 was being referenced (the dual GPU) since that would make more sense in the analogy.Hmm. The 6870 is $200 part. I got confused there. Screw AMD and their messed up system.
If it's twice the power consumption of Xenos, then it's doable in a console, since Xenos was around 50 - 60 Watts.It's not that simple though. We're still looking at more than twice the power consumption.