There is no possible way Sony has ray tracing on the cards. The original fermi GTX 480 was tracing at ONE frame per second.
On June 12, 2008 Intel demonstrated a special version of Enemy Territory: Quake Wars, titled Quake Wars: Ray Traced, using ray tracing for rendering, running in basic HD (720p) resolution. ETQW operated at 14-29 frames per second. The demonstration ran on a 16-core (4 socket, 4 core) Xeon Tigerton system running at 2.93 GHz.[14]
lol, Yoshida. Not as good as the Wario64 troll but still great.
Dual-layer and multi-layer are different things.
What the fucking fuck? Who the are you to be acting all high and mighty with this smug tone? I know about folding@home, I have a PS3 that I use almost daily, and I think you should STFU with your pompous comments since all you do is take quotes out of context and spread misconceptions in this forum.
How many times have you been proven wrong? How many times have you been corrected on one of your assumptions? Pull your head out of Sony's ass for two seconds and realize I'm not falling for your bullshit.
I can't wait until next gen specs are revealed and I hope they are nothing like you keep assuming they'll be so everyone can see what kind of joke you are.
That's Jeff for you. He'll spend hours looking through Sony patents and anything he finds=FACT! even though he doesn't understand that these companies make plenty of patents and many don't see the light of day.
Laser requires a different reading algorithm. They can't update it via firmware.What's the difference ?
Legitimate Ray tracing software would require an order of magnitude processing power not feasible on next gen hardware. Not even on Next gen2.
I'll pay £10 to Child's play charity, if ray tracing makes it into a console game next gen. Digital or Retail release.
With a probable 2013 release date, let's say, 13+7, until 14 June 2020.
Last of Us already has real time ray tracing from the main characters fists, and thats on PS3 http://www.neogaf.com/forum/showthread.php?t=477927
Last of Us already has real time ray tracing from the main characters fists, and thats on PS3
Ray-casting is often used for physics/collision problems but is a different kettle of fish from ray tracing for light simulation from a performance:results POV.
but the new unreal engine 4 is doing nice things along those lines with lighting.
The AMD Libraries support bundled ray tracing for lighting. IT's not "Legitimate Ray tracing" but a method of using only a few 100's of rays to determine lighting rather than 10s of thousands that does require an order of magnitude more power. In an AMD lecture on the subject it was admitted that true ray tracing would "bog down the processor". In any case Ray Tracing is starting with bundled Ray Tracing this generation. Next generation may fully support Ray Tracing.Legitimate Ray tracing software would require an order of magnitude processing power not feasible on next gen hardware. Not even on Next gen2.
I'll pay £10 to Child's play charity, if ray tracing makes it into a console game next gen. Digital or Retail release.
With a probable 2013 release date, let's say, 13+7, until 14 June 2020.
This probably is not supportable on early PS3 drives....who knows. Quad layer blu-ray players are necessary to support 4K blu-ray coming after Jan 2013 when h.265 is published.http://en.wikipedia.org/wiki/Blu-ray_Disc said:In January 2007, Hitachi showcased a 100 GB Blu-ray Disc, consisting of four layers containing 25 GB each.[160] Unlike TDK and Panasonic's 100 GB discs, they claim this disc is readable on standard Blu-ray Disc drives that are currently in circulation, and it is believed that a firmware update is the only requirement to make it readable to current players and drives.[161]
No meltdown from me and again, the 2011 cell vision does not require a Cell processor, it's hardware agnostic! Cell or no cell in PS4 is not the point.Hammer24 said:Interestingly the exact same thing happened at the beginning of last gen, when user MikeB "educated" everyone about the awesome power that be the mighty cell.
When finally exposed, he resorted to posting pics of his sister (before he got banned in the end), I wonder if we´ll get another awesome meltdown this time.
lmaolol. awesome.
EDIT: Snipped!
No meltdown from me and again, the 2011 cell vision does not require a Cell processor, it's hardware agnostic!
lmao
Let´s see if i can predict the future:
1. You´ll keep on posting walls of text with no info relevant to the next console gen whatsoever, until the specs for Durango and Orbis get announced.
2. When the specs are out, we´ll have a Durango that is more powerfull than the Orbis. You will continue to "educate" the dirty masses, why this just seems so, as the silicon in the Orbis is so much more advanced, that it´ll only be a question of time until devs will unleash the full power of the mighty APU to surpass whats possible on the Durango.
3. The first 3rd party multiplats will arrive, and look and run significantly better on Durango. You´ll tell everyone that devs are just too lazy, as the Orbis is capable of so much more.
4. Time goes on, the situation wont change, and people will call you out. And then the meltdown will come.
We´ve seen all this before.
LOL, maybe but I secretly believe that Durango and PS4 will have the same SOC, I haven't mentioned it before as there is very little support for this.Let´s see if i can predict the future:
1. You´ll keep on posting walls of text with no info relevant to the next console gen whatsoever, until the specs for Durango and Orbis get announced.
2. When the specs are out, we´ll have a Durango that is more powerfull than the Orbis. You will continue to "educate" the dirty masses, why this just seems so, as the silicon in the Orbis is so much more advanced, that it´ll only be a question of time until devs will unleash the full power of the mighty APU to surpass whats possible on the Durango.
3. The first 3rd party multiplats will arrive, and look and run significantly better on Durango. You´ll tell everyone that devs are just too lazy, as the Orbis is capable of so much more.
4. Time goes on, the situation wont change, and people will call you out. And then the meltdown will come.
We´ve seen all this before.
Most of the speculation for this has timed out and is unlikely leaving something like the above as a possible.Microsoft has purchased the domain names microsoft-sony.com and sony-microsoft.com. Microsoft registered the former yesterday, and the latter two days, according to a WHOIS search on DomainTools (one, two, via WinRumors).
Both domains currently redirect to a Bing search results page. The purchase of the domains have naturally resulted in lots of speculation
Again, wild speculation ...
If the PS4 will really use an x86 multicore CPU, it will be for the first time the other way around. The PS4 would be easier to program than the XBOX720. We know next to nothing about what kind of CPU the next XBOX will use, but if the XBOX720 will use a PowerPC derivated CPU, we have two possible scenarios; in the worst case scenario it could be an in order multicore CPU like Xenon, meaning total crap to program for in comparison with the x86 PS4 CPU, in the best case scenario it could be an out of order PowerPC CPU, meaning that it won't be a bitch to program for but for sure it will not be easier to program for than a traditional x86 CPU.
LOL, maybe but I secretly believe that Durango and PS4 will have the same SOC, I haven't mentioned it before as there is very little support for this.
Now, how can I be claiming the PS4 is more powerful that the Durango when I believe they will have essentially the same core (memory size may be a different).
I even mentioned a wild speculation where there might be 2 of the 1PPU4SPU building blocks mentioned in another Sony Patent in the PS4 and also in the Durango for backward compatibility for both (Xbox360 used 3 PPC processors in the Xbox360 but one was dedicated to OS and DSP duties which could be emulated with other hardware). Economy of scale and building block design would make it practical especially if it they were used in a slimmer slim PS3 SOC which could also emulate a Xbox360. So both Microsoft and Sony would cooperate on a SOC that could emulate a PS3 and Xbox360.
After 2013 TSVs make SOCs economically practical and building block designs having a large economy of scale is what supports this.
Again, wild speculation but the underlying logic is based on the Sony 2010 patent and the AMD "process optimized building block" SOCs.
I am no programmer, but I don't think it will be a significant difference. The big problem was that Cell had asymmetrical cores. The SPU's had to have different (kind of) instruction than the main PU. The big advantage 360 had was that it had symmetrical cores, 3 exactly the same cores. I don't believe X86 or PPC will make a significant difference (considering that PPC development is a common thing on the console field).
Yup....we have less information about the Durango...the assumption for the Durango being a PPC came from the Oban being made in the IBM forge. IF the Oban is not a Durango SOC then Durango might not be using a PPC CPU.Actually, they seem to have a total different core. We can't really talk about things like SOC if we don't know what kinf of CPU the next xbox will use. A single core of Steamroller would take as much space on silicon than 4 simple in order cores of an hypothetical Xenon-like Xbox720 CPU; therefore, untill we don't know what kind of CPU design the Xbox720 will implement, we can't really guess about the chip design.
Where is the fun in that <grin>. Devkit is probably known for the PS4 but interesting enough not for the Durango...riddle me that. And what more can we speculate on DevKit leaks.Hammer24 said:See, thats the problem (wild speculation).
Instead of muddying the water with (mostly) baseless speculation, why don´t you simply concentrate on what we know about the devkits out in the wild? While they are by no means identical to the finalized hardware, they´ll give a pretty good picture of where things are heading.
The AMD Libraries support bundled ray tracing for lighting. IT's not "Legitimate Ray tracing" but a method of using only a few 100's of rays to determine lighting rather than 10s of thousands that does require an order of magnitude more power. In an AMD lecture on the subject it was admitted that true ray tracing would "bog down the processor". In any case Ray Tracing is starting with bundled Ray Tracing this generation. Next generation may fully support Ray Tracing.
This thread is weird, y'all. That said, that Tom's Hardaware review of the new AMD CPUs seems to suggest that one benefit of the dual GPU config is that together, they can push 1080p.
I'm not a tech person, though. Could the combo integrated/7970m that ppl are talking about push that Star Wars demo at 1080p? Sony do love checking off bullet points, and I bet they'd love to say Orbis does full HD gaming.
Devkit is probably known for the PS4 but interesting enough not for the Durango...riddle me that.
So quick question, when people are saying things like "X teraflops won't be enough, etc" what exactly are they aiming for. I was just looking through the high res PC screen thread and someone posted some absolutely jaw dropping screens of the new BF3 DLC. If we could get quality like that at 1080p/30fps next gen I would be incredibly happy. Same with Crysis 2 on Ultra.
Personally, I hope they have to specs to make 1080p an attainable target. Even if 1080p isn't in all games, which it won't be, at least the games that don't make it will likely be greater than 720p.
That's what I'm thinking too, although they did try to say the same for this gen.
7970M alone could probably run 1313, probably not with all of the effects though. We don't know exactly what it was running on, so it's hard to tell. We also don't know for sure if the GPU/APU combo is happening or if it just for the dev kits. Personally, I think/hope it is.
Ok, I am not a programmer but I will try to explain this concept.
In gaming, memory performance(mainly latency), floating point, and integer performance are the key factors for the performance of a CPU.
For example, Floating points (Flops) are important to push the geometry, while integers are important for AI.
RISC CPUs like the Cell have very high Floating performance but lower integer performace, while traditional X86 CPUs have high integer performance but lower floating performance.
For GPUs, the most important factor is the floating point performance, since it is a specialized processor.
When people are saying things like "X teraflops won't be enough, etc" they are probably coming from the observation that for example Epic said that to run the UE4 with the best setting you need a system that generates "X Flops".
I know what teraflops mean but I was more asking about what kind of visuals they're expecting from next-gen since what I'm expecting seems to be attainable at 1080p without breaking the bank.Ok, I am not a programmer but I will try to explain this concept.
In gaming, memory performance(mainly latency), floating point, and integer performance are the key factors for the performance of a CPU.
For example, Floating points (Flops) are important to push the geometry, while integers are important for AI.
RISC CPUs like the Cell have very high Floating performance but lower integer performace, while traditional X86 CPUs have high integer performance but lower floating performance.
For GPUs, the most important factor is the floating point performance, since it is a specialized processor.
When people are saying things like "X teraflops won't be enough, etc" they are probably coming from the observation that for example Epic said that to run the UE4 with the best setting you need a system that generates "X Flops".
If the PS4 will really use an x86 multicore CPU, it will be for the first time the other way around. The PS4 would be easier to program than the XBOX720. We know next to nothing about what kind of CPU the next XBOX will use, but if the XBOX720 will use a PowerPC derivated CPU, we have two possible scenarios; in the worst case scenario it could be an in order multicore CPU like Xenon, meaning total crap to program for in comparison with the x86 PS4 CPU, in the best case scenario it could be an out of order PowerPC CPU, meaning that it won't be a bitch to program for but for sure it will not be easier to program for than a traditional x86 CPU.
So quick question, when people are saying things like "X teraflops won't be enough, etc" what exactly are they aiming for. I was just looking through the high res PC screen thread and someone posted some absolutely jaw dropping screens of the new BF3 DLC. If we could get quality like that at 1080p/30fps next gen I would be incredibly happy. Same with Crysis 2 on Ultra.
Personally, I hope they have to specs to make 1080p an attainable target. Even if 1080p isn't in all games, which it won't be, at least the games that don't make it will likely be greater than 720p.
Let´s see if i can predict the future:
1. You´ll keep on posting walls of text with no info relevant to the next console gen whatsoever, until the specs for Durango and Orbis get announced.
2. When the specs are out, we´ll have a Durango that is more powerfull than the Orbis. You will continue to "educate" the dirty masses, why this just seems so, as the silicon in the Orbis is so much more advanced, that it´ll only be a question of time until devs will unleash the full power of the mighty APU to surpass whats possible on the Durango.
3. The first 3rd party multiplats will arrive, and look and run significantly better on Durango. You´ll tell everyone that devs are just too lazy, as the Orbis is capable of so much more.
4. Time goes on, the situation wont change, and people will call you out. And then the meltdown will come.
We´ve seen all this before.
If the PS4 will really use an x86 multicore CPU, it will be for the first time the other way around. The PS4 would be easier to program than the XBOX720. We know next to nothing about what kind of CPU the next XBOX will use, but if the XBOX720 will use a PowerPC derivated CPU, we have two possible scenarios; in the worst case scenario it could be an in order multicore CPU like Xenon, meaning total crap to program for in comparison with the x86 PS4 CPU, in the best case scenario it could be an out of order PowerPC CPU, meaning that it won't be a bitch to program for but for sure it will not be easier to program for than a traditional x86 CPU.
1080p may see more support next gen, but expect developers to go back to 720p in order to have more time per pixel. No matter how good games look, resolution will always be one of the first things to be sacrificed for more performance.
Since everyone seems to ignore that IGN usually suck at rumors we can bring another of their stories into this.
IGN did their developer poll and the overwhelming majority were indicating that the Durango was easier or would be easier to work on than the PS4. Take that as you will.
http://www.ign.com/articles/2012/06/01/the-next-generation-according-to-game-developers
I am probably in the minority on this but I am ok with that. Resolution is one thing but for particular games the effects going into the game to make the game world look interesting and react interestingly are more important than the resolution.
Not that I am doubting the verity, but I wonder who they asked? The sample size (35) is pretty small
So quick question, when people are saying things like "X teraflops won't be enough, etc" what exactly are they aiming for. I was just looking through the high res PC screen thread and someone posted some absolutely jaw dropping screens of the new BF3 DLC. If we could get quality like that at 1080p/30fps next gen I would be incredibly happy. Same with Crysis 2 on Ultra.
Personally, I hope they have to specs to make 1080p an attainable target. Even if 1080p isn't in all games, which it won't be, at least the games that don't make it will likely be greater than 720p.
at 1080p? Sony do love checking off bullet points, and I bet they'd love to say Orbis does full HD gaming.
Not that I am doubting the verity, but I wonder who they asked? The sample size (35) is pretty small
Karak said:IGN is a basin of insider knowledge. So I figured might as well throw it in here.
MS is really good at documentation though, in comparison to Sony and Nintendo - even with more standardized hardware in this upcoming round. That, and the gobs of memory.
IGN has real, legitimate sources. It seems their understanding of technology, or whatever "second hand information" they are told is lacking sometimes or intentionally obsfucscated though.
I just hope they realize the cost savings by omitting DAC's and make the video output HDMI only with 720p minimum. HDMI + combined optical/stereo output jack should be the only 2 A/V ports behind the device. Not expecting 1080p standard, but I surely hope there is a mandatory 720p minimum standard enforced this time, it will also make video scaling much simpler.
Jesus H.... your constant trolling of jeff is becoming really old. Sure, you might not agree with what he's writes but at least he's putting some research into his posts and provides cites.Oh my god! Sony is copying the Wii-U by releasing a controller with a screen, only it's the size of a small TV, take that Nintendo!
It's a patent, so it must be real, right?.........right?
Jesus H.... your constant trolling of jeff is becoming really old. Sure, you might not agree with what he's writes but at least he's putting some research into his posts and provides cites.
There's no need for the constant bashing. At least jeff's trying to contribute something interesting and worthy of discussion unlike your constant shitting on him.
He steal your bike as a kid or something?
Aspects such as compilers, debugging tools, performance analyzers, and other tools matter just as much as architecture, if not more. If the tools allow the developers to get the job done, regardless of the architecture, that will make a difference in the end. Though I agree that x86 would be most preferable by most developers since it's well documented and people have plenty of experience.
I don't think MS is going to use anything but an OoO core since they originally wanted this in the 360 but never got it.
Basically whichever of the two is easier to develop for and receives more dev support will see the better return. Question is, out of the three, which do you think will pull ahead?
Fine fine, I'll back off. Didn't realize it was that bothersome to people, my apologies.