• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U Thread - Now in HD!

Status
Not open for further replies.

NBtoaster

Member
For me, the realistic lighting and shadows seen in Zombi U, also shows dynamic shadows. Another being the ease with which DoF has been seen in things like P-100 and Pikmin.

The leaked dev kit specs show DX11 feature set in black and white, and have gone some way in being verified as credible on here. So unless they are not or have since been scaled back, the DX10 feature set is just not the case.

What leaked specs show DX11 features?
 

The_Lump

Banned
right. i don't think the features are going to preclude fall fat UE4, i think the performance is going to preclude it. sm4 vs sm5 isn't really going to be a factor.

There was never any hope of full fat UE4 though. That'll be a push for PS4/720, let alone WiiU. I don't think thats even in question atm.

Just not sure why "dx10 / dx11 features" gets chucked about as a yardstick for measuring power - they're unrelated topics.
 

nordique

Member
Not necessarily. I think all the weak CPU for the Wiiu comes from the vector unit. It comes down to how many vector instruction you can issue per clock and how wide the vector unit is. If you have enough registers to hide all the latencies, it basically Clock*Ops/Cycle.

The article mentioned that the design house that doesn't have issues with the CPU is one that is very GPU heavy. My guess is that they're not utilizing the 360's VMX or the PS3's SPE's very much on those consoles either.

My speculation is that while the WiiU CPU is very modern and efficient in running code, it lacks the a powerful/fast vector unit and that will hinder performance of very compute heavy tasks in ports. That stuff could be run on the GPU probably since it's a more modern GPU, but I doubt many devs will do that.

Yes, this is a very good point, and I don't disagree; but what I meant to imply was that in the article, the innate difference between the Wii U structure regarding the CPU vs the 360/PS3 set up is not discussed or even ackowledged. It paints the CPU to be poorer than it likely is.

Regardless of the intricacies, there are fundamental differences between in Order and out of order architecture, and all these engines that have been optimized for 5-6 years to run on in order CPUs are simply being ported over to a new system withini a rushed time frame

there is no mention about the DSP and how that could relate with the CPU efficiency, when we know sound takes up much CPU power (the one developer did say they made 15% efficiency progress, to me this may indicate DSP offload)

and no dicussion of the GPGPU-like CPU to GPU offload capabilities at all

it just seemed like the article was too definite about the CPU. And my reasoning for that is because the reporter doesn't understand what they are talking about too well; they're just passing on the information.
 

nordique

Member
i've said before that i'd be happy with Nintendo's first party Wii level graphics in HD (based on what i've seen out of dolphin on my PC). i'd like more, but that standard is enough for me, so i'm very happy with what we're getting, even though clearly it isn't going to be putting out graphics that remotely compare with my PC or indeed the consoles that follow it.

so long as the IQ isn't garbage, as it was on the Wii, i think it's going to be a great little piece of hardware. however, that doesn't make me optimistic about what third parties are going to do with it. hardly any third party Wii games look as good as the higher end third party GameCube games did. IQ is a big part of why Wii games look shit (solved here, hopefully) but team talent is the other part (compare Nintendo's games in HD on dolphin with, well, basically everyone elses) of why Wii games looked shit... and is that going to be solved? architecture helps somewhat, but not entirely i don't think.

It depends what your expectations are on IQ

Personally, I am of the opinion that 32mb of eDRAM is more than enough to clean up IQ

but that also depends on a multitude of factors, most importantly whether developers attempt to implement things like AA or not.

I would expect most retail releases in 720p though, rather than 1080p, but one never knows. It all goes back to developers.
 

big_erk

Member
Definitely an option, but they're quite expensive iirc?

You can get a 128GB SSD for just under $100.00. Which is cheaper than a 128GB sd card. I just happened to get my hands on a 256GB unit and set it aside for this purpose. You can always go with a mechanical hard drive and save a lot of money, but moving parts and all.
 

nikatapi

Member
It was clear from the beginning than Nintendo would not go with a super powerful console, but a moderate improvement over the current generation consoles.

The fact that there are some pretty good looking games at launch (Assasin's Creed, Trine 2) which are probably rushed ports is encouraging, and i think as developers get to understand the system we will get some very nice graphics.

Of course everyone's concern is if the machine will be capable to get ports (downgraded of course) from the next Sony-MS consoles, so if the power gap is very high, it might be difficult in the future, especially as the developers learn and take advantage of their power to the point that it affects gameplay, not only visual improvements.

The only certain thing is that we will get some very innovative gaming and non-gaming uses of the WiiU gamepad, and probably Nintendo's first party games will look amazing.
 
I think you can still be positive about the Wii U and talk specs. You just need to have real expectations about the system. Within that, the tech can still be discussed without need for kleenex boxes nearby lol

people who get upset about specs are those with insane expectations, and we've known for over a year now (we've been talking about all sorts of stuff since E3 2011) the system was going to be a moderate boost to the PS360

It is a little dissapointing to see Nintendo go for the lower end of their RAM target, 1GB rather than the upper end @ 1.5 GB (which is what we've been expecting, expectation in check and reasonable range et al)

Assuming, that is the final retail hardware's useable RAM.

I still find it quite alarming that when we are in threads like these one linked article = confirmed info

people take it as facts and run with it... no one will care that we jumped to conclusions but these threads become a chore to read.

Specs are no fun to talk about when people start bringing up PS4/X720 future UE4 games and pointing at what WiiU can't do. These stories get written to be linked and get talked about. People jump and call them facts.

At this point we need to read Hands-On / Previews of the games we should all know what level of hardware Nintendo has been telling us they have planned.

September 13 hype! I want games.
 

nordique

Member
What leaked specs show DX11 features?

I think the compute shader support was the big one, which unless I am mistaken, is more of a Shade model 5.0 / DX11 type feauture than DX10/ SM 4.x

I remember when bg first posted those specs, that was the first thing that jumped out at me after I read through everything

some of the more super techy folks were surprised too
 

Van Owen

Banned
Dissapointed about the amount if RAM and that it's apparently pretty slow.

I think the weak CPU is really going to hold back ports from 720 and ps4.
 

nordique

Member
It was clear from the beginning than Nintendo would not go with a super powerful console, but a moderate improvement over the current generation consoles.

The fact that there are some pretty good looking games at launch (Assasin's Creed, Trine 2) which are probably rushed ports is encouraging, and i think as developers get to understand the system we will get some very nice graphics.

Of course everyone's concern is if the machine will be capable to get ports (downgraded of course) from the next Sony-MS consoles, so if the power gap is very high, it might be difficult in the future, especially as the developers learn and take advantage of their power to the point that it affects gameplay, not only visual improvements.

The only certain thing is that we will get some very innovative gaming and non-gaming uses of the WiiU gamepad, and probably Nintendo's first party games will look amazing.

Nice post

and yeah the big thing is whether or not the Wii U could get next gen downports

I think, less than power, its more feature set. For instance, mobile phones are much, much less powerful than PS3 and Xbox 360 (don't let the "1GB Ram" and all that phone spec sheet crap fool you) but they can still obtain unreal engine 3 games.

Likewise, if the Wii U has a comparable feature set with the PS4 and 720, it won't matter than the raw tech is closer to the PS3/360, only that the Wii U would be capable of running the engine (something the Wii wasn't able to do)

I should mention, that regardless of what all these developer leaks mention, Iwata did address these concerns at previous investors meetings, and according to him he feels the Wii U will be powerful enough to get future ports from future more powerful systems. To me, that is all that needs to be said about the potential; it means Iwata was sure that it could recieve ports not just from current gen but the next gen systems too which is why we have forward thinking feautures such as a "GPGPU" focused architecture (akin to what the other next gen systems seem to have)

what it will boil down to, at the end of the day, is whether publisher will want to port and whether developers will do a good job with the porting.
 

The_Lump

Banned
Yes, this is a very good point, and I don't disagree; but what I meant to imply was that in the article, the innate difference between the Wii U structure regarding the CPU vs the 360/PS3 set up is not discussed or even ackowledged. It paints the CPU to be poorer than it likely is.


...


it just seemed like the article was too definite about the CPU. And my reasoning for that is because the reporter doesn't understand what they are talking about too well; they're just passing on the information.


I agree. The quotes he uses about the CPU could just have easily been taken in a positive manner. The guy says "We are not limited by it, but some games might suffer from it". That same guy says their game is GPU heavy.

That could be taken to mean many things, not necessarily negative. Like you say OoOE is huge. A lower clocked OoOE CPU with two way SMT for example, would trash the 360 CPU in most ways. But stick a game optimised for 360 on it and you will soon be 'limited'.

Article was quite quick to take the quote as a bad thing and didn't elaborate on other posibilities of what it could mean. Other than that, good article.


(fyi, I'm not saying the CPU will or won't be a problem/weak sauce, just saying we don't know and the articles author doesn't either )
 

IdeaMan

My source is my ass!
Article confirms 1 GB of usable RAM for games.

Yep (1, 2, and more)

This is in part why i was never specific about the memory amount, because devs were working with 1GB for their games, but haven't seen a retail unit with the final quantity of RAM, they were just warned of it.

And there is all this story about the room reserved for the "OS"/Software layer elaborated in many posts like here.

Nintendo surely got a better idea of the OS size now, the files that constitutes it. But let's say it takes 50mo. It's very likely, considering there was a huge space apparently dedicated to those functions this year, that it will still need an additional room for the Wii U services, background applications, multi-tasking, etc.

So there are two reasons why the system can't have only 1GB of total ram:

1) Studios already built their games with this amount. Imagine Nintendo saying "hey, our OS is 70MB and we need roughly 150mb of memory available while someone plays, so your projects will run with 850MB in the end". It would be a hell, especially for titles who are at roughly 30 fps and who will therefore see their performance decreasing.

Besides:

A: I doubt they found a way to reduce the imprint of all the non-gaming stuff, acknowledging what kind of features (social, internet, multi-tasking, apps, voice chat, camera, etc.) the Wii U is capable of, to a point they would only require let's say 0,1GB in addition to those 1GB. But it's a possibility.

B: I also question the convenience and cheap aspect of the hypothetic choice to integrate a 0,1 to 0,4GB memory chip for the "OS", or design a 1,1GB unified memory chip for games + "OS", rather than simply put 1,5GB.

So to summarize, at worst case scenario, the Wii U got those 1GB granted to developers + an additional X amount for the "software layer".

2) Like i said, some third-parties were noticed at a time, of the normally planned final amount of memory in the retail unit. And it was clearly more than 1GB, more than 1,1GB. But it was a few months ago, long before the OS was completed (if it's the case now), long before the presentation of miiverse and the possible introduction of the dashboard and all this operating system layer to developers. Those developers didn't saw a working commercialized system with its total ram. So we'll need to patiently wait the platform release to finally witness what Nintendo decided on that matter (a weird 1,1GB total quantity or 1,5GB-2GB one).

TLDR: Wii U will rocks, it's a solid and balanced system !
 

nordique

Member
Dissapointed about the amount if RAM and that it's apparently pretty slow.

I think the weak CPU is really going to hold back ports from 720 and ps4.

first, I agree regarding the ram

second, don't come to conclusions about the CPU just yet. I still think some developers are being disingenuous regarding both it and how they effectively utilize it, and all these reporters do is just report on the information.

even the tech guy he talked with doesn't have first hand information of the CPU.

Its a different architecture, and its a new system. We must keep that in mind.

That said, it does seem its on the slow side - but it might be not just because of a low clock. It might also be due to developers not understanding how it works yet.

go figure. new system and all.
 

Stulaw

Member
People do realise that Shader model 5.0 is just microsofts version in HLSL and that OpenGL has the same features but they use in GLSL 4.3 right?
 
cbb7d55401310248f3a6addad1c5a023.jpg

Kinky.

...that's only cos Starfox vs Metroid technicallly isn't an existing IP.

Yeah, yeah. I'm not going to let you have that. :p

Yes, you can. Now do it!

*crosses arms and pokes out bottom lip*

No!





The two things I was looking for.

you definitely should. theres a nice network of gaffers and you can drop more hints. i'll follow.

twitter is pretty amazing once you get into it. really feels like a community.

I just can't into that and Bookface.

No it's not, Dev's would trade volume for speed any day.

At most, high speed RAM will only give you a couple of frames per second extra, whereas running out of RAM will collapse your frame rate.

If Nintendo asked the Dev's..4GB of exotic or 8Gb of standard?, 9 out 10 would take the latter.

Would a game dev (multiplat or otherwise) really prefer 8GB of DDR3 to 4GB of GDDR5?

Depends on the person.

http://forum.beyond3d.com/showthread.php?t=62108

So that means the vg-leak CPU info is wrong/fake ?

Maybe nintend purposelly did, or just the guy who was writting text mixed up the terms ...

It means it's correct (I say that based on personal info corroborating other things). As I said IBM said Power. That means it can be PowerPC.


... 100% ?

Ok, 200%.


SD Cards are the last

Got any info on access/seek speeds and transfer ?

I'm working under the assumption that the SD card slot is going to be faster than the USB 2.0 port. Future SD card speeds are supposed to range from 100-300 MB/s.


Nice read. The writer makes some unusual conclusions though.

b) 1-1.5 GB seems plausible. bg made mention how its more 1.5-2GB, but that could be total system RAM; useable RAM might only be 1-1.5 GB

Yeah I'm talking about total. Useable I don't know, but we did learn that 1GB was set aside in the kits. I've said before I would be surprised to see a large amount blocked off in the final, but I don't recall saying it's impossible.
 
Dissapointed about the amount if RAM and that it's apparently pretty slow.
Didn't see the Digital Foundry sidebar at first so was wondering where this comment came from. Have there been several reports of this as they state?
This is in part why i was never specific about the memory amount
I thought you were the source of the 2GB figure..?

I don't think anybody (except for that Emily Rogers lady) thought there'd be 1GB of total RAM - but usable RAM for games falls roughly in line with the SDK doc leak?
So wait - if there's 1GB total ram and the OS uses like 200MBs, there's 800MB for the devs left?
Nah, the article states there's 1GB available for games.
 

Van Owen

Banned
first, I agree regarding the ram

second, don't come to conclusions about the CPU just yet. I still think some developers are being disingenuous regarding both it and how they effectively utilize it, and all these reporters do is just report on the information.

even the tech guy he talked with doesn't have first hand information of the CPU.

Its a different architecture, and its a new system. We must keep that in mind.

That said, it does seem its on the slow side - but it might be not just because of a low clock. It might also be due to developers not understanding how it works yet.

go figure. new system and all.

When they say the CPU is slow, I don't think they're just referring to the clock. If it was super efficient at low speeds they wouldn't be complaining.
 

IdeaMan

My source is my ass!
So wait - if there's 1GB total ram and the OS uses like 200MBs, there's 800MB for the devs left?

If in the end, the retail unit will have 2GB of ram and the software layer requires let's say 300MB, yes, they won't waste 700MB.

But there's a lot of IF here.

The only sure thing ? Devs made their games with at least 1GB of ram for it.
 
Dissapointed about the amount if RAM and that it's apparently pretty slow.

I think the weak CPU is really going to hold back ports from 720 and ps4.

I wouldn't make that conclusion yet based on what we're learning about those two consoles.

I thought you were the source of the 2GB figure..?

One of. The other person I talked with since I was the one who mentioned 2GB publicly as being legit, and I'm paraphrasing, said "2GB is definitely on the table". That's why I haven't said that amount conclusively.

So wait - if there's 1GB total ram and the OS uses like 200MBs, there's 800MB for the devs left?

I'm not so sure the OS will be that small.

TLDR: Wii U will rocks, it's a solid and balanced system !

The real main point.
 

nordique

Member
Didn't see the Digital Foundry sidebar at first so was wondering where this comment came from. Have there been several reports of this as they state?

Do you mean 32mb eDRAM or the 1GB useable RAM for games, as per that article?

I don't think anybody (except for that Emily Rogers lady) thought there'd be 1GB of total RAM - but usable RAM for games falls roughly in line with the SDK doc leak bg's latest assassination payout?

I can't speak for others, but it was in line with what expectations have been, though on the low end

speaking of which, bg had to assassinate many bothans to bring us that information

(ps. thank Ideaman for clearing up that...so it might still be in flux? I thought Wii U units would be well under production by now, which would indicate the final RAM would be set aside now...1 othe whole GB for OS is crazy, j'accepte)
 
Chû Totoro;41568222 said:
The fact is that I don't know a lot about all this stuff. I have very basic knowledge but I still want to know some basic things like size of RAM, GPU etc.
And 1GB of RAM seems to be a little low so I'd rather wait for final specs.

1GB of usable system memory isn't low at all, it's amazingly great for a game console. Don't compare this to PC memory, because PC's are running Windows - Windows 7 uses 4GB RAM all by itself (if you don't have 4GB RAM, it is constantly accessing "virtual memory" on your hard drive or flash drive to get what it needs).
 

tkscz

Member
The meat.


CPU: The Wii U's IBM-made CPU is made up of three Power PC cores. We've been unable to ascertain the clock speed of the CPU (more on this later), but we know out of order execution is supported.
RAM in the final retail unit: 1GB of RAM is available to games.
GPU: The Wii U's graphics processing unit is a custom AMD 7 series GPU. Clock speed and pipelines were not disclosed, but we do know it supports DirectX 10 and shader 4 type features. We also know that eDRAM is embedded in the GPU custom chip in a similar way to the Wii.
"It's comparable to the current generation and a bit more powerful than that.

This part is making it hard to believe, for me anyway. When did it switch to a 7000 series GPU? And if so, why did it stop at DX10? For that matter, why say DX10 in the first place? A Nintendo console Leagally can't support Direct X or Shader Model. Something isn't stirring the kool-aid.
 

Aostia

El Capitan Todd
it seems that Sniper Ghost Warrior 2 could be the Cryengine3 game in development for the Wii U.


About the RAM: thay say that 1Giga is for games, so I think that the total (including the part used for the OS and so on) could be bigger than 1Giga.
 

nikatapi

Member
If in the end, the retail unit will have 2GB of ram and the software layer requires let's say 300MB, yes, they won't waste 700MB.

But there's a lot of IF here.

The only sure thing ? Devs made their games with at least 1GB of ram.

You've repeatedly said "at least", so i'm wondering if there was any suggestion by Nintendo to the developers that the amount of memory for gaming purposes would change to a higher number, but not until they figured the exact amount they needed for the OS.

Otherwise, they would be developing with 1GB as a definite target from the beginning.
 

nordique

Member
When they say the CPU is slow, I don't think they're just referring to the clock. If it was super efficient at low speeds they wouldn't be complaining.

Not necessarilly. If they just "cut and paste" (and I use the term loosely) their PS3/360 code then they will run into problems due to innate architecture differences between those systems and Wii U

Keep in mind, efficiency has much to do with how its made efficient in the first place
 

Jacobi

Banned
This part is making it hard to believe, for me anyway. When did it switch to a 7000 series GPU? And if so, why did it stop at DX10? For that matter, why say DX10 in the first place? A Nintendo console Leagally can't support Direct X or Shader Model. Something isn't stirring the kool-aid.

I suspected it being some sort of radeon 7650 version, but this chip has Direct X 11 features in its features list so...
 
It doesn't say anything about a 7000-series GPU...
Do you mean 32mb eDRAM or the 1GB useable RAM for games, as per that article?
The 1GB i.e. this "There have been several reports that the 1GB of available RAM for developers is rather slow, so it may well be the case that the much faster eDRAM is used for much more than just the framebuffer."

Would the eDRAM make up for the "rather slow" RAM (if this is even true)?
One of. The other person I talked with since I was the one who mentioned 2GB publicly as being legit, and I'm paraphrasing, said "2GB is definitely on the table". That's why I haven't said that amount conclusively.
Sounds sort of like they're weighing up whether they need another 512MB for the added cost or they can get away with 1.5GB...?
Devs made their games with at least 1GB of ram.
Would it be more accurate to say that devs made their games within the confines of 1GB of RAM with the view that more may or may not be unlocked or added with the final build/OS.
 

IdeaMan

My source is my ass!
You've repeatedly said "at least", so i'm wondering if there was any suggestion by Nintendo to the developers that the amount of memory for gaming purposes would change to a higher number, but not until they figured the exact amount they needed for the OS.

Otherwise, they would be developing with 1GB as a definite target from the beginning.

I doubt it's 1GB "from the beginning", the spec sheet leaked by vgleaks says that the amount of memory available for devs "will settle between 1GB and 1,5GB".

So the likely scenario is that devs got 1GB has a "safe/sure thing" for their games, and they will see later if Nintendo will give them more RAM according to the space the Wii U functions will take.
 
So is anyone seriously still believing that the WiiU will be supported beyond the PS4/720 release. 3rd party support is pathethic enough as it is, and we're hearing the same see-through excuses again as we did in 2006.

At this point it's pretty much a sure bet that WiiU will again be the one console that will be left out of the 3rd party plan.

At least 2013 should have some support, after that I'll just get another console to get 3rd party support or stick to PC.

Hopefully at least Nintendo will release a steady stream of games this time, unlike with the spotty release calender that plagued the Wii.
 

Aostia

El Capitan Todd
What would be awesome is if your TV constantly stayed in third person view and the tablet serves as your sniper scope.

well, the game is in first person, so probably it'll be as in Zombi: on the tv you have the normal first person view while you'll have the scope-zoom on the gamepad.
I've tried it in Zombi and obviously it's gimmick (you have to turn the gamepad up in front of the tv) but cool. Obviously imho. But, I mean: for normal control scheme I already have other consoles, so I hope that, especially for a game based on the scope, they'll offer at least the chance to play it this way.
 

TunaLover

Member
1gb for games and 500mb for OS seems about right and was reported for lherre.
I personally was expecting 1.5gb for games and 500mb for OS, oh well...
 

nordique

Member
The 1GB i.e. this "There have been several reports that the 1GB of available RAM for developers is rather slow, so it may well be the case that the much faster eDRAM is used for much more than just the framebuffer."

I would say this; first that is that person's subjective opinion. It won't hold true in all cases.

second, it will boil down to developers, but yes its possible the eDRAM could be used more than just a framebuffer
 

tkscz

Member
Hey IdeaMan, can you help me with my question? What's with that weird explanation of the GPU? They say it's a custom 7000 series GPU, which would mean it supports features from OpenGL 4.X. So why did they say DX10, or even DX at all?
 
So is anyone seriously still believing that the WiiU will be supported beyond the PS4/720 release. 3rd party support is pathethic enough as it is, and we're hearing the same see-through excuses again as we did in 2006.

At this point it's pretty much a sure bet that WiiU will again be the one console that will be left out of the 3rd party plan.

At least 2013 should have some support, after that I'll just get another console to get 3rd party support or stick to PC.

Hopefully at least Nintendo will release a steady stream of games this time, unlike with the spotty release calender that plagued the Wii.

If there is money to be made, yes.

Hey IdeaMan, can you help me with my question? What's with that weird explanation of the GPU? They say it's a custom 7000 series GPU, which would mean it supports features from OpenGL 4.X. So why did they say DX10, or even DX at all?

They said AMD 7, so like others said they probably mean 700.
 

IdeaMan

My source is my ass!
Hey IdeaMan, can you help me with my question? What's with that weird explanation of the GPU? They say it's a custom 7000 series GPU, which would mean it supports features from OpenGL 4.X. So why did they say DX10, or even DX at all?

Maybe they were referring to the R700 chips, and all this DX talk is weird, it's a method to easily say what set of features the GPU can support but it won't use DX as a lot of people already explained.
 

TunaLover

Member
Also it could be interesting to know what kind of RAM it is, it makes a world difference, also to know about cache to help RAM overwork.
 

mrklaw

MrArseFace
So is anyone seriously still believing that the WiiU will be supported beyond the PS4/720 release. 3rd party support is pathethic enough as it is, and we're hearing the same see-through excuses again as we did in 2006.

At this point it's pretty much a sure bet that WiiU will again be the one console that will be left out of the 3rd party plan.

At least 2013 should have some support, after that I'll just get another console to get 3rd party support or stick to PC.

Hopefully at least Nintendo will release a steady stream of games this time, unlike with the spotty release calender that plagued the Wii.

yes. The 360 and PS3 will still get games for at least a couple of years after PS4/720 come out, and the WiiU is in good shape to at least be included in the development of those games. So you'll have two tiers, and depending on the performance of WiiU (both technically and in the marketplace), publishers will either look for a downport from PS4/720, or a side/slight upport from PS3/360
 

Van Owen

Banned
Well, the article does seem to confirm that the system is a lot closer to current gen than it is to ps4 and 720 which a lot of people in the speculation thread did not think was the case.
 
If there is money to be made, yes.

There was money to be made with the Wii as well, and we all saw that many 3rd parties willingly excluded the Wii Market from their main portfolio, using the Wii as a garbage dump to finance their main business.

I for one by now expect the same strategy again with some occasional stand out efforts every 2 years or so.
 

nordique

Member
Hey IdeaMan, can you help me with my question? What's with that weird explanation of the GPU? They say it's a custom 7000 series GPU, which would mean it supports features from OpenGL 4.X. So why did they say DX10, or even DX at all?

Not directed at me,

but honestly i think its two things

1) hes a reporter, and reporters don't always understand what they talk about

2) reporters also have to translate what sources tell them in a way that people understand

he did use the terminology "DX type" and not flat out "DX"

I think he was trying to give a "layman" perspective on what it could do, but didn't understand what he was reporting at the same time.
 
Status
Not open for further replies.
Top Bottom