Something I've always considered, but not confirmed (and probably won't until I get my hadns on this thing) is the way this userface could possibly 1up every other touch screen interface.
Traditional touch screens (conductive or not), I think, rely on touching the on screen buttons (duh). You have to be accurate though. But something I thought (or maybe just hope) the iPhone/apple did was really think about the software behind that. The keys look really small - I wouldn't be surprised if they had algorithms that really made use of both spellcheck, hardware, and other software featues to accurately gage what it thought you pressed - not just measuring it by hardware alone like traditional interfaces.
Like you're trying to spell 'the', you kind of hit the "Y" "h" and something the hardware can't tell is 'w' or 'e', but everything tells the software taht you tried to spell 'the', so it changes it to that on the fly. Like predictive spelling COMBINED with smart hardware.
That kind of thing would make it the next step in touch screen interfaces, where it becomes seamless with ther user, and they don't have to be as accurate.
Of course, it could just be a standard touch interface, in which case it could be a little annoying. For the record, I liek the idea behind the ipod, but in it's actual use, I don't like it so much - it's too sensitive, and I end up touching it when I got to hit the centre button - maybe I have fat thumbs.