• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The iPhone 7 will the be the most powerful gaming system any Nintendo game has run on

PFD

Member
I found this:

iphone-7-2014.jpg


Source: Geeks are Sexy

How is this relevant to the discussion?
 

DangerMan

Banned
Yeah, that's ever the problem. Companies aren't going to invest as much in games they charge 5-10 dollars for vs 60 dollar new games.

Plus controls, not many people have something like this, and touchscreen only limits the scope of the game.

gamevicemainimage-800x434.jpg

^basically the NX?

Valve's NX?
 

LordOfChaos

Member
Two of the cores are performance cores, the other two - power-saving cores, and only two cores are visible at a time. It's not clear whether the scheduler is clustered switching or IKS, but it clearly is not Heterogeneous Multi-Processing (HMP).

Interestingly it's not big.LITTLE at all. But I'm not sure what difference he means by shared caches, unless it's shared L1, because big.LITTLE already had shared caches and coherency. But reportedly faster switching than big.LITTLE.

Since there were so many broken big.LITTLE implementations, it makes sense that Apple waited and made their own. Can't wait for someone to dig into why their version is better and what the little cores comprise of.

https://twitter.com/PatrickMoorhead/status/773591957234851840

https://twitter.com/PatrickMoorhead/status/773591146681380864

Isn't the Wii U GPU at over 300 GFLOPs?


Nah, 160 shaders, 176Gflops.
 

DangerMan

Banned
176. Different architecture anyway, so a direct comparison is pointless.

I thought it could run at double that without over-volting or any risk of CPU/GPU damage

EDIT: Might require safe overclocking but on a home console that doesn't hurt battery life, only power bill
 
Does it really matter when the game actually looks worse than new super mario bros wiiu?

And couldnt we say the same thing about miitomo and pokemon go if you count it as a nintendo property?
 

LordOfChaos

Member
I thought it could run at double that without over-volting or any risk of CPU/GPU damage

EDIT: Might require safe overclocking but on a home console that doesn't hurt battery life, only power bill

Where'd you get that? It's on a 45nm fabrication plant, and at 550MHz. Running at 1100MHz in the Wii U's form factor on 45nm...

slide011.jpg


Sounds like a silly fan theory someone spread on the internet. Like the future overclock theory since the PSU had some over provisioning.
 
If you ask me Nintendo should give up making hardware in general and just make games on the more powerful hardware.
this thread sucks
 

elohel

Member
Not always true as they bloat the OS revisions which bog down the speed to some degree.
A lot of iOS features go wasted.

Talking about new hardware the question posed, unless I misunderstood is why is the power of the new processors appealing

What do you mean OS revisions and bloat?

Edit: btw this thread is basically an excuse to
Dump on iPhones lol is anyone actually looking at the posts? Commenting on losing a small device, wires and android features? What does this have to do with hardware and Nintendo and benefits
 

ethomaz

Banned
Can't wait this meme to die. For pretty much anything other than GPU workload the jaguar will destroy last gen CPUs.
Jaguar is a last gen mobile CPU.

It won't destroy anything lol

It is weaker than Intel Pentium M released in 2003... It is weaker than Intel Atom.

Edit - Fixed for accuracy.
 
The title of the thread is already fanboy nonsense.

I'll bite.

Come back when your Nexus 6 has acceptable read/write performance.

78201.png


78202.png


Just to make it clear, I'm not an Apple fanboy - I've owned and used phones that are primary devices across webOS (HP Pre 3, Palm Pre 2), Windows Phone 7 (HTC Radar), Windows Phone 8 (Nokia Lumia 820, Lumia 620), Windows Phone 8.1 (HTC 8X), BlackBerry 10 (BlackBerry Z10), Android 4.4 (Motorola Moto X) and iOS (iPhone 5s) over the last 5 years, so I think I have a fair say when discussing software and hardware across devices these days.
 

LordOfChaos

Member
Jaguar is a last gen mobile CPU.

It won't destroy anything lol

It is weaker than Intel Pentium M released in 2003.

Not modern PC processors for sure. Not even old ass PC processors. But last gen console CPUs, which guy 1 was comparing it to? Huge pipeline, huge pipeline flush penalty, nearly no branch prediction, no prefetchers, last gen console CPUs?

The Cell could go places if you put a whole lot of manual work in, and on paper, sure, more Gflops than 6.5 Jaguar cores. But people hugely understate the scale up in software complexity it brought to make up for all of the above. This code for edge detection

Code:
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <unistd.h>
 
/* ... */
 
/* the graph */
vertex_t * G;
 
/* number of vertices in the graph */
unsigned card_V;
 
/* root vertex (where the visit starts) */
unsigned root;
 
void parse_input( int argc, char** argv );
 
int main(int argc, char ** argv)
{
  unsigned *Q, *Q_next, *marked;
  unsigned  Q_size=0, Q_next_size=0;
  unsigned  level = 0;
 
  parse_input(argc, argv);
  graph_load();
 
  Q      = 
          (unsigned *) calloc(card_V, sizeof(unsigned));
  Q_next = 
          (unsigned *) calloc(card_V, sizeof(unsigned));
  marked = 
          (unsigned *) calloc(card_V, sizeof(unsigned));
 
  Q[0] = root;
  Q_size  = 1;
  while (Q_size != 0)
    {
      /* scanning all vertices in queue Q */
      unsigned Q_index;
      for ( Q_index=0; Q_index<Q_size; Q_index++ )
      {
        const unsigned vertex = Q[Q_index];
        const unsigned length = G[vertex].length;
        /* scanning each neighbor of each vertex */
        unsigned i;
      for ( i=0; i<length; i++)
          {
            const unsigned neighbor =
              G[vertex].neighbors[i];
      if( !marked[neighbor] ) {
            /* mark the neighbor */
            marked[neighbor]      = TRUE;
            /* enqueue it to Q_next */
            Q_next[Q_next_size++] = neighbor;
          }
        }
      }
      level++;
      unsigned * swap_tmp;
      swap_tmp    = Q;
      Q           = Q_next;
      Q_next      = swap_tmp;
      Q_size      = Q_next_size;
      Q_next_size = 0;
    }
  return 0;
}

60 lines of source code for any general processor.

1200 lines of code for Cell. Twelve fricking hundred.



I agree with guy 1 here, Jaguars are far preferable to last gen, and the Cell would only edge it out in edge cases for a lot more work, and cases where GPGPU is far better now anyways.
 

nkarafo

Member
I'll bite.

Come back when your Nexus 6 has acceptable read/write performance.
This isn't a thread about iPhone vs other Mobile tech.

This is supposed to be a thread about iPhone vs consoles, Nintendo ones in particular.

There are still no relevant benchmarks in this thread.
 
The closest we'll get to 1080p gaming from Nintendo (on the Plus line).

I'm excited to see where they head with the mobile market – and how they deal with Android too.
 

mario_O

Member
It's coming to Android too. And the Galaxy S7, on paper, is clearly more powerful. Maybe the One Plus 3 too. Son I don't think so OP. :p
 
This isn't a thread about iPhone vs other Mobile tech.

This is supposed to be a thread about iPhone vs consoles, Nintendo ones in particular.

There are still no relevant benchmarks in this thread.

Yeah, I was just cherrypicking like he was, and chose a completely random factor (though one that actually affects real world behaviour of the device in a positive way...).
 
But it's not advanced enough that it has beat home consoles in performance and graphical quality of games.

Apple - and most mobile SOCs - have serious Memory speed and bandwidth issues vs conventional consoles, and apps have pretty strict size limits. So while you can do pretty amazing things, the market doesn't lean in that direction because there is way more money in games that are small enough to download over LTE and cheap enough that they don't cost $25 up front.

Mobile stands to see the biggest gains in GPU performance related to games when they can start taking advantage of HBM. ( or when Apple allows for 10GB + games...)
 

Moreche

Member
I gave up on mobile games about a year ago when I sold my iPhone 6S, Apple TV4,and iPad Pro. I went back to Windows and android and have yet to play one game on my phone, I also went back to console gaming.
These Apple mobile devices may have the most advanced mobile processors ever but gaming on the doesn't mean shit when it kills the battery dead in no time.
Personally I fell for the whole Apple TV and mobile will kill consoles but I now realise it won't until battery technology at least catches up and that's not going to happen either because they keep making them thinner.
But I do agree that I'm glad Sony and Nintendo are releasing games on mobile because there's a lot of less risk, more reward on mobile.
 

marmoka

Banned
I'll bite.

Come back when your Nexus 6 has acceptable read/write performance.

78201.png


78202.png


Just to make it clear, I'm not an Apple fanboy - I've owned and used phones that are primary devices across webOS (HP Pre 3, Palm Pre 2), Windows Phone 7 (HTC Radar), Windows Phone 8 (Nokia Lumia 820, Lumia 620), Windows Phone 8.1 (HTC 8X), BlackBerry 10 (BlackBerry Z10), Android 4.4 (Motorola Moto X) and iOS (iPhone 5s) over the last 5 years, so I think I have a fair say when discussing software and hardware across devices these days.

As I mentioned in my first post, I just found that image. And I'm not a Nexus fanboy, I don't even own a Nexus. If that image is wrong, complain to the website, not me.
 
Top Bottom