• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Apple Unleashed | Oct 18, 2021

Panajev2001a

GAF's Pleasant Genius
I've been reading some of the freshly published docs on this chip and the M1 Max is specced for a maximum of 18 tflops at 55 watts maximum draw.

For console comparison, the PS5 is 10.5 tflops at around 340 watts (but that's for the whole system) - so both the M1 Pro and M1 Max surpass the PS5 in terms of raw graphics performance.

Now let's look at the desktop-grade RTX 3080 card which can perform 29 flops at 320 watts.

As a thought exercise, let's say apple could scale this chip to draw 320 watts for the desktop Pro systems, which would equate to just over 100 tflops, terrifying. I know chips cannot just be "scaled up" and have to be redesigned around thermal loads and power draws; but I'm pretty confident that whatever Apple does with their pro-lineup SoC wise will be considerably powerful than the best dedicated PC GPU on the market. Sucks they don't embrace Vulkan and lock that power behind Metal and MoltenVK.
18 TFLOPS for the GPU at 55 Watts? Mmmh… like PS3 was 2 TFLOPS for the GPU?
 

Papacheeks

Banned
Yeah, I don't agree with this.

"Way more for the money" ... there is no equivalent to what Apple is doing on the PC side. How do you replicate what they just announced? Yeah there are PC specific workflows that you can't do on Mac if you need Nvidia cards, sure. Have you seen how many hi-res monitors you can drive with these laptops? Where are you going to get the CPU performance this laptop has on the PC (at all) and do so with its thermal envelope?

It doesn't exist.

I personally think, as a result, your definition of "does more" is tenuous at best. These laptops are new benchmarks across the entire computing world.

If these M processors were made to work in other OS environments that would be correct. But they are literally engineered for Unix(Ios) use and applications. None Apple Software its still up in the air as these chips are still new relatively speaking and non apple/adobe FP encoding software is still not tested compared to PC counterpart. So no according to these leaked benchmarks from literally couple weeks ago those numbers they throw at you are specific numbers for specific tasks which is what the M1/M1X chips were made for.

Intel/amd MAKE cpu's that are varied for their purpose and use case. Which is why in these leaked charts M1X stacks well in Cinebench, and Video encode benchmarks within their own iOS environment.

But if you compare them to things outside of apple they still dont kill the playing field like you state.

Single core tests:

1*akq39nlEXjmYPMjOGvnhZQ.png


Multicore tests:

1*mIZeNzc8hTlXB1AJvi7dVg.png



And thats against a AMD apu NOT their any of their flagship Ryzen or Threadripper.

If you configure a lenovo or MSI wa Workstation with a combo of the latest Intel Xeon or Ryzen 9 with a nvidia or Radeon render card the performance is different depending on applications. AND since M1/X chips are only on apple's platform its hard to say how they stack on more Windows/Business based applications.
 
Last edited:

Represent.

Represent(ative) of bad opinions
Im about to buy one...

Not sure if I should get the 14" with the M1 Max Chip or the 16" with the M1 Pro chip

32GB in both, both are the same price.

Larger screen or better chip.

Gonna be used mainly for Photoshop and Lightroom.

Someone help me decide
 

Topher

Gold Member
Im about to buy one...

Not sure if I should get the 14" with the M1 Max Chip or the 16" with the M1 Pro chip

32GB in both, both are the same price.

Larger screen or better chip.

Gonna be used mainly for Photoshop and Lightroom.

Someone help me decide

Are you working with just the MacBook and no external monitor? Without an external monitor I think I'd rather have the 16". I use an external monitor as my primary display so I tend to go with smaller laptop screen sizes and boost up the chips and RAM.

Not sure if that helps.
 

Certinty

Member
For around £1300 I would have traded my M1 Pro in and happily put the extra £600 or so but for £1900? No chance. Looks amazing though.
 

Durask

Member
So will Mac become a viable gaming system with these chips?

I mean "viable" as in "variety of games reasonably close to what I can now get on Steam".
 

Topher

Gold Member
So will Mac become a viable gaming system with these chips?

I mean "viable" as in "variety of games reasonably close to what I can now get on Steam".

No, primarily because very few games are ported to Mac and even fewer ported to M1. M1 Pro and M1 Max are aimed at the professional market. I don't think this is going to sway gaming developers towards Mac even a little especially at the cost.
 
Last edited:

Dream-Knife

Banned
I've been reading some of the freshly published docs on this chip and the M1 Max is specced for a maximum of 18 tflops at 55 watts maximum draw.

For console comparison, the PS5 is 10.5 tflops at around 340 watts (but that's for the whole system) - so both the M1 Pro and M1 Max surpass the PS5 in terms of raw graphics performance.

Now let's look at the desktop-grade RTX 3080 card which can perform 29 flops at 320 watts.

As a thought exercise, let's say apple could scale this chip to draw 320 watts for the desktop Pro systems, which would equate to just over 100 tflops, terrifying. I know chips cannot just be "scaled up" and have to be redesigned around thermal loads and power draws; but I'm pretty confident that whatever Apple does with their pro-lineup SoC wise will be considerably powerful than the best dedicated PC GPU on the market. Sucks they don't embrace Vulkan and lock that power behind Metal and MoltenVK.
You can't really compare Tflops across architectures. A 3070ti is 21 Tflops, yet is outperformed (~2٪) by the 16 tflop rx6800.
Where did they get that number from?
 

Futurematic

Member
Odd choices
-HDMI 2.0 instead of 2.1 (Apple TV has 2.1)
-Notch which is fine with me but why no Face ID?
-worse than iPad Pro camera with no ultrawide lens and no Center Stage feature
-baseline 14” doesn’t have a charging brick good enough to support fast charge (20 bucks to upgrade to the fast charge 50% in 30 minutes brick used on all the other 14” models)

All seems like nickel and dime bullshit on a laptop this expensive, although who knows what odd technical problems they might have ran into—maybe they screwed up the HDMI interface and couldn’t do 2.1, maybe FaceID sucks at the usual distance you use the computer at. All probably solved next version 2023 lol.

Other than that though fantastic and even better than I expected. Love the (rare if ever for Apple notebooks) full size instead of half height function keys. Top performance as noted above isn’t better than the competition, unless you consider power use which is less than half to achieve the same result lol

That said I don’t need this on my lap unless work funds it, so I’ll wait for the 2022 MacBook M2 and hopefully in orange with nearly as good screen.
 
The only things that remain on my wishlist for a perfect MacBook hardware wise:

1.) Face ID
2.) No notch. I guess under screen camera will need to improve drastically.
3.) Touch screen with Apple Pencil support.
4.) HDMI 2.1
5.) Center stage on the web cam would be nice.

Software-wise, I really want better video game support and commitment to the industry from Apple.

All that being said, I’m more of a tablet / phone guy atm for mobile devices. I prefer the iMac, Mac Pro, and Mac mini form factors if I were considering an upgrade. An iMac with the features listed above, with an easel mode for Apple Pencil support, would be a day one buy. If they made an external display like that, I’d get the Mac Pro or mini with the display depending on cost. iPad mini and iPhone for the road, Apple TV / PS5 for the living room. Mac / my custom built PC for desktop gaming.
 
Last edited:

Dr Bass

Member
If these M processors were made to work in other OS environments that would be correct. But they are literally engineered for Unix(Ios) use and applications. None Apple Software its still up in the air as these chips are still new relatively speaking and non apple/adobe FP encoding software is still not tested compared to PC counterpart. So no according to these leaked benchmarks from literally couple weeks ago those numbers they throw at you are specific numbers for specific tasks which is what the M1/M1X chips were made for.

Intel/amd MAKE cpu's that are varied for their purpose and use case. Which is why in these leaked charts M1X stacks well in Cinebench, and Video encode benchmarks within their own iOS environment.

But if you compare them to things outside of apple they still dont kill the playing field like you state.

Single core tests:

1*akq39nlEXjmYPMjOGvnhZQ.png


Multicore tests:

1*mIZeNzc8hTlXB1AJvi7dVg.png



And thats against a AMD apu NOT their any of their flagship Ryzen or Threadripper.

If you configure a lenovo or MSI wa Workstation with a combo of the latest Intel Xeon or Ryzen 9 with a nvidia or Radeon render card the performance is different depending on applications. AND since M1/X chips are only on apple's platform its hard to say how they stack on more Windows/Business based applications.

Considering an "M1X" was not announced, and instead we got M1 pro and max (stupid naming, but whatever) lets see what the top end actually benchmarks at when the devices are released.

Also you're comparing Apple's new MOBILE CHIP to Intel's flagship desktop. You do understand that part right? And even if Apple doesn't improve on these tests, Apple is hanging with Intel's flagship at a lower power draw in a thin laptop.

The desktop stuff isn't even released yet. So yeah .. Apple is curb stomping everyone right now. I don't even wish this was the case, I think Apple has turned incredibly arrogant and greedy, but reality is reality. Denying it doesn't do anyone any favors.
 
Last edited:
Apple is just trolling their user base now. A notch on a laptop screen. Now I really have seen everything.

I'm happy with my 13" M1 MBP. The screen has no notch, and honestly I've come to like the Touch Bar. I'm not feeling any compelling urge to upgrade anytime soon.
 

Panajev2001a

GAF's Pleasant Genius
You can't really compare Tflops across architectures. A 3070ti is 21 Tflops, yet is outperformed (~2٪) by the 16 tflop rx6800.

Where did they get that number from?
They started counting equivalent performance from fixed function units instead only from Shader ALU’s (which would have been in the 200 GFLOPS-ish range, I forgot the exact number though).
 

UnNamed

Banned
These numbers, 8 core, 10 core, 32 cores, are meaningless. The way Apple promote these numbers with "X more powerful", they never change. Even benchmark are pretty useless, something people should have learned from '90 but for some reason they are still used to shoot numbers.

The only metric is real applications such as programs and games. I will wait for real tests.

Edit: M1Max have 32 core, 10.4TF GPU. In par with a 3060 MaxQ am I right? Not that powerful after all.
I really don't understand why Apple still put power efficient cpu in the Pro line. For long term works you already have Macbook Air and the base Pro, so I think is useless to have a Pro device which is not pushed to its limit because it has to last 10 hr.
 
Last edited:

Papacheeks

Banned
Considering an "M1X" was not announced, and instead we got M1 pro and max (stupid naming, but whatever) lets see what the top end actually benchmarks at when the devices are released.

Also you're comparing Apple's new MOBILE CHIP to Intel's flagship desktop. You do understand that part right? And even if Apple doesn't improve on these tests, Apple is hanging with Intel's flagship at a lower power draw in a thin laptop.

The desktop stuff isn't even released yet. So yeah .. Apple is curb stomping everyone right now. I don't even wish this was the case, I think Apple has turned incredibly arrogant and greedy, but reality is reality. Denying it doesn't do anyone any favors.

My point regardless still stands. They can only curb stomp apps within Ecco system. Outside of that there are shit tons of uses for cpu/gpu.

And ryzen 6th gen for laptops announced they are doubling core counts across the board.
So again these new chips may be the king in final cut, audio production, adobe. But these are all production specific and also that is not scalable across other PS for things like davinci resolve, zbrush, maya, Cad, etc.

There is a much larger spectrum cpu/gpu are used from and literally there are no real world bench marks yet.
Also these chips are same ones going into iMac if I’m not mistaken?

EDIT:

Marques Brownlee says what I was saying when it comes to specific workflows like FInal Cut these chips are a huge deal. But in outside applications its a case by case use if the app is designed or optimized to take advantage of the chips.



Literally talks about these chips compared to Intel "MAC" chips as in to MAC only environment. Which was my entire point because your blanket statement was just plain wrong when were talking all available CPU's in the laptop space outside of MAC OS.
 
Last edited:

Deku Tree

Member
Weird that they didn’t include HDMI 2.1, but if I’m going to use an external display in my office or at home then I’d plug it into the TB port. I guess the HDMI port is for plugging into a TV or projector at work in a hotel or something. I wonder if lack of HDMI 2.1 had to do with thermals. Maybe you can get HDMI 2.1 with a TB dongle? IDK
 

Azurro

Banned
I was waiting for the reveal but after those prices (2250 euro for the 14"), I'm just going to get the Air M1 with 16 GBs. Does anyone know of a good usb c hub that outputs 4k@60 FPS? Every dongle I'm seeing seems to be limited to 4K@30
 

jaysius

Banned
It wouldn't go with Apple's focus on privacy and not selling your user data. Which is exactly the reason many people choose Apple and pay a premium vs. Android.
All that user data they could sell... even more money left on the table.

I could care less if anyone knows my listening habits, Spotify is already selling that data.
 

Hari Seldon

Member
I haven’t purchased a Mac in a long time now that I daily drive Linux. But I kind of wanted one just for GarageBand. Can you dual boot arm Linux on these?
 
Last edited:

Bitmap Frogs

Mr. Community
I haven’t purchased a Mac in a long time now that I daily drive Linux. But I kind of wanted one just for GarageBand. Can you dual boot arm Linux on these?

I daily drive Linux but honestly I find macs very nice looking.

If only Apple released their drivers and allowed boot camp style Linux installation…
 

LordOfChaos

Member
Anyone 'member how AMD was on the HSA (this new magic unified memory) train so long ago with Kaveri (or earlier?), but didn't have the clout to really gain adoption?


I 'member. Wonder where we would be if it did and became a PC standard years ago.
 

sendit

Member
Oh no Face Id?
I did not read the info in detail and just assumed the reason for the notch is Face Id…

No Face Id is lame
Yep. Pointless notch if Face Id isn't there. Next revision will probably include face ID (next year).
 
Top Bottom