New Apple M1 CPU

Shared memory was how things were done on many personal computers (Amigas, STs) back in the day. It places a premium on memory management which seems to be much more of a bugaboo today than it was 35 years ago. Even after the widespread application of hardware-based memory management, RAM is still being stomped on an all-to-frequent basis. I suspect this has a lot to do with compilers trying to take responsibility for things that used to be the programmer's responsibility.
 
The CDC mainframes we had at Purdue in the 1970s had everything in memory and peripheral processors tapped into the memory pool to move data out to printers, tapes, and disks. The OS ran on a PP as well, basically managing a list of running programs and telling the CPUs which part of memory the instructions were located.
 
Memory leaks. Throws me back to QEMM. I was the go to guy for that. Loved it. Hated its demise. But of course, things got better. Remember the 80286 and protected mode (1 way)?
 
The CDC mainframes we had at Purdue in the 1970s had everything in memory and peripheral processors tapped into the memory pool to move data out to printers, tapes, and disks. The OS ran on a PP as well, basically managing a list of running programs and telling the CPUs which part of memory the instructions were located.
We had a 7300 and it was pretty much one job at a time (submitted as card decks). The printer could push out a case of paper in under a minute. Shutting the printer down on a runaway job was perhaps the most exercise the attendants ever got.

60 bit words was something that took some serious getting used to having come from Z80, 6502 and 6809 microcomputers.
 
  • Like
Reactions: navychop and Foxbat
I was all ready to order the MacBook Air with 16 GB for my new daily driver when it struck me, what about Remote Access for work? Not only is it a different processor, but there’s the macOS Big Sur restrictionists on kernel extensions. Unless I could convince Corporate IT to update their VPN just for little ol’ me, I’d have to keep my old MacBook Pro for remote access instead of trading it in, requiring more deliberation about my purchase.

I’ve notified the powers that be that until we can test the new VPN under Big Sur, any Mac users need to hold off upgrading to Big Sur or purchasing a new M1 MacBook for remote access.

Edit: it looks like VPN support is available for Big Sur as well as Apple Silicon. Now I just have to request our Corporate IT download the latest version so I can test it out.
 
Last edited:
  • Like
Reactions: ncted
Not M1-related, but I updated my late-2016 MacBook Pro to Big Sur 11.0.1 Sunday. My Corporate IT did me a solid by putting the new VPN Client for macOS up on the concentrator and my Macs updated automatically when I connected them.

Since it's all down with System Extensions instead of the Kexts, I did need to go in to the Security Settings and approve the new way of doing things. No problem, and it works fine when I used it to check on a few things after the Security Updates to the servers Saturday night.
 
I am a Mac Developer (since 1992). The new M1 Macs require MacOS Big Sur. Unfortunately, Big Sur breaks a lot of apps!

It breaks my app too (Natural Scene Designer) www.naturalgfx.com I had to put an incompatibility notice on the front page of my website.

In Big Sur, Apple changed a lot of things including the way programs draw to the screen. This is the part where I am struggling. MacOS used to allow programs to draw to the screen at will. Now they can draw only when Apple says its OK to draw. This requires a major rewrite of my code.

For example, when you dynamically resize something using dotted lines, you can no longer go into a loop and redraw the dotted lines at will. Instead, you have to set some internal program variables indicating that the next time you are allowed by Apple to draw, you draw the dotted lines instead of what you normally draw.

The previous MacOS release (Catalina) also broke my program. Apple deleted support for the file system that my program CD uses (HFS). I had to re-master the Program CD send new program CD-ROMs to all my customers at great expense.

Apple has also put developers on notice that they have 'deprecated' hundreds of other functions. These functions will be deleted in a future release. My program currently uses many of these functions and I will need to find suitable replacement functions.

I have already had to rewrite my software (Natural Scene Designer and its predecessors) several times -- when they switched from Motorola to PowerPC, MacOS 9 to MacOS X, PowerPC to Intel, 32 to 64-bit, and now Intel to Apple M1 chips.

In contrast, the Windows version of my app only had to be rewritten once (from 32 to 64 bit). Windows does not remove old functions (at least none that I use).

So ...... if you are thinking about upgrading to an M1 Mac or MacOS Big Sur -- be sure to check with the developers of the apps you use to see if they will work!
 
  • Like
Reactions: harshness
Are you using XCode or Swift? Apple seems pretty intent on having all development for their platforms using these tools.

If you wrote code that followed all the rules, and then the rules were torn up and you were told to follow new rules, I can see why the third time that happened you’d wonder why you were jumping through hoops to support a minor platform.

On the other hand, I see a company like Serif who have both Windows and iOS versions of their Applications in addition to macOS and they are supporting M1 native code at launch.
 
  • Like
Reactions: navychop
Here’s a half-hour video looking at the M1 and explaining some common misconceptions:
 
He cursed "backstory" in the beginning and yet a significant majority of his presentation was related to previous hardware.

Percentages of increase DO NOT tell anyone what they can expect in terms of real-world performance.

He talked a lot about modeling application demands but one has to wonder how well Apple understands the types of applications that personal computer users want to use (versus those of Mac users).

Is 16GB of RAM enough? Sure, the M1 is the low-end SoC for lower power notebooks but how expensive are successive generations likely to be when they have to upgrade the entire SoC to come up with a higher performance, greater RAM, model.

It seems obvious that much of the performance is coming from an enormous cache rather than raw power. I reason that's why the multi-core performance drops more into line with existing CPU designs.

An article on Seeking Alpha notes that the compute-intensive x86 applications running on Rosetta 2 will be up to 30% slower (yes, another percentage relative to the previous generation).

Hopefully we'll start seeing some practical benchmarks. There's a lot of good there but I'm wondering how much of the hype is hand-picked.
 
My take based on some limited time with a Mac Mini with M1 at work:

It is perfectly usable with both native and emulated apps. Native apps are very snappy to start compared to emulated apps, but there is nothing to complain about in the latter case. Once apps are loaded, there is no lagginess either way. Things "just work" in the typical Mac fashion. The only exception to this is we couldn't get any native, open-source Machine Learning tools to work. Given they were all alpha releases, this is not surprising.

If I had to replace my wife's MacBook Air, I'd get her an AS-based one to replace it.

The rumor mill has Apple releasing a 12-core SoC for the MacBook Pro 16 in 2021 with larger memory support and a better GPU.

This model makes a lot of sense for Apple, but I don't really know if it does for anyone else. I would really like to see more ARM-based (or RISC-V) computers out there. If I were Intel, I'd be developing my own RISC-V product for general purpose use, just to hedge my bets against NVIDIA/ARM.
 
The rumor mill has Apple releasing a 12-core SoC for the MacBook Pro 16 in 2021 with larger memory support and a better GPU.
I'm not so concerned about the notebooks, but flexibility in the desktops would seem to be a real problem. Then again, I may be underestimating what people try with their notebook computers.
 
I'm not so concerned about the notebooks, but flexibility in the desktops would seem to be a real problem. Then again, I may be underestimating what people try with their notebook computers.

Agreed. Given the power advantage of the AS design, I would expect they could come out with a much more capable Mac Mini/iMac/Mac Pro than this initial offering. However, the lack of RAM and internal storage upgrades is a real problem for variety of Mac users. Better get the top of the line when you order to make sure you have everything you could possibly need!
 
The whole 8 Gigabytes vs. 16 Gigabytes question is one reason I haven’t pulled the trigger yet. But most of the reports so far seem to indicate that what I plan on doing with my new MacBook Air, 8 GB would be fine and I should save my $200.

The battery life improvement is the primary draw for a new Air M1, not performance. I see there are people trying to run the Folding at Home distributed computing application and they’re running into problems. I would load the Control program, not the actual compute client, but IMHO the FAH developers are not going to code an Apple Silicon client for the CPUs or GPUs anytime soon. They have enough problems keeping the back-end servers running to support a new, unproven architecture.
 
Are you using XCode or Swift? Apple seems pretty intent on having all development for their platforms using these tools.

If you wrote code that followed all the rules, and then the rules were torn up and you were told to follow new rules, I can see why the third time that happened you’d wonder why you were jumping through hoops to support a minor platform.

On the other hand, I see a company like Serif who have both Windows and iOS versions of their Applications in addition to macOS and they are supporting M1 native code at launch.
I'm using the Xcode development tool and my code is written in Objective C. Swift is a newer Apple programming language that I haven't bothered to look at yet.

I have a 'Developer Transition Kit' from Apple that cost me $499 to 'rent' for a year. Its just an M1 based Mac that looks like the Mac Mini but has different internals. After a year I have to send it back to Apple.

I still don't have my software working on MacOS BigSur. The changes I need to make to get my app to run on BigSur are complex and my head hurts just thinking about it.

I prefer the Intel based Macs because you can run both MacOS and Windows on them.

If I absolutely had to have an Apple Silicon Mac I would wait a very long time until there are more programs that run natively on it (not using Rosetta 2).
 
Swift is a newer Apple programming language that I haven't bothered to look at yet.
Swift has been the recommended development language for a few years now for all Apple platforms and it is free under the Apache 2.0 license. It is probably worth a look-see.
 
The whole 8 Gigabytes vs. 16 Gigabytes question is one reason I haven’t pulled the trigger yet. But most of the reports so far seem to indicate that what I plan on doing with my new MacBook Air, 8 GB would be fine and I should save my $200.

The battery life improvement is the primary draw for a new Air M1, not performance. I see there are people trying to run the Folding at Home distributed computing application and they’re running into problems. I would load the Control program, not the actual compute client, but IMHO the FAH developers are not going to code an Apple Silicon client for the CPUs or GPUs anytime soon. They have enough problems keeping the back-end servers running to support a new, unproven architecture.
Given how the memory footprint of browsers has grown over the years, I would personally go with the 16GB model. At the moment, FF is using 2.5GB of RAM on my mid-2015 MBP 15 with just 4 tabs open. Safari is probably better, but Chrome is definitely worse.
 
  • Like
Reactions: Foxbat
***

Users Who Are Viewing This Thread (Total: 1, Members: 0, Guests: 1)

Who Read This Thread (Total Members: 1)