Standing before the Apple Silicon era.
This year is its beginning. There's no shortage of writing and reviews about Apple's first Macs (and iPad!) using their minuscule and powerful new architecture, but my own reaction is to stand back and take a breath.
It's quite a transition. The visual design and internal engineering of the Mac has always been its cornerstone, but its processors have always been found elsewhere. Every Apple nerd knows Steve described the move to Intel processors in 2005 as the logical step on their quest for more computing bang for the electrical buck. No one seems to mention that Apple "could have" moved to their own processors then, but didn't.
I enquote "could have" because perhaps they couldn't have, lacking the company expertise. (Could they have if Steve Wozniak had stayed as close to the company as Steve Jobs did?) That expertise appeared quietly with the iPhone, and less quietly with the first iPad's A4. Johnny Srouji seemed to emerge as the needed Steve Wozniak, ultimately thrust forward as the explainer of Apple Silicon's technical philosophy to the public. The handheld devices achieved their world-class heights on a steady diet of chips, high on integration of processors, graphics cores, and other custom processors – and practically free of carbohydrates – and at long last, the Mac will inherit it all.
I just read the forum post of a Mac newcomer's delights and gripes. "Apple should have made the RAM upgradable on these," they said, framing it like an oversight or mistake about which Apple would find themselves blushing, apologizing for, and rescinding. Non-upgradable memory for a leading desktop computer? Admittedly, it sounds unthinkable to long-time computer users. Of course, this was no mistake: the choice was deliberate (though executives refrained on this occasion from invoking the word "courage"), and the visionaries have decided the efficiency bestowed by their fully-integrated chip systems outweigh that ability.
While I'd love to understand more about low-level computer functionality, the nature of the M1's "unified memory" is something I vaguely understand from my days experimenting with programming for dedicated graphics processors. When dealing with hundreds or thousands of shapes and colours, every relevant data point must be copied from on-board memory to discrete "graphics memory" every frame. However, it's possible to copy reusable data to that graphics memory just once, in programming, then submit further instructions on how to reuse it. A significant optimization, which can be critical to improving frame rates or rendering more detail. And that's great, but it's another concept to comprehend and master when your vision is so simple.
The M1 contains only one pool of shared memory, locating it within reach of both its main processors and graphics cores. There's no separate "graphics memory" at all – no copying at all. That's one hint about the advantages of making this memory non-upgradeable. It feels radical, but also seems simple enough to make that aforementioned programming routine feel arcane. And that's just one example of the kind of conceptual change to the nature of computers this first chip heralds.
That's the kind of thing that makes me step back and take a breath. I recall this feeling from earlier childhood, when the Amiga, with its groundbreaking graphics and audio capabilities, seemed the computer of its day. I watched adults around me get excited, resonate with humour and friendliness, recognizing the implications: possibilities to explore, sense to make, uncharted territory whose compass direction was clear. A humbling reminder that "being a computer user" was an activity absent from prior human history, that they were alive near its dawn, equipped to appreciate it together like a sunrise. We're doubly fortunate to live decades into that history, with what would have been called "supercomputers" in our pockets, but this year calls to mind that dear old mood.
For all Apple's growth, mastery and influence, it feels like they've finally grown into something primary that they hadn't before. The transition has gone well, people love the computers, and it's just the barest start. I'm assuming other companies' leaders have been blinking with some bewilderment, asking each other whether they should be doing fully-integrated desktop-class processors of their own. How would they? How could they?
That's all for now. Nothing deathly insightful to say, but I wanted to take that moment to nod at it. I'll have plenty of time to enjoy it all.