But seriously: after leaving coffee and regularly embracing tea, I have been consuming a lot of homemade scones lately.
Despite scones' long and privileged position at the tail of my social media bios, I've only just begun baking them. For years, a nearby store sold excellent "American" scones (wedge-shaped with fruit or icing, the kind some Americans seem to think is part of the definition of a paper cup of coffee with a paper heat guard around the middle, especially when served in a separate paper bag). But the stock has suddenly halted, leaving strong demand across the demographic of people who are me.
So strong was this demand that it proved the first force capable of ushering me into any baking routine. I have acquired a rudimentary feel for the behaviour of flour in various ratios to liquid, and a developing sense for the effects of sugar, butter, eggs, or baking powder. This is a change which, prior to day 14000, even I might have thought out of my own character.
Here’s the argument:
1. It is the Age Of Scones.
3. Therefore, I have to eat a lot of scones.
The fact that the addition of some sort of second premise meant to strengthen this argument would in fact weaken it, is known as the Age Of Scones paradox.
I've known I would write about the cultural tragedy of the loss of Star Trek after 2005, and the literary value now locked in the classic series. But there's so much to say, and the emotional component has been so vast for me, that I've mulled and brooded for over a year.
Fortunately, I had also thought to check for others who felt this way, and have since balmed myself with the empathizing sharings of many like-minded cherishers. I can now instantly think of several I'd trust to convey my approximate sentiments, but I'll share this one by YouTube user "Nitpicking Nerd", who earns his pseudonym with elegance and an endless supply of supporting examples.
While udanite is undoubtedly a precious material, and “chauffeur” seems a luxury classification of anything, I’ve still yet to see the great chauffeur udanite the late-night hosts keep telling their audiences they've got.
How did I miss this news from last week? Too many departures to unvisited areas of our solar system is a good problem.
This is the first mention I've noted of the "Trojan asteroids," which orbit the sun at about the distance of Jupiter. Already bearing a mental image of the solar system's original formation from gravity acting on dust and debris, this was an intuitive addition. Lucy's eddying course appears typically complex for a probe, using Earth's gravity to help it reach two clusters of these rocks.
As usual, plenty of materials for the public, which I'll be perusing in time:
… A long time ago, I was very close to Steve Jobs and Bill Gates. They both were happy to be bazillionaires, they both were megalomaniacs in certain ways, they could be very unpleasant, but they had some principles. They had a red line. In my encounters with Mark Zuckerberg, I've never been able to discover any principles.
Good to hear from the chap now and then since he's retired. Walt is routinely credited with nothing less than the invention of writing about personal technology for consumers, but I've also found him generally principled, journalistic and good-humoured.
Kara Swisher has struck me as someone who brandishes her opinion on her sleeve and spotlights the concordances or clashes with her interviewees, but Walt is clearly in a comfortable place with his old colleague, and this was a relaxed chat from a pair whose long-spanning study of tech's past informs their views on the future.
Well, Intel was a good stepping stone.
The PowerBook G4 I bought for myself was my "main" computer during university years had a powerful and agreeable feeling: the silver finish, the slightly-transparent keys that still had the traditional tall and skewed capital letters, and the truly matte display. I owned a few Mac laptops since, mainly as secondary machines to use away from my desk until iPads existed, but have never since hungered for a notebook computer. If I'm using a Mac, I want to be grounded at my desk with a good display, a good keyboard, and any other controls and accessories set up just right. So I didn't really watch this event with intent to buy these utterly capable MacBook Pros now or later, but goodness knows my sensory inputs will be available for input on how these chips will be used for desktop machines. A few things that seemed nice anyway:
ProMotion on the Mac. The smoothness of 120 hertz isn't so fundamental as the shift to Retina, though proved surprisingly hard for me to unsee after using an iPad Pro. But ProMotion is also about efficiently reducing frame rates for static content, and that manner of smartness feels due as a standard for traditional computers, let alone luxury models.
The world is used to the "notch" now, and I think it looks all right on the MacBook Pro (for a 1080p camera, but no Face ID or Center Stage, as much as I'd like a reason to believe those will appear on desktop Macs soon). In software, the macOS menu clearly has to be taller to underlap the camera housing area, taking on a shape it's never quite had. When apps are in full screen, the horizontal extension of the notch is rendered black, temporarily creating the appearance of a notchless rectangular display – I think Apple's always avoided this – but developers can elect to use this space once they're ready.
What I found oddly cozy about the new Pros was the physical design. They aren't made to appear to have a thin "edge" which gently expands beneath into a deceptively thicker body, found also on the original iPad and the Intel iMacs. The shape instead resembles older MacBooks and MacBook Pros, where the front and sides are simply rounded by some modest radius into the otherwise flat surface of the underside, essentially making the volume of the body a good old rectangular prism again. I'm sure that space is well-utilized by the internals, which always resemble some sort of highly-advanced microscopic city. Finally, unlike the historically nickel-sized and slightly-rounded rubber feet, these models appear to have four short, extruded aluminum struts with rubber circles on the ends to give their undersides a little clearance from their resting surface. It all feels refreshingly like a statement, or at least an acknowledgement: "you can pick this up and take it around, but it's a substantial machine, a little more of a tank." I imagine you can feel its weight and stability when you place it where you want it, and that feeling seems to mean something.
Someone on MacRumors said Jony Ive's vision was "killed" by these designs. Obviously that's an exaggeration, but I'll concede the feeling of a more even balancing of the long-sought ideal of pure thinness, lightness and plainness with the practical ideal of things real pro users have understandably spoken up about when it's seemed like the Apple folks were out of touch. The new Pros are apparently neither thinner nor lighter than their predecessors, and it's nice to get the implied feeling that Apple no longer feels any wisp of shame about this.
Listening to other reviewers, I think my estimation of the iPad mini is less common, but I'll still consciously attempt to stick to things I haven't heard others say.
I remember learning that Steve Jobs wasn't for the idea of an iPad mini, thinking that the more page-like display of the iPad was ideal, noting that other companies were trying smaller tablets after the iPad had arrived, and that a major shortfall was their lack of custom-built apps, leading to an abundance of "blown-up phone apps" where, for example, you'd get rows of tabular information or list items stretched from the width of an Android phone to the width of an Android tablet, accomplishing little in itself.
The first iPad mini "solved" this by arriving as a conceptually compact full-size iPad with the same screen aspect ratio and the same number of pixels. iPhone apps wouldn't run on iPads unless developers specified they should, and they would no more run on iPad mini. However, every existing iPad app would run on iPad mini with no attention from developers at all. One marketing slogan was "every inch an iPad." Meanwhile, the Mini was thin, holdable in one hand, and was based on hardware re-engineering that would later give rise to the first iPad Air.
That's how Apple (that is, Steve Jobs and friends) turned out to see it, but I saw it as the closest thing that had ever been invented to a Star Trek PADD. I don't know that I cared what most of its apps would do by default, so long as there was a device that felt rather like a handheld book beneath whose cover could be found access to much of human knowledge and an array of modern communications tools.
In my mind, the remaining differentiator was the inset rectangular display and the extended "chin" containing the Mini's home button, which identified it too clearly to be mistaken. The original Mini was also not particularly high-powered even for its time (it still felt residually like the era in which miniaturization was sufficiently cool to persuade), though the next four models would catch up admirably and thin the devices further. This sixth-generation iPad mini, a high performer for the current year, inherits the modern iPad design introduced by the Pro in 2018 and adopted by the Air in 2020, less than which it weighs about a third. In my mind, the iPad mini is "finally" the futuristic digital book I've always wanted.
At the time I bought mine, I had a 2018 11-inch iPad Pro whose TrueDepth array had stopped functioning, so I was asking myself the iPad mini alone would work. Was the Mini really so much smaller? I looked up the Mini's dimensions and took a ruler to the Pro for comparison. It really didn't seem much smaller, but somehow, that exercise failed to prepare me. The Mini really is quintessentially book-sized in height and width, so that reading an eBook (or an SNES game's instruction manual) feels about perfect. But upon return, the 11-inch iPad Pro feels vast.
The other aspect of the Pro which feels superior is the variable-frame-rate ("ProMotion") display with efficient low frame rates for static graphics against an especially smooth 120 hertz during scrolling, animations, or even gameplay, whereas the Mini's display "merely" refreshes a traditional 60 times per second. 60 hertz has been a standard all my life, so I thought the 120-hertz feature was almost overkill when I'd bought the iPad Pro. Now that I'm looking for it, the difference is perceptible at will.
Liquid crystal displays refresh from the uppermost line of pixels downward, and "jelly scrolling" is another term reviewers have thrown around when talking about the iPad mini 6, specifically in portrait mode when the display has been rotated such that its intrinsic top is on the functional left or right. (I don't know why reviewers have picked on this model in particular; if it's a property of LCDs generally, shouldn't it affect all LCD tablets?) Though some of those reviewers have called the effect barely noticeable, I noticed it easily when scrolling through tall columns of text. The right-hand edge of the column scrolls ahead of the left-hand edge, and the graphical content in between looks linearly skewed. The angle of skew increases with the speed of the scrolling, and I'd say it's a few degrees at worst. Turning the device 180 degrees so that the bottom becomes the top, the predicted opposite is true, and the left-hand edge of the column scrolls ahead of the right. Again, I can't see why I shouldn't have noticed this with every iPad I've ever used including my first-generation iPad. But I didn't, so I have to believe I'm noticing it now because the flaring conversation has drawn my focus.
Another visible shortcoming: the iPad home screen layout (which has clearly been subject to some metric-related decisions on the addition of widgets to the iPad home screen) looks to have been designed for full-size iPads and merely implicated down to the Mini, so the spacing doesn't feel as natural. This seems mainly because the iPad mini now has the most oblong screen shape of any iPad, whereas the home screen layout was intended for a closer-to-square-shaped space. I've seen it pointed out that the displayed width of a medium-sized widget is now larger on some iPhones than on this iPad mini, and I can see that the iPad mini's built-in weather widget is so cramped that some of the symbols and icons actually end up overlapping each other, which looks almost shockingly shoddy for Apple. But that's as bad as it gets, and it's theoretically improvable in software alone.
I'm not one who aims to own a cutting-edge iPhone, so the iPad mini's front-facing camera might be the first wide-angle camera I've ever owned, and I'm so delighted by it. I've been waving it around and taking test pictures and videos, gaining from it a feeling reminiscent of a first childhood experience such as a bike or train ride. Though I've only previewed it in the FaceTime app, Center Stage is clearly an arrival, much like the redemption of some comically primitive and failure-prone videoconferencing face-tracking software I laughed at two decades ago, succeeding in simulating something much more like the alertness and smoothness of a cameraperson behind one of those small donkey-sized cameras on a TV studio floor, with slow and smooth pans, tilts and zooms.
Those are actually my main points, and most of the rest is consensus. Performance is very good, battery life is very good, and it's great that the Mini now supports USB-C and the second-generation Apple Pencil. It should last and feel viable for years. I bought a cheap clone of the magnetic folio cover for the Mini (especially important to me with my penchant for book-like-ness), and it feels great when I relax and read in the evening. I'll be patting it in gratitude and hugging it occasionally.
Sora was a good choice for the final fighter. I remembered Nintendo didn't reveal the results of their poll, stating merely in the Wii U version's final presentation that Bayonetta – the last character to join that game – was the most requested character in Europe. I'd chalked the worldwide most requested character up to a "Scroll Of Truth" moment for Nintendo, but it seems their plans for this moment ran that deep, that deliberately. I've never played Kingdom Hearts, but the appeal of combining the legacies of Disney and Square is clear. Plus, I've gravitated daily to an ambient mix of Kingdom Hearts' emotional music while working, so the trailer still found a way to reach me.
In another corporate cynicism-defeating display, Smash Ultimate must be one of Nintendo's main sources of revenue with its worldwide player base and extensive DLC collection, because it spoke like a laser to players' hearts from the first "everyone is here." The effort involved in adding an additional dozen characters to the roster was deceptively large, as a single character's arrival sometimes implied rippling modifications or capacities for others (for example, Kirby's penchant for copying anyone's ability).
Smash Ultimate started huge and became enormous, with over seven dozen playable fighters, tons of stages, and most of a thousand pieces of music. More importantly, it lovingly glorifies characters found much of the way back to the earliest video games, including many that aren't Nintendo's. I've played every version of Smash, but played much more seriously starting on the Wii U, when the online mode became robust and pseudonymous rather than anonymous. Preferred rules and omega stages have made my 25,000-plus matches mostly joyful, and I thought I might consider myself to have "beaten the game" when my "Global Smash Power" hit ten million, but I believe I will continue to dabble and improve.
I'm not a huge fan of DLC (and I'm gratified Iwata insisted Nintendo's approach was to insist on a retail game's full value), but I did spend 75 cents on the Goemon Mii outfit. It was my fondest hope that the deserving Goemon would join as a character, but now that we know he won't, the fashion-formed nod feels all the more worthwhile. I have yet to smack an opponent with his proud pipe (which is most definitely not not a pipe).
Mr Sakurai's sign-off was touching, and he deserves a good breath. Before whatever he does next.
Steve Jobs died on October 5, 2011. There's plenty I could say, but I'll keep it to a few idle thoughts. Jobs came to occupy a spot on the small list of people whose very thinking I found worthy of study and emulation, yet he encapsulated his most deliberate and life-steering thoughts in mere paragraphs. His eyes cast over humanity, he had one foot firmly in the scientific and factual ("where are we?"), and the other in the potential ("where could we be?"). My favourite thought of his was that little monologue on "life," meaning the status quo. "Everything around you that you call life," he said, "was made up by people that were no smarter than you. And you can change it." If you reasonably define a pass-or-fail threshold to evaluate whether someone had successfully "changed life," Steve was about as far above as was possible. Subtract his influence from today's everyday, and it would look and feel like a science fiction-grade alternate timeline.
I think Steve's soul really was the foundation of Apple, as Tim Cook has since declared it would always be. When Steve said "we think," meaning "Apple thinks," he meant "I think," meaning he himself. From those thoughts, Apple's actions followed. After his death, the heavy question became whether Tim's declaration would be honoured and hold true. Steve's suggestion to Apple's staff had been the simplistic advice not to do what he would have done, but just to do "what's right." Tim Cook has overseen Apple's beyond-tenfold expansion since then. The futuristic, circular campus Steve appeared to present to the Cuperino City Council months before his death has been in regular use for a few years. The "Remembering Steve" page is still up.
People like to debate whether Apple has "done what's right," meandered from the path of stark lucidity Steve seemed to embody, or lost its way entirely, and much of that debate feels healthy and good-spirited.
As a response to enthusiasm for the inspiring work of any company, one component of such debates seems less valuable and more cynical: a pointing-out that they're "merely" a company, that their goal is to make money for their shareholders, that that's all they're good for, as though everything they do is more of a scam than a worthwhile endeavour. I'm not sure what worth even the speaker feels that claim has. That may indeed be the exclusive goal of some companies, but it seems such a foolish one, because any company that requires customers for success will appeal to them by cherishing common values, and the values of customers (unless they're all shareholders) are broader and higher than "fund a company." A company whose people really fathom the desires of customers will turn out to earn more money for shareholders than a company whose people are just pretending to.
Steve Jobs didn't have to choose the corporate world, and would probably have done something great if he hadn't. But he saw it as the vehicle for the realization of whichever visions he had, and whichever were still forming. For Apple (meaning for Steve), the business world was a good servant and a bad master, and so when asked by Mossberg and Swisher on stage about overtaking Microsoft in valuation, he said it was surreal, but that it didn't matter much, that it wasn't "what's important," that it wasn't what kept them coming to work every day, and that it wasn't why customers bought their products.
So, I think that anti-corporate cynicism is at least somewhat misguided, but fully misplaced on Jobs. Steve is a reminder that his sort of attitude can exist as much in people today, including in CEOs, as purely as it did in him. Rare, but possible. And in times of challenge for technology companies, when fear of risk overwhelmed creative impulses at the leadership level, Steve thought the appropriate response was to keep your chin up, trust your intellect and inner voice, and work to innovate your way out, and to get to greater things.
This week, I rode a fully-electric public transit bus for the first time.
My area is blessed with a modest and fine transit system which I've enjoyed for decades. The drivers are friendly, the buses prompt, the routes comprehensive, and they required face coverings and distancing during the pandemic while providing free transportation for everyone.
I hadn't realized I was on a fully-electric bus until station departure time. Even with in-ear heaphones, the feeling was so strange that I took them out – I was so accustomed to hearing that sound of a furious, growing combustion engine that its presence would have been a fraction as jarring as its absence. And then we were off. Few differences could have been more perceptible; it felt almost as though these tons of mechanics were floating along the road by magic. I had fretted that I hadn't brought my even heavier-dury noise-cancelling headphones for this trip, and suddenly that didn't matter.
I've witnessed many changes to the status quo, but this one feels more like a change than possibly any.
The greenhouse-like effect from human-driven gas emissions is thankfully a subject of active conversation in current media and culture. Practically everyone who really studies it has said it's too little, too late, and the consequences to be felt on the planet's surface (including its non-human species with no blame for them), are largely before us. My optimism lies in my imagined status quo for the future, when the sound and filth of combustion engines are so alien that people will gawk with incredulity to learn a past society apathetically tolerated, if not clung to, roads crawling with them like ants through bustling anthills. Like cassettes to digital audio, they'll be remembered and even appreciated for their charm, which was real as anything perceived, but by and large, humanity will find it can't go back.
Version 15 is a cool one for Safari users like me. I've heard ongoing opinionation from the personal tech and developer communities as it's been hurriedly refined over the summer after its striking interface changes were criticized, particularly the cramming of too many controls into a single row on the iPhone layouts. Rather than remaining heavy-handed, it looks like the Safari team rethought, refined, and turned many of the boldest changes into deactivatable options, which I think is best for everyone.
The most exciting change for me might be the most striking: the browser chrome's ability to change colour according to the site's specifications. Other browsers have made their interfaces customizable by letting the user pick a pre-designed theme, akin to snapping a decorative plastic plate over your handheld game console. But Safari's approach results in a totally different feel, and it's an interesting choice. "Striking" is right, but I had to chew on it before deciding I loved it.
Here's an example of a thrown-together web page taking advantage of this on an iPad. Rather than a dominating white or grey title bar, the page's lavender colour extends to the screen's upper edge, beneath the controls. Logically, this seems to imply that the browser's controls (including even tabs for other web sites) are "part" of this page, and I suppose that's what has kept other browser teams from implementing or thinking of it. Once I rationalized that, though, the entire web started to feel different. Web pages felt more like "an app" (which is certainly a reasonable term for many of today's web pages – apps written for browsers). If not that, at least it makes web sites feel more "privileged" about their role within the browser, as though they have temporary stewardship or custody of those controls and tabs. It's initially uncomfortable because it's a challenge to long-established convention, like a fiddler at a Victorian dinner party or something: you won't love it if you can't loosen up a little, but once you do, you might find yourself yearning for it again.
One pattern that seems almost prescribed is to colour the toolbar as one with the page's uppermost banner, creating the appearance of a tall "cap" on the entire site, like the top layer of a cake. If you're using Safari 15, you can see this on the WebKit blog – or if you're not, you can imagine what it would look like for the darkish blue to occupy that upper space. I think it's quite handsome. (It occurred to me that the line between this amalgamated banner and the proper page could be wavy, making that "cake layer" look more "iced." One idea among limitless ones, surely.)
One final little thing about Safari 15. I've tried loading a few web pages with different styles and colours, and I think that – at long last – the page content is not first rendered as a solid white rectangle before it loads, including on the Mac. We're well into the dark-golden era of dark mode as a standard, both on Apple platforms and the web generally, and Safari finally seems to act like it.