Steve Barnes' World of Happiness

Closing note on predics.

While my recent posts made use of the distinguished abbreviation "predic" to mean "prediction," I am aware that recent memetic tides have popularized its use to mean "predicament": certainly a delightful use in itself.

(Neither use stands for "predicate," which remains the only version of the abbreviation in the Oxford English Dictionary.)

Zelda predic inversion.

Well, my predic comically failed: I'd guessed the sequel to Breath Of The Wild would be about exploration of the underworld, and the trailer revealed its main focus would be the skies above Hyrule instead, beginning with a shot of a skydiving Link obviously inspired by Skyward Sword. But, who knows – these are vaster games than ever, and I don't guess they've played their entire hand with this look.

Another shot from a floating island shows that the distant ground below is indeed the terrain from Breath Of The Wild, which I've now spent hundreds of hours exploring. I think faithful continuity is one of the clearest indications of care about a fictional world.

Last-minute Nintendo Direct predic.

Breath Of The Wild's sequel was announced almost two years ago, and Aonuma faded into the last major Direct supposing viewers might expect him to elaborate at last. Somehow I wasn't poised to leap and gawk, and I was right to hold off. Instead, Skyward Sword received its upconversion for Switch. (While it's HD and 60fps – and what a companion amiibo; Loftwings are the goats of birds – I somehow think I'll stick to the original version, most beloved. Amazing it's ten years old this year.)

Now, however, it's time for the unveiling. The two-year-old teaser ended with Breath Of The Wild's version of Hyrule Castle rising monstrously by several stories, as if on a gargantuan mechanical platform from beneath the earth, which comports with the original's lore.

Breath Of The Wild's soul spawned directly from the original Legend Of Zelda's in many ways, applying Miyamoto's earliest dreams for the series to the expansive potential of modern tech. What profoundly lacked in that translation was the realization of the original game's dungeons. Yes, the series has generalized the term "dungeon" to mean any kind of designated, mappable area of trials and tests in which is found a game-critical item, and Breath Of The Wild has those, but compared to its outdoor realm, it barely honours what was explicitly dubbed the underworld – something I've missed in every installment that seems only to tip its hat to the idea.

This sequel will be Breath Of The Wild's full and neglected embrace of the darkness and toil, the twisting passages and countless chambers, the observation-based puzzles and the calculated facets of combat.

Such is my predic.

WWDC 2021 keynote impressions.

It's Apple's second fully-remote WWDC keynote. After decades of having fine-tuned the art of live broadcasts that evolved from Steve Jobs' early presentations, the strange and fresh feeling of seeing how Apple would craft a pre-recorded event is a year old now. Like last year's, it was spliced with whimsical, artful and musical transitions around Apple Park which never happened during their live events (though, interestingly, other companies have long made a habit of it. And never in any Apple keynote have I heard "ladies and gentlemen, please welcome," which always struck me as bizarre).

No rumoured hardware announcements, which is right for WWDC. It seemed like a more evolutionary year for the OSes – handfuls of exciting features, but nothing totally new, sparkly, or in-your-face (FaceTime notwithstanding). I've missed that and found it refreshing: it's been a decade since Apple switched pace to a major release of OS X every year, rather than every… longer while. The pace of software updates today feels like a subway against a horse and cart, and it's nice to feel like you can take a breath on that timescale – enjoy using technology without worrying about what will have changed before the seasonal weather does.

A lot of stuff about holistic integration into users' lives and relationships. Set a custom quasi-"do not disturb" status which curates work-related versus personal notifications. Share your personal health data with family members or trusted friends. Get a prompt from your watch to reflect on your day. This stuff is a far cry from the calculated focus of Steve Jobs' famous four-quadrant product line: four kinds of products, all Macs, and the operating system that would run them. Even seven years later when the iPhone launched, I doubt the leadership was even thinking about anything from mindfulness to measuring cardio fitness. But "Apple's grown like a weed," and now their ability to work across this range of areas belies something that's always been traceable in its thinking: computers need never be a segregated slice of existential pie, as other companies have conceptualized them. They have their role as helpers and not masters, but their modern sophistication suits them to tasks beyond those of their simpler ancestors.

It's pretty amazing. But it's come so far, so fast, that today I finally found myself asking "wasn't this company called 'Apple Computer' just a few minutes ago, selling just those four Macs? Now they're spending ten minutes talking about home, work, focus, and reminding you to breathe? Is any of this developer conference going to be about using computers?" Happily, there would be plenty of that.

The new Safari design was a highlight I was craving. People have sharply personal browser preferences, and – though happily designed and coded around open, refined standards by several companies rather than one – browsers look and feel different to use. My main gripe with browsers (all the way back to Netscape) is that they feel cluttered, and I've always sought the balance of power, stability and zen-likeness that Safari has honed. Organizationally, the tab groups and tabs-in-the-main-bar thing look like a radical curve, but I suspect they're just the improvement I failed to think of. The visual tab itself is the address bar? Of course – when we use a browser's address bar, we already intend to change the destination of the currently-focused tab. Pages whose primary colours starkly flood the chrome look great. All that, plus the introduction of extensions on iOS, have me really looking forward to a new life of placid Internet journeys.

Shortcuts was previously "Workflow," a third-party app Apple decided to acquire to bring automation to iOS. It's become sophisticated enough that Federighi framed it as the future of automation on Mac, starting a transition from the beloved Automator. (Will AppleScript die a slow death or find a new life?) I'd hoped for this one last year, and I'm nerdishly excited to play around.

FaceTime seems to be doing some catching-up, acquiring features that made other apps like Zoom more popular during the pandemic: multi-platform access, ease of use on the web, screen and window sharing, and a plain old grid view. (Craig says the "Portrait Mode" feature was "inspired by" Portrait Mode photography on iPhone, though Skype has included a background-blur feature for years). However, while Skype is clunky, Zoom was haunted by security issues, and I'm not aware Discord even makes their privacy policy clear for conferencing transmissions, FaceTime will maintain its always-on end-to-end encryption. Ever since Jobs announced FaceTime would be an open standard (evidently false), its main obstacle to popularity seems to have been its usability between only Apple customers.

Where Apple might be going a step further? "SharePlay," an API for developers to integrate FaceTime with native apps on devices for super-efficient synchronization that doesn't rely on screen sharing at all. I'll be paying attention to the possibilities this week. (All this, and "iChat Theater," which let you share any Mac file preview in-video with wonderful smoothness, would seem to remain sadly forgotten.)

The truly magical-looking, Apple-feeling thing of the day was "Universal Connectivity." Craig places an iPad to the right of a MacBook Pro, moves the Mac's cursor past the extreme right, and it automatically peeks into the iPad's screen. Move it a little further to confirm, and the Mac's trackpad is now controlling the iPad's cursor in iOS, and is free to move back and forth. He even drags an image from an iPad all the way across the MacBook Pro's screen, all the way onto an iMac sitting to its left, and drops it into a Final Cut Pro timeline. This was the first time I lamented the lack of a real audience, who would have applauded for a solid minute. I use my iPad next to my iMac every day, and that's a feature I would love, but didn't even think to hope because it didn't seem remotely realistic. Fantastic.

Honourable mentions: lots of good privacy stuff (e.g. Mail blocks tracking pixels), a lot of detail in Maps (even for non-photography-based maps, it looks like models include traffic lights, crosswalks, various kinds of trees, and custom-modelled representations of buildings), Siri can process many voice requests on-device, and government-issued ID on the iPhone (for participating US states).

A new site for the ELT.

The supreme telescope of the decade (under construction for a few years yet) has a new home page with lots of information. One fact illustrating its stature:

Time needed to walk from the entrance of the dome to the top via the set of stairs and walkways fixed to the inner side of the dome: ~30 minutes.

Alex Trebek, knowledge and kindness.

As Jeopardy!'s year of guest hosts has progressed since his death last year, I've paid more attention to Alex Trebek's legacy. His 36 years hosting the show were only part of his career. If he had retired at news of his stage four cancer diagnosis, his streak would have been celebrated as outstanding. Instead, he appeared on set until his final weekend, allegedly reading answers to his contestants amid frailty and pain, disguised with his usual poise. Memorable moments included a short message favouring kindness and generosity, a champion thanking Alex for teaching him English as a child, and the news he'd spent his last day as he hoped, watching the horizon with his wife. What finer final days could I wish upon myself?

I've since watched a little Classic Concentration, whose reruns individuals have surfaced on YouTube, seemingly without objection. (Fascinating that's now "the way" to get older TV episodes. You'd think providing them would be a way for networks to convincingly illustrate good will and good values, which is worth more to consumers than most companies realize, seeming instead to believe that saying they value stuff in advertisements is sufficient.)

Concentration was about recalling the locations of hidden matching tiles on a game board, which would incrementally reveal a rebus beneath. It was a game so simple that I thought it would be better-suited for contestants who were children, whose main weakness would have been ignorance of the cultural staples or figures of speech that were the match-deciding rebuses' solutions. (As a child, I played Concentration for DOS enough to gain the advantage of learning and memorizing many of those expressions.) What struck me was that contestants commentated somewhat meaninglessly about their choice of tiles. Rather than "thirteen," they might say "I'd like to open up the middle of the board, so let's go with thirteen," and it was consistent enough to seem unnatural, as though explicitly encouraged. Trebek did the same kind of thing, bordering over-commentating on those choices, or on the solution, or on a contestant's shortfall in deriving it. It felt like the improvisational churning of a person who couldn't find the intellectual depth he craved, attempted to manufacture it, and came out looking as verbose as transparent. (I thought this all happened before Jeopardy, but in actuality, this was one of two other shows Trebek was hosting alongside it. In any case, while interesting and entertaining, it left me unsurprised the show dropped away in 1991 while Jeopardy now persists.)

After he died, I found "dry wit" cited as a common trait of his character. An understandable take, but a shallow one. Whatever he was promoting by staying with Jeopardy to the latest possible moment, he cared sincerely and deeply about it. You couldn't tell which answers' questions he didn't know, but considering his occasional foreign pronunciations seemed informed and his occasional elaborations seemed off-the-cuff, you got the impression he'd do well against the highest-achieving champions, with whom he seemed closer to enamoured than impressed. The question I was left asking: what is the right word for the thing he cared about? Knowledge? Scholarliness? Intellectualism? An academic mindset?

Watching Jeopardy lately, the characteristic that most struck me might be impartiality. No matter what the clue was, Trebek recited it with unwavering clarity and formality (even as categories merited casual, styled or humourous phrasing). Was a topic controversial or ghastly? It didn't seem to matter, so long as it was factual and verifiable. Was it video games, still dismissed as flippant or irrelevant to more important things in life? They've enjoyed the same moments on the blue grid as famous composers and world history. Competitive sports, inconsequential as they seem to external matters? There they are, now and then (even when the contestants reveal themselves as my type of person).

That might be the insight I've gained. Alex didn't seem to discriminate between worthy and unworthy knowledge; he just seemed to care that the mind was open, actively observing and taking note, taking interest, as much in the seemingly boring as the seemingly exciting. After all, to any one of us, the seemingly boring may one day become the genuinely exciting.

All of this seems tangential to kindness. Did Alex see some more direct link between this open-mindedness and the world he envisioned and wished to egg into existence? Here are those presuming words from one of his final shows:

We're trying to build a gentler, kinder society, and if we all pitch in just a little bit, we're gonna get there.

Standing before the Apple Silicon era.

This year is its beginning. There's no shortage of writing and reviews about Apple's first Macs (and iPad!) using their minuscule and powerful new architecture, but my own reaction is to stand back and take a breath.

It's quite a transition. The visual design and internal engineering of the Mac has always been its cornerstone, but its processors have always been found elsewhere. Every Apple nerd knows Steve described the move to Intel processors in 2005 as the logical step on their quest for more computing bang for the electrical buck. No one seems to mention that Apple "could have" moved to their own processors then, but didn't.

I enquote "could have" because perhaps they couldn't have, lacking the company expertise. (Could they have if Steve Wozniak had stayed as close to the company as Steve Jobs did?) That expertise appeared quietly with the iPhone, and less quietly with the first iPad's A4. Johnny Srouji seemed to emerge as the needed Steve Wozniak, ultimately thrust forward as the explainer of Apple Silicon's technical philosophy to the public. The handheld devices achieved their world-class heights on a steady diet of chips, high on integration of processors, graphics cores, and other custom processors – and practically free of carbohydrates – and at long last, the Mac will inherit it all.

I just read the forum post of a Mac newcomer's delights and gripes. "Apple should have made the RAM upgradable on these," they said, framing it like an oversight or mistake about which Apple would find themselves blushing, apologizing for, and rescinding. Non-upgradable memory for a leading desktop computer? Admittedly, it sounds unthinkable to long-time computer users. Of course, this was no mistake: the choice was deliberate (though executives refrained on this occasion from invoking the word "courage"), and the visionaries have decided the efficiency bestowed by their fully-integrated chip systems outweigh that ability.

While I'd love to understand more about low-level computer functionality, the nature of the M1's "unified memory" is something I vaguely understand from my days experimenting with programming for dedicated graphics processors. When dealing with hundreds or thousands of shapes and colours, every relevant data point must be copied from on-board memory to discrete "graphics memory" every frame. However, it's possible to copy reusable data to that graphics memory just once, in programming, then submit further instructions on how to reuse it. A significant optimization, which can be critical to improving frame rates or rendering more detail. And that's great, but it's another concept to comprehend and master when your vision is so simple.

The M1 contains only one pool of shared memory, locating it within reach of both its main processors and graphics cores. There's no separate "graphics memory" at all – no copying at all. That's one hint about the advantages of making this memory non-upgradeable. It feels radical, but also seems simple enough to make that aforementioned programming routine feel arcane. And that's just one example of the kind of conceptual change to the nature of computers this first chip heralds.

That's the kind of thing that makes me step back and take a breath. I recall this feeling from earlier childhood, when the Amiga, with its groundbreaking graphics and audio capabilities, seemed the computer of its day. I watched adults around me get excited, resonate with humour and friendliness, recognizing the implications: possibilities to explore, sense to make, uncharted territory whose compass direction was clear. A humbling reminder that "being a computer user" was an activity absent from prior human history, that they were alive near its dawn, equipped to appreciate it together like a sunrise. We're doubly fortunate to live decades into that history, with what would have been called "supercomputers" in our pockets, but this year calls to mind that dear old mood.

For all Apple's growth, mastery and influence, it feels like they've finally grown into something primary that they hadn't before. The transition has gone well, people love the computers, and it's just the barest start. I'm assuming other companies' leaders have been blinking with some bewilderment, asking each other whether they should be doing fully-integrated desktop-class processors of their own. How would they? How could they?

That's all for now. Nothing deathly insightful to say, but I wanted to take that moment to nod at it. I'll have plenty of time to enjoy it all.

It there does not have except on Twitter.

Found on Twitter:

Il n'y a que sur Twitter que vous pouvez voir une vache manger une banane…

I think that's the first time I've seen "il n'y a que" in the wild. "Il y a" (literally "it there has") is the typical way to say "there is." ("Il y a une vache" is likely what a translator would give you for "there is a cow.")

But this is a negation. "Il n'y a pas un vache" means "there is not a cow" – the phrase "ne pas" is split amid the phrase. "Que" rather than "pas" implies something more like "except." So, roughly, "there is not except on Twitter that you can see a cow eating a banana." More naturally, "only on Twitter," which is what this francophone seems to mean.

"Seulement" is "only," and that seems simple enough to ask why one wouldn't just use “seulement sur Twitter.” Looking through Oxford's English-French dictionary, I can't tell whether that would sound equally fine or slightly unnatural. In any case, the tweeter's phrase sounds like something you'd get used to if you spoke the language – there's a satisfying feeling to that "ne pas" construction with that "except" twist, and the relatively brevity with which you can execute that in French.

(Having said all that, it's also possible on YouTube.)

The glories of pull quotes.

You know, those large, sophisticated-looking quotes in large text placed off to one side of the text of an article, ostensibly to provide a guided peek into a key sentiment, or to highlight the thought as supremely poignant or summarizing. Even more sophisticated: the pull quote's text is in italics. More sophisticated still: the article's text flows around the pull quote's text. Maximum sophistication: there's a solid horizontal bar situated above or below the entire pull quote. Not above and below: above or below.

As heavenly as that sounds, I suppose the problem is that once I've read the pull quote and the article, I feel like the chosen text is neither poignant nor summarizing. If anything, it feels like the sentence chosen for the pull quote was selected on the basis of its seeming as poignant or summarizing as possible without being either, as though being either would somehow ruin it for the person selecting it. It's not "this pie combines blueberries with orange more deliciously than any recipe I've tried," but "the implications of this pie, when considered in context, may baffle the considered diner," even if the article was written essentially to extol the combination of blueberries and oranges. At least, that's what it feels like.

My instinct is that the relevant question, hyperbolic as it might sound, is "who invented these things?". Relevant, because it seems too believable – a classic swipe of Occam's razor – that one person used these, others thought they looked cool, and everyone started using them. And all the people in that hypothetical scenario worked for publication companies, print or digital. None of them were individuals writing for themselves, because they wouldn't have thought of it, because they didn't work for publication companies.

At least, that's what it feels like.

Jony Ive on creativity.

We haven't heard much from Ive since he worked at Apple, where he seldom spoke outside the confines of the precisely-produced videos nestled within the executive staff's live presentations. He all but identifies himself as a quiet person in this speech, where he talks not about Apple, but about his relationship to design and creativity in general.

Ive has always struck me as a person who speaks so deeply in the abstract that it's difficult to understand what he's talking about. On the plus, that incomprehensibility makes him seem rather distinguished. More substantially: when you do take some meaning from his thoughts, you realize the utility of that abstraction is broad applicability. Ive has designed objects, but I've found his almost proverbial insights applicable to the creation of visual art, music, scripts, and even groups of people guided by common ideas or directives.

"The Fringe Benefits Of Failure…"

An inspiring speech with witty, bright and dark moments:

Imagination is not only the uniquely human capacity to envision that which is not, and therefore the fount of all invention and innovation. In its arguably most transformative and revelatory capacity, it is the power that enables us to empathise with humans whose experiences we have never shared.

Text and video:

The coveted bivalvial trill.

Incidentally, I took note of the word "bivalve" when sharing the definition of "cockle" earlier – it just refers to a mollusc with a hinged shell.

It reminded me of the word "bilabial" (any sound involving the lips), which reminded me mainly of the linguistic term "bilabial trill": that vibrational, motorboat-like pronunciation. (Other common trills include the lingual trill, or rolling your "r," and the guttural trill, heard when gargling with or without liquid.)

Anyway, this inspired me to consider the possibility of a "bivalvial trill": the hypothetical vibration of a hinged shell. I'm not sure whether this has ever existed in nature, but now that I've wondered about it, I'd be especially enthused to know it does.