Steve Barnes' World of Happiness

Ni No Kuni: a modern classic RPG.

I've claimed to love classic RPGs since playing Final Fantasy VI and Chrono Trigger, but admittedly haven't played many since then. The later works of Square Enix made them seem like they were destined (or doomed?) to evolve into some mish-mash of genres and periods, overcomplicating the original recipe for greatness. They were simple in their execution and gameplay, like a time-spanning song or dish.

Ni No Kuni: Wrath Of The White Witch has some of that – in particular, you can move freely around a battlefield to your advantage or detriment – but that's about as daring as LEVEL-5 got. (In some ways, it's even simpler: you control only one character at a time in battle while the others fight or defend at will.)

I didn't play this one either. Instead, Twitch enabled me to watch Lexie play it, and it was as good to watch someone play their first RPG as to watch any friend play through large stretches of any game again. It had been a while.

The title translates to "Other World" (or literally, "World Of Two"), and – especially considering this was apparently anticipated as the "Studio Ghibli game" despite no direct involvement from Miyazaki himself – I'm glad the title went untranslated. An earlier version was released for Nintendo DS, which was expanded for modern consoles years later. The familiar trope of a fantasy world explored within the mind of a child facing life's hardships is handled with care, and while I'd probably leave the voices set to Japanese, I'm glad I didn't miss the voice behind the unabashedly Scottish and inimitable Drippy, a yellow, somewhat gasoline-container-shaped fairy lord with a lantern dangling from his nose, who is irreplaceable. (Honourable mention to the purple Shonky-Honker, a duck-like species with an orchestral brass horn for a beak, who you typically fight in the wild, but can also recruit as a companion. In terms of lovability, they surpass moogles and even rival Yoshis. I mean, come on.)

Also present: a full book. Books can be awkward and tedious in games, but I love their function as unmistakeably serious invitations to temporarily believe the world is real, and this one is availed with dozens of pages of text and illustrations worthy of print. Far exceeding a typical instruction manual, its collectible sections serve as a valuable reference for the game's creatures, treasures and lore. Its style is distinct and literary, you'll end up achieving more in-game the more you study it, and it even includes an entirely fictional script and cipher, for those willing to learn and translate. Truly impressive.

I won't elaborate further, except to reaffirm that I took this game as a sign that classic RPGs' elements are still recognized and embraced, to players' benefit. A sincere story as the vehicle for simple gameplay that amounts to an adventure.

The open Gates.

I'm thoroughly enjoying Gates McFadden's recent podcast. She leverages her classic Star Trek friends as guests, which is common and appeals to fans who wish they could have peeked behind the production curtains.

But Gates aims deeper, striving for personal exchanges suitable for proper friendships, and I've enjoyed harvesting insight about what brought her guests to hold the bite-sized opinions which are typically the full entrée. Reviews say Gates is a good "interviewer," but I'd call her a good conversationalist – you can tell she wants to share as much as she asks her guests to.

I always guessed McFadden had more to say than made it to the conventions and special features, and goodness, does she.

Horace And Pete: the feeling of theatre on film.

I heard about this series a while ago and shrugged it off. Louis CK not playing himself? That seemed odd.

But the 68-minute first episode was free on his site (which I love: it's simple, and though oriented to his professional life, the tone is personal and all media and payment stuff is contained neatly within it), so I decided to watch it late last night.

I had decided earlier in the day, actually, once I'd seen the first minute. A static shot of a dark pub, chairs tilted inward toward the tables to leave the floor accessible for sweeping, still and quiet after the music has faded. A guy (Louis CK not playing himself) stumbles down the stairs, looks around, adjusts his clothing awkwardly, and decides to start righting the chairs, very much at his under-excited leisure. There was something about it which I shortly put my finger on: it was a film, but it didn't feel like a film. It felt like a play, where – especially in those opening moments – exposition is established by the stage directions, which reach the audience almost entirely by way of the actor's pace, body and face, rather than the engagements of music, lighting or cameras. Films, especially today, rely so heavily on the latter group, but this production seemed content to leave those habits for others to play with (save the bookending theme song by Paul Simon).

No undermining of the laid expectations were forthcoming. It was about as much a play as a film could be. It's possible there were genuine cuts between adjacent shots, but for all I could tell, each scene was genuinely rehearsed and performed straight through by the small cast, shot with no masks or frills of any kind. One actor even stumbled over one line at one point (just once), and that apparently made it in. No big deal, just as it wouldn't have been if a real person had done it. There was even a labelled intermission in the middle of the hour – something I haven't seen in films that weren't decades old.

The themes seemed to include family, tradition and politics. Humour peppered the scenes, of course, but I'd say this opening episode made the series out to be more of a semi-mature, lugubrious, apathetic drama than a comedy. I'm sincerely tempted to carry on: it seems like proper art, and an example of art which fully sloughs popular convention just because it's popular. It feels unconcerned and unhurried, which is something I keep noticing recent TV and movies just can't seem to bear.

(Oh, and after watching, Louis CK probably is just playing himself. I suppose.)

Beats Flex is superior to AirPods Pro.

Goodness, did I love my original AirPods – enough to wear out the (admittedly meagre) batteries over hundreds of hours of outdoor walking in under two years. I was ready to upgrade at the moment of the long-awaited announcement of something new, and the AirPods Pro seemed like the next serious step, boosting the quality and adding active noise cancellation, which then – before the industry at large attempted it – seemed like a serious accomplishment for something so portable.

I loved them too, at first. I even put aside my standing dislike – which has faded – of devices you insert into your ear canal.

Months later, the active noise cancelling began worsening in one ear. Not fade, but I could tell it was trying and failing to measure surrounding sound and compensate accurately. When I'd shake my head or take a step while jogging, I'd get noise instead of anti-noise: subtle sympathetic swooshes or squeaks.

Happily, the warranty plus patience took care of that. Unhappily, the same thing hinted at beginning with the other ear several months later, and this time my warranty had just expired. Not the best situation: I was left with a slightly defective, premium-priced product after just a year.

Happily, Apple instated a special replacement program for AirPods Pro exhibiting just this behaviour. Unhappily, some miscommunication led the staff to believe this second replacement occurred both outside the program and out of warranty, leading me to have to make multiple calls to Apple over two months, keeping my own records and advocating for a refund of not just one, but two replacement charges made – the second one seemingly an unrelated error. I know from experience that Apple generally has their customer relations together, but if I'd been a new customer, that bungle may have been my cue to exit.

On top of that, I reflected on my time with the Pros. It's impressive to have two isolated, high-quality earpieces keeping in perfect sync – but when they occasionally lapsed, they really lapsed. Now and then, for example, when only the right or left bud would respond, I'd spend half a walk trying to disconnect and reconnect them in every way I could think of – a predic which wouldn't have been possible with single-piece headphones. As it happened, all of this was at least enough for me to exit my use of the AirPods. I cleaned them carefully, and eBay it was.

The Beats Flex had been released, and that's where my sights had landed. Costing a fifth as much as the AirPods Pro, I might have thought they were "not as good," period. But the W1 chip connected them automatically to all your Apple devices like the AirPods Pro did – no compromise there. They were single-piece: the connecting wire sat gently on your neck, and you wouldn't lose or damage them if they fell out – they even click together magnetically, forming a sort of necklace (which also sends a pause message, while the Pros sensed they'd left your ears). They fit identically in the ear canal and feel fine. They're black. (At least, mine are.) They have a volume control. The don't have their own charging case, because the battery lasts many times longer – I've charged mine a handful of times over months of use. So far, no issues, no subtle defects predestined by lofty aspirations, no hassles with customer service. I found them on sale for $35. This has been way better, and I'm way happier. I enjoyed riding the hype for a little while, but I'd have missed this stroke of logic if I'd been any more fixated or less disillusioned.

So, what are the disadvantages to the Beats Flex? Beats doesn't try to cancel noise actively, but presence in the ear canal does plenty on its own. I can hear the overall difference in sound quality: the Bluetooth-like fidelity is audible, and the sound signature feels like it folds in a little more Beats-esque dolling-up of the spectrum to make it sound a little brighter and bassier than what it receives (though tastefully less than Beats was famous for doing in a previous decade). The AirPods Pro provided that luxurious, pillowy smoothness for gentle bass which the Beats Flex only imitates, but I'm out walking or jogging, for goodness' sake – not sinking into a velour-padded recliner for a vision quest.

Since this whole transition, Apple has ramped up "spatial audio," involving the gyroscope and other input to compensate for your head movements, dealing out the illusion of unmoving speakers around you. I tried watching Star Trek: TNG this way with the AirPods Pro before selling them – it was cool, but not amazing. Recently, I hear Eddy Cue thinks this is the next big thing for music, and the feature will arrive at the system level this year, while Apple Music tracks and tvOS add their own support for it.

If Apple thinks spatial audio is that big, then they're going to be bringing it to more people and more types of headphones in the future. I get the impression they've quietly recognized and addressed the aformentioned issues with the AirPods Pro as well. That's good, and maybe I'll give the headphone-scape another look when I have less doubt the reliability will be solid. For now, the Beats Flex – these humble, inexpensive things – are the ones that have proved themselves, and will remain my companion.

"Momentarily."

Momentarily, derived from "momentary," has primarily been used to mean "for a moment" ("the sensation is momentary," therefore "the sensation will last momentarily").

However, it's also commonly used to mean "in a moment," as in "she'll be with you momentarily."

I've heard that called a misuse. I'm not sure when it happened, but Oxford now lists that second usage with no qualification except "North American."

I observe that "North American" use regularly, and I've just noticed it in an old Columbo episode, so it can't be new. Personally, I've weaned myself from it, preferring "shortly," or "soon," or "in a moment," or even "presently" (which deserves its own post).

Does using "momentarily" to mean "soon" sound "wrong" to people on other continents?

Closing note on predics.

While my recent posts made use of the distinguished abbreviation "predic" to mean "prediction," I am aware that recent memetic tides have popularized its use to mean "predicament": certainly a delightful use in itself.

(Neither use stands for "predicate," which remains the only version of the abbreviation in the Oxford English Dictionary.)

Zelda predic inversion.

Well, my predic comically failed: I'd guessed the sequel to Breath Of The Wild would be about exploration of the underworld, and the trailer revealed its main focus would be the skies above Hyrule instead, beginning with a shot of a skydiving Link obviously inspired by Skyward Sword. But, who knows – these are vaster games than ever, and I don't guess they've played their entire hand with this look.

Another shot from a floating island shows that the distant ground below is indeed the terrain from Breath Of The Wild, which I've now spent hundreds of hours exploring. I think faithful continuity is one of the clearest indications of care about a fictional world.

Last-minute Nintendo Direct predic.

Breath Of The Wild's sequel was announced almost two years ago, and Aonuma faded into the last major Direct supposing viewers might expect him to elaborate at last. Somehow I wasn't poised to leap and gawk, and I was right to hold off. Instead, Skyward Sword received its upconversion for Switch. (While it's HD and 60fps – and what a companion amiibo; Loftwings are the goats of birds – I somehow think I'll stick to the original version, most beloved. Amazing it's ten years old this year.)

Now, however, it's time for the unveiling. The two-year-old teaser ended with Breath Of The Wild's version of Hyrule Castle rising monstrously by several stories, as if on a gargantuan mechanical platform from beneath the earth, which comports with the original's lore.

Breath Of The Wild's soul spawned directly from the original Legend Of Zelda's in many ways, applying Miyamoto's earliest dreams for the series to the expansive potential of modern tech. What profoundly lacked in that translation was the realization of the original game's dungeons. Yes, the series has generalized the term "dungeon" to mean any kind of designated, mappable area of trials and tests in which is found a game-critical item, and Breath Of The Wild has those, but compared to its outdoor realm, it barely honours what was explicitly dubbed the underworld – something I've missed in every installment that seems only to tip its hat to the idea.

This sequel will be Breath Of The Wild's full and neglected embrace of the darkness and toil, the twisting passages and countless chambers, the observation-based puzzles and the calculated facets of combat.

Such is my predic.

WWDC 2021 keynote impressions.

It's Apple's second fully-remote WWDC keynote. After decades of having fine-tuned the art of live broadcasts that evolved from Steve Jobs' early presentations, the strange and fresh feeling of seeing how Apple would craft a pre-recorded event is a year old now. Like last year's, it was spliced with whimsical, artful and musical transitions around Apple Park which never happened during their live events (though, interestingly, other companies have long made a habit of it. And never in any Apple keynote have I heard "ladies and gentlemen, please welcome," which always struck me as bizarre).

No rumoured hardware announcements, which is right for WWDC. It seemed like a more evolutionary year for the OSes – handfuls of exciting features, but nothing totally new, sparkly, or in-your-face (FaceTime notwithstanding). I've missed that and found it refreshing: it's been a decade since Apple switched pace to a major release of OS X every year, rather than every… longer while. The pace of software updates today feels like a subway against a horse and cart, and it's nice to feel like you can take a breath on that timescale – enjoy using technology without worrying about what will have changed before the seasonal weather does.

A lot of stuff about holistic integration into users' lives and relationships. Set a custom quasi-"do not disturb" status which curates work-related versus personal notifications. Share your personal health data with family members or trusted friends. Get a prompt from your watch to reflect on your day. This stuff is a far cry from the calculated focus of Steve Jobs' famous four-quadrant product line: four kinds of products, all Macs, and the operating system that would run them. Even seven years later when the iPhone launched, I doubt the leadership was even thinking about anything from mindfulness to measuring cardio fitness. But "Apple's grown like a weed," and now their ability to work across this range of areas belies something that's always been traceable in its thinking: computers need never be a segregated slice of existential pie, as other companies have conceptualized them. They have their role as helpers and not masters, but their modern sophistication suits them to tasks beyond those of their simpler ancestors.

It's pretty amazing. But it's come so far, so fast, that today I finally found myself asking "wasn't this company called 'Apple Computer' just a few minutes ago, selling just those four Macs? Now they're spending ten minutes talking about home, work, focus, and reminding you to breathe? Is any of this developer conference going to be about using computers?" Happily, there would be plenty of that.

The new Safari design was a highlight I was craving. People have sharply personal browser preferences, and – though happily designed and coded around open, refined standards by several companies rather than one – browsers look and feel different to use. My main gripe with browsers (all the way back to Netscape) is that they feel cluttered, and I've always sought the balance of power, stability and zen-likeness that Safari has honed. Organizationally, the tab groups and tabs-in-the-main-bar thing look like a radical curve, but I suspect they're just the improvement I failed to think of. The visual tab itself is the address bar? Of course – when we use a browser's address bar, we already intend to change the destination of the currently-focused tab. Pages whose primary colours starkly flood the chrome look great. All that, plus the introduction of extensions on iOS, have me really looking forward to a new life of placid Internet journeys.

Shortcuts was previously "Workflow," a third-party app Apple decided to acquire to bring automation to iOS. It's become sophisticated enough that Federighi framed it as the future of automation on Mac, starting a transition from the beloved Automator. (Will AppleScript die a slow death or find a new life?) I'd hoped for this one last year, and I'm nerdishly excited to play around.

FaceTime seems to be doing some catching-up, acquiring features that made other apps like Zoom more popular during the pandemic: multi-platform access, ease of use on the web, screen and window sharing, and a plain old grid view. (Craig says the "Portrait Mode" feature was "inspired by" Portrait Mode photography on iPhone, though Skype has included a background-blur feature for years). However, while Skype is clunky, Zoom was haunted by security issues, and I'm not aware Discord even makes their privacy policy clear for conferencing transmissions, FaceTime will maintain its always-on end-to-end encryption. Ever since Jobs announced FaceTime would be an open standard (evidently false), its main obstacle to popularity seems to have been its usability between only Apple customers.

Where Apple might be going a step further? "SharePlay," an API for developers to integrate FaceTime with native apps on devices for super-efficient synchronization that doesn't rely on screen sharing at all. I'll be paying attention to the possibilities this week. (All this, and "iChat Theater," which let you share any Mac file preview in-video with wonderful smoothness, would seem to remain sadly forgotten.)

The truly magical-looking, Apple-feeling thing of the day was "Universal Connectivity." Craig places an iPad to the right of a MacBook Pro, moves the Mac's cursor past the extreme right, and it automatically peeks into the iPad's screen. Move it a little further to confirm, and the Mac's trackpad is now controlling the iPad's cursor in iOS, and is free to move back and forth. He even drags an image from an iPad all the way across the MacBook Pro's screen, all the way onto an iMac sitting to its left, and drops it into a Final Cut Pro timeline. This was the first time I lamented the lack of a real audience, who would have applauded for a solid minute. I use my iPad next to my iMac every day, and that's a feature I would love, but didn't even think to hope because it didn't seem remotely realistic. Fantastic.

Honourable mentions: lots of good privacy stuff (e.g. Mail blocks tracking pixels), a lot of detail in Maps (even for non-photography-based maps, it looks like models include traffic lights, crosswalks, various kinds of trees, and custom-modelled representations of buildings), Siri can process many voice requests on-device, and government-issued ID on the iPhone (for participating US states).

A new site for the ELT.

The supreme telescope of the decade (under construction for a few years yet) has a new home page with lots of information. One fact illustrating its stature:

Time needed to walk from the entrance of the dome to the top via the set of stairs and walkways fixed to the inner side of the dome: ~30 minutes.

Alex Trebek, knowledge and kindness.

As Jeopardy!'s year of guest hosts has progressed since his death last year, I've paid more attention to Alex Trebek's legacy. His 36 years hosting the show were only part of his career. If he had retired at news of his stage four cancer diagnosis, his streak would have been celebrated as outstanding. Instead, he appeared on set until his final weekend, allegedly reading answers to his contestants amid frailty and pain, disguised with his usual poise. Memorable moments included a short message favouring kindness and generosity, a champion thanking Alex for teaching him English as a child, and the news he'd spent his last day as he hoped, watching the horizon with his wife. What finer final days could I wish upon myself?

I've since watched a little Classic Concentration, whose reruns individuals have surfaced on YouTube, seemingly without objection. (Fascinating that's now "the way" to get older TV episodes. You'd think providing them would be a way for networks to convincingly illustrate good will and good values, which is worth more to consumers than most companies realize, seeming instead to believe that saying they value stuff in advertisements is sufficient.)

Concentration was about recalling the locations of hidden matching tiles on a game board, which would incrementally reveal a rebus beneath. It was a game so simple that I thought it would be better-suited for contestants who were children, whose main weakness would have been ignorance of the cultural staples or figures of speech that were the match-deciding rebuses' solutions. (As a child, I played Concentration for DOS enough to gain the advantage of learning and memorizing many of those expressions.) What struck me was that contestants commentated somewhat meaninglessly about their choice of tiles. Rather than "thirteen," they might say "I'd like to open up the middle of the board, so let's go with thirteen," and it was consistent enough to seem unnatural, as though explicitly encouraged. Trebek did the same kind of thing, bordering over-commentating on those choices, or on the solution, or on a contestant's shortfall in deriving it. It felt like the improvisational churning of a person who couldn't find the intellectual depth he craved, attempted to manufacture it, and came out looking as verbose as transparent. (I thought this all happened before Jeopardy, but in actuality, this was one of two other shows Trebek was hosting alongside it. In any case, while interesting and entertaining, it left me unsurprised the show dropped away in 1991 while Jeopardy now persists.)

After he died, I found "dry wit" cited as a common trait of his character. An understandable take, but a shallow one. Whatever he was promoting by staying with Jeopardy to the latest possible moment, he cared sincerely and deeply about it. You couldn't tell which answers' questions he didn't know, but considering his occasional foreign pronunciations seemed informed and his occasional elaborations seemed off-the-cuff, you got the impression he'd do well against the highest-achieving champions, with whom he seemed closer to enamoured than impressed. The question I was left asking: what is the right word for the thing he cared about? Knowledge? Scholarliness? Intellectualism? An academic mindset?

Watching Jeopardy lately, the characteristic that most struck me might be impartiality. No matter what the clue was, Trebek recited it with unwavering clarity and formality (even as categories merited casual, styled or humourous phrasing). Was a topic controversial or ghastly? It didn't seem to matter, so long as it was factual and verifiable. Was it video games, still dismissed as flippant or irrelevant to more important things in life? They've enjoyed the same moments on the blue grid as famous composers and world history. Competitive sports, inconsequential as they seem to external matters? There they are, now and then (even when the contestants reveal themselves as my type of person).

That might be the insight I've gained. Alex didn't seem to discriminate between worthy and unworthy knowledge; he just seemed to care that the mind was open, actively observing and taking note, taking interest, as much in the seemingly boring as the seemingly exciting. After all, to any one of us, the seemingly boring may one day become the genuinely exciting.

All of this seems tangential to kindness. Did Alex see some more direct link between this open-mindedness and the world he envisioned and wished to egg into existence? Here are those presuming words from one of his final shows:

We're trying to build a gentler, kinder society, and if we all pitch in just a little bit, we're gonna get there.

Standing before the Apple Silicon era.

This year is its beginning. There's no shortage of writing and reviews about Apple's first Macs (and iPad!) using their minuscule and powerful new architecture, but my own reaction is to stand back and take a breath.

It's quite a transition. The visual design and internal engineering of the Mac has always been its cornerstone, but its processors have always been found elsewhere. Every Apple nerd knows Steve described the move to Intel processors in 2005 as the logical step on their quest for more computing bang for the electrical buck. No one seems to mention that Apple "could have" moved to their own processors then, but didn't.

I enquote "could have" because perhaps they couldn't have, lacking the company expertise. (Could they have if Steve Wozniak had stayed as close to the company as Steve Jobs did?) That expertise appeared quietly with the iPhone, and less quietly with the first iPad's A4. Johnny Srouji seemed to emerge as the needed Steve Wozniak, ultimately thrust forward as the explainer of Apple Silicon's technical philosophy to the public. The handheld devices achieved their world-class heights on a steady diet of chips, high on integration of processors, graphics cores, and other custom processors – and practically free of carbohydrates – and at long last, the Mac will inherit it all.

I just read the forum post of a Mac newcomer's delights and gripes. "Apple should have made the RAM upgradable on these," they said, framing it like an oversight or mistake about which Apple would find themselves blushing, apologizing for, and rescinding. Non-upgradable memory for a leading desktop computer? Admittedly, it sounds unthinkable to long-time computer users. Of course, this was no mistake: the choice was deliberate (though executives refrained on this occasion from invoking the word "courage"), and the visionaries have decided the efficiency bestowed by their fully-integrated chip systems outweigh that ability.

While I'd love to understand more about low-level computer functionality, the nature of the M1's "unified memory" is something I vaguely understand from my days experimenting with programming for dedicated graphics processors. When dealing with hundreds or thousands of shapes and colours, every relevant data point must be copied from on-board memory to discrete "graphics memory" every frame. However, it's possible to copy reusable data to that graphics memory just once, in programming, then submit further instructions on how to reuse it. A significant optimization, which can be critical to improving frame rates or rendering more detail. And that's great, but it's another concept to comprehend and master when your vision is so simple.

The M1 contains only one pool of shared memory, locating it within reach of both its main processors and graphics cores. There's no separate "graphics memory" at all – no copying at all. That's one hint about the advantages of making this memory non-upgradeable. It feels radical, but also seems simple enough to make that aforementioned programming routine feel arcane. And that's just one example of the kind of conceptual change to the nature of computers this first chip heralds.

That's the kind of thing that makes me step back and take a breath. I recall this feeling from earlier childhood, when the Amiga, with its groundbreaking graphics and audio capabilities, seemed the computer of its day. I watched adults around me get excited, resonate with humour and friendliness, recognizing the implications: possibilities to explore, sense to make, uncharted territory whose compass direction was clear. A humbling reminder that "being a computer user" was an activity absent from prior human history, that they were alive near its dawn, equipped to appreciate it together like a sunrise. We're doubly fortunate to live decades into that history, with what would have been called "supercomputers" in our pockets, but this year calls to mind that dear old mood.

For all Apple's growth, mastery and influence, it feels like they've finally grown into something primary that they hadn't before. The transition has gone well, people love the computers, and it's just the barest start. I'm assuming other companies' leaders have been blinking with some bewilderment, asking each other whether they should be doing fully-integrated desktop-class processors of their own. How would they? How could they?

That's all for now. Nothing deathly insightful to say, but I wanted to take that moment to nod at it. I'll have plenty of time to enjoy it all.