Steve Barnes' World of Happiness

"Can I Use" looks so nice.

It's not just a great resource for web developers to learn which browsers support which features, but the colour theme fits my tastes to closely. (It responds to system-wide dark mode by darkening and de-texturing the background, but the light version appeals almost as much to me, which I can't often say of sites.)

Homebrew's site is a close second.

When spelling frightens.

At best, I find spelling mistakes amusing: perhaps conjuring an unintended conversational phrase, or done purposely in humour, irony or humility. At worst, I find them irksome; say, repeated misusage out of negligence. Today, anything you type on is likely to try to correct you – indeed, to overcorrect you – so I don't see an excuse for that.

But, who am I to talk? I have the advantage of always having loved spelling. We'd have spelling homework in first grade, and I recall correcting my classmates' work before the teacher got to it. As with many things, I suppose interest in spelling correllates with skill. I've never known the feeling of disliking spelling: an aversion even to working to improve.

So, this was an interesting post for me: Aaron Moodie on his realization his discomfort about spelling made him hesitate to write his own blog.

The fallacy of relative privation.

Someone mentioned this to me on Twitter (hooray for functional Twitter!) when I asked whether there was a name for a particular turn of conversation around a contentious topic. I'm afraid I forget the topic, but the conversation's form resembled the following: person A said "here's a problem," then person B said "that's not nearly as bad as this other problem."

What's wrong with that? Nothing, inherently. What would be wrong, though, would be the reader, if she thought "problem B is the worse problem, so problem A is therefore not a problem."

Indeed, I had the impression this might have been person B's implication. Strong emphasis on might, of course – I could have tried to find out by asking directly, and one of the commonest and worst mistakes available when discussing on the Internet is to jump to conclusions about what your conversational partners mean. (Instead, I raised that question about the fallacy's name.) That's also why this fallacy, like other fallacies, might be slipperier to identify: for someone aiming to use it, it could be slipped in to dialogue under cover of ambiguity.

(Note: this also seems to be what I've sometimes heard Americans call "whataboutism.")

Great street-level video from Tokyo.

Continuous walking shots taken through bustling urban Japan have been a happy genre lately, but this was simply superior: super-smooth camerawork, temperate exposure, and the photographer's intuition to dwell on interesting signs and vistas satisfyingly matched up with mine. I ended up watching a whole chunk of this in one go.

When people call today the 20th anniversary of Mac OS X…

… They're referring to the shelf launch date. I wasn't so attuned to Apple's announcements and launch dates then, but I remember its aesthetic warming past the horizon of my mind, with its plastic-looking stripes and bulbousnesses. The design seemed to match the tones and materials of the first iMacs, whose beigeless and rectangle-less appearance, considered Jony Ive's big splash, would signify that Apple was back. At least, that's what Steve Jobs seemed to hope, and this statement was more than skin-deep: he'd brought NeXT software into the picture, and this would serve as the foundation of Mac OS X, later known as OS X, later known as macOS.

The resolution of the first monitor on which I viewed my own installation, I think, was either 640 by 480 or 800 by 600. It was still the age in which one could easily see the pixels, and if I focused and squinted, I could make out the red, green and blue elements. There was no escaping the feeling that everything boiled down to horizontal and vertical lines, rectangles and squares, and that was a comforting and familiar feeling. It felt harmonious with Mac OS 9 and everything before, where bitmap fonts thrived and anti-aliasing seemed a more theoretical concept. Now, throughout this plasticky paradigm, I was seeing smoothed fonts, soft round buttons, and window edges that seemed to blur into each other. A minimized window would squeeze itself down into the dock like a pair of hands contouring an exaggerated hourglass figure on a mannequin, with no trace of pixellation. I wouldn't have called myself a fluent programmer then, but I could tell something fundamentally different was happening on a technical level. That would have been obvious anyway: apps like Adobe Premiere couldn't even run on Mac OS X – at least, not natively. Booting an app in "classic mode," one of my first tastes of emulation, temporarily plopped its pixellated countenance right into this blearier-looking world.

I've since learned about what was happening technically: terms like "object-oriented," "protected memory," "preemptive multitasking," and "UNIX-based" have unmasked themselves as sources of delight and wonder. Systems can crash, data can be corrupted, and programming takes ages, but these were ways around those, real treatments for causes rather than symptoms, things that would excite you if this was your life.

Steve had said on stage that Mac OS X would set Apple up "for the next 20 years," and he turned out to have understated. I'm not sure how much proverbial DNA today's version shares with the one launched on that day; it might be like comparing Homo sapiens to Homo erectus. The plastickiness is long gone, and the iMacs they imitate – while still in use here and there – feel more like relics than modern products. But those aforementioned concepts have stayed right at the heart of the system, having seen it across 17 major releases and three different companies' processor architectures. No sign it's about to end.

(Oh, and Scott says hi.)

Apple discontinues the original HomePod.

Here's the breaking report from TechCrunch.

The HomePod was one of those rarer things I decided to adopt early, and I wrote a few first impressions when I switched to two HomePod mini after using the original HomePod for a couple of years. In summary: the HomePod mini sounded a little cheaper by comparison, but I stress "by comparison." The original's sound was so "good" – deliberately in quotation marks – as to sound almost alien, inspiring worry I'd disturb my adjacent neighbours with its pristine but unadjustable bass.

It didn't take me long to forget the relative cheapness. Now the Minis just sound good to me (without quotation marks, and with pluralization, since I bought a second Mini before long. And that's the main thing: no matter how good the original got, I think I've confirmed through this transition that stereo anything is better than mono everything, so to speak. At retail, stereo Minis cost 200 USD, and stereo originals cost 600 after Apple's first price cut).

The original was a heartfelt experiment, years in the making. I'm a little surprised it ended this way, but the ending helps confirm that my subjective perception of audio quality may not be so far from some allegedly objective ideal. As John Cleese once said of wine, "don't let anyone tell you what wine you should like."

Music from the outside.

It's been about a year since I played much music, which hasn't happened since my first piano lesson, before I was 2,000 days old. Through my earliest hours spent learning the Super Mario Bros. theme by ear, to composing the Frontiers soundtrack and having worked on over 85 musical stage shows, the piano has been much of my leisure and livelihood for decades.

I'd come to yearn to explore other things with the same ardency and leisure, and I wanted to shelve music long enough to forget about it. Not to disown it, but to get an opportunity to see it from the outside again, as non-musicians do. It's the kind of perspective I might want to recapture on the English language: a detachment some monolinguals might barely think to imagine, not even to their deaths.

This week I put on the headphones and improvised for a while. I wasn't really struck until afterward, and what struck me was that it hadn't been a while, but just a few minutes. I'm not sure whether playing had always done that for me, or whether it was plainer because I'd done it less. I was reminded of the rather loaded word "meditation" – perhaps loaded because its users seem to feel little pressure to define it, and because at least one usage refers to a pretty natural state of not recalling all one's concerns at once, which doesn't call for a fancy word. A counterintuitive observation, though: a few minutes spent making minutes seem longer is useful amid a routine where someone feels they can't afford a few minutes for anything.

I've seen composers' lives analyzed into phases, where their body of work seems to change with personal events or revelations. I never aim to change for change's sake, but having gained a small dose of the perspective I wanted, I think I understand that phenomenon a little better. If this all yields some change for me, I hope it feels more like it always should have.

"Right Up Our Alley"

I expect to see this shared a little. Some super-cool… drone photography?… which sort of makes you feel like an invisible entity who can fly around a place.

The enduringly clumsy state of e-mail.

John Gruber, reminding us that e-mail tracking pixels are still common practice, and that, since the arrival of HTML support, e-mail clients have been much like web browsers, except without the modern and sophisticated security features.

I think this really is a reminder. While Apple, Google and Mozilla have all contributed to the effort to make web browsers feel more like sentinel guardians of the user, the only companies I've seen attempting to make e-mail similarly modern, secure and serene are lesser-known ones, like Hey and ProtonMail. Their goals aren't to change the entire e-mail industry, but to wrap individual in a protective barrier, so to speak.

Otherwise, e-mail feels pretty much the way it did 20 years ago. It works and I use it, but it remains pretty clunky and archaic-feeling, so I only use it when I have to. I disable image loading by default. I only type in plain text. I barely get any unwanted messages, mainly because I spend as much time as necessary to manually unsubscribe to everything I can.

And 20 years later, its oldest, simplest-to-exploit form of user tracking is apparently still the most popular.

Science Week 2021 – done and done.

Happy landing.

(Science Week, Day 7!)

It’s the last day of Science Week, fellow humans – thank you for joining me.

There can be only one topic for today.

Mario and the future of augmented reality.

(Science Week, Day 6!)

Today’s is a nod not to the process of science, but to the application of the knowledge it’s bestowed by hard-working and creative people, which has always fallen well within the scope of Science Week. (Even Oculus’ Michael Abrash has the title “chief scientist,” and I don’t suppose he’s making discoveries so much as helping apply them.)

I've mentioned before that I casually predicted the nature of the now relatively-imminent wave of augmented reality by running-up in a Nintendo Power contest in 1996. My notion: a head-worn headset would map the room using lasers, store the information as a model, and create the illusion before the wearer’s eyes that game elements were milling about the space. 25 years later, I’m not sure there’s any aspect of that I got wrong. As Apple’s first AR device becomes increasingly rumoured and evidenced, “LiDAR” is the abbreviated name for the sensors which can currently model rooms on the iPhone and iPad, using none other than laser light.

Meanwhile, Mario Kart: Home Circuit arrived last year from Nintendo: a more modest but clever and heartful application of sensing and tracking technology to remote control racers which broadcast to your console’s display. As you control Mario or Luigi’s kart with your thumbs and watch the race on the display as usual, their on-board cameras feed back their view of the room, superimposing obstacles and guard rails. It’s a gorgeous way to convey an imaginary experience which has only just become practical to render.

A number of companies have made serious advances in VR and AR last decade, and I think we have every reason to suppose these are the final years of a world without digitally enhanced windows on our own world, and other imaginary realms.