Incidentally, I took note of the word "bivalve" when sharing the definition of "cockle" earlier – it just refers to a mollusc with a hinged shell.
It reminded me of the word "bilabial" (any sound involving the lips), which reminded me mainly of the linguistic term "bilabial trill": that vibrational, motorboat-like pronunciation. (Other common trills include the lingual trill, or rolling your "r," and the guttural trill, heard when gargling with or without liquid.)
Anyway, this inspired me to consider the possibility of a "bivalvial trill": the hypothetical vibration of a hinged shell. I'm not sure whether this has ever existed in nature, but now that I've wondered about it, I'd be especially enthused to know it does.
I've just enjoyed the 2016 film Snowden, a lengthy, fact-based drama about someone I already thought a remarkable person.
The captioner knew what he or she was doing: when a character said "sequel," the transcription appropriately read "SQL," a database-related language. What else was there? "Siggint?" That would be "SIGINT," more digital security jargon. Those were satisfying to see throughout, but for a movie costing 40 million dollars, it would have been embarrassing to miss a single one.
Then, up this popped:
Everyday, I go in and I get starting points for SIGINT…
I posted this a few days ago. Tell me my introductory assessment is wrong.
I'd only watched The Dark Crystal (the original film from 1982) recently, on something of a quest to unearth any well-crafted fictional spectacles I'd missed growing up. I knew about it, but the trailer never enticed me – it felt like something "other people liked." In the modern dearth of moods and flavours of the cinematic past, though, its impressionistic orchestral score and brooding announcer (more like a gritty Vincent Price than a gritty James Earl Jones) persuaded me to give it a chance. Whether I happened to like it seemed vanishingly relevant as I watched the astronomical level of craft and care evident in every crevice of every shot. It may be niche, but niche implies deep appreciation, and I think the Netflix-commissioned episodic prequel from 2019 was meant to leverage that appreciation decades later.
It was simply great to see, in this era, the bulk of a show's credits recognizing physical craftspeople and puppeteers. It fels like the second-most populous creditees were actors. But a character's voice artist was credited alongside its puppeteer, with no further specificity. This character was played by these two people, period. Excellent. (What wasn't excellent was that Netflix insisted on shrinking the video to a corner the moment each episode's credits began, throwing a wrench into that recognition as well as one of any episode's most carefully-considered transitional moments.)
Sets and puppets. While these physical and real things provided most of the filmed visuals in the manner of the original film, Age Of Resistance, though seeming to resist over-indulgence in this area, still leaned too heavily on CG effects for my liking, with unreined use of swirls, sparkles, and fully-rendered characters that I wish had been puppets. A sentient rock monster and a clan of arachnoids were fully rendered, and I have no doubt that decades ago, these would have been championed with unwavering confidence and expertise by the Jim Henson troupe. After all, a calm and confident "these are puppets, get over it" seemed the prevailing conceit of this universe, and the highly varied Gelflings, Skeksis and Mystics were put forth as fully alive, fully sentient beings in a world that evolved in total disconnection from the familiarity of humanity. This frees you to watch and imagine how these beings could have evolved on (or off) their planet.
The voice cast seems to embrace this. The characters feel sincere and untethered to Hollywood-style clichés, tones or inflections – perhaps untethered even to the script, as offhand remarks, mutters, and inquiring hums or implicating grunts often overlap the prominent speaker and each other. It lends itself to apparent realness in a way today's directors seldom seem courageous enough to allow.
Weaknesses, aside from the heavy-handed computer-generated visuals, include the cinematography, and, unfortunately, the music. The music is "movie-quality," which isn't saying much today. In terms of music theory (which is perceptible intellectually by musicians and intuitively by laypeople), it's the opposite of bold, and pales in comparison to the score of the 1982 film. It has themes, but no themes, if you will. Watching the 1982 trailer is a better musical experience than watching the new series in its entirety.
Today's directors, I recently thought, seem to think they're in Speed, terrified the production will explode if the camera fails to move at less than 55 miles per hour. Sadly, the boldness of Age Of Resistance is sullied by this fear as well. If its sets and characters were a museum exhibit, children and adults would stare at single scenes for minutes at a time and not want to leave, whereas I found myself lucky to regard its stunning vistas, dwellings, and coves for even two seconds. And again (honestly, why?), for simple exchanges between hero or villain conspirators, the camera can't resist circling as though we're watching them from a temperamental merry-go-round. In fact, the show's overall pace seemed equally rushed, even hampering absorbability of the pilot's crucial exposition. Among the ten episodes, there are two scenes I know I'll remember: one where two frightened main characters work through a language barrier toward a hesitant alliance, and one where another two primaries expose the world's origin story by performing an elaborate puppet show for two others. (So much lends life to these characters, but revealing their artistic whims this way was a bright star on that tree.) Of course it cut when appropriate, but refreshingly, these were scenes in which the camera was freed from the almost neurotic-feeling perfectionism that so often seized it. Here, it was not liberally dollied, shaken, thrown around, flipped, tilted, blended, cement-mixed, or fired into a wormhole. Would that this scene had inspired all the others.
For those weaknesses, there was so much achieved. I don't see myself watching the series again, but I may return to the original film for its inspired music, its room to breathe, and the spectacle it was designed to offer a theatrical audience. While Age Of Resistance set itself up for a second season, and I might watch that, the show wasn't renewed.
It was those unusual and courageous traits of Age Of Resistance – the traits that made The Dark Crystal shine – that made me want to write about it. I'd like to see TV and movies overcome their nervous, less courageous tendencies and refocus on achieving their greatness by embracing choices inspired not by what already exists, but by the creator's imagination and gut.
The common mistake is to use "everyday" where you should use "every day." People make it when texting. A dozen businesses in my town make it on their store signs, menus, and painstakingly hand-painted window messaging. Corporations make it right in expensive TV commercials and billboards. But worse than that is the indifference: the habit spans the decades because it's seldom criticized or even recognized.
Yet, solving this requires no complex flow chart. It's simple. Here goes.
"Everyday," as one word, is an adjective. It means "happening every day" or "commonplace," as in "your everyday stroll," "his everyday routine," or "your simple, everyday plyin' shears." (Yes, plyin' shears. If you haven't plied with shears, then they may not be your simple, everyday plyin' shears.)
"Every day" is a common phrase, but it's just an everyday case of two words that happen to be used as a phrase. "Every" is a determiner, and "day" is the word whose relationship to the sentence it specifies, as in "Billiam visited the Heckin' Café every day. Not just some days, not once every four days, but every day." Grammatically, "every day" is identical to "each pear," "no volcano," "some planets," or "that clam." Unsurprisingly, each of those examples is a two-word phrase. Even if you hadn't known what a determiner was, your intuition likely enables you to feel the relationship of the phrase's first word to the second, and you can feel that relationship between "every" and "day."
This illustrates the mistake's impact on readers: once you understand the difference, reading "we're open everyday" feels like reading "Francine went to the orchard and inspected eachpear," or "Jess Phoenix decided novolcano should go unstudied," "perhaps someplanets have atmospheres like Earth's," or "if only thatclam could win the World's Finest Clam award." All who misuse "everyday" are constantly doing that without realizing it.
That's it. If you missed it in grade school, two or three minutes to learn the difference and you're set for life.
The Oxford dictionary:
1. an edible burrowing bivalve mollusc with a strong ribbed shell.
Why, yes. The definition includes "edible."
I haven't searched every related definition, but I checked the definitions of "human," "ape," and "mammal," and – counter-intuitively, perhaps – found "edible" in none of them.
(Note: I use it routinely regardless, but in this case, I used the "Oxford comma" slightly more consciously.)
(Note: I've observed the BBC phrases some headlines as questions, perhaps to keep them journalistic, according to their estimation.)
It's not just a great resource for web developers to learn which browsers support which features, but the colour theme fits my tastes so closely. (It responds to system-wide dark mode by darkening and de-texturing the background, but the light version appeals almost as much to me, which I can't often say of sites.)
Homebrew's site is a close second.
At best, I find spelling mistakes amusing: perhaps conjuring an unintended conversational phrase, or done purposely in humour, irony or humility. At worst, I find them irksome; say, repeated misusage out of negligence. Today, anything you type on is likely to try to correct you – indeed, to overcorrect you – so I don't see an excuse for that.
But, who am I to talk? I have the advantage of always having loved spelling. We'd have spelling homework in first grade, and I recall correcting my classmates' work before the teacher got to it. As with many things, I suppose interest in spelling correllates with skill. I've never known the feeling of disliking spelling: an aversion even to working to improve.
So, this was an interesting post for me: Aaron Moodie on his realization his discomfort about spelling made him hesitate to write his own blog.
Someone mentioned this to me on Twitter (hooray for functional Twitter!) when I asked whether there was a name for a particular turn of conversation around a contentious topic. I'm afraid I forget the topic, but the conversation's form resembled the following: person A said "here's a problem," then person B said "that's not nearly as bad as this other problem."
What's wrong with that? Nothing, inherently. What would be wrong, though, would be the reader, if she thought "problem B is the worse problem, so problem A is therefore not a problem."
Indeed, I had the impression this might have been person B's implication. Strong emphasis on might, of course – I could have tried to find out by asking directly, and one of the commonest and worst mistakes available when discussing on the Internet is to jump to conclusions about what your conversational partners mean. (Instead, I raised that question about the fallacy's name.) That's also why this fallacy, like other fallacies, might be slipperier to identify: for someone aiming to use it, it could be slipped in to dialogue under cover of ambiguity.
(Note: this also seems to be what I've sometimes heard Americans call "whataboutism.")
Continuous walking shots taken through bustling urban Japan have been a happy genre lately, but this was simply superior: super-smooth camerawork, temperate exposure, and the photographer's intuition to dwell on interesting signs and vistas satisfyingly matched up with mine. I ended up watching a whole chunk of this in one go.
… They're referring to the shelf launch date. I wasn't so attuned to Apple's announcements and launch dates then, but I remember its aesthetic warming past the horizon of my mind, with its plastic-looking stripes and bulbousnesses. The design seemed to match the tones and materials of the first iMacs, whose beigeless and rectangle-less appearance, considered Jony Ive's big splash, would signify that Apple was back. At least, that's what Steve Jobs seemed to hope, and this statement was more than skin-deep: he'd brought NeXT software into the picture, and this would serve as the foundation of Mac OS X, later known as OS X, later known as macOS.
The resolution of the first monitor on which I viewed my own installation, I think, was either 640 by 480 or 800 by 600. It was still the age in which one could easily see the pixels, and if I focused and squinted, I could make out the red, green and blue elements. There was no escaping the feeling that everything boiled down to horizontal and vertical lines, rectangles and squares, and that was a comforting and familiar feeling. It felt harmonious with Mac OS 9 and everything before, where bitmap fonts thrived and anti-aliasing seemed a more theoretical concept. Now, throughout this plasticky paradigm, I was seeing smoothed fonts, soft round buttons, and window edges that seemed to blur into each other. A minimized window would squeeze itself down into the dock like a pair of hands contouring an exaggerated hourglass figure on a mannequin, with no trace of pixellation. I wouldn't have called myself a fluent programmer then, but I could tell something fundamentally different was happening on a technical level. That would have been obvious anyway: apps like Adobe Premiere couldn't even run on Mac OS X – at least, not natively. Booting an app in "classic mode," one of my first tastes of emulation, temporarily plopped its pixellated countenance right into this blearier-looking world.
I've since learned about what was happening technically: terms like "object-oriented," "protected memory," "preemptive multitasking," and "UNIX-based" have unmasked themselves as sources of delight and wonder. Systems can crash, data can be corrupted, and programming takes ages, but these were ways around those, real treatments for causes rather than symptoms, things that would excite you if this was your life.
Steve had said on stage that Mac OS X would set Apple up "for the next 20 years," and he turned out to have understated. I'm not sure how much proverbial DNA today's version shares with the one launched on that day; it might be like comparing Homo sapiens to Homo erectus. The plastickiness is long gone, and the iMacs they imitate – while still in use here and there – feel more like relics than modern products. But those aforementioned concepts have stayed right at the heart of the system, having seen it across 17 major releases and three different companies' processor architectures. No sign it's about to end.
(Oh, and Scott says hi.)
Here's the breaking report from TechCrunch.
The HomePod was one of those rarer things I decided to adopt early, and I wrote a few first impressions when I switched to two HomePod mini after using the original HomePod for a couple of years. In summary: the HomePod mini sounded a little cheaper by comparison, but I stress "by comparison." The original's sound was so "good" – deliberately in quotation marks – as to sound almost alien, inspiring worry I'd disturb my adjacent neighbours with its pristine but unadjustable bass.
It didn't take me long to forget the relative cheapness. Now the Minis just sound good to me (without quotation marks, and with pluralization, since I bought a second Mini before long. And that's the main thing: no matter how good the original got, I think I've confirmed through this transition that stereo anything is better than mono everything, so to speak. At retail, stereo Minis cost 200 USD, and stereo originals cost 600 after Apple's first price cut).
The original was a heartfelt experiment, years in the making. I'm a little surprised it ended this way, but the ending helps confirm that my subjective perception of audio quality may not be so far from some allegedly objective ideal. As John Cleese once said of wine, "don't let anyone tell you what wine you should like."
It's been about a year since I played much music, which hasn't happened since my first piano lesson, before I was 2,000 days old. Through my earliest hours spent learning the Super Mario Bros. theme by ear, to composing the Frontiers soundtrack and having worked on over 85 musical stage shows, the piano has been much of my leisure and livelihood for decades.
I'd come to yearn to explore other things with the same ardency and leisure, and I wanted to shelve music long enough to forget about it. Not to disown it, but to get an opportunity to see it from the outside again, as non-musicians do. It's the kind of perspective I might want to recapture on the English language: a detachment some monolinguals might barely think to imagine, not even to their deaths.
This week I put on the headphones and improvised for a while. I wasn't really struck until afterward, and what struck me was that it hadn't been a while, but just a few minutes. I'm not sure whether playing had always done that for me, or whether it was plainer because I'd done it less. I was reminded of the rather loaded word "meditation" – perhaps loaded because its users seem to feel little pressure to define it, and because at least one usage refers to a pretty natural state of not recalling all one's concerns at once, which doesn't call for a fancy word. A counterintuitive observation, though: a few minutes spent making minutes seem longer is useful amid a routine where someone feels they can't afford a few minutes for anything.
I've seen composers' lives analyzed into phases, where their body of work seems to change with personal events or revelations. I never aim to change for change's sake, but having gained a small dose of the perspective I wanted, I think I understand that phenomenon a little better. If this all yields some change for me, I hope it feels more like it always should have.