Journal tags: us

291

Filters

My phone rang today. I didn’t recognise the number so although I pressed the big button to answer the call, I didn’t say anything.

I didn’t say anything because usually when I get a call from a number I don’t know, it’s some automated spam. If I say nothing, the spam voice doesn’t activate.

But sometimes it’s not a spam call. Sometimes after a few seconds of silence a human at the other end of the call will say “Hello?” in an uncertain tone. That’s the point when I respond with a cheery “Hello!” of my own and feel bad for making this person endure those awkward seconds of silence.

Those spam calls have made me so suspicious that real people end up paying the price. False positives caught in my spam-detection filter.

Now it’s happening on the web.

I wrote about how Google search, Bing, and Mozilla Developer network are squandering trust:

Trust is a precious commodity. It takes a long time to build trust. It takes a short time to destroy it.

But it’s not just limited to specific companies. I’ve noticed more and more suspicion related to any online activity.

I’ve seen members of a community site jump to the conclusion that a new member’s pattern of behaviour was a sure sign that this was a spambot. But it could just as easily have been the behaviour of someone who isn’t neurotypical or who doesn’t speak English as their first language.

Jessica was looking at some pictures on an AirBnB listing recently and found herself examining some photos that seemed a little too good to be true, questioning whether they were in fact output by some generative tool.

Every email that lands in my inbox is like a little mini Turing test. Did a human write this?

Our guard is up. Our filters are activated. Our default mode is suspicion.

This is most apparent with web search. We’ve always needed to filter search results through our own personal lenses, but now it’s like playing whack-a-mole. First we have to find workarounds for avoiding slop, and then when we click through to a web page, we have to evaluate whether’s it’s been generated by some SEO spammer making full use of the new breed of content-production tools.

There’s been a lot of hand-wringing about how this could spell doom for the web. I don’t think that’s necessarily true. It might well spell doom for web search, but I’m okay with that.

Back before its enshittification—an enshittification that started even before all the recent AI slop—Google solved the problem of accurate web searching with its PageRank algorithm. Before that, the only way to get to trusted information was to rely on humans.

Humans made directories like Yahoo! or DMOZ where they categorised links. Humans wrote blog posts where they linked to something that they, a human, vouched for as being genuinely interesting.

There was life before Google search. There will be life after Google search.

Look, there’s even a new directory devoted to cataloging blogs: websites made by humans. Life finds a way.

All of the spam and slop that’s making us so suspicious may end up giving us a new appreciation for human curation.

It wouldn’t be a straightforward transition to move away from search. It would be uncomfortable. It would require behaviour change. People don’t like change. But when needs must, people adapt.

The first bit of behaviour change might be a rediscovery of bookmarks. It used to be that when you found a source you trusted, you bookmarked it. Browsers still have bookmarking functionality but most people rely on search. Maybe it’s time for a bookmarking revival.

A step up from that would be using a feed reader. In many ways, a feed reader is a collection of bookmarks, but all of the bookmarks get polled regularly to see if there are any updates. I love using my feed reader. Everything I’ve subscribed to in there is made by humans.

The ultimate bookmark is an icon on the homescreen of your phone or in the dock of your desktop device. A human source you trust so much that you want it to be as accessible as any app.

Right now the discovery mechanism for that is woeful. I really want that to change. I want a web that empowers people to connect with other people they trust, without any intermediary gatekeepers.

The evangelists of large language models (who may coincidentally have invested heavily in the technology) like to proclaim that a slop-filled future is inevitable, as though we have no choice, as though we must simply accept enshittification as though it were a force of nature.

But we can always walk away.

Applying the four principles of accessibility

Web Content Accessibility Guidelines—or WCAG—looks very daunting. It’s a lot to take in. It’s kind of overwhelming. It’s hard to know where to start.

I recommend taking a deep breath and focusing on the four principles of accessibility. Together they spell out the cutesy acronym POUR:

  1. Perceivable
  2. Operable
  3. Understandable
  4. Robust

A lot of work has gone into distilling WCAG down to these four guidelines. Here’s how I apply them in my work…

Perceivable

I interpret this as:

Content will be legible, regardless of how it is accessed.

For example:

  • The contrast between background and foreground colours will meet the ratios defined in WCAG 2.
  • Content will be grouped into semantically-sensible HTML regions such as navigation, main, footer, etc.

Operable

I interpret this as:

Core functionality will be available, regardless of how it is accessed.

For example:

  • I will ensure that interactive controls such as links and form inputs will be navigable with a keyboard.
  • Every form control will be labelled, ideally with a visible label.

Understandable

I interpret this as:

Content will make sense, regardless of how it is accessed.

For example:

  • Images will have meaningful alternative text.
  • I will make sensible use of heading levels.

This is where it starts to get quite collaboritive. Working at an agency, there will some parts of website creation and maintenance that will require ongoing accessibility knowledge even when our work is finished.

For example:

  • Images uploaded through a content management system will need sensible alternative text.
  • Articles uploaded through a content management system will need sensible heading levels.

Robust

I interpret this as:

Content and core functionality will still work, regardless of how it is accessed.

For example:

  • Drop-down controls will use the HTML select element rather than a more fragile imitation.
  • I will only use JavaScript to provide functionality that isn’t possible with HTML and CSS alone.

If you’re applying a mindset of progressive enhancement, this part comes for you. If you take a different approach, you’re going to have a bad time.

Taken together, these four guidelines will get you very far without having to dive too deeply into the rest of WCAG.

Trust

In their rush to cram in “AI” “features”, it seems to me that many companies don’t actually understand why people use their products.

Google is acting as though its greatest asset is its search engine. Same with Bing.

Mozilla Developer Network is acting as though its greatest asset is its documentation. Same with Stack Overflow.

But their greatest asset is actually trust.

If I use a search engine I need to be able to trust that the filtering is good. If I look up documentation I need to trust that the information is good. I don’t expect perfection, but I also don’t expect to have to constantly be thinking “was this generated by a large language model, and if so, how can I know it’s not hallucinating?”

“But”, the apologists will respond, “the results are mostly correct! The documentation is mostly true!”

Sure, but as Terence puts it:

The intern who files most things perfectly but has, more than once, tipped an entire cup of coffee into the filing cabinet is going to be remembered as “that klutzy intern we had to fire.”

Trust is a precious commodity. It takes a long time to build trust. It takes a short time to destroy it.

I am honestly astonished that so many companies don’t seem to realise what they’re destroying.

Speculation rules and fears

After I wrote positively about the speculation rules API I got an email from David Cizek with some legitimate concerns. He said:

I think that this kind of feature is not good, because someone else (web publisher) decides that I (my connection, browser, device) have to do work that very often is not needed. All that blurred by blackbox algorithm in the browser.

That’s fair. My hope is that the user will indeed get more say, whether that’s at the level of the browser or the operating system. I’m thinking of a prefers-reduced-data setting, much like prefers-color-scheme or prefers-reduced-motion.

But this issue isn’t something new with speculation rules. We’ve already got service workers, which allow the site author to unilaterally declare that a bunch of pages should be downloaded.

I’m doing that for Resilient Web Design—when you visit the home page, a service worker downloads the whole site. I can justify that decision to myself because the entire site is still smaller in size than one article from Wired or the New York Times. But still, is it right that I get to make that call?

So I’m very much in favour of browsers acting as true user agents—doing what’s best for the user, even in situations where that conflicts with the wishes of a site owner.

Going back to speculation rules, David asked:

Do we really need this kind of (easily turned to evil) enhancement in the current state of (web) affairs?

That question could be asked of many web technologies.

There’s always going to be a tension with any powerful browser feature. The more power it provides, the more it can be abused. Animations, service workers, speculation rules—these are all things that can be used to improve websites or they can be abused to do things the user never asked for.

Or take the elephant in the room: JavaScript.

Right now, a site owner can link to a JavaScript file that’s tens of megabytes in size, and the browser has no alternative but to download it. I’d love it if users could specify a limit. I’d love it even more if browsers shipped with a default limit, especially if that limit is related to the device and network.

I don’t think speculation rules will be abused nearly as much as client-side JavaScript is already abused.

My approach to HTML web components

I’ve been deep-diving into HTML web components over the past few weeks. I decided to refactor the JavaScript on The Session to use custom elements wherever it made sense.

I really enjoyed doing this, even though the end result for users is exactly the same as before. This was one of those refactors that was for me, and also for future me. The front-end codebase looks a lot more understandable and therefore maintainable.

Most of the JavaScript on The Session is good ol’ DOM scripting. Listen for events; when an event happens, make some update to some element. It’s the kind of stuff we might have used jQuery for in the past.

Chris invoked Betteridge’s law of headlines recently by asking Will Web Components replace React and Vue? I agree with his assessment. The reactivity you get with full-on frameworks isn’t something that web components offer. But I do think web components can replace jQuery and other approaches to scripting the DOM.

I’ve written about my preferred way to do DOM scripting: element.target.closest. One of the advantages to that approach is that even if the DOM gets updated—perhaps via Ajax—the event listening will still work.

Well, this is exactly the kind of thing that custom elements take care of for you. The connectedCallback method gets fired whenever an instance of the custom element is added to the document, regardless of whether that’s in the initial page load or later in an Ajax update.

So my client-side scripting style has updated over time:

  1. Adding event handlers directly to elements.
  2. Adding event handlers to the document and using event.target.closest.
  3. Wrapping elements in a web component that handles the event listening.

None of these progressions were particularly ground-breaking or allowed me to do anything I couldn’t do previously. But each progression improved the resilience and maintainability of my code.

Like Chris, I’m using web components to progressively enhance what’s already in the markup. In fact, looking at the code that Chris is sharing, I think we may be writing some very similar web components!

A few patterns have emerged for me…

Naming custom elements

Naming things is famously hard. Every time you make a new custom element you have to give it a name that includes a hyphen. I settled on the convention of using the first part of the name to echo the element being enhanced.

If I’m adding an enhancement to a button element, I’ll wrap it in a custom element that starts with button-. I’ve now got custom elements like button-geolocate, button-confirm, button-clipboard and so on.

Likewise if the custom element is enhancing a link, it will begin with a-. If it’s enhancing a form, it will begin with form-.

The name of the custom element tells me how it’s expected to be used. If I find myself wrapping a div with button-geolocate I shouldn’t be surprised when it doesn’t work.

Naming attributes

You can use any attributes you want on a web component. You made up the name of the custom element and you can make up the names of the attributes too.

I’m a little nervous about this. What if HTML ends up with a new global attribute in the future that clashes with something I’ve invented? It’s unlikely but it still makes me wary.

So I use data- attributes. I’ve already got a hyphen in the name of my custom element, so it makes sense to have hyphens in my attributes too. And by using data- attributes, the browser gives me automatic reflection of the value in the dataset property.

Instead of getting a value with this.getAttribute('maximum') I get to use this.dataset.maximum. Nice and neat.

The single responsibility principle

My favourite web components aren’t all-singing, all-dancing powerhouses. Rather they do one thing, often a very simple thing.

Here are some examples:

  • Jason’s aria-collapsable for toggling the display of one element when you click on another.
  • David’s play-button for adding a play button to an audio or video element.
  • Chris’s ajax-form for sending a form via Ajax instead of a full page refresh.
  • Jim’s user-avatar for adding a tooltip to an image.
  • Zach’s table-saw for making tables responsive.

All of those are HTML web components in that they extend your existing markup rather than JavaScript web components that are used to replace HTML. All of those are also unambitious by design. They each do one thing and one thing only.

But what if my web component needs to do two things?

I make two web components.

The beauty of custom elements is that they can be used just like regular HTML elements. And the beauty of HTML is that it’s composable.

What if you’ve got some text that you want to be a level-three heading and also a link? You don’t bemoan the lack of an element that does both things. You wrap an a element in an h3 element.

The same goes for custom elements. If I find myself adding multiple behaviours to a single custom element, I stop and ask myself if this should be multiple custom elements instead.

Take some of those button- elements I mentioned earlier. One of them copies text to the clipboard, button-clipboard. Another throws up a confirmation dialog to complete an action, button-confirm. Suppose I want users to confirm when they’re copying something to their clipboard (not a realistic example, I admit). I don’t have to create a new hybrid web component. Instead I wrap the button in the two existing custom elements.

Rather than having a few powerful web components, I like having lots of simple web components. The power comes with how they’re combined. Like Unix pipes. And it has the added benefit of stopping my code getting too complex and hard to understand.

Communicating across components

Okay, so I’ve broken all of my behavioural enhancements down into single-responsibility web components. But what if one web component needs to have awareness of something that happens in another web component?

Here’s an example from The Session: the results page when you search for sessions in London.

There’s a map. That’s one web component. There’s a list of locations. That’s another web component. There are links for traversing backwards and forwards through the locations via Ajax. Those links are in web components too.

I want the map to update when the list of locations changes. Where should that logic live? How do I get the list of locations to communicate with the map?

Events!

When a list of locations is added to the document, it emits a custom event that bubbles all the way up. In fact, that’s all this component does.

You can call the event anything you want. It could be a newLocations event. That event is dispatched in the connectedCallback of the component.

Meanwhile in the map component, an event listener listens for any newLocations events on the document. When that event handler is triggered, the map updates.

The web component that lists locations has no idea that there’s a map on the same page. It doesn’t need to. It just needs to dispatch its event, no questions asked.

There’s nothing specific to web components here. Event-driven programming is a tried and tested approach. It’s just a little easier to do thanks to the connectedCallback method.

I’m documenting all this here as a snapshot of my current thinking on HTML web components when it comes to:

  • naming custom elements,
  • naming attributes,
  • the single responsibility principle, and
  • communicating across components.

I may well end up changing my approach again in the future. For now though, these ideas are serving me well.

Displaying HTML web components

Those HTML web components I made for date inputs are very simple. All they do is slightly extend the behaviour of the existing input elements.

This would be the ideal use-case for the is attribute:

<input is="input-date-future" type="date">

Alas, Apple have gone on record to say that they will never ship support for customized built-in elements.

So instead we have to make HTML web components by wrapping existing elements in new custom elements:

<input-date-future>
  <input type="date">
<input-date-future>

The end result is the same. Mostly.

Because there’s now an additional element in the DOM, there could be unexpected styling implications. Like, suppose the original element was direct child of a flex or grid container. Now that will no longer be true.

So something I’ve started doing with HTML web components like these is adding something like this inside the connectedCallback method:

connectedCallback() {
    this.style.display = 'contents';
  …
}

This tells the browser that, as far as styling is concerned, there’s nothing to see here. Move along.

Or you could (and probably should) do it in your stylesheet instead:

input-date-future {
  display: contents;
}

Just to be clear, you should only use display: contents if your HTML web component is augmenting what’s within it. If you add any behaviours or styling to the custom element itself, then don’t add this style declaration.

It’s a bit of a hack to work around the lack of universal support for the is attribute, but it’ll do.

Pickin’ dates on iOS

This is a little follow-up to my post about web components for date inputs.

If you try the demo on iOS it doesn’t work. There’s nothing stopping you selecting any date.

That’s nothing to do with the web components. It turns out that Safari on iOS doesn’t support min and max on date inputs. This is also true of any other browser on iOS because they’re all just Safari in a trenchcoat …for now.

I was surprised — input type="date" has been around for a long time now. I mean, it’s not the end of the world. You’d have to do validation on inputted dates on the server anyway, but it sure would be nice for the user experience of filling in forms.

Alas, it doesn’t look like this is something on the interop radar.

What really surprised me was looking at Can I Use. That shows Safari on iOS as fully supporting date inputs.

Maybe it’s just semantic nitpickery on my part but I would consider that the lack of support for the min and max attributes means that date inputs are partially supported.

Can I Use gets its data from here. I guess I need to study the governance rules and try to figure out how to submit a pull request to update the currently incorrect information.

Pickin’ dates

I had the opportunity to trim some code from The Session recently. That’s always a good feeling.

In this case, it was a progressive enhancement pattern that was no longer needed. Kind of like removing a polyfill.

There are a couple of places on the site where you can input a date. This is exactly what input type="date" is for. But when I was making the interface, the support for this type of input was patchy.

So instead the interface used three select dropdowns: one for days, one for months, and one for years. Then I did a bit of feature detection and if the browser supported input type="date", I replaced the three selects with one date input.

It was a little fiddly but it worked.

Fast forward to today and input type="date" is supported across the board. So I threw away the JavaScript and updated the HTML to use date inputs by default. Nice!

I was discussing date inputs recently when I was talking to students in Amsterdam:

They’re given a PDF inheritance-tax form and told to convert it for the web.

That form included dates. The dates were all in the past so the students wanted to be able to set a max value on the datepicker. Ideally that should be done on the server, but it would be nice if you could easily do it in the browser too.

Wouldn’t it be nice if you could specify past dates like this?

<input type="date" max="today">

Or for future dates:

<input type="date" min="today">

Alas, no such syntactic sugar exists in HTML so we need to use JavaScript.

This seems like an ideal use-case for HTML web components:

Instead of all-singing, all-dancing web components, it feels a lot more elegant to use web components to augment your existing markup with just enough extra behaviour.

In this case, it would be nice to augment an existing input type="date" element. Something like this:

 <input-date-past>
   <input type="date">
 </input-date-past>

Here’s the JavaScript that does the augmentation:

 customElements.define('input-date-past', class extends HTMLElement {
     constructor() {
         super();
     }
     connectedCallback() {
         this.querySelector('input[type="date"]').setAttribute('max', new Date().toISOString().substring(0,10));
     }
 });

That’s it.

Here’s a CodePen where you can see it in action along with another HTML web component for future dates called, you guessed it, input-date-future.

See the Pen Date input HTML web components by Jeremy Keith (@adactio) on CodePen.

Ad revenue

It’s been dispiriting but unsurprising to see American commentators weigh in on the EU’s Digital Markets Act. I really wish they’d read Baldur’s excellent explainer first.

John has been doing his predictable “leave Britney alone!” schtick with regards to Apple (and in this case, Google and Facebook too). Ian Betteridge does an excellent job of setting him straight:

A lot of commentators seem to have the same issue as John: that it’s weird that a governmental body can or should define how products should be designed.

But governments mandate how products are designed all the time, and not just in the EU. Take another market which is pretty big: cars. All cars have to feature safety equipment, which varies from region to region but will broadly include everything from seatbelts to crumple zones. Cars have rules for emissions, for fuel efficiency, all of which are designing how a car should work.

But there’s one assumption in John’s post that Ian didn’t push back on. John said:

It’s certainly possible that Meta can devise ways to serve non-personalized contextual ads that generate sufficient revenue per user.

That comes with a footnote:

One obvious solution would be to show more ads — a lot more ads — to make up for the difference in revenue. So if contextual ads generate, say, one-tenth the revenue of targeted ads, Meta could show 10 times as many ads to users who opt out of targeting. I don’t think 10× is an outlandish multiplier there — given how remarkably profitable Meta’s advertising business is, it might even need to be higher than that.

It’s almost like an article of faith that behavioural advertising is more effective than contextual advertising. But there’s no data to support this. Quite the opposite. I wrote about this four years ago.

Once again, I urge you to read the excellent analysis by Jesse Frederik and Maurits Martijn.

There’s also Tim Hwang’s book, Subprime Attention Crisis:

From the unreliability of advertising numbers and the unregulated automation of advertising bidding wars, to the simple fact that online ads mostly fail to work, Hwang demonstrates that while consumers’ attention has never been more prized, the true value of that attention itself—much like subprime mortgages—is wildly misrepresented.

More recently Dave Karpf said what we’re all thinking:

The thing I want to stress about microtargeted ads is that the current version is perpetually trash, and we’re always just a few years away from the bugs getting worked out.

The EFF are calling for a ban. Should that happen, the sky would not fall. Contrary to what John thinks, revenue would not plummet. Contextual advertising works just fine …without the need for invasive surveillance and tracking.

Like I said:

Tracker-driven behavioural advertising is bad for users. The advertisements are irrelevant most of the time, and on the few occasions where the advertising hits the mark, it just feels creepy.

Tracker-driven behavioural advertising is bad for advertisers. They spend their hard-earned money on invasive ad tech that results in no more sales or brand recognition than if they had relied on good ol’ contextual advertising.

Tracker-driven behavioural advertising is very bad for the web. Megabytes of third-party JavaScript are injected at exactly the wrong moment to make for the worst possible performance. And if that doesn’t ruin the user experience enough, there are still invasive overlays and consent forms to click through (which, ironically, gets people mad at the legislation—like GDPR—instead of the underlying reason for these annoying overlays: unnecessary surveillance and tracking by the site you’re visiting).

Headsongs

When I play music, it’s almost always instrumental. If you look at my YouTube channel almost all the videos are of me playing tunes—jigs, reels, and so on.

Most of those videos were recorded during The Situation when I posted a new tune every day for 200 consecutive days. Every so often though, I’d record a song.

I go through periods of getting obsessed with a particular song. During The Situation I remember two songs that were calling to me. New York was playing in my head as I watched my friends there suffering in March 2020. And Time (The Revelator) resonated in lockdown:

And every day is getting straighter, time’s a revelator.

Time (The Revelator) on mandolin

The song I’m obsessed with right now is called Foreign Lander. I first came across it in a beautiful version by Sarah Jarosz (I watch lots of mandolin videos on YouTube so the algorithm hardly broke a sweat showing this to me).

Time (The Revelator) on mandolin

There’s a great version by Tatiana Hargreaves too. And Tim O’Brien.

I wanted to know more about the song. I thought it might be relatively recent. The imagery of the lyrics makes it sound like something straight from a songwriter like Nick Cave:

If ever I prove false love
The elements would moan
The fire would turn to ice love
The seas would rage and burn

But the song is old. Jean Ritchie collected it, though she didn’t have to go far. She said:

Foreign Lander was my Dad’s proposal song to Mom

I found that out when I came across this thread from 2002 on mudcat.org where Jean Ritchie herself was a regular contributor!

That gave me a bit of vertiginous feeling of The Great Span, thinking about the technology that she used when she was out in the field.

In the foreground, Séamus Ennis sits with his pipes. In the background, Jean Ritchie is leaning intently over her recording equipment.

I’ve been practicing Foreign Lander and probably driving Jessica crazy as I repeat over and over and over. It’s got some tricky parts to sing and play together which is why it’s taking me a while. Once I get it down, maybe I’ll record a video.

I spent most of Saturday either singing the song or thinking about it. When I went to bed that night, tucking into a book, Foreign Lander was going ‘round in my head.

Coco—the cat who is not our cat—came in and made herself comfortable for a while.

I felt very content.

A childish little rhyme popped into my head:

With a song in my head
And a cat on my bed
I read until I sleep

I almost got up to post it as a note here on my website. Instead I told myself to do it the morning, hoping I wouldn’t forget.

That night I dreamt about Irish music sessions. Don’t worry, I’m not going to describe my dream to you—I know how boring that is for everyone but the person who had the dream.

But I was glad I hadn’t posted my little rhyme before sleeping. The dream gave me a nice little conclusion:

With a song in my head
And a cat on my bed
I read until I sleep
And dream of rooms
Filled with tunes.

PageSpeed Insights bookmarklet

I’m a little obsessed with web performance. I like being able to check a page’s core web vitals quickly and easily.

Four years ago, I made a Lighthouse bookmarklet. Whatever web page you were on, when you clicked on the bookmarklet you’d get the Lighthouse results for that page. Handy!

It doesn’t work anymore. This is probably because Google are in the loop. Four years is pretty good innings for anything involving that company.

I kid (mostly). Lighthouse itself is still going strong, despite being a Google product. But the bookmarklet needs updating.

Rather than just get Lighthouse results, I figured that the full PageSpeed Insights results would be even better. If your website is in the Chrome UX report, you get to see those CrUX details too.

So here’s the updated bookmarklet:

PageSpeed Insights

Drag that up to your desktop browser’s bookmarks toolbar. Press it whenever you want to test the page you’re on.

This week

It’s been another busy week of evening activities that ended up covering a range of musical styles.

Monday

On Monday night I went to the session at The Fiddler’s Elbow. It’s on every fortnight. The musicians are always great but the crowd can be more variable. Sometimes it’s too rowdy for comfort. But this week was perfect, probably because not many people are going out in late (dry) January.

The session, led by fiddler Ben Paley was exceptionally enjoyable. Nice and laid back, with a good groove.

Tuesday

On Tuesday night I stayed in and watched a film. Killers Of The Flower Moon. Two thumbs up from me.

Wednesday

On Wednesday evening it was the regular session at The Jolly Brewer. Jolly good it was too.

Thursday

On Thursday night I was back in The Jolly Brewer. My friend Rob roped me into doing a Burns Night thing. “It’s not a session, but it’s not a gig” was how he described it. I wasn’t sure what to expect.

We had been brushing up on our Scottish tunes, but we were mostly faking it. In the end it didn’t matter. I don’t think there was a single Scottish person there. But there was a good crowd enjoying their tatties and neeps with suitably-addressed haggis while we played our tunes in the background.

Some more musicians showed up: a fiddler and two banjo players. “Isn’t there old-time music here tonight?” they asked. We told them that no, it was Burns Night, but why not play some old-time tunes anyway?

So I passed the night jamming along to lots of tunes I didn’t know. I hope I wasn’t too offputting for them. It was good fun.

Friday

Finally on Friday evening it was my turn to leave my mandolin at home and listen to some music instead. The brilliant DakhaBrakha were playing out at Sussex Uni in the Attenborough Centre.

Imagine if Tom Waits and Cocteau Twins came from Eastern Europe and joined forces. Well, DakhaBrakha are even better than that.

I think I first heard them years ago on YouTube when I came across a video of them playing at KEXP. The first song caught my attention, then proceeded to mercilessly hold my attention captive until I was completely at their mercy—the way it builds and builds is just astonishing! I’ve been a fan ever since.

The gig was brilliant. I was absolutely blown away. I highly recommend seeing them if you can. Not only will you hear some brilliant music, you’ll be supporting Ukraine.

Слава Україні!

Continuous partial ick

The output of generative tools based on large language models gives me the ick.

This isn’t a measured logical response. It’s more of an involuntary emotional reaction.

I could try to justify my reaction by saying I’m concerned about the exploitation involved in the training data, or the huge energy costs involved, or the disenfranchisement of people who create art. But those would be post-facto rationalisations.

I just find myself wrinkling my nose and mentally going “Ew!” whenever somebody posts the output of some prompt they gave to ChatGPT or Midjourney.

Again, I’m not saying this is rational. It’s more instinctual.

You could well say that this is my problem. You may be right. But I wonder what it is that’s so unheimlich about these outputs that triggers my response.

Just to clarify, I am talking about direct outputs, shared verbatim. If someone were to use one of these tools in the process of creating something I’d be none the wiser. I probably couldn’t even tell that a large language model was involved at some point. I’m fine with that. It’s when someone takes something directly from one of these tools and then shares it online, that’s what raises my bile.

I was at a conference a few months back where your badge featured a hallucinated picture of you. Now, this probably sounded like a fun idea. It probably is a fun idea. I can’t tell. All I know is that it made me feel a little queasy.

Perhaps it’s a question of taste. In which case, I’m being a snob. I’m literally turning my nose up at something I deem to be tacky.

But isn’t it tacky, though? It’s not something I can describe, but there’s just something about the vibe of these images—and words—that feels off. It’s sort of creepy, but it’s mostly just the mediocrity that sits so uneasily with me.

These tools do an amazing job of solving the quantity problem—how to produce an image or piece of text quickly. And by most measurements, you could say that they also solve the quality problem. These outputs are good enough to pass for “the real thing.” The outputs are, like, 90% to 95% there. And the gap is closing.

And yet. There’s something in that gap. Something that I feel in my gut. Something that makes me go “nope.”

This week

Socialising in England usually follows a set pattern. You work during the week. You go out on the weekend.

This week I’ve been doing the exact opposite. I’ve been out every weeknight and I plan to stay in all weekend.

Monday

On Monday Jessica and I took a trip up to London. Dinner in Chinatown followed by a film in the Curzon cinema in Soho.

Usually dinner and a movie would be a fun outing, but this was a more sombre affair. The film we saw was The Zone Of Interest followed by an interview with the director, Jonathan Glazer.

The film is officially released in February. This was an advance screening organised by The Wiener Holocaust Library. Jessica is a member, which is how we got our invitations.

I was unsure whether the framing device of The Zone Of Interest would work. The hidden camera set-up could’ve come across as gimicky. But it worked all too well. The experience was disturbingly immersive, thanks in no small part to the naturalistic performances. Not showing the other side of the wall was the right decision—hearing the other side of the wall was incredibly effective. The depth of research that went into this project was palpable. It not only succeeded in its core task of showing the banality of evil, it also worked on a meta level, displaying the banality of the remembrance of evil.

See this film. And see it projected if you can.

Tuesday

With the heaviness of Monday evening still rightly staying with me, I was glad to have an opportunity to lose myself in music for a while. There was an impromptu Irish music session at the lovely Hand In Hand brewpub in Kemptown. It’s usually more of a jazz venue, but my friend Robb who works there convinced them to try a more folky evening.

The session was nice and intimate—just five of us playing. The pub was busy and everyone seemed to really appreciate the music. Me, I just really got into playing jigs and reels with my talented friends.

Wednesday

Whereas the session in the Hand in Hand was an impromptu affair, the session in the Jolly Brewer is regular as clockwork. Every Wednesday evening at 8 o’clock, rain, hail, or shine.

It was particularly good this week. Sometimes you just lock into a groove and everything clicks.

Thursday

Enough with the culture—time for some good hard science!

I hadn’t been to a Brighton Astro meetup in ages. Their monthly lectures are usually on the first Thursday of the month, which clashes with the session in the Ancient Mariner in Hove. But this month’s gathering was an exception, which meant I could finally make it.

Professor Malcolm Longair from the University of Cambridge was ostensibly speaking about the James Webb Space Telescope, but the talk ended up being larger in scope. The over-riding message was that we get the full picture of the universe by looking at all the frequencies of the electromagnetic spectrum—not just visible light, but not just infrared either.

It was so great to see how Brighton Astro has grown. It started life years ago as a meetup in the Clearleft building. Now it gets over a hundred people attending every month.

Friday

The weekend starts now. Apart from Salter Cane band practice tomorrow morning, I plan to stay in and stay cosy.

2023

I try to get back to Ireland a few times a year to see my mother. At some point in each trip there’s a social gathering with her friends or family. Inevitably the talk turns to ailments, illnesses, and complaints. I sit there quietly and nod politely.

2023 was the year I joined in.

If it wasn’t relaying my experience of visits to the emergency room, it was talk of my sinuses acting up and keeping me awake at night with their noises. Nasal polyps perhaps? And lately I’ve been having this wheezy asthma-like issue at night, what with this chesty cough I’ve been trying to sha… you get how uninteresting this is, right?

So I’ve got some nagging health issues. But I consider myself lucky. In the grand scheme of things, they aren’t big deals. Even the allergy which requires me to carry an epi-pen is to the easily-avoidable Ibuprofun, not to some ubiquitous foodstuff.

In fact I’ve had just enough health issues to give me a nice dose of perspective and appreciate all the times when my body is functioning correctly. I often think of what Maciej wrote about perspective:

The good news is, as you get older, you gain perspective. Perspective helps alleviate burnout.

The bad news is, you gain perspective by having incredibly shitty things happen to you and the people you love. Nature has made it so that perspective is only delivered in bulk quantities. A railcar of perspective arrives and dumps itself on your lawn when all you needed was a microgram. This is a grossly inefficient aspect of the human condition, but I’m sure bright minds in Silicon Valley are working on a fix.

Hence my feeling fortunate. 2023 was a perfectly grand year for me.

I went on some great adventures with Jessica. In the middle of the year we crossed the Atlantic on the Queen Mary II with our friends Dan and Sue, then we explored New York, and then we relaxed on Saint Augustine Beach for a week. Lovely!

The week in Ortigia, Sicily was great. So was the week in Cáceres, Spain. And the week spent playing music in Belfast during the trad festival was a blast.

There was lots of music closer to home too. Brighton is blessed with plenty of Irish music sessions and I’m doing my best to get to all of them. Playing mandolin in a session is my happy place.

Other music is also available. The band had an excellent year with the addition of our brilliant drummer, Matthew. We made such fast progress on new material that we managed to get into the studio to record an album’s worth of songs. Expect a new Salter Cane album in 2024!

On the work front, my highlights were event-based. I curated and hosted UX London. I spoke at a bunch of other events, and I think I did a good job. I spoke at no online events, and that’s the way I’d like to keep it. I thrive on giving talks at in-person gatherings. I hope I can continue to do that in 2024.

I very much enjoyed having a four-day work week in 2023. I don’t think I could ever go back to a five-day week. In fact, for 2024 I’m dabbling with a three-day work week. I’m luckily I can afford to do this. Given the choice, I’d rather have more time than more money. I know not everyone has that choice.

My hope for 2024 is for pretty much more of the same as I got in 2023. More music. More travel. But fewer health issues.

When I was summarising 2022, I said:

I’ve got my health. That’s something I don’t take for granted.

I’ve still (mostly) got my health. I definitely don’t take it for granted. Here’s to a happy and healthy 2024.

After the end

I was doing some housekeeping on my website recently, tidying up some broken links, that kind of thing. I happened on the transcript and video for the talk I gave two years ago called “Sci-fi and Me.”

Sci-Fi & Me – Jeremy Keith – Stay Curious Café by beyond tellerrand

I really enjoyed preparing and giving that talk. It’s the kind of topic I’d love to speak/podcast about more.

Part of the structure of the talk involved me describing ten topics that might be encountered in the literature of science fiction. I describe the topic, mention some examples, and then choose one book as my pick for that topic.

For the topic of post-apocalypse stories, I chose Emily St. John Mandell’s Station Eleven. I love that book, and the equally excellent—though different—television series.

STATION ELEVEN Trailer (2021)

I’ve written in the past about why I love it:

Station Eleven describes a group of people in a post-pandemic world travelling around performing Shakespeare plays. At first I thought this was a ridiculous conceit. Then I realised that this was the whole point. We don’t have to watch Shakespeare to survive. But there’s a difference between surviving and living.

You’ve got a post-apocalyptic scenario where the pursuit of art helps giving meaning to life. That’s Station Eleven, but it also describes a film currently streaming on Netflix called Apocalypse Clown. Shakespeare’s been swapped for clowning, the apocalypse is set in Ireland, and the film is a comedy, but in a strange way, it tackles the same issue at the heart of Station Eleven: survival is insufficent.

APOCALYPSE CLOWN Official Trailer Ire/UK 2023

I really enjoyed Apocalypse Clown, mostly down to Natalie Palamides’s scene-stealing performance. It very much slipped by under the radar, unlike the recent Netflix production Leave The World Behind

Leave The World Behind | Final Trailer | Netflix

If you haven’t watched Leave The World Behind yet, stop reading please. Because I want to talk about the ending of the film.

SPOILERS

I never read the Rumaan Alam novel, but I thoroughly enjoyed this film. The mounting dread, the slow trickle of information, all good vibey stuff.

What I really liked was the way you can read the ending in two different ways.

On the large scale, we hear how everything that has unfolded is leading to the country tearing itself apart—something we see beginning to happen in the distance.

But on the smaller scale, we see people come together. When the final act was introduced as “The Last One” I thought we might be in for the typical trope of people turning on one another until there’s a final survivor. But instead we see people who have been mistrustful of one another come to help each other. It felt very true to the reality described in Rebecca Solnit’s excellent A Paradise Built In Hell.

The dichotomy between the large-scale pessimism and the smale-scale optimism rang true. It reminded me of The Situation. The COVID-19 pandemic was like a Rorscharch test that changed as you zoomed in and out:

I’ve noticed concentric circles of feelings tied to geography—positive in the centre, and very negative at the edges. What I mean is, if you look at what’s happening in your building and your street, it’s quite amazing how people are pulling together.

But once you look further than that, things turn increasingly sour. At the country level, incompetence and mismanagement seem to be the order of the day. And once you expand out to the whole world, who can blame you for feeling overwhelmed with despair?

But the world is made up of countries, and countries are made up of communities, and these communities are made up of people who are pulling together and helping one another.

Sessions

Brighton has a thriving Irish music scene. Some sessions are weekly—every Sunday afternoon in The Bugle and every Wednesday evening in The Jolly Brewer. Some are every two weeks, like the session in The Fiddler’s Elbow. Others are monthly, like the session in The Dover Castle and the session in The Lord Nelson.

So it sometimes happens that if the calendar aligns just right, there are many sessions in one week. This was one of those weeks. I managed a streak of five sessions in a row.

The first was the regular Sunday afternoon session in The Bugle.

Two women playing fiddle in a pub.

Then on Monday, it was The Fiddler’s Elbow.

Two concertina players and a banjo player sitting at a table in a pub corner.

The night after that there was a one-off session in the Hand in Hand, which will hopefully become a regular monthly occurrence.

A woman playing fiddle and a man playing concertina in an ornate pub. In the foreground another man holds a fiddle.

On Wednesday it was the regular session at The Jolly Brewer.

Two banjo players, a man and a women, playing at a pub table. Two fiddlers, a man and a woman, in the corner of a pub.

Finally on Thursday it was the monthly session at The Lord Nelson.

A woman playing concertina and a man playing whistle around a pub table with a guitar headstock in the foreground. A woman playing fiddle and a man playing bones at a pub table covered with pints.

I’m very lucky to have so many opportunities to play the music I love with my fellow musicians. I don’t take it for granted.

HTML web components

Web components have been around for quite a while, but it feels like they’re having a bit of a moment right now.

It turns out that the best selling point for web components was “wait and see.” For everyone who didn’t see the benefit of web components over being locked into a specific framework, time is proving to be a great teacher.

It’s not just that web components are portable. They’re also web standards, which means they’ll be around as long as web browsers. No framework can make that claim. As Jake Lazaroff puts it, web components will outlive your JavaScript framework.

At this point React is legacy technology, like Angular. Lots of people are still using it, but nobody can quite remember why. The decision-makers in organisations who chose to build everything with React have long since left. People starting new projects who still decide to build on React are doing it largely out of habit.

Others are making more sensible judgements and, having been bitten by lock-in in the past, are now giving web components a go.

If you’re one of those people making the move from React to web components, there’ll certainly be a bit of a learning curve, but that would be true of any technology change.

I have a suggestion for you if you find yourself in this position. Try not to bring React’s mindset with you.

I’m talking about the way React components are composed. There’s often lots of props doing heavy lifting. The actual component element itself might be empty.

If you want to apply that model to web components, you can. Lots of people do. It’s not unusual to see web components in the wild that look like this:

<my-component></my-component>

The custom element is just a shell. All the actual power is elsewhere. It’s in the JavaScript that does all kinds of clever things with the shadow DOM, templates, and slots.

There is another way. Ask, as Robin does, “what would HTML do?”

Think about composibility with existing materials. Do you really need to invent an entirely new component from scratch? Or can you use HTML up until it reaches its limit and then enhance the markup?

Robin writes:

I don’t think we should see web components like the ones you might find in a huge monolithic React app: your Button or Table or Input components. Instead, I’ve started to come around and see Web Components as filling in the blanks of what we can do with hypertext: they’re really just small, reusable chunks of code that extends the language of HTML.

Dave talks about how web components can be HTML with superpowers. I think that’s a good attitude to have. Instead of all-singing, all-dancing web components, it feels a lot more elegant to use web components to augment your existing markup with just enough extra behaviour.

Where does the shadow DOM come into all of this? It doesn’t. And that’s okay. I’m not saying it should be avoided completely, but it should be a last resort. See how far you can get with the composibility of regular HTML first.

Eric described his recent epiphany with web components. He created a super-slider custom element that wraps around an existing label and input type="range":

You just take some normal HTML markup, wrap it with a custom element, and then write some JS to add capabilities which you can then style with regular CSS!  Everything’s of the Light Side of the Web.  No need to pierce the Vale of Shadows or whatever.

When you wrap some existing markup in a custom element and then apply some new behaviour with JavaScript, technically you’re not doing anything you couldn’t have done before with some DOM traversal and event handling. But it’s less fragile to do it with a web component. It’s portable. It obeys the single responsibility principle. It only does one thing but it does it well.

Jim created an icon-list custom element that wraps around a regular ul populated with li elements. But he feels almost bashful about even calling it a web component:

Maybe I shouldn’t be using the term “web component” for what I’ve done here. I’m not using shadow DOM. I’m not using the templates or slots. I’m really only using custom elements to attach functionality to a specific kind of component.

I think what Eric and Jim are doing is exemplary. See also Zach’s web components.

At the end of his post, Eric says he’d like a nice catchy term for these kinds of web components. In Dave’s catalogue of web components, they’re called “element extensions.” I like that. It’s pretty catchy.

Or we could call them “HTML web components.” If your custom element is empty, it’s not an HTML web component. But if you’re using a custom element to extend existing markup, that’s an HTML web component.

React encouraged a mindset of replacement: “forgot what browsers can do; do everything in a React component instead, even if you’re reinventing the wheel.”

HTML web components encourage a mindset of augmentation instead.

A memex in every web browser

When Mathew Modine’s character first shows up in Christopher Nolan’s Oppenheimer, I figured the rest of the cinema audience wouldn’t have appreciated me shouting out “VANNEVAR BUSH IN THE HOUSE!” so I screamed it on the inside.

The Manhattan Project was not his only claim to fame or infamy. When it comes to the world we now live in, Bush’s idea of the memex has been almost equally influential. His article As We May Think became a touchstone for Douglas Engelbart and later Tim Berners-Lee.

But as Matt Thompson points out:

…the device he describes does not resemble the internet or anything I’ve ever found on it.

Then he says:

What Bush was describing sounds to me like what you might get if you turned a browser history — the most neglected piece of the software — into a robust and fully featured machine of its own. It would help you map the path you charted through a web of knowledge, refine those maps, order them, and share them

Yes! This!! I 100% agree with the description of browser history as “the most neglected piece of the software.” While I wouldn’t go as far as Chris when he says web browsers kind of suck, I’m kind of amazed that there hasn’t been more innovation and competition in this space.

If anything we’ve outsourced the management of our browsing history to services like Delicious and Pinboard, or to tools like Obsidian and Roam Research. Heck, the links section of my website is my attempt to manage and annotate my own associative trails.

Imagine if that were baked right into a web browser. Then imagine how beautiful such a rich source of data might look.

Like Matt says:

I don’t think anything like this exists. So Bush’s essay still transfixes me.

border:none 2023

In 2013, I spoke at the border:none event in Nuremberg. I gave a talk called The Power of Simplicity.

It was a great little event. Most of the talks were, like mine, on technical topics; design, development, the usual conference material.

This year Joschi and Marc decided to have another border:none event ten years on from the first one. They invited back all the original speakers, as well as some new folks. They kept the ticket price the same as ten years ago—just thirty euros.

For us speakers from the previous event, the only brief they gave us was to consider what’s happened in the past decade. I played it pretty safe and talked about the web. I’ll post a transcript of my talk soon.

Some of the other speakers were far more ambitious. They spoke about themselves, the world, the meaning of life …my presentation was very tame in comparison.

I really, really admire the honesty and vulnerability that those people displayed. Tobias Baldauf in particular took my breath away. He delivered an intensely personal talk on generational trauma that was meticulously researched and took incredible bravery to deliver. It was worth going to Nuremberg just for the privilege of being present for that talk.

Other talks were refreshingly tech-free. There was a talk on cold-water swimming. There was a talk on paragliding. And I don’t mean they were saying “what designers can learn from cold-water swimming” or “how I became a better developer through paragliding.” The talks were literally about swimming and paragliding.

There was a great variety of speakers this time around, include age ranges from puberty to menopause (quite literally—that was the topic of one of the talks). I had the great pleasure of providing some coaching before the event to fifteen year old Maya who was delivering her first talk in English. She did a fantastic job! And the talk she gave—about how teachers in her school aren’t always trusting of the technology they provide to students—was directly relevant to what we’re seeing in the world of work. Give people autonomy, agency and trust.

There was a lot of trust at border:none. Everyone who bought a ticket did so on trust—they had no idea what to expect. Likewise, Marc and Joschi put their trust in the speakers. They gave the speakers the freedom to talk about whatever they wanted. That trust was repayed.

Florian took some superb pictures of the event. Matthias wrote up his experience. So did Tom. Valisis shared the gist of his excellent talk.

At the end of the event there was some joking about returning in 2033. I love the idea of a conference that happens once every ten years. Count me in!