Journal tags: network

25

Increment by increment

The bedrock of the World Wide Web is solid. Built atop the protocols of the internet (TCP/IP), its fundamental building blocks remain: URLs of HTML files transmitted over HTTP. Baldur Bjarnason writes:

Even today, the web is like living fossil, a preserved relic from a different era. Anybody can put up a website. Anybody can run a business over it. I can build an app or service, send the URL to anybody I like, and most people in the world will be able to run it without asking anybody’s permission.

Still, the web has evolved. In fact, that evolution is something that’s also built into its fundamental design. Rather than try to optimise the World Wide Web for one particular use-case, Tim Berners-Lee realised the power of being flexible. Like the internet, the World Wide Web is deliberately dumb.

(I get very annoyed when people talk about the web as being designed for scientific work at CERN. That was merely the first use-case. The web was designed for everything …and nothing in particular.)

Robin Berjon compares the web’s evolution to the ship of Theseus:

That’s why it’s been so hard to agree about what the Web is: the Web is architected for resilience which means that it adapts and transforms. That flexibility is the reason why I’m talking about some mythological dude’s boat. Altogether too often, we consider some aspects of the Web as being invariants when they’re potentially just as replaceable as any other part. This isn’t to say that there are no invariants on the Web.

The web can be changed. That’s both a comfort and a warning. There’s plenty that we should change about today’s web. But there’s also plenty—at the root level—that we should fight to preserve.

And if you want change, the worst way to go about it is to promulgate the notion of burning everything down and starting from scratch. As Erin says in the fourth and final part of her devastating series on Meta in Myanmar:

We don’t get a do-over planet. We won’t get a do-over network.

Instead, we have to work with the internet we made and find a way to rebuild and fortify it to support the much larger projects of repair—political, cultural, environmental—that are required for our survival.

Though, as Robin points out, that doesn’t preclude us from sharing a vision:

Proceeding via small, incremental changes can be a laudable approach, but even then it helps to have a sense for what it is that those small steps are supposed to be incrementing towards.

I’m looking forward to reading what Robin puts forward, particularly because he says “I’m no technosolutionist.”

From a technical perspective, the web has never been better. We have incredible features in HTML, CSS, and JavaScript, all standardised and with amazing interoperability between browsers. The challenges that face the web today are not technical.

That’s one of the reasons why I have no patience for the web3 crowd. Apart from the ridiculous name, they’re focusing on exactly the wrong part of the stack.

Listening to their pitch, they’ll point out that while yes, the fundamental bedrock of the web is indeed decentralised—TCP/IP, HTTP(S)—what’s been constructed on that foundation is increasingly centralised; the power brokers of Google, Meta, Amazon.

And what’s the solution they propose? Replace the underlying infrastructure with something-something-blockchain.

Would that it were so simple.

The problems of today’s web are not technical in nature. The problems of today’s web won’t be solved by technology. If we’re going to solve the problems of today’s web, we’ll need to do it through law, culture, societal norms, and co-operation.

(Feel free to substitute “today’s web” with “tomorrow’s climate”.)

The syndicate

Social networks come and social networks go.

Right now, there’s a whole bunch of social networks coming (Blewski, Freds, Mastication) and one big one going, thanks to Elongate.

Me? I watch all of this unfold like Doctor Manhattan on Mars. I have no great connection to any of these places. They’re all just syndication endpoints to me.

I used to have a checkbox in my posting interface that said “Twitter”. If I wanted to add a copy of one of my notes to Twitter, I’d enable that toggle.

I have, of course, now removed that checkbox. Twitter is dead to me (and it should be dead to you too).

I used to have another checkbox next to that one that said “Flickr”. If I was adding a photo to one of my notes, I could toggle that to send a copy to my Flickr account.

Alas, that no longer works. Flickr only allows you to post 1000 photos before requiring a pro account. Fair enough. I’ve actually posted 20 times that amount since 2005, but I let my pro membership lapse a while back.

So now I’ve removed the “Flickr” checkbox too.

Instead I’ve now got a checkbox labelled “Mastodon” that sends a copy of a note to my Mastodon account.

When I publish a blog post like the one you’re reading now here on my journal, there’s yet another checkbox that says “Medium”. Toggling that checkbox sends a copy of my post to my page on Ev’s blog.

At least it used to. At some point that stopped working too. I was going to start debugging my code, but when I went to the documentation for the Medium API, I saw this:

This repository has been archived by the owner on Mar 2, 2023. It is now read-only.

I guessed I missed the memo. I guess Medium also missed the memo, because developers.medium.com is still live. It proudly proclaims:

Medium’s Publishing API makes it easy for you to plug into the Medium network, create your content on Medium from anywhere you write, and expand your audience and your influence.

Not a word of that is accurate.

That page also has a link to the Medium engineering blog. Surely the announcement of the API deprecation would be published there?

Crickets.

Moving on…

I have an account on Bluesky. I don’t know why.

I was idly wondering about sending copies of my notes there when I came across a straightforward solution: micro.blog.

That’s yet another place where I have an account. They make syndication very straightfoward. You can go to your account and point to a feed from your own website.

That’s it. Syndication enabled.

It gets better. Micro.blog can also cross-post to other services. One of those services is Bluesky. I gave permission to micro.blog to syndicate to Bluesky so now my notes show up there too.

It’s like dominoes falling: I post something on my website which updates my RSS feed which gets picked up by micro.blog which passes it on to Bluesky.

I noticed that one of the other services that micro.blog can post to is Medium. Hmmm …would that still work given the abandonment of the API?

I gave permission to micro.blog to cross-post to Medium when my feed of blog posts is updated. It seems to have worked!

We’ll see how long it lasts. We’ll see how long any of them last. Today’s social media darlings are tomorrow’s Friendster and MySpace.

When the current crop of services wither and die, my own website will still remain in full bloom.

Portability

Exactly sixteen years ago on this day, I wrote about Twitter, a service I had been using for a few weeks. I documented how confusing yet compelling it was.

Twitter grew and grew after that. But at some point, it began to feel more like it was shrinking, shrivelling into a husk of its former self.

Just over ten years ago, there was a battle for the soul of Twitter from within. One camp wanted it to become an interoperable protocol, like email. The other camp wanted it to be a content farm, monetised by advertisers. That’s the vision that won. They declared war on the third-party developers who had helped grow Twitter in the first place, and cracked down on anything that didn’t foster e N g A g E m E n T.

The muskofication of Twitter is the nail in the coffin. In the tradition of all scandals since Watergate, I propose we refer to the shocking recent events at Twitter as Elongate.

Post-Elongate Twitter will limp on, I’m sure, but it can never be the fun place it once was. The incentives just aren’t there. As Bastian wrote:

Twitter was once an amplifier for brilliant ideas, for positivity, for change, for a better future. Many didn’t understand the power it had as a communication platform. But that power turned against the exact same people who needed this platform so urgently. It’s now a waste of time and energy at best and a threat to progress and society at worst.

I don’t foresee myself syndicating my notes to Twitter any more. I’ve removed the site from my browser’s bookmarks. I’ve removed it from my phone’s home screen too.

As someone who’s been verified on Twitter for years, with over 140,000 followers, it should probably feel like a bigger deal than it does. I echo Robin’s observation:

The speed with which Twitter recedes in your mind will shock you. Like a demon from a folktale, the kind that only gains power when you invite it into your home, the platform melts like mist when that invitation is rescinded.

Meanwhile, Mastodon is proving to be thoroughly enjoyable. Some parts are still rough around the edges, but compared to Twitter in 2006, it’s positively polished.

Interestingly, the biggest complaint that I and my friends had about Twitter all those years ago wasn’t about Twitter per se, but about lock-in:

Twitter is yet another social network where we have to go and manually add all the same friends from every other social network.

That’s the very thing that sets the fediverse apart: the ability to move from one service to another and bring your social network with you. Now Matt is promising to add ActivityPub to Tumblr. That future we wanted sixteen years ago might finally be arriving.

That fediverse feeling

Right now, Twitter feels like Dunkirk beach in May 1940. And look, here comes a plucky armada of web servers running Mastodon instances!

Others have written some guides to getting started on Mastodon:

There are also tools like Twitodon to help you migrate from Twitter to Mastodon.

Getting on board isn’t completely frictionless. Understanding how Mastodon works can be confusing. But then again, so was Twitter fifteen years ago.

Right now, many Mastodon instances are struggling with the influx of new sign-ups. But this is temporary. And actually, it’s also very reminiscent of the early unreliable days of Twitter.

I don’t want to go into the technical details of Mastodon and the fediverse—even though those details are fascinating and impressive. What I’m really struck by is the vibe.

In a nutshell, I’m loving it! It feels …nice.

I was fully expecting Mastodon to be full of meta-discussions about Mastodon, but in the past few weeks I’ve enjoyed people posting about stone circles, astronomy, and—obviously—cats and dogs.

The process of finding people to follow has been slow, but in a good way. I’ve enjoyed seeking people out. It’s been easier to find the techy folks, but I’ve also been finding scientists, journalists, and artists.

On the one hand, the niceness of the experience isn’t down to technical architecture; it’s all about the social norms. On the other hand, those social norms are very much directed by technical decisions. The folks working on the fediverse for the past few years have made very thoughtful design decisions to amplify niceness and discourage nastiness. It’s all very gratifying to experience!

Personally, I’m posting to Mastodon via my own website. As much as I’m really enjoying Mastodon, I still firmly believe that nothing beats having control of your own content on your domain.

But I also totally get that not everyone has the same set of priorities as me. And frankly, it’s unrealistic to expect everyone to have their own domain name.

It’s like there’s a spectrum of ownership. On one end, there’s publishing on your own website. On the other end, there’s publishing on silos like Twitter, Facebook, Medium, Instagram, and MySpace.

Publishing on Mastodon feels much closer to the website end of the spectrum than it does to the silo end of the spectrum. If something bad happens to the Mastodon instance you’re on, you can up and move to a different instance, taking your social graph with you.

In a way, it’s like delegating domain ownership to someone you trust. If you don’t have the time, energy, resources, or interest in having your own domain, but you trust someone who’s running a Mastodon instance, it’s the next best thing to publishing on your own website.

Simon described it well when he said Mastodon is just blogs:

A Mastodon server (often called an instance) is just a shared blog host. Kind of like putting your personal blog in a folder on a domain on shared hosting with some of your friends.

Want to go it alone? You can do that: run your own dedicated Mastodon instance on your own domain.

And rather than compare Mastodon to Twitter, Simon makes a comparison with RSS:

Do you still miss Google Reader, almost a decade after it was shut down? It’s back!

A Mastodon server is a feed reader, shared by everyone who uses that server.

Lots of other folks are feeling the same excitement in the air that I’m getting:

Bastian wrote:

Real conversations. Real people. Interesting content. A feeling of a warm welcoming group. No algorithm to mess around with our timelines. No troll army to destory every tiny bit of peace. Yes, Mastodon is rough around the edges. Many parts are not intuitive. But this roughness somehow added to the positive experience for me.

This could really work!

Brent Simmons wrote:

The web is wide open again, for the first time in what feels like forever.

I concur! Though, like Paul, I love not being beholden to either Twitter or Mastodon:

I love not feeling bound to any particular social network. This website, my website, is the one true home for all the stuff I’ve felt compelled to write down or point a camera at over the years. When a social network disappears, goes out of fashion or becomes inhospitable, I can happily move on with little anguish.

But like I said, I don’t expect everyone to have the time, means, or inclination to do that. Mastodon definitely feels like it shares the same indie web spirit though.

Personally, I recommend experiencing Mastodon through the website rather than a native app. Mastodon instances are progressive web apps so you can add them to your phone’s home screen.

You can find me on Mastodon as @adactio@mastodon.social

I’m not too bothered about what instance I’m on. It really only makes a difference to my local timeline. And if I do end up finding an instance I prefer, then I know that migrating will be quite straightforward, by design. Perhaps I should be on an instance with a focus on front-end development or the indie web. I still haven’t found much of an Irish traditional music community on the fediverse. I’m wondering if maybe I should start a Mastodon instance for that.

While I’m a citizen of mastodon.social, I’m doing my bit by chipping in some money to support it: sponsorship levels on Patreon start at just $1 a month. And while I can’t offer much technical assistance, I opened my first Mastodon pull request with a suggested improvement for the documentation.

I’m really impressed with the quality of the software. It isn’t perfect but considering that it’s an open source project, it’s better than most VC-backed services with more and better-paid staff. As Giles said, comparing it to Twitter:

I’m using Mastodon now and it’s not the same, but it’s not shit either. It’s different. It takes a bit of adjustment. And I’m enjoying it.

Most of all, I love, love, love that Mastodon demonstrates that things can be different. For too long we’ve been told that behavioural advertising was an intrinsic part of being online, that social networks must inevitably be monolithic centralised beasts, that we have to relinquish control to corporations in order to be online. The fediverse is showing us a better way. And this isn’t just a proof of concept either. It’s here now. It’s here to stay, if you want it.

Syndicating to Mastodon

I’ve been contemplating a checkbox. The label for this checkbox reads:

This is a bot account

Let me back up…

In what seems like decades ago, but was in fact just a few weeks, Elon Musk bought Twitter and began burning it to the ground. His admirers insist he’s playing some form of four-dimensional chess, but to the rest of us, his actions are indistinguishable from a spoilt rich kid not understanding what a social network is.

It wasn’t giving me much cause for anguish personally. For the past eight years, I’ve only used Twitter as a syndication endpoint for my own notes. But I understand that’s a very privileged position to be in. Most people on Twitter don’t have the same luxury of independence. It’s genuinely maddening and saddening to see their years of sharing destroyed by one cruel idiot.

Lots of people started moving to Mastodon. I figured I should do the same for my syndicated notes.

At first, I signed up for an account on mastodon.cloud. No particular reason. But that’s where I saw this very insightful post from Anil Dash:

When it came time to reckon with social media’s failings, nobody ran to the “web3” platforms. Nobody asked “can I get paid per message”? Nobody asked about the blockchain. The community of people who’ve been quietly doing this work for years (decades!) ended up being the ones who welcomed everyone over, as always.

I was getting my account all set up and beginning to follow some other folks, when I realised that I actually already had an exisiting account over on mastodon.social. Doh! Turns out that I signed up back in 2017 to kick the tyres, but never did much else because there weren’t many other people around back then. Oh, how times have changed!

Anyway, I thought I had really screwed up by having two accounts but this turned out to be an opportunity to experience some of the thoughtfulness in Mastodon’s design. The process of migrating from one Mastodon account to another—on a completely different instance—was very smooth! It was clear that this wasn’t an afterthought. This is an essential part of the fediverse and the design of the migration flow reflects that.

This gives me enormous peace of mind. If I ever want to switch to a different instance and still keep my network intact, I know it won’t be a problem. Mastodon is like the opposite of the roach-motel mentality that permeates most VC-backed so-called social networks.

As I played around some more—reading, following, exploring—my feelings of fondness only grew stronger. I like this place a lot!

I definitely wanted to syndicate my notes to Mastodon. At first, I implemented a straightforward RSS-to-Mastodon syndication using IFTTT (IF This, Then That), thanks to Matthias’s excellent tutorial.

But that didn’t feel quite right. When I syndicate to Twitter, I make a conscious choice each time. There’s a “Twitter” toggle that I can enable or disable in my posting interface. Mastodon deserved the same level of thoughtfulness.

So I switched off the IFTTT recipe and started exploring the Mastodon API. It’s going to sound like a humblebrag when I tell you that I got cross-posting working in almost no time at all, but that’s not a testament to my coding prowess (I’m really not very good), but rather a testament to the Mastodon API, which was a joy to work with.

  1. On your Mastodon instance, go to /settings/applications.
  2. Click on New Application.
  3. Fill in the details about your website and select write:statuses (and probably write:media) from the Scopes list.
  4. Copy Your access token to use in API calls.
  5. Write some sloppy code (in my case, PHP that uses CURL).

I did hit a wall when it came to posting images. That took me a while to get working, and I couldn’t figure out why. Was it something at Mastodon’s end while it was struggling under the influx of new users? As it turns out, no. It was entirely down to me being an idiot. (You know that situation where you’re working on a problem for ages and you’ve become convinced it’s an extremely gnarly rocket-science problem, but then turns out to be something stupid like a typo? Yeah. That.)

Then there’s the whole question of how to receive replies, likes, and reboosts from Mastodon here on my own site. Luckily, that was super easy, thanks to Brid.gy. One click and I was done. I love Brid.gy!

Take this note, for example. There’s a version on Twitter and a version on Mastodon. The original version on my own site gets responses from both places.

If I’m replying to a response on Twitter, I do not syndicate that to Mastodon.

Likewise, if I’m replying to a response on Mastodon, I do not syndicate that to Twitter.

Oh, one thing worth mentioning: if you’re sending a reply to something on Mastodon using the API, there’s an in_reply_to_id field for you to provide. But you should also include the full @username@instance of the person you’re replying to at the beginning of the message to ensure that it’s displayed as a reply rather than showing up as a regular post. Note the difference between this note on my site and its syndicated version on Mastodon.

Anyway, now I’m posting to Mastodon, but I’m doing it through the the interface of my own website. Which brings me to that checkbox in Mastodon’s profile settings:

This is a bot account

The help text reads:

Signal to others that the account mainly performs automated actions and might not be monitored

If I were doing the automatic cross-posting from RSS, I’d definitely tick that box. But as I’m making a conscious decision whenever I syndicate to Mastodon, I think I’m going to leave that checkbox unticked.

My cross-posting is not automated and I’m very much monitoring my Mastodon account …because I’m enjoying my Mastodon experience more than I’ve enjoyed anything online for quite some time. Highly recommended!

A curl in every port

A few years back, Zach Bloom wrote The History of the URL: Path, Fragment, Query, and Auth. He recently expanded on it and republished it on the Cloudflare blog as The History of the URL. It’s well worth the time to read the whole thing. It’s packed full of fascinating tidbits.

In the section on ports, Zach says:

The timeline of Gopher and HTTP can be evidenced by their default port numbers. Gopher is 70, HTTP 80. The HTTP port was assigned (likely by Jon Postel at the IANA) at the request of Tim Berners-Lee sometime between 1990 and 1992.

Ooh, I can give you an exact date! It was January 24th, 1992. I know this because of the hack week in CERN last year to recreate the first ever web browser.

Kimberly was spelunking down the original source code, when she came across this line in the HTUtils.h file:

#define TCP_PORT 80 /* Allocated to http by Jon Postel/ISI 24-Jan-92 */

We showed this to Jean-François Groff, who worked on the original web technologies like libwww, the forerunner to libcurl. He remembers that day. It felt like they had “made it”, receiving the official blessing of Jon Postel (in the same RFC, incidentally, that gave port 70 to Gopher).

Then he told us something interesting about the next line of code:

#define OLD_TCP_PORT 2784 /* Try the old one if no answer on 80 */

Port 2784? That seems like an odd choice. Most of us would choose something easy to remember.

Well, it turns out that 2784 is easy to remember if you’re Tim Berners-Lee.

Those were the last four digits of his parents’ phone number.

Web talk

At the start of this month I was in Amsterdam for a series of back-to-back events: Indie Web Camp Amsterdam, View Source, and Fronteers. That last one was where Remy and I debuted talk we’d been working on.

The Fronteers folk have been quick off the mark so the video is already available. I’ve also published the text of the talk here:

How We Built The World Wide Web In Five Days

This was a fun talk to put together. The first challenge was figuring out the right format for a two-person talk. It quickly became clear that Remy’s focus would be on the events of the five days we spent at CERN, whereas my focus would be on the history of computing, hypertext, and networks leading up to the creation of the web.

Now, we could’ve just done everything chronologically, but that would mean I’d do the first half of the talk and Remy would do the second half. That didn’t appeal. And it sounded kind of boring. So then we come up with the idea of interweaving the two timelines.

That worked remarkably well. The talk starts with me describing the creation of CERN in the 1950s. Then Remy talks about the first day of the hack week. I then talk about events in the 1960s. Remy talks about the second day at CERN. This continues until we join up about half way through the talk: I’ve arrived at the moment that Tim Berners-Lee first published the proposal for the World Wide Web, and Remy has arrived at the point of having running code.

At this point, the presentation switches gears and turns into a demo. I do not have the fortitude to do a live demo, so this was all down to Remy. He did it flawlessly. I have so much respect for people brave enough to do live demos, and do them well.

But the talk doesn’t finish there. There’s a coda about our return to CERN a month after the initial hack week. This was an opportunity for both of us to close out the talk with our hopes and dreams for the World Wide Web.

I know I’m biased, but I thought the structure of the presentation worked really well: two interweaving timelines culminating in a demo and finishing with the big picture.

There was a forcing function on preparing this presentation: Remy was moving house, and I was already going to be away speaking at some other events. That limited the amount of time we could be in the same place to practice the talk. In the end, I think that might have helped us make the most of that time.

We were both feeling the pressure to tell this story well—it means so much to us. Personally, I found that presenting with Remy made me up my game. Like I said:

It’s been a real treat working with Remy on this. Don’t tell him I said this, but he’s kind of a web hero of mine, so this was a real honour and a privilege for me.

This talk could have easily turned into a boring slideshow of “what we did on our holidays”, but I think we managed to successfully avoid that trap. We’re both proud of this talk and we’d love to give it again some time. If you’d like it at your event, get in touch.

In the meantime, you can read the text, watch the video, or look at the slides (but the slides really don’t make much sense in isolation).

The Weight of the WWWorld is Up to Us by Patty Toland

It’s Patty Toland’s first time at An Event Apart! She’s from the fantabulous Filament Group. They’re dedicated to making the web work for everyone.

A few years ago, a good friend of Patty’s had a medical diagnosis that required everyone to pull together. Another friend shared an article about how not to say the wrong thing. This is ring theory. In a moment of crisis, the person involved is in the centre. You need to understand where you are in this ring structure, and only ever help and comfort inwards and dump concerns and problems outwards.

At the same time, Patty spent time with her family at the beach. Everyone reads the same books together. There was a book about a platoon leader in Vietnam. 80% of the story was literally a litany of stuff—what everyone was carrying. This was peppered with the psychic and emotional loads that they were carrying.

A month later there was a lot of coverage of Syrian refugees arriving in Europe. People were outraged to see refugees carrying smartphones as though that somehow showed they weren’t in a desperate situation. But smartphones are absolutely a necessity in that situation, and most of the phones were less expensive, lower-end devices. Refugeeinfo.eu was a useful site for people in crisis, but the navigation was designed to require JavaScript.

When people thing about mobile, they think about freedom and mobility. But with that JavaScript decision, the developers piled baggage on to the users.

There was a common assertion that slow networks were a third-world challenge. Remember Facebook’s network challenges? They always talked about new markets in India and Africa. The implication is that this isn’t our problem in, say, Omaha or New York.

Pew Research provided a lot of data back then that showed that this thinking was wrong. Use of cell phones, especially smartphones and tablets, escalated dramatically in the United States. There was a trend towards mobile-only usage. This was in low-income households—about one third of the population. Among 5,400 panelists, 15% did not have a JavaScript-enabled device.

Pew Research provided updated data this year. The research shows an increase in those trends. Half of the population access the web primarily on mobile. The cost of a broadband subscription is too expensive for many people. Sometimes broadband access simply isn’t available.

There’s a term called “the homework gap.” Two thirds of teachers assign broadband-dependent homework, while one third of students have no access to broadband.

At most 37% of people have unlimited data. Most people run out of data on a frequent basis.

Speed also varies wildly. 4G doesn’t really mean anything. The data is all over the place.

This shows that network issues are definitely not just a third world challenge.

On the 25th anniversary of the web, Tim Berners-Lee said the web’s potential was only just beginning to be glimpsed. Everyone has a role to play to ensure that the web serves all of humanity. In his contract for the web, Tim outlined what governments, companies, and users need to do. This reminded Patty of ring theory. The user is at the centre. Designers and developers are in the next circle out. Then there’s the circle of companies. Then there are platforms, browsers, and frameworks. Finally there’s the outer circle of governments.

Are we helping in or dumping in? If you look at the data for the average web page size (2 megabytes), we are definitely dumping in. The size of third-party JavaScript has octupled.

There’s no way for a user to know before clicking a link how big and bloated the page is going to be. Even if they abandon the page load, they’ve still used (and wasted) a lot of data.

Third party scripts—like ads—are really bad at dumping in (to use the ring theory model). The best practices for ads suggest that up to 100 additional HTTP requests is totally acceptable. Unbelievable! It doesn’t matter how performant you’ve made a site when this crap gets piled on top of it.

In 2018, the internet’s data centres alone may already have had the same carbon footprint as all global air travel. This will probably triple in the next seven years. The amount of carbon it takes to train a single AI algorithm is more than the entire life cycle of a car. Then there’s fucking Bitcoin. A single Bitcoin transaction could power 21 US households. It is designed to use—specifically, waste—more and more energy over time.

What should we be doing?

Accessibility should be at the heart of what we build. Plan, test, educate, and advocate. If advocacy doesn’t work, fear can be a motivator. There’s an increase in accessibility lawsuits.

Our websites should be as light as possible. Ask, measure, monitor, and optimise. RequestMap is a great tool for visualising requests. You can see the size and scale of third-party requests. You can also see when images are far, far bigger than they need to be.

Take a critical guide to everything and pare everything down. Set perforance budgets—file size budgets, for example. Optimise images, subset custom fonts, lazyload images and videos, get third-party tools out of the critical path (or out completely), and seek out lighter frameworks.

Test on real devices that real people are using. See Alex Russell’s data on the differences between the kind of devices we use and typical low-end devices. We literally need to stop people in JavaScript.

Push the boundaries. See the amazing work that Adrian Holovaty did with Soundslice. He had to make on-the-fly sheet music generation work on old iPads that musicians like to use. He recommends keeping old devices around to see how poorly your product is working on it.

If you have some power, then your job is to empower somebody else.

—Toni Morrison

Service workers and browser extensions

I quite enjoy a good bug hunt. Just yesterday, myself and Cassie were doing some bugfixing together. As always, the first step was to try to reproduce the problem and then isolate it. Which reminds me…

There’ve been a few occasions when I’ve been trying to debug service worker issues. The problem is rarely in reproducing the issue—it’s isolating the cause that can be frustrating. I try changing a bit of code here, and a bit of code there, in an attempt to zero in on the problem, butwith no luck. Before long, I’m tearing my hair out staring at code that appears to have nothing wrong with it.

And that’s when I remember: browser extensions.

I’m currently using Firefox as my browser, and I have extensions installed to stop tracking and surveillance (these technologies are usually referred to as “ad blockers”, but that’s a bit of a misnomer—the issue isn’t with the ads; it’s with the invasive tracking).

If you think about how a service worker does its magic, it’s as if it’s sitting in the browser, waiting to intercept any requests to a particular domain. It’s like the service worker is the first port of call for any requests the browser makes. But then you add a browser extension. The browser extension is also waiting to intercept certain network requests. Now the extension is the first port of call, and the service worker is relegated to be next in line.

This, apparently, can cause issues (presumably depending on how the browser extension has been coded). In some situations, network requests that should work just fine start to fail, executing the catch clauses of fetch statements in your service worker.

So if you’ve been trying to debug a service worker issue, and you can’t seem to figure out what the problem might be, it’s not necessarily an issue with your code, or even an issue with the browser.

From now on when I’m troubleshooting service worker quirks, I’m going to introduce a step zero, before I even start reproducing or isolating the bug. I’m going to ask myself, “Are there any browser extensions installed?”

I realise that sounds as basic as asking “Are you sure the computer is switched on?” but there’s nothing wrong with having a checklist of basic questions to ask before moving on to the more complicated task of debugging.

I’m going to make a checklist. Then I’m going to use it …every time.

Empire State

I’m in New York. Again. This time it’s for Google’s AMP Conf, where I’ll be giving ‘em a piece of my mind on a panel.

The conference starts tomorrow so I’ve had a day or two to acclimatise and explore. Seeing as Google are footing the bill for travel and accommodation, I’m staying at a rather nice hotel close to the conference venue in Tribeca. There’s live jazz in the lounge most evenings, a cinema downstairs, and should I request it, I can even have a goldfish in my room.

Today I realised that my hotel sits in the apex of a triangle of interesting buildings: carrier hotels.

32 Avenue Of The Americas.Telephone wires and radio unite to make neighbors of nations

Looming above my hotel is 32 Avenue of the Americas. On the outside the building looks like your classic Gozer the Gozerian style of New York building. Inside, the lobby features a mosaic on the ceiling, and another on the wall extolling the connective power of radio and telephone.

The same architects also designed 60 Hudson Street, which has a similar Art Deco feel to it. Inside, there’s a cavernous hallway running through the ground floor but I can’t show you a picture of it. A security guard told me I couldn’t take any photos inside …which is a little strange seeing as it’s splashed across the website of the building.

60 Hudson.HEADQUARTERS The Western Union Telegraph Co. and telegraph capitol of the world 1930-1973

I walked around the outside of 60 Hudson, taking more pictures. Another security guard asked me what I was doing. I told her I was interested in the history of the building, which is true; it was the headquarters of Western Union. For much of the twentieth century, it was a world hub of telegraphic communication, in much the same way that a beach hut in Porthcurno was the nexus of the nineteenth century.

For a 21st century hub, there’s the third and final corner of the triangle at 33 Thomas Street. It’s a breathtaking building. It looks like a spaceship from a Chris Foss painting. It was probably designed more like a spacecraft than a traditional building—it’s primary purpose was to withstand an atomic blast. Gone are niceties like windows. Instead there’s an impenetrable monolith that looks like something straight out of a dystopian sci-fi film.

33 Thomas Street.33 Thomas Street, New York

Brutalist on the outside, its interior is host to even more brutal acts of invasive surveillance. The Snowden papers revealed this AT&T building to be a centrepiece of the Titanpointe programme:

They called it Project X. It was an unusually audacious, highly sensitive assignment: to build a massive skyscraper, capable of withstanding an atomic blast, in the middle of New York City. It would have no windows, 29 floors with three basement levels, and enough food to last 1,500 people two weeks in the event of a catastrophe.

But the building’s primary purpose would not be to protect humans from toxic radiation amid nuclear war. Rather, the fortified skyscraper would safeguard powerful computers, cables, and switchboards. It would house one of the most important telecommunications hubs in the United States…

Looking at the building, it requires very little imagination to picture it as the lair of villainous activity. Laura Poitras’s short film Project X basically consists of a voiceover of someone reading an NSA manual, some ominous background music, and shots of 33 Thomas Street looming in its oh-so-loomy way.

A top-secret handbook takes viewers on an undercover journey to Titanpointe, the site of a hidden partnership. Narrated by Rami Malek and Michelle Williams, and based on classified NSA documents, Project X reveals the inner workings of a windowless skyscraper in downtown Manhattan.

Assumptions

Last year Benedict Evans wrote about the worldwide proliferation and growth of smartphones. Nolan referenced that post when he extrapolated the kind of experience people will be having:

As Benedict Evans has noted, the next billion people who are poised to come online will be using the internet almost exclusively through smartphones. And if Google’s plans with Android One are any indication, then we have a fairly good idea of what kind of devices the “next billion” will be using:

  • They’ll mostly be running Android.
  • They’ll have decent specs (1GB RAM, quad-core processors).
  • They’ll have an evergreen browser and WebView (Android 5+).
  • What they won’t have, however, is a reliable internet connection.

This is the same argument that Tom made in his presentation at Responsive Field Day. The main point is that network conditions are unreliable, and I absolutely agree that we need to be very, very mindful of that. But I’m not so sure about the other conditions either. They smell like assumptions:

Assumptions are the problem. Whether it’s assumptions about screen size, assumptions about being able-bodied, assumptions about network connectivity, or assumptions about browser capabilities, I don’t think any assumptions are a safe bet. Now you might quite reasonably say that we have to make some assumptions when we’re building on the web, and you’d be right. But I think we should still aim to keep them to a minimum.

It’s not necessarily true that all those new web users will be running WebView browser like Chrome—there are millions of Opera Mini users, and I would expect that number to rise, given all the speed and cost benefits that proxy browsing brings.

I also don’t think that just because a device is a smartphone it necessarily means that it’s a pocket supercomputer. It might seem like a reasonable assumption to make, given the specs of even a low-end smartphone, but the specs don’t tell the whole story.

Alex gave a great presentation at the recent Polymer Summit. He dives deep into exactly how smartphones at the lower end of the market deal with websites.

I don’t normally enjoy listening to talk of hardware and specs, but Alex makes the topic very compelling by tying it directly to how we build websites. In short, we’re using waaaaay too much JavaScript. The message here is not “don’t use JavaScript” but rather “use JavaScript wisely.” Alas, many of the current crop of monolithic frameworks aren’t well suited to this.

Alex’s talk prompted Michael Scharnagl to take a look back at past assumptions and lessons learned on the web, from responsive design to progressive web apps.

We are consistently improving and we often have to realize that our assumptions are wrong.

This is particularly true when we’re making assumptions about how people will access the web.

It’s not enough to talk about the “next billion” in abstract, like an opportunity to reach teeming masses of people ripe for monetization. We need to understand their lives and their priorities with the sort of detail that can build empathy for other people living under vastly different circumstances.

That’s from an article Ethan linked to, noting:

Ice cold in Copenhagen

I went to Copenhagen last week for the Coldfront conference. It was lovely to be back in Denmark’s capital. I used to go over there ever year when the Reboot conference was running, but that wrapped up a few year’s back so it’s been quite a while since I had the opportunity to savour Copenhagen’s architecture, culture, coffee, food, and beer.

Coldfront was fun. Kenneth has modelled the format of the event on Remy’s Full Frontal conference—one day of a single track of front-end dev talks in a comfy cinema.

Going to a focused conference like this is a great way of getting a short sharp shock of what’s hot—like a State of the Union address for the web. At Coldfront there were some very clear themes around building for resilience, and specifically routing around the damage of inconsistent connectivity. There was a very clear message—from Paul, Alex, and Patrick (blog imminent)—that the network is not always on our side. Making our sites work offline should be much more of a priority than it currently is.

On a related note, the technology that was mentioned the most was Service Workers …and Jake wasn’t even there! Heck, even I mentioned it in glowing terms in my own little presentation. I was admiring the way it has been designed specifically to be used in a progressive enhancement kind of way.

So if I were Mr. McGuire in The Graduate, my line to a web developer equivalent of Dustin Hoffman would be “I want to say one word to you, just one word. Are you listening? …Service Workers.”

Relinkification

On Jessica’s recommendation, I read a piece on the Guardian website called The eeriness of the English countryside:

Writers and artists have long been fascinated by the idea of an English eerie – ‘the skull beneath the skin of the countryside’. But for a new generation this has nothing to do with hokey supernaturalism – it’s a cultural and political response to contemporary crises and fears

I liked it a lot. One of the reasons I liked it was not just for the text of the writing, but the hypertext of the writing. Throughout the piece there are links off to other articles, books, and blogs. For me, this enriches the piece and it set me off down some rabbit holes of hyperlinks with fascinating follow-ups waiting at the other end.

Back in 2010, Scott Rosenberg wrote a series of three articles over the course of two months called In Defense of Hyperlinks:

  1. Nick Carr, hypertext and delinkification,
  2. Money changes everything, and
  3. In links we trust.

They’re all well worth reading. The whole thing was kicked off with a well-rounded debunking of Nicholas Carr’s claim that hyperlinks harm text. Instead, Rosenberg finds that hyperlinks within a text embiggen the writing …providing they’re done well:

I see links as primarily additive and creative. Even if it took me a little longer to read the text-with-links, even if I had to work a bit harder to get through it, I’d come out the other side with more meat and more juice.

Links, you see, do so much more than just whisk us from one Web page to another. They are not just textual tunnel-hops or narrative chutes-and-ladders. Links, properly used, don’t just pile one “And now this!” upon another. They tell us, “This relates to this, which relates to that.”

The difference between a piece of writing being part of the web and a piece of writing being merely on the web is something I talked about a few years back in a presentation called Paranormal Interactivity at ‘round about the 15 minute mark:

Imagine if you were to take away all the regular text and only left the hyperlinks on Wikipedia, you could still get the gist, right? Every single link there is like a wormhole to another part of this “choose your own adventure” game that we’re playing every day on the web. I love that. I love the way that Wikipedia uses links.

That ability of the humble hyperlink to join concepts together lies at the heart of Tim Berners Lee’s World Wide Web …and Ted Nelson’s Project Xanudu, and Douglas Engelbart’s Dynamic Knowledge Environments, and Vannevar Bush’s idea of the Memex. All of those previous visions of a hyperlinked world were—in many ways—superior to the web. But the web shipped. It shipped with brittle, one-way linking, but it shipped. And now today anyone can create a connection between two ideas by linking to resources that represent those ideas. All you need is an HTML document that contains some A elements with href attributes, and a URL to act as that document’s address.

Like the one you’re accessing now.

Not only can I link to that article on the Guardian’s website, I can also pair it up with other related links, like Warren Ellis’s talk from dConstruct 2014:

Inventing the next twenty years, strategic foresight, fictional futurism and English rural magic: Warren Ellis attempts to convince you that they are all pretty much the same thing, and why it was very important that some people used to stalk around village hedgerows at night wearing iron goggles.

There is definitely the same feeling of “the eeriness of the English countryside” in Warren’s talk. If you haven’t listened to it yet, set aside some time. It is enticing and disquieting in equal measure …like many of the works linked to from the piece on the Guardian.

There’s another link I’d like to make, and it happens to be to another dConstruct speaker.

From that Guardian piece:

Yet state surveillance is no longer testified to in the landscape by giant edifices. Instead it is mostly carried out in by software programs running on computers housed in ordinary-looking government buildings, its sources and effects – like all eerie phenomena – glimpsed but never confronted.

James Bridle has been confronting just that. His recent series The Nor took him on a tour of a parallel, obfuscated English countryside. He returned with three pieces of hypertext:

  1. All Cameras Are Police Cameras,
  2. Living in the Electromagnetic Spectrum, and
  3. Low Latency.

I love being able to do this. I love being able to add strands to this world-wide web of ours. Not only can I say “this idea reminds me of another idea”, but I can point to both ideas. It’s up to you whether you follow those links.

Visionaries

In 2005 I went to South by Southwest for the first time. It was quite an experience. Not only did I get to meet lots of people with whom I had previously only interacted with online, but I also got to meet lots of lots of new people. Many of my strongest friendships today started in Austin that year.

Back before it got completely unmanageable, Southby was a great opportunity to mix up planned gatherings with serendipitous encounters. Lunchtime, for example, was often a chaotic event filled with happenstance: you could try to organise a small group to go to a specific place, but it would inevitably spiral into a much larger group going to wherever could seat that many people.

One lunchtime I found myself sitting next to a very nice gentleman and we got on to the subject of network theory. Back then I was very obsessed with small-world networks, the strength of weak ties, and all that stuff. I’m still obsessed with all that stuff today, but I managed to exorcise a lot my thoughts when I gave my 2008 dConstruct talk, The System Of The World. After giving that magnum opus, I felt like I had got a lot of network-related stuff off my chest (and off my brain).

Anyway, back in 2005 I was still voraciously reading books on the subject and I remember recommending a book to that nice man at that lunchtime gathering. I can’t even remember which book it was now—maybe Nexus by Mark Buchanan or Critical Mass by Philip Ball. In any case, I remember this guy making a note of the book for future reference.

It was only later that I realised that that “guy” was David Isenberg. Yes, that David Isenberg, author of the seminal Rise of the Stupid Network, one of the most important papers ever published about telecommunications networks in the twentieth century (you can watch—and huffduff—a talk he gave called Who will run the Internet? at the Oxford Internet Institute a few years back).

I was reminded of that lunchtime encounter from seven years ago when I was putting together a readlist of visionary articles today. The list contains:

  1. As We May Think by Vannevar Bush
  2. Information Management: A Proposal by Tim Berners-Lee (vague but exciting!)
  3. Rise of the Stupid Network by David Isenberg
  4. There’s Plenty of Room at the Bottom by Richard Feynman
  5. The Coming Technological Singularity: How to Survive in the Post-Human Era by Vernor Vinge

There are others that should be included on that list but there’s are the ones I could find in plain text or HTML rather than PDF.

Feel free to download the epub file of those five articles together and catch up on some technology history on your Kindle, iPad, iPhone or other device of your choosing.

Things

Put the kettle on, make yourself a cup of tea, and settle down to read a couple of thought-provoking pieces about networked devices.

First up, Scott Jenson writes Of Bears, Bats, and Bees: Making Sense of the Internet of Things:

The Internet of Things is a growing, changing meme. Originally it was meant to invoke a giant swarm of cheap computation across the globe but recently has been morphing and blending, even insinuating, into established product concepts.

Secondly, Charles Stross has published an abridged version of a talk he gave back in June called How low (power) can you go?:

The logical end-point of Moore’s Law and Koomey’s Law is a computer for every square metre of land area on this planet — within our lifetimes. And, speaking as a science fiction writer, trying to get my head around the implications of this technology for our lives is giving me a headache. We’ve lived through the personal computing revolution, and the internet, and now the advent of convergent wireless devices — smartphones and tablets. Ubiquitous programmable sensors will, I think, be the next big step, and I wouldn’t be surprised if their impact is as big as all the earlier computing technologies combined.

And I’ll take this opportunity to once again point to one of my favourite pieces on the “Internet of Things” by Russell Davies:

The problem, though, with the Internet Of Things is that it falls apart when it starts to think about people. When big company Internet Of Things thinkers get involved they tend to spawn creepy videos about sleek people in sleek homes living optimised lives full of smart objects. These videos seem to radiate the belief that the purpose of a well-lived life is efficiency. There’s no magic or joy or silliness in it. Just an optimised, efficient existance. Perhaps that’s why the industry persists in inventing the Internet Fridge. It’s top-down design, not based on what people might fancy, but on what technologies companies are already selling.

Fortunately, though, there’s another group of people thinking about the Internet of Things - enthusiasts and inventors who are building their own internet connected things, adding connectivity and intelligence to the world in their own ways.

You can read it on your networked device or you can listen to it on your networked device …while you’re having your cup of tea …in a non-networked cup …with water from a non-networked kettle.

BBC - Podcasts - Four Thought: Russell M. Davies 21 Sept 2011 on Huffduffer

Of Time and the Network and the Long Bet

When I went to Webstock, I prepared a new presentation called Of Time And The Network:

Our perception and measurement of time has changed as our civilisation has evolved. That change has been driven by networks, from trade routes to the internet.

I was pretty happy with how it turned out. It was a 40 minute talk that was pretty evenly split between the past and the future. The first 20 minutes spanned from 5,000 years ago to the present day. The second 20 minutes looked towards the future, first in years, then decades, and eventually in millennia. I was channeling my inner James Burke for the first half and my inner Jason Scott for the second half, when I went off on a digital preservation rant.

You can watch the video and I had the talk transcribed so you can read the whole thing.

It’s also on Huffduffer, if you’d rather listen to it.

Adactio: Articles—Of Time And The Network on Huffduffer

Webstock: Jeremy Keith

During the talk, I pointed to my prediction on the Long Bets site:

The original URL for this prediction (www.longbets.org/601) will no longer be available in eleven years.

I made the prediction on February 22nd last year (a terrible day for New Zealand). The prediction will reach fruition on 02022-02-22 …I quite like the alliteration of that date.

Here’s how I justified the prediction:

“Cool URIs don’t change” wrote Tim Berners-Lee in 01999, but link rot is the entropy of the web. The probability of a web document surviving in its original location decreases greatly over time. I suspect that even a relatively short time period (eleven years) is too long for a resource to survive.

Well, during his excellent Webstock talk Matt announced that he would accept the challenge. He writes:

Though much of the web is ephemeral in nature, now that we have surpassed the 20 year mark since the web was created and gone through several booms and busts, technology and strategies have matured to the point where keeping a site going with a stable URI system is within reach of anyone with moderate technological knowledge.

The prediction has now officially been added to the list of bets.

We’re playing for $1000. If I win, that money goes to the Bletchley Park Trust. If Matt wins, it goes to The Internet Archive.

The sysadmin for the Long Bets site is watching this bet with great interest. I am, of course, treating this bet in much the same way that Paul Gilster is treating this optimistic prediction about interstellar travel: I would love to be proved wrong.

The detailed terms of the bet have been set as follows:

On February 22nd, 2022 from 00:01 UTC until 23:59 UTC,
entering the characters http://www.longbets.org/601 into the address bar of a web browser or command line tool (like curl)
OR
using a web browser to follow a hyperlink that points to http://www.longbets.org/601
MUST
return an HTML document that still contains the following text:
“The original URL for this prediction (www.longbets.org/601) will no longer be available in eleven years.”

The suspense is killing me!

The Audio of the System of the World

Four months after the curtain went down on dConstruct 2008, the final episode of the podcast of the conference has just been published. It’s the audio recording of my talk The System Of The World.

I’m very happy indeed with how the talk turned out: dense and pretentious …but in a good way, I hope. It’s certainly my favourite from the presentations I have hitherto delivered.

Feel free to:

The whole thing is licenced under a Creative Commons attribution licence. You are free—nay, encouraged—to share, copy, distribute, remix and mash up any of those files as long as you include a little attribution lovin’.

If you’ve got a Huffduffer account, feel free to huffduff it.

Moral panic

Thanks to Tom’s always excellent linkage, I came across an excellent in-depth article by Brenda Brathwaite called The Myth of the Media Myth, all about the perception of videogames by non-gamers. The research was prompted by a dinner conversation that highlighted the typical reactions:

It happens the same way every time: People listen and then they say what they’ve been feeling. Videogames are not good for you. Videogames are a waste of time. They isolate children. Kids never go outside to play. They just sit there and stare at the TV all day.

The tone of the opinions reminded me of the Daily Mail attitude to social networking sites. The resonances were so strong that I decided to conduct a quick experiment using my hacky little text substitution script. Here are the terms I swapped:

OriginalSubstitution
videogamesocial networking site
gamingsocial networking
game designerweb designer
gamewebsite
playsurf
GTAFacebook

Because the original article is paginated, I ran the print version through the transmogrifier. Please excuse any annoying print dialogue boxes. Here’s the final result.

The results are amusing, even accurate.The original article begins:

There are six of us around the table, and the conversation turns to what I do for a living, also known as “my field of study” in academia. “I’m a game designer and a professor,” I say. The dinner had been arranged by a third party in order to connect academics from various institutions for networking purposes.

“You mean videogames?” one of the teachers asks. It’s said with the same professional and courteous tone that one might reserve for asking, “Did you pass gas?”

“Videogames, yes,” I answer. “I’ve been doing it over 20 years now.” Really without any effort at all, I launch into a little love manifesto of sorts, talking about how much I enjoy being a game designer, how wonderful it is to make games, all kinds of games.

After substiitution:

There are six of us around the table, and the conversation turns to what I do for a living, also known as “my field of study” in academia. “I’m a web designer and a professor,” I say. The dinner had been arranged by a third party in order to connect academics from various institutions for networking purposes.

“You mean social networking sites?” one of the teachers asks. It’s said with the same professional and courteous tone that one might reserve for asking, “Did you pass gas?”

“social networking sites, yes,” I answer. “I’ve been doing it over 20 years now.” Really without any effort at all, I launch into a little love manifesto of sorts, talking about how much I enjoy being a web designer, how wonderful it is to make websites, all kinds of websites.

The comments from interviewees also hold up. Before:

One friend complained about GTA, admitted she’d never played the game and then offered this: “If you really are interested in deep psychoanalysis… the truth of my disdain for games is from a negative relationship — [a former boyfriend] would play for hours, upon hours, upon hours. Maybe I felt neglected, ignored and disrespected.”

After:

One friend complained about Facebook, admitted she’d never surfed the website and then offered this: “If you really are interested in deep psychoanalysis… the truth of my disdain for websites is from a negative relationship — [a former boyfriend] would surf for hours, upon hours, upon hours. Maybe I felt neglected, ignored and disrespected.”

Even the analysis of the language offers parallels. Original:

“I haven’t found this kind of attitude about games per se. But in my version of your dinner party anecdote, I start with ‘I make games,’ not ‘I make videogames,’ and I’ve never had a response like the one you describe. This leads me to wonder if the very term ‘videogames’ is the problem meme.”

Substitution:

“I haven’t found this kind of attitude about websites per se. But in my version of your dinner party anecdote, I start with ‘I make websites,’ not ‘I make social networking sites,’ and I’ve never had a response like the one you describe. This leads me to wonder if the very term ‘social networking sites’ is the problem meme.”

But most telling of all are the quotes in the closing passages that haven’t been changed one jot from the original:

“If I had a choice, I would want to include these distrustful folks in finding solutions. I would prefer it if they understood. I would prefer it if they could see the long sequence of events that is going to address their fears and create the medium they will inevitably love and participate in, whether they expect to or not. What’s sad is that their ideological, ignorant, hostile, one-dimensional attitudes oversimplify one of the most beautiful problems in human history.”

Back to school

When I went to the Reboot conference in Copenhagen earlier this year, I met plenty of people who were interesting, cool and just plain nice. In fact, I met half of those lovely people before I even arrived in Denmark—it was at Stansted airport, waiting for a delayed flight, that I first met Riccardo Cambiassi, Lee Bryant and David Smith.

David is a teacher at St Paul’s school in London. Lately he’s been organising an ongoing series of guest speakers to come in and talk to the students. came in and gave a talk a little while back—yes that Ted Nelson. As you can imagine then, I was simultaneously honoured and intimidated when David asked me to come along to the school to give a talk on Designing the Social Web.

Yesterday was the big day. I walked across Hammersmith bridge and stepped inside a school for the first time in almost twenty years. Despite my nervousness, I felt the talk went well. I put together some slides but they were mostly just notes for myself. I had a whole grab-bag of things I wanted to discuss and while I might have done it in a very unstructured way, I think I managed to cover most of them.

Obviously this was a very different audience than I’m used to speaking to but I really enjoyed that. It was illuminating to go straight to the source and find out how teenagers are using social networking sites. Once the talk and questions were done, we adjourned to lunch—a good old fashioned school dinner—where the discussion continued. I really enjoyed talking with such sharp, savvy young gentlemen.

It isn’t surprising that they’re all so Web-savvy; the Web has always been there for them. Thinking back on my own life, it almost seems in retrospect as if I was just waiting for the Web to come along. Maybe I was born too soon or maybe I’m just young at heart, but I found that I was able to relate very closely with these people who are half my age.

I took the opportunity to test a theory of Jeff Veen’s on the difference in generational attitudes towards open data. Given the following two statements:

  1. my data is private except what I explicitly choose to make public or
  2. my data is public except what I explicitly choose to keep private,

…the overwhelming consensus amongst the students was with the second viewpoint, which happens to be the viewpoint I share but I suspect many people my age don’t.

There were plenty of other stimulating talking points—the Facebook/Beacon debacle was a big topic. It was a great way to spend an afternoon. My thanks to David for inviting me along to the school and my thanks to the young men of St Paul’s for their graciousness in listening to me natter on about small world networks, the strength of weak ties, portable social networks and, inevitably, microformats.

Seeing as I was in London anyway, I took the tube across town to see my collaborators at New Bamboo. That meant that by the time I was leaving London, it was rush hour. Oh joy. Despite the knackering experience of the commute, I managed to stay on my feet long enough to enjoy a great gig in Brighton that evening. It was a long but very fulfilling day.

Open?

The nerdier nether-regions of blogland have been burning through the night with the news of the OpenSocial initiative spearheaded by Google and supported by what Chris so aptly calls the coalition of the willing.

Like Simon, I’ve been trying to get my head around exactly what OpenSocial is all about ever since reading Brady Forrest’s announcement. Here’s what I think is going on:

Facebook has an API that allows third parties to put applications on Facebook profile pages (substitute the word “widget” for “application” for a more accurate picture). Developers have embraced Facebook applications because, well, Facebook is so damn big. But developing an app/widget for Facebook is time-consuming enough that the prospect of rewriting the same app for a dozen other social networking sites is an unappealing prospect. That’s where OpenSocial comes in. It’s a set of conventions. If you develop to these conventions, your app can live on any of the social networking sites that support OpenSocial: LinkedIn, MySpace, Plaxo and many more.

Some of the best explanations of OpenSocial are somewhat biased, coming as they do from the people who are supporting this initiative, but they are still well worth reading:

There’s no doubt that this set of conventions built upon open standards—HTML and JavaScript—is very good for developers. They no longer have to choose what “platforms” they want to support when they’re building widgets.

That’s all well and good but frankly, I’m not very interested in making widgets, apps or whatever you want to call them. I’m interested in portable social networks.

At first glance, it looks like OpenSocial might provide a way of exporting social network relationships. From the documentation:

The People and Friends data API allows client applications to view and update People Profiles and Friend relationships using AtomPub GData APIs with a Google data schema. Your client application can request a list of a user’s Friends and query the content in an existing Profile.

But it looks like these API calls are intended for applications sitting on the host platform rather than separate sites hoping to extract contact information. As David Emery points out, this is a missed opportunity:

The problem is, however, that OpenSocial is coming at completely the wrong end of the closed-social-network problem. By far and away the biggest problem in social networking is fatigue, that to join yet another site you have to sign-up again, fill in all your likes and dislikes again and—most importantly—find all your friends again. OpenSocial doesn’t solve this, but if it had it could be truly revolutionary; if Google had gone after opening up the social graph (a term I’m not a fan of, but it seems to have stuck) then Facebook would have become much more of an irrelevance—people could go to whatever site they wanted to use, and still preserve all the interactions with their friends (the bit that really matters).

While OpenSocial is, like OAuth, a technology for developers rather than end users, it does foster a healthy atmosphere of openness that encourages social network portability. Tantek has put together a handy little table to explain how all these technologies fit together:

portabilitytechnologyprimary beneficiary
social applicationOAuth, OpenSocialdevelopers
social profilehCard users
friends listXFN users
loginOpenID users

I was initially excited that OpenSocial might be a magic bullet for portable social networks but after some research, it doesn’t look like that’s the case—it’s all about portable social widgets.

But like I said, I’m not entirely sure that I’ve really got a handle on OpenSocial so I’ll be digging deeper. I was hoping to see Patrick Chanezon talk about it at the Web 2.0 Expo in Berlin next week but, wouldn’t you know it, I’m scheduled to give a talk at exactly the same time. I hope there’ll be plenty of livebloggers taking copious notes.