Journal tags: networks

18

Increment by increment

The bedrock of the World Wide Web is solid. Built atop the protocols of the internet (TCP/IP), its fundamental building blocks remain: URLs of HTML files transmitted over HTTP. Baldur Bjarnason writes:

Even today, the web is like living fossil, a preserved relic from a different era. Anybody can put up a website. Anybody can run a business over it. I can build an app or service, send the URL to anybody I like, and most people in the world will be able to run it without asking anybody’s permission.

Still, the web has evolved. In fact, that evolution is something that’s also built into its fundamental design. Rather than try to optimise the World Wide Web for one particular use-case, Tim Berners-Lee realised the power of being flexible. Like the internet, the World Wide Web is deliberately dumb.

(I get very annoyed when people talk about the web as being designed for scientific work at CERN. That was merely the first use-case. The web was designed for everything …and nothing in particular.)

Robin Berjon compares the web’s evolution to the ship of Theseus:

That’s why it’s been so hard to agree about what the Web is: the Web is architected for resilience which means that it adapts and transforms. That flexibility is the reason why I’m talking about some mythological dude’s boat. Altogether too often, we consider some aspects of the Web as being invariants when they’re potentially just as replaceable as any other part. This isn’t to say that there are no invariants on the Web.

The web can be changed. That’s both a comfort and a warning. There’s plenty that we should change about today’s web. But there’s also plenty—at the root level—that we should fight to preserve.

And if you want change, the worst way to go about it is to promulgate the notion of burning everything down and starting from scratch. As Erin says in the fourth and final part of her devastating series on Meta in Myanmar:

We don’t get a do-over planet. We won’t get a do-over network.

Instead, we have to work with the internet we made and find a way to rebuild and fortify it to support the much larger projects of repair—political, cultural, environmental—that are required for our survival.

Though, as Robin points out, that doesn’t preclude us from sharing a vision:

Proceeding via small, incremental changes can be a laudable approach, but even then it helps to have a sense for what it is that those small steps are supposed to be incrementing towards.

I’m looking forward to reading what Robin puts forward, particularly because he says “I’m no technosolutionist.”

From a technical perspective, the web has never been better. We have incredible features in HTML, CSS, and JavaScript, all standardised and with amazing interoperability between browsers. The challenges that face the web today are not technical.

That’s one of the reasons why I have no patience for the web3 crowd. Apart from the ridiculous name, they’re focusing on exactly the wrong part of the stack.

Listening to their pitch, they’ll point out that while yes, the fundamental bedrock of the web is indeed decentralised—TCP/IP, HTTP(S)—what’s been constructed on that foundation is increasingly centralised; the power brokers of Google, Meta, Amazon.

And what’s the solution they propose? Replace the underlying infrastructure with something-something-blockchain.

Would that it were so simple.

The problems of today’s web are not technical in nature. The problems of today’s web won’t be solved by technology. If we’re going to solve the problems of today’s web, we’ll need to do it through law, culture, societal norms, and co-operation.

(Feel free to substitute “today’s web” with “tomorrow’s climate”.)

The syndicate

Social networks come and social networks go.

Right now, there’s a whole bunch of social networks coming (Blewski, Freds, Mastication) and one big one going, thanks to Elongate.

Me? I watch all of this unfold like Doctor Manhattan on Mars. I have no great connection to any of these places. They’re all just syndication endpoints to me.

I used to have a checkbox in my posting interface that said “Twitter”. If I wanted to add a copy of one of my notes to Twitter, I’d enable that toggle.

I have, of course, now removed that checkbox. Twitter is dead to me (and it should be dead to you too).

I used to have another checkbox next to that one that said “Flickr”. If I was adding a photo to one of my notes, I could toggle that to send a copy to my Flickr account.

Alas, that no longer works. Flickr only allows you to post 1000 photos before requiring a pro account. Fair enough. I’ve actually posted 20 times that amount since 2005, but I let my pro membership lapse a while back.

So now I’ve removed the “Flickr” checkbox too.

Instead I’ve now got a checkbox labelled “Mastodon” that sends a copy of a note to my Mastodon account.

When I publish a blog post like the one you’re reading now here on my journal, there’s yet another checkbox that says “Medium”. Toggling that checkbox sends a copy of my post to my page on Ev’s blog.

At least it used to. At some point that stopped working too. I was going to start debugging my code, but when I went to the documentation for the Medium API, I saw this:

This repository has been archived by the owner on Mar 2, 2023. It is now read-only.

I guessed I missed the memo. I guess Medium also missed the memo, because developers.medium.com is still live. It proudly proclaims:

Medium’s Publishing API makes it easy for you to plug into the Medium network, create your content on Medium from anywhere you write, and expand your audience and your influence.

Not a word of that is accurate.

That page also has a link to the Medium engineering blog. Surely the announcement of the API deprecation would be published there?

Crickets.

Moving on��

I have an account on Bluesky. I don’t know why.

I was idly wondering about sending copies of my notes there when I came across a straightforward solution: micro.blog.

That’s yet another place where I have an account. They make syndication very straightfoward. You can go to your account and point to a feed from your own website.

That’s it. Syndication enabled.

It gets better. Micro.blog can also cross-post to other services. One of those services is Bluesky. I gave permission to micro.blog to syndicate to Bluesky so now my notes show up there too.

It’s like dominoes falling: I post something on my website which updates my RSS feed which gets picked up by micro.blog which passes it on to Bluesky.

I noticed that one of the other services that micro.blog can post to is Medium. Hmmm …would that still work given the abandonment of the API?

I gave permission to micro.blog to cross-post to Medium when my feed of blog posts is updated. It seems to have worked!

We’ll see how long it lasts. We’ll see how long any of them last. Today’s social media darlings are tomorrow’s Friendster and MySpace.

When the current crop of services wither and die, my own website will still remain in full bloom.

Portability

Exactly sixteen years ago on this day, I wrote about Twitter, a service I had been using for a few weeks. I documented how confusing yet compelling it was.

Twitter grew and grew after that. But at some point, it began to feel more like it was shrinking, shrivelling into a husk of its former self.

Just over ten years ago, there was a battle for the soul of Twitter from within. One camp wanted it to become an interoperable protocol, like email. The other camp wanted it to be a content farm, monetised by advertisers. That’s the vision that won. They declared war on the third-party developers who had helped grow Twitter in the first place, and cracked down on anything that didn’t foster e N g A g E m E n T.

The muskofication of Twitter is the nail in the coffin. In the tradition of all scandals since Watergate, I propose we refer to the shocking recent events at Twitter as Elongate.

Post-Elongate Twitter will limp on, I’m sure, but it can never be the fun place it once was. The incentives just aren’t there. As Bastian wrote:

Twitter was once an amplifier for brilliant ideas, for positivity, for change, for a better future. Many didn’t understand the power it had as a communication platform. But that power turned against the exact same people who needed this platform so urgently. It’s now a waste of time and energy at best and a threat to progress and society at worst.

I don’t foresee myself syndicating my notes to Twitter any more. I’ve removed the site from my browser’s bookmarks. I’ve removed it from my phone’s home screen too.

As someone who’s been verified on Twitter for years, with over 140,000 followers, it should probably feel like a bigger deal than it does. I echo Robin’s observation:

The speed with which Twitter recedes in your mind will shock you. Like a demon from a folktale, the kind that only gains power when you invite it into your home, the platform melts like mist when that invitation is rescinded.

Meanwhile, Mastodon is proving to be thoroughly enjoyable. Some parts are still rough around the edges, but compared to Twitter in 2006, it’s positively polished.

Interestingly, the biggest complaint that I and my friends had about Twitter all those years ago wasn’t about Twitter per se, but about lock-in:

Twitter is yet another social network where we have to go and manually add all the same friends from every other social network.

That’s the very thing that sets the fediverse apart: the ability to move from one service to another and bring your social network with you. Now Matt is promising to add ActivityPub to Tumblr. That future we wanted sixteen years ago might finally be arriving.

That fediverse feeling

Right now, Twitter feels like Dunkirk beach in May 1940. And look, here comes a plucky armada of web servers running Mastodon instances!

Others have written some guides to getting started on Mastodon:

There are also tools like Twitodon to help you migrate from Twitter to Mastodon.

Getting on board isn’t completely frictionless. Understanding how Mastodon works can be confusing. But then again, so was Twitter fifteen years ago.

Right now, many Mastodon instances are struggling with the influx of new sign-ups. But this is temporary. And actually, it’s also very reminiscent of the early unreliable days of Twitter.

I don’t want to go into the technical details of Mastodon and the fediverse—even though those details are fascinating and impressive. What I’m really struck by is the vibe.

In a nutshell, I’m loving it! It feels …nice.

I was fully expecting Mastodon to be full of meta-discussions about Mastodon, but in the past few weeks I’ve enjoyed people posting about stone circles, astronomy, and—obviously—cats and dogs.

The process of finding people to follow has been slow, but in a good way. I’ve enjoyed seeking people out. It’s been easier to find the techy folks, but I’ve also been finding scientists, journalists, and artists.

On the one hand, the niceness of the experience isn’t down to technical architecture; it’s all about the social norms. On the other hand, those social norms are very much directed by technical decisions. The folks working on the fediverse for the past few years have made very thoughtful design decisions to amplify niceness and discourage nastiness. It’s all very gratifying to experience!

Personally, I’m posting to Mastodon via my own website. As much as I’m really enjoying Mastodon, I still firmly believe that nothing beats having control of your own content on your domain.

But I also totally get that not everyone has the same set of priorities as me. And frankly, it’s unrealistic to expect everyone to have their own domain name.

It’s like there’s a spectrum of ownership. On one end, there’s publishing on your own website. On the other end, there’s publishing on silos like Twitter, Facebook, Medium, Instagram, and MySpace.

Publishing on Mastodon feels much closer to the website end of the spectrum than it does to the silo end of the spectrum. If something bad happens to the Mastodon instance you’re on, you can up and move to a different instance, taking your social graph with you.

In a way, it’s like delegating domain ownership to someone you trust. If you don’t have the time, energy, resources, or interest in having your own domain, but you trust someone who’s running a Mastodon instance, it’s the next best thing to publishing on your own website.

Simon described it well when he said Mastodon is just blogs:

A Mastodon server (often called an instance) is just a shared blog host. Kind of like putting your personal blog in a folder on a domain on shared hosting with some of your friends.

Want to go it alone? You can do that: run your own dedicated Mastodon instance on your own domain.

And rather than compare Mastodon to Twitter, Simon makes a comparison with RSS:

Do you still miss Google Reader, almost a decade after it was shut down? It’s back!

A Mastodon server is a feed reader, shared by everyone who uses that server.

Lots of other folks are feeling the same excitement in the air that I’m getting:

Bastian wrote:

Real conversations. Real people. Interesting content. A feeling of a warm welcoming group. No algorithm to mess around with our timelines. No troll army to destory every tiny bit of peace. Yes, Mastodon is rough around the edges. Many parts are not intuitive. But this roughness somehow added to the positive experience for me.

This could really work!

Brent Simmons wrote:

The web is wide open again, for the first time in what feels like forever.

I concur! Though, like Paul, I love not being beholden to either Twitter or Mastodon:

I love not feeling bound to any particular social network. This website, my website, is the one true home for all the stuff I’ve felt compelled to write down or point a camera at over the years. When a social network disappears, goes out of fashion or becomes inhospitable, I can happily move on with little anguish.

But like I said, I don’t expect everyone to have the time, means, or inclination to do that. Mastodon definitely feels like it shares the same indie web spirit though.

Personally, I recommend experiencing Mastodon through the website rather than a native app. Mastodon instances are progressive web apps so you can add them to your phone’s home screen.

You can find me on Mastodon as @adactio@mastodon.social

I’m not too bothered about what instance I’m on. It really only makes a difference to my local timeline. And if I do end up finding an instance I prefer, then I know that migrating will be quite straightforward, by design. Perhaps I should be on an instance with a focus on front-end development or the indie web. I still haven’t found much of an Irish traditional music community on the fediverse. I’m wondering if maybe I should start a Mastodon instance for that.

While I’m a citizen of mastodon.social, I’m doing my bit by chipping in some money to support it: sponsorship levels on Patreon start at just $1 a month. And while I can’t offer much technical assistance, I opened my first Mastodon pull request with a suggested improvement for the documentation.

I’m really impressed with the quality of the software. It isn’t perfect but considering that it’s an open source project, it’s better than most VC-backed services with more and better-paid staff. As Giles said, comparing it to Twitter:

I’m using Mastodon now and it’s not the same, but it’s not shit either. It’s different. It takes a bit of adjustment. And I’m enjoying it.

Most of all, I love, love, love that Mastodon demonstrates that things can be different. For too long we’ve been told that behavioural advertising was an intrinsic part of being online, that social networks must inevitably be monolithic centralised beasts, that we have to relinquish control to corporations in order to be online. The fediverse is showing us a better way. And this isn’t just a proof of concept either. It’s here now. It’s here to stay, if you want it.

Syndicating to Mastodon

I’ve been contemplating a checkbox. The label for this checkbox reads:

This is a bot account

Let me back up…

In what seems like decades ago, but was in fact just a few weeks, Elon Musk bought Twitter and began burning it to the ground. His admirers insist he’s playing some form of four-dimensional chess, but to the rest of us, his actions are indistinguishable from a spoilt rich kid not understanding what a social network is.

It wasn’t giving me much cause for anguish personally. For the past eight years, I’ve only used Twitter as a syndication endpoint for my own notes. But I understand that’s a very privileged position to be in. Most people on Twitter don’t have the same luxury of independence. It’s genuinely maddening and saddening to see their years of sharing destroyed by one cruel idiot.

Lots of people started moving to Mastodon. I figured I should do the same for my syndicated notes.

At first, I signed up for an account on mastodon.cloud. No particular reason. But that’s where I saw this very insightful post from Anil Dash:

When it came time to reckon with social media’s failings, nobody ran to the “web3” platforms. Nobody asked “can I get paid per message”? Nobody asked about the blockchain. The community of people who’ve been quietly doing this work for years (decades!) ended up being the ones who welcomed everyone over, as always.

I was getting my account all set up and beginning to follow some other folks, when I realised that I actually already had an exisiting account over on mastodon.social. Doh! Turns out that I signed up back in 2017 to kick the tyres, but never did much else because there weren’t many other people around back then. Oh, how times have changed!

Anyway, I thought I had really screwed up by having two accounts but this turned out to be an opportunity to experience some of the thoughtfulness in Mastodon’s design. The process of migrating from one Mastodon account to another—on a completely different instance—was very smooth! It was clear that this wasn’t an afterthought. This is an essential part of the fediverse and the design of the migration flow reflects that.

This gives me enormous peace of mind. If I ever want to switch to a different instance and still keep my network intact, I know it won’t be a problem. Mastodon is like the opposite of the roach-motel mentality that permeates most VC-backed so-called social networks.

As I played around some more—reading, following, exploring—my feelings of fondness only grew stronger. I like this place a lot!

I definitely wanted to syndicate my notes to Mastodon. At first, I implemented a straightforward RSS-to-Mastodon syndication using IFTTT (IF This, Then That), thanks to Matthias’s excellent tutorial.

But that didn’t feel quite right. When I syndicate to Twitter, I make a conscious choice each time. There’s a “Twitter” toggle that I can enable or disable in my posting interface. Mastodon deserved the same level of thoughtfulness.

So I switched off the IFTTT recipe and started exploring the Mastodon API. It’s going to sound like a humblebrag when I tell you that I got cross-posting working in almost no time at all, but that’s not a testament to my coding prowess (I’m really not very good), but rather a testament to the Mastodon API, which was a joy to work with.

  1. On your Mastodon instance, go to /settings/applications.
  2. Click on New Application.
  3. Fill in the details about your website and select write:statuses (and probably write:media) from the Scopes list.
  4. Copy Your access token to use in API calls.
  5. Write some sloppy code (in my case, PHP that uses CURL).

I did hit a wall when it came to posting images. That took me a while to get working, and I couldn’t figure out why. Was it something at Mastodon’s end while it was struggling under the influx of new users? As it turns out, no. It was entirely down to me being an idiot. (You know that situation where you’re working on a problem for ages and you’ve become convinced it’s an extremely gnarly rocket-science problem, but then turns out to be something stupid like a typo? Yeah. That.)

Then there’s the whole question of how to receive replies, likes, and reboosts from Mastodon here on my own site. Luckily, that was super easy, thanks to Brid.gy. One click and I was done. I love Brid.gy!

Take this note, for example. There’s a version on Twitter and a version on Mastodon. The original version on my own site gets responses from both places.

If I’m replying to a response on Twitter, I do not syndicate that to Mastodon.

Likewise, if I’m replying to a response on Mastodon, I do not syndicate that to Twitter.

Oh, one thing worth mentioning: if you’re sending a reply to something on Mastodon using the API, there’s an in_reply_to_id field for you to provide. But you should also include the full @username@instance of the person you’re replying to at the beginning of the message to ensure that it’s displayed as a reply rather than showing up as a regular post. Note the difference between this note on my site and its syndicated version on Mastodon.

Anyway, now I’m posting to Mastodon, but I’m doing it through the the interface of my own website. Which brings me to that checkbox in Mastodon’s profile settings:

This is a bot account

The help text reads:

Signal to others that the account mainly performs automated actions and might not be monitored

If I were doing the automatic cross-posting from RSS, I’d definitely tick that box. But as I’m making a conscious decision whenever I syndicate to Mastodon, I think I’m going to leave that checkbox unticked.

My cross-posting is not automated and I’m very much monitoring my Mastodon account …because I’m enjoying my Mastodon experience more than I’ve enjoyed anything online for quite some time. Highly recommended!

A curl in every port

A few years back, Zach Bloom wrote The History of the URL: Path, Fragment, Query, and Auth. He recently expanded on it and republished it on the Cloudflare blog as The History of the URL. It’s well worth the time to read the whole thing. It’s packed full of fascinating tidbits.

In the section on ports, Zach says:

The timeline of Gopher and HTTP can be evidenced by their default port numbers. Gopher is 70, HTTP 80. The HTTP port was assigned (likely by Jon Postel at the IANA) at the request of Tim Berners-Lee sometime between 1990 and 1992.

Ooh, I can give you an exact date! It was January 24th, 1992. I know this because of the hack week in CERN last year to recreate the first ever web browser.

Kimberly was spelunking down the original source code, when she came across this line in the HTUtils.h file:

#define TCP_PORT 80 /* Allocated to http by Jon Postel/ISI 24-Jan-92 */

We showed this to Jean-François Groff, who worked on the original web technologies like libwww, the forerunner to libcurl. He remembers that day. It felt like they had “made it”, receiving the official blessing of Jon Postel (in the same RFC, incidentally, that gave port 70 to Gopher).

Then he told us something interesting about the next line of code:

#define OLD_TCP_PORT 2784 /* Try the old one if no answer on 80 */

Port 2784? That seems like an odd choice. Most of us would choose something easy to remember.

Well, it turns out that 2784 is easy to remember if you’re Tim Berners-Lee.

Those were the last four digits of his parents’ phone number.

Web talk

At the start of this month I was in Amsterdam for a series of back-to-back events: Indie Web Camp Amsterdam, View Source, and Fronteers. That last one was where Remy and I debuted talk we’d been working on.

The Fronteers folk have been quick off the mark so the video is already available. I’ve also published the text of the talk here:

How We Built The World Wide Web In Five Days

This was a fun talk to put together. The first challenge was figuring out the right format for a two-person talk. It quickly became clear that Remy’s focus would be on the events of the five days we spent at CERN, whereas my focus would be on the history of computing, hypertext, and networks leading up to the creation of the web.

Now, we could’ve just done everything chronologically, but that would mean I’d do the first half of the talk and Remy would do the second half. That didn’t appeal. And it sounded kind of boring. So then we come up with the idea of interweaving the two timelines.

That worked remarkably well. The talk starts with me describing the creation of CERN in the 1950s. Then Remy talks about the first day of the hack week. I then talk about events in the 1960s. Remy talks about the second day at CERN. This continues until we join up about half way through the talk: I’ve arrived at the moment that Tim Berners-Lee first published the proposal for the World Wide Web, and Remy has arrived at the point of having running code.

At this point, the presentation switches gears and turns into a demo. I do not have the fortitude to do a live demo, so this was all down to Remy. He did it flawlessly. I have so much respect for people brave enough to do live demos, and do them well.

But the talk doesn’t finish there. There’s a coda about our return to CERN a month after the initial hack week. This was an opportunity for both of us to close out the talk with our hopes and dreams for the World Wide Web.

I know I’m biased, but I thought the structure of the presentation worked really well: two interweaving timelines culminating in a demo and finishing with the big picture.

There was a forcing function on preparing this presentation: Remy was moving house, and I was already going to be away speaking at some other events. That limited the amount of time we could be in the same place to practice the talk. In the end, I think that might have helped us make the most of that time.

We were both feeling the pressure to tell this story well—it means so much to us. Personally, I found that presenting with Remy made me up my game. Like I said:

It’s been a real treat working with Remy on this. Don’t tell him I said this, but he’s kind of a web hero of mine, so this was a real honour and a privilege for me.

This talk could have easily turned into a boring slideshow of “what we did on our holidays”, but I think we managed to successfully avoid that trap. We’re both proud of this talk and we’d love to give it again some time. If you’d like it at your event, get in touch.

In the meantime, you can read the text, watch the video, or look at the slides (but the slides really don’t make much sense in isolation).

The Weight of the WWWorld is Up to Us by Patty Toland

It’s Patty Toland’s first time at An Event Apart! She’s from the fantabulous Filament Group. They’re dedicated to making the web work for everyone.

A few years ago, a good friend of Patty’s had a medical diagnosis that required everyone to pull together. Another friend shared an article about how not to say the wrong thing. This is ring theory. In a moment of crisis, the person involved is in the centre. You need to understand where you are in this ring structure, and only ever help and comfort inwards and dump concerns and problems outwards.

At the same time, Patty spent time with her family at the beach. Everyone reads the same books together. There was a book about a platoon leader in Vietnam. 80% of the story was literally a litany of stuff—what everyone was carrying. This was peppered with the psychic and emotional loads that they were carrying.

A month later there was a lot of coverage of Syrian refugees arriving in Europe. People were outraged to see refugees carrying smartphones as though that somehow showed they weren’t in a desperate situation. But smartphones are absolutely a necessity in that situation, and most of the phones were less expensive, lower-end devices. Refugeeinfo.eu was a useful site for people in crisis, but the navigation was designed to require JavaScript.

When people thing about mobile, they think about freedom and mobility. But with that JavaScript decision, the developers piled baggage on to the users.

There was a common assertion that slow networks were a third-world challenge. Remember Facebook’s network challenges? They always talked about new markets in India and Africa. The implication is that this isn’t our problem in, say, Omaha or New York.

Pew Research provided a lot of data back then that showed that this thinking was wrong. Use of cell phones, especially smartphones and tablets, escalated dramatically in the United States. There was a trend towards mobile-only usage. This was in low-income households—about one third of the population. Among 5,400 panelists, 15% did not have a JavaScript-enabled device.

Pew Research provided updated data this year. The research shows an increase in those trends. Half of the population access the web primarily on mobile. The cost of a broadband subscription is too expensive for many people. Sometimes broadband access simply isn’t available.

There’s a term called “the homework gap.” Two thirds of teachers assign broadband-dependent homework, while one third of students have no access to broadband.

At most 37% of people have unlimited data. Most people run out of data on a frequent basis.

Speed also varies wildly. 4G doesn’t really mean anything. The data is all over the place.

This shows that network issues are definitely not just a third world challenge.

On the 25th anniversary of the web, Tim Berners-Lee said the web’s potential was only just beginning to be glimpsed. Everyone has a role to play to ensure that the web serves all of humanity. In his contract for the web, Tim outlined what governments, companies, and users need to do. This reminded Patty of ring theory. The user is at the centre. Designers and developers are in the next circle out. Then there’s the circle of companies. Then there are platforms, browsers, and frameworks. Finally there’s the outer circle of governments.

Are we helping in or dumping in? If you look at the data for the average web page size (2 megabytes), we are definitely dumping in. The size of third-party JavaScript has octupled.

There’s no way for a user to know before clicking a link how big and bloated the page is going to be. Even if they abandon the page load, they’ve still used (and wasted) a lot of data.

Third party scripts—like ads—are really bad at dumping in (to use the ring theory model). The best practices for ads suggest that up to 100 additional HTTP requests is totally acceptable. Unbelievable! It doesn’t matter how performant you’ve made a site when this crap gets piled on top of it.

In 2018, the internet’s data centres alone may already have had the same carbon footprint as all global air travel. This will probably triple in the next seven years. The amount of carbon it takes to train a single AI algorithm is more than the entire life cycle of a car. Then there’s fucking Bitcoin. A single Bitcoin transaction could power 21 US households. It is designed to use—specifically, waste—more and more energy over time.

What should we be doing?

Accessibility should be at the heart of what we build. Plan, test, educate, and advocate. If advocacy doesn’t work, fear can be a motivator. There’s an increase in accessibility lawsuits.

Our websites should be as light as possible. Ask, measure, monitor, and optimise. RequestMap is a great tool for visualising requests. You can see the size and scale of third-party requests. You can also see when images are far, far bigger than they need to be.

Take a critical guide to everything and pare everything down. Set perforance budgets—file size budgets, for example. Optimise images, subset custom fonts, lazyload images and videos, get third-party tools out of the critical path (or out completely), and seek out lighter frameworks.

Test on real devices that real people are using. See Alex Russell’s data on the differences between the kind of devices we use and typical low-end devices. We literally need to stop people in JavaScript.

Push the boundaries. See the amazing work that Adrian Holovaty did with Soundslice. He had to make on-the-fly sheet music generation work on old iPads that musicians like to use. He recommends keeping old devices around to see how poorly your product is working on it.

If you have some power, then your job is to empower somebody else.

—Toni Morrison

Empire State

I’m in New York. Again. This time it’s for Google’s AMP Conf, where I’ll be giving ‘em a piece of my mind on a panel.

The conference starts tomorrow so I’ve had a day or two to acclimatise and explore. Seeing as Google are footing the bill for travel and accommodation, I’m staying at a rather nice hotel close to the conference venue in Tribeca. There’s live jazz in the lounge most evenings, a cinema downstairs, and should I request it, I can even have a goldfish in my room.

Today I realised that my hotel sits in the apex of a triangle of interesting buildings: carrier hotels.

32 Avenue Of The Americas.Telephone wires and radio unite to make neighbors of nations

Looming above my hotel is 32 Avenue of the Americas. On the outside the building looks like your classic Gozer the Gozerian style of New York building. Inside, the lobby features a mosaic on the ceiling, and another on the wall extolling the connective power of radio and telephone.

The same architects also designed 60 Hudson Street, which has a similar Art Deco feel to it. Inside, there’s a cavernous hallway running through the ground floor but I can’t show you a picture of it. A security guard told me I couldn’t take any photos inside …which is a little strange seeing as it’s splashed across the website of the building.

60 Hudson.HEADQUARTERS The Western Union Telegraph Co. and telegraph capitol of the world 1930-1973

I walked around the outside of 60 Hudson, taking more pictures. Another security guard asked me what I was doing. I told her I was interested in the history of the building, which is true; it was the headquarters of Western Union. For much of the twentieth century, it was a world hub of telegraphic communication, in much the same way that a beach hut in Porthcurno was the nexus of the nineteenth century.

For a 21st century hub, there’s the third and final corner of the triangle at 33 Thomas Street. It’s a breathtaking building. It looks like a spaceship from a Chris Foss painting. It was probably designed more like a spacecraft than a traditional building—it’s primary purpose was to withstand an atomic blast. Gone are niceties like windows. Instead there’s an impenetrable monolith that looks like something straight out of a dystopian sci-fi film.

33 Thomas Street.33 Thomas Street, New York

Brutalist on the outside, its interior is host to even more brutal acts of invasive surveillance. The Snowden papers revealed this AT&T building to be a centrepiece of the Titanpointe programme:

They called it Project X. It was an unusually audacious, highly sensitive assignment: to build a massive skyscraper, capable of withstanding an atomic blast, in the middle of New York City. It would have no windows, 29 floors with three basement levels, and enough food to last 1,500 people two weeks in the event of a catastrophe.

But the building’s primary purpose would not be to protect humans from toxic radiation amid nuclear war. Rather, the fortified skyscraper would safeguard powerful computers, cables, and switchboards. It would house one of the most important telecommunications hubs in the United States…

Looking at the building, it requires very little imagination to picture it as the lair of villainous activity. Laura Poitras’s short film Project X basically consists of a voiceover of someone reading an NSA manual, some ominous background music, and shots of 33 Thomas Street looming in its oh-so-loomy way.

A top-secret handbook takes viewers on an undercover journey to Titanpointe, the site of a hidden partnership. Narrated by Rami Malek and Michelle Williams, and based on classified NSA documents, Project X reveals the inner workings of a windowless skyscraper in downtown Manhattan.

Relinkification

On Jessica’s recommendation, I read a piece on the Guardian website called The eeriness of the English countryside:

Writers and artists have long been fascinated by the idea of an English eerie – ‘the skull beneath the skin of the countryside’. But for a new generation this has nothing to do with hokey supernaturalism – it’s a cultural and political response to contemporary crises and fears

I liked it a lot. One of the reasons I liked it was not just for the text of the writing, but the hypertext of the writing. Throughout the piece there are links off to other articles, books, and blogs. For me, this enriches the piece and it set me off down some rabbit holes of hyperlinks with fascinating follow-ups waiting at the other end.

Back in 2010, Scott Rosenberg wrote a series of three articles over the course of two months called In Defense of Hyperlinks:

  1. Nick Carr, hypertext and delinkification,
  2. Money changes everything, and
  3. In links we trust.

They’re all well worth reading. The whole thing was kicked off with a well-rounded debunking of Nicholas Carr’s claim that hyperlinks harm text. Instead, Rosenberg finds that hyperlinks within a text embiggen the writing …providing they’re done well:

I see links as primarily additive and creative. Even if it took me a little longer to read the text-with-links, even if I had to work a bit harder to get through it, I’d come out the other side with more meat and more juice.

Links, you see, do so much more than just whisk us from one Web page to another. They are not just textual tunnel-hops or narrative chutes-and-ladders. Links, properly used, don’t just pile one “And now this!” upon another. They tell us, “This relates to this, which relates to that.”

The difference between a piece of writing being part of the web and a piece of writing being merely on the web is something I talked about a few years back in a presentation called Paranormal Interactivity at ‘round about the 15 minute mark:

Imagine if you were to take away all the regular text and only left the hyperlinks on Wikipedia, you could still get the gist, right? Every single link there is like a wormhole to another part of this “choose your own adventure” game that we’re playing every day on the web. I love that. I love the way that Wikipedia uses links.

That ability of the humble hyperlink to join concepts together lies at the heart of Tim Berners Lee’s World Wide Web …and Ted Nelson’s Project Xanudu, and Douglas Engelbart’s Dynamic Knowledge Environments, and Vannevar Bush’s idea of the Memex. All of those previous visions of a hyperlinked world were—in many ways—superior to the web. But the web shipped. It shipped with brittle, one-way linking, but it shipped. And now today anyone can create a connection between two ideas by linking to resources that represent those ideas. All you need is an HTML document that contains some A elements with href attributes, and a URL to act as that document’s address.

Like the one you’re accessing now.

Not only can I link to that article on the Guardian’s website, I can also pair it up with other related links, like Warren Ellis’s talk from dConstruct 2014:

Inventing the next twenty years, strategic foresight, fictional futurism and English rural magic: Warren Ellis attempts to convince you that they are all pretty much the same thing, and why it was very important that some people used to stalk around village hedgerows at night wearing iron goggles.

There is definitely the same feeling of “the eeriness of the English countryside” in Warren’s talk. If you haven’t listened to it yet, set aside some time. It is enticing and disquieting in equal measure …like many of the works linked to from the piece on the Guardian.

There’s another link I’d like to make, and it happens to be to another dConstruct speaker.

From that Guardian piece:

Yet state surveillance is no longer testified to in the landscape by giant edifices. Instead it is mostly carried out in by software programs running on computers housed in ordinary-looking government buildings, its sources and effects – like all eerie phenomena – glimpsed but never confronted.

James Bridle has been confronting just that. His recent series The Nor took him on a tour of a parallel, obfuscated English countryside. He returned with three pieces of hypertext:

  1. All Cameras Are Police Cameras,
  2. Living in the Electromagnetic Spectrum, and
  3. Low Latency.

I love being able to do this. I love being able to add strands to this world-wide web of ours. Not only can I say “this idea reminds me of another idea”, but I can point to both ideas. It’s up to you whether you follow those links.

Visionaries

In 2005 I went to South by Southwest for the first time. It was quite an experience. Not only did I get to meet lots of people with whom I had previously only interacted with online, but I also got to meet lots of lots of new people. Many of my strongest friendships today started in Austin that year.

Back before it got completely unmanageable, Southby was a great opportunity to mix up planned gatherings with serendipitous encounters. Lunchtime, for example, was often a chaotic event filled with happenstance: you could try to organise a small group to go to a specific place, but it would inevitably spiral into a much larger group going to wherever could seat that many people.

One lunchtime I found myself sitting next to a very nice gentleman and we got on to the subject of network theory. Back then I was very obsessed with small-world networks, the strength of weak ties, and all that stuff. I’m still obsessed with all that stuff today, but I managed to exorcise a lot my thoughts when I gave my 2008 dConstruct talk, The System Of The World. After giving that magnum opus, I felt like I had got a lot of network-related stuff off my chest (and off my brain).

Anyway, back in 2005 I was still voraciously reading books on the subject and I remember recommending a book to that nice man at that lunchtime gathering. I can’t even remember which book it was now—maybe Nexus by Mark Buchanan or Critical Mass by Philip Ball. In any case, I remember this guy making a note of the book for future reference.

It was only later that I realised that that “guy” was David Isenberg. Yes, that David Isenberg, author of the seminal Rise of the Stupid Network, one of the most important papers ever published about telecommunications networks in the twentieth century (you can watch—and huffduff—a talk he gave called Who will run the Internet? at the Oxford Internet Institute a few years back).

I was reminded of that lunchtime encounter from seven years ago when I was putting together a readlist of visionary articles today. The list contains:

  1. As We May Think by Vannevar Bush
  2. Information Management: A Proposal by Tim Berners-Lee (vague but exciting!)
  3. Rise of the Stupid Network by David Isenberg
  4. There’s Plenty of Room at the Bottom by Richard Feynman
  5. The Coming Technological Singularity: How to Survive in the Post-Human Era by Vernor Vinge

There are others that should be included on that list but there’s are the ones I could find in plain text or HTML rather than PDF.

Feel free to download the epub file of those five articles together and catch up on some technology history on your Kindle, iPad, iPhone or other device of your choosing.

Of Time and the Network and the Long Bet

When I went to Webstock, I prepared a new presentation called Of Time And The Network:

Our perception and measurement of time has changed as our civilisation has evolved. That change has been driven by networks, from trade routes to the internet.

I was pretty happy with how it turned out. It was a 40 minute talk that was pretty evenly split between the past and the future. The first 20 minutes spanned from 5,000 years ago to the present day. The second 20 minutes looked towards the future, first in years, then decades, and eventually in millennia. I was channeling my inner James Burke for the first half and my inner Jason Scott for the second half, when I went off on a digital preservation rant.

You can watch the video and I had the talk transcribed so you can read the whole thing.

It’s also on Huffduffer, if you’d rather listen to it.

Adactio: Articles—Of Time And The Network on Huffduffer

Webstock: Jeremy Keith

During the talk, I pointed to my prediction on the Long Bets site:

The original URL for this prediction (www.longbets.org/601) will no longer be available in eleven years.

I made the prediction on February 22nd last year (a terrible day for New Zealand). The prediction will reach fruition on 02022-02-22 …I quite like the alliteration of that date.

Here’s how I justified the prediction:

“Cool URIs don’t change” wrote Tim Berners-Lee in 01999, but link rot is the entropy of the web. The probability of a web document surviving in its original location decreases greatly over time. I suspect that even a relatively short time period (eleven years) is too long for a resource to survive.

Well, during his excellent Webstock talk Matt announced that he would accept the challenge. He writes:

Though much of the web is ephemeral in nature, now that we have surpassed the 20 year mark since the web was created and gone through several booms and busts, technology and strategies have matured to the point where keeping a site going with a stable URI system is within reach of anyone with moderate technological knowledge.

The prediction has now officially been added to the list of bets.

We’re playing for $1000. If I win, that money goes to the Bletchley Park Trust. If Matt wins, it goes to The Internet Archive.

The sysadmin for the Long Bets site is watching this bet with great interest. I am, of course, treating this bet in much the same way that Paul Gilster is treating this optimistic prediction about interstellar travel: I would love to be proved wrong.

The detailed terms of the bet have been set as follows:

On February 22nd, 2022 from 00:01 UTC until 23:59 UTC,
entering the characters http://www.longbets.org/601 into the address bar of a web browser or command line tool (like curl)
OR
using a web browser to follow a hyperlink that points to http://www.longbets.org/601
MUST
return an HTML document that still contains the following text:
“The original URL for this prediction (www.longbets.org/601) will no longer be available in eleven years.”

The suspense is killing me!

The Audio of the System of the World

Four months after the curtain went down on dConstruct 2008, the final episode of the podcast of the conference has just been published. It’s the audio recording of my talk The System Of The World.

I’m very happy indeed with how the talk turned out: dense and pretentious …but in a good way, I hope. It’s certainly my favourite from the presentations I have hitherto delivered.

Feel free to:

The whole thing is licenced under a Creative Commons attribution licence. You are free—nay, encouraged—to share, copy, distribute, remix and mash up any of those files as long as you include a little attribution lovin’.

If you’ve got a Huffduffer account, feel free to huffduff it.

Back to school

When I went to the Reboot conference in Copenhagen earlier this year, I met plenty of people who were interesting, cool and just plain nice. In fact, I met half of those lovely people before I even arrived in Denmark—it was at Stansted airport, waiting for a delayed flight, that I first met Riccardo Cambiassi, Lee Bryant and David Smith.

David is a teacher at St Paul’s school in London. Lately he’s been organising an ongoing series of guest speakers to come in and talk to the students. came in and gave a talk a little while back—yes that Ted Nelson. As you can imagine then, I was simultaneously honoured and intimidated when David asked me to come along to the school to give a talk on Designing the Social Web.

Yesterday was the big day. I walked across Hammersmith bridge and stepped inside a school for the first time in almost twenty years. Despite my nervousness, I felt the talk went well. I put together some slides but they were mostly just notes for myself. I had a whole grab-bag of things I wanted to discuss and while I might have done it in a very unstructured way, I think I managed to cover most of them.

Obviously this was a very different audience than I’m used to speaking to but I really enjoyed that. It was illuminating to go straight to the source and find out how teenagers are using social networking sites. Once the talk and questions were done, we adjourned to lunch—a good old fashioned school dinner—where the discussion continued. I really enjoyed talking with such sharp, savvy young gentlemen.

It isn’t surprising that they’re all so Web-savvy; the Web has always been there for them. Thinking back on my own life, it almost seems in retrospect as if I was just waiting for the Web to come along. Maybe I was born too soon or maybe I’m just young at heart, but I found that I was able to relate very closely with these people who are half my age.

I took the opportunity to test a theory of Jeff Veen’s on the difference in generational attitudes towards open data. Given the following two statements:

  1. my data is private except what I explicitly choose to make public or
  2. my data is public except what I explicitly choose to keep private,

…the overwhelming consensus amongst the students was with the second viewpoint, which happens to be the viewpoint I share but I suspect many people my age don’t.

There were plenty of other stimulating talking points—the Facebook/Beacon debacle was a big topic. It was a great way to spend an afternoon. My thanks to David for inviting me along to the school and my thanks to the young men of St Paul’s for their graciousness in listening to me natter on about small world networks, the strength of weak ties, portable social networks and, inevitably, microformats.

Seeing as I was in London anyway, I took the tube across town to see my collaborators at New Bamboo. That meant that by the time I was leaving London, it was rush hour. Oh joy. Despite the knackering experience of the commute, I managed to stay on my feet long enough to enjoy a great gig in Brighton that evening. It was a long but very fulfilling day.

Open?

The nerdier nether-regions of blogland have been burning through the night with the news of the OpenSocial initiative spearheaded by Google and supported by what Chris so aptly calls the coalition of the willing.

Like Simon, I’ve been trying to get my head around exactly what OpenSocial is all about ever since reading Brady Forrest’s announcement. Here’s what I think is going on:

Facebook has an API that allows third parties to put applications on Facebook profile pages (substitute the word “widget” for “application” for a more accurate picture). Developers have embraced Facebook applications because, well, Facebook is so damn big. But developing an app/widget for Facebook is time-consuming enough that the prospect of rewriting the same app for a dozen other social networking sites is an unappealing prospect. That’s where OpenSocial comes in. It’s a set of conventions. If you develop to these conventions, your app can live on any of the social networking sites that support OpenSocial: LinkedIn, MySpace, Plaxo and many more.

Some of the best explanations of OpenSocial are somewhat biased, coming as they do from the people who are supporting this initiative, but they are still well worth reading:

There’s no doubt that this set of conventions built upon open standards—HTML and JavaScript—is very good for developers. They no longer have to choose what “platforms” they want to support when they’re building widgets.

That’s all well and good but frankly, I’m not very interested in making widgets, apps or whatever you want to call them. I’m interested in portable social networks.

At first glance, it looks like OpenSocial might provide a way of exporting social network relationships. From the documentation:

The People and Friends data API allows client applications to view and update People Profiles and Friend relationships using AtomPub GData APIs with a Google data schema. Your client application can request a list of a user’s Friends and query the content in an existing Profile.

But it looks like these API calls are intended for applications sitting on the host platform rather than separate sites hoping to extract contact information. As David Emery points out, this is a missed opportunity:

The problem is, however, that OpenSocial is coming at completely the wrong end of the closed-social-network problem. By far and away the biggest problem in social networking is fatigue, that to join yet another site you have to sign-up again, fill in all your likes and dislikes again and—most importantly—find all your friends again. OpenSocial doesn’t solve this, but if it had it could be truly revolutionary; if Google had gone after opening up the social graph (a term I’m not a fan of, but it seems to have stuck) then Facebook would have become much more of an irrelevance—people could go to whatever site they wanted to use, and still preserve all the interactions with their friends (the bit that really matters).

While OpenSocial is, like OAuth, a technology for developers rather than end users, it does foster a healthy atmosphere of openness that encourages social network portability. Tantek has put together a handy little table to explain how all these technologies fit together:

portabilitytechnologyprimary beneficiary
social applicationOAuth, OpenSocialdevelopers
social profilehCard users
friends listXFN users
loginOpenID users

I was initially excited that OpenSocial might be a magic bullet for portable social networks but after some research, it doesn’t look like that’s the case—it’s all about portable social widgets.

But like I said, I’m not entirely sure that I’ve really got a handle on OpenSocial so I’ll be digging deeper. I was hoping to see Patrick Chanezon talk about it at the Web 2.0 Expo in Berlin next week but, wouldn’t you know it, I’m scheduled to give a talk at exactly the same time. I hope there’ll be plenty of livebloggers taking copious notes.

Portability

In case you hadn’t noticed, I’ve got a real thing about portable social networks. And I’m not the only one. At a recent meetup in San Francisco a bunch of the Web’s finest minds got together to tackle this issue. You can track the progress (and contribute) on the microformats wiki.

Ever since then, Brian Oberkirch has been doing a sterling job documenting the issues involved:

Head on over there, read what Brian has to say and join in the conversation in the comments.

Lest you think that this is some niche itch that needs to be scratched, I can tell you from personal experience that everybody I’ve spoken to thinks that is a real issue that needs tackling. Heck, even Wired News is getting upset in the article Slap in the Facebook: It’s Time for Social Networks to Open Up:

We would like to place an open call to the web-programming community to solve this problem. We need a new framework based on open standards. Think of it as a structure that links individual sites and makes explicit social relationships, a way of defining micro social networks within the larger network of the web.

Weirdly, the same article then dismisses XFN, saying Trouble is, the data format doesn’t yet offer any tools for managing friends. That’s kind of like dismissing HTML because it doesn’t offer you a way of managing your bookmarks. XFN is a format—a really simply format. Building a tool to manage relationships would be relatively easy. But you have to have the format before you can have the tool.

Life streams and Jaiku

After I wrote about mashing up RSS feeds to create a sort of life stream, some people have taken this idea and run with it. Probably my favourite implementation is Deliciously Meta from Steve Ivy, which looks very classy. For Wordpress users, Chris J. Davis has created a plug-in. Check out his own life stream to see it in action.

Just the other day, I came across a site which allows you to create a life stream by entering a series of URLs. The site is Jaiku.

Jaiku is a Finnish competitor to Twitter—with the added benefit of a life stream thrown in. You send the site little updates of what you’re doing (via the Web or mobile) and you track what your friends are up to.

In many ways, Jaiku is superior to Twitter. It certainly looks a lot better. It feels snappier. The markup is clean. There’s also a dedicated mobile client for Nokia smart phones. All in all, it’s a slick, fun site.

And yet… simply by virtue of the fact that I discovered it after Twitter, I’m unlikely to use Jaiku as much. It all comes back to the issue of creating yet another network of friends on yet another social networking site: I don’t feel very motivated to do it and I suspect that none of my contacts on Twitter relish the prospect either.

Khoi posited the idea that the exclusivity of social networks may be a feature, not a bug. That may be true to a certain extent. On Last.fm, my criteria for adding a contact is not just my relationship with that person, but also whether or not they have crappy taste in music. On Twitter, I only add people I’ve met in real life. Perhaps I’ll end up using Jaiku for a limited subset of people I know: maybe I’ll use it just for tracking my Central European Tribe comrades.

But what I really want is to be able to take all my friends from Twitter and quickly and easily port them over to Jaiku. Alas, in the absence of hCard and XFN on Twitter, this seems unlikely. A movement in the other direction seems more likely given that Jaiku is using hCard.

Meanwhile, I could kill two birds with one stone and add my RSS feed from Twitter to my life stream on Jaiku. That way, every time I post to Twitter, it would show up on Jaiku. I wonder if that would constitute “gaming” the system?

If I wanted to game the system in a harmless but fun way, I could have some fun with the query string on Jaiku and post the results to Flickr.

More thoughts on portable social networks

I’m not the only one thinking about portable social networks:

There are some good comments on these posts ‘though I keep noticing the trend for things to get too complex too quickly. Tom Carden mentions FOAF but I have a number of issues with that:

  1. Publishing XML is hard, certainly harder than publishing HTML.
  2. Out of sight is out of mind. I’ve actually got a FOAF file here at adactio but I haven’t updated it in years. Invisible metadata rots.

A lot of people are talking about the need for some kind of centralised service (ala Gravatar) for storing a social network. But surely the last thing we need is yet another walled garden or roach motel?

I’d much prefer a distributed solution and, frankly, I wish Gravatar had gone down that route given its often slugglish ways. I realise that a centralised service is needed for people who don’t have their own URL but it should, in my opinion, be second choice rather than default.

In any case, I think we may be barking up the wrong tree with all this talk of needing something new. Personally, I don’t think the solution need be complicated it all. It’s within reach right now and it doesn’t require the creation of any new service.

Suppose, just suppose, that…

… were marked up with XFN (update: or more importantly, hCard—see below). Now all I have to do is provide one of those URLs to the next social networking site I join.

Far fetched? Two of the sites I listed are already walking the walk. All that’s needed is for the sign-up form on the next fadsite I join to at least include the option of importing a buddy list by pointing to a URL.

Sure, it won’t work perfectly. People might have different names from site to site. But that’s okay. It’ll work good enough. It will probably get 80% of my contacts imported. And that’s a lot better than the current count of zero.

We don’t need yet another centralised service. The information is already out there, it just needs to be explicitly marked up.

Once you populate a network on one site, that information should be easily portable to another site. That’s doable. It isn’t even that hard: all it requires is the addition of a few rel attributes and possibly some hCard encoding.

Let’s not go chasing a complicated solution when a simpler one will do.

So here’s my plea—nay, my demand—to the next Web X.X social networking doohickey that wants me to join up:

  1. Give me a simple input field for entering a URL that lists my contacts.
  2. Parse that URL for people and relationships.
  3. Voila! I’ve added a bunch of friends. I may repeat from step one with a different URL.
  4. Markup my contacts on your doohickey in an easily exportable way.

Who wants to get the ball rolling? Why can’t this become as ubiquitous as gradients, closed betas, giant text and wet-floor reflections?

For all the talk of social media and the strength of weak ties, there isn’t much action being taken to really try to “harness collective intelligence®”. Within the confines of their own walls, these Web X.X sites might be all about social this and social that, but I want to see more sites practice what they preach on a wider scale… the scale of the World Wide (semantic) Web.

Update

Following on from some comments and Twitter chat, I wanted to clarify a few points:

  1. Yes, social networks differ depending on context. That’s why I want the ability to point at more than one URL. If I join up to a new music site, I might want to point to my Last.fm contacts, but not my Flickr contacts. If I join a new site about food or drink, I’d probably want to point to my Cork’d drinking buddies, but not my Linkdin network. Or I might want to point to any combination thereof: Flickr + Twitter - Last.fm, for example.

  2. The issue of whether the people you’re adding even want to be your friend is a red herring. That’s an issue regardless of portability. I can quite easily add people as my friends on Flickr who don’t want to reciprocate. The same goes for Twitter. Portability will allow me to add friends en masse but it won’t ever automatically add me as a friend to the people I’m importing: that’s still up to them.

  3. No, this won’t move 100% of contacts from network to network. But it will move a lot. My user name is adactio on Flickr, Last.fm, Twitter, Upcoming, Technorati and Cork’d. I suspect a lot of people use the same user name across sites. For sites that use real names, there’s an even greater chance of portability.

  4. None of this portability is irreversible, it’s just a shortcut. If I get false positives—people imported that I don’t want as contacts—I can just remove that relationship. Likewise if I fail to import some people automatically, I’ve still got the old-fashioned way of doing it by hand (which we all have to do now anyway).

  5. Forget about XFN for a minute. The important thing is that I’m pointing to a page and saying, “any people listed on this page are contacts I want to import.” Now, there is no <person> element in HTML so how does it know which strings are people? Well, we need some way of saying “this is a real name”, or “this is a nickname”. We have that already: class="fn" and class="nickname". These are properties of hCard. So I guess it’s hCard usage that really matters. That said, XFN can added an extra level of granularity: contact vs. friend, at least. But I stand corrected: the really important formatting issue here is marking up “who are the people on this page?” rather than “what are the relationships on this page?” The URL itself contains the information that everyone listed is a contact.

Just take a look at these URLs:

  • http://corkd.com/people/adactio/buddies/
  • https://www.flickr.com/people/adactio/contacts/
  • http://last.fm/user/adactio/friends/

A semantic consensus is already emerging across sites in URL structure:

http://site name/[people|user]/username/[buddies|contacts|friends]/

All that’s needed is to explicitly mark up any people on those pages. That’s easily done with hCard. All these sites have to do is edit a template. For extra relationship richness, XFN can help.