The Essentials of Ubiquity

I stood waiting for a bus last night. I knew it was a number 4; I knew it was suposed to arrive at 10:01pm, and that it should take about 14 minutes to get me to Waterloo, where I could catch a train home.

What I didn't know was when, or whether, it would actually turn up. There was no "countdown" display on the bus-shelter, so the information available was entirely based on paper, ink, and "should".

This struck me, in an age where the capabilties of our daily, hand-held technologies are developing at an ever-increasing rate, as frankly somewhat poor, even disappointing. Someone wasn't trying hard enough.

I own an iPhone. It's an incredibly polarising device; excessively loved by some, unnecessarily maligned by others. It does, however, have one point very much in its favour:

What it claims to do, it does. Well, and with excellent stability. In Apple parlance, It Just Works.

There are of course things it won't do. Copy and paste functionality is the classic example of what it lacks, much lambasted for the omission of such a simple, widely-supported feature. It's a bit of a riddle until you then ask yourself "but what is the standard, accepted way of supporting copy and paste on a multi-touch, gesture-driven device?" Of course there isn't one, and Apple's perfectionist attitude was that they'd rather not do it if they couldn't do it well. We'll see how well their implementation works in the mass market in a few months.

iPhones, after all, evolve. The device I bought on the launch day of the 1st gen phone was a very different machine when I sold it to get the 3G (mainly for the memory). The rule on "iPhone 1.0" was "This is a totally closed system; we give you maps, calendars, and the other standard PDA/smartphone basics, and if you want to do something clever, you go to the web". When I sold it of course it supported third-party apps, and had mapping capabilities vastly greater than when I bought it, among numerous bug-fixes & small enhancements. That process means that on those rare occasions when you do find a fault in the system, you don't feel you bought a dud device, just that you may have to wait for it to improve.

For copy and paste, of course, that wait was rather longer than we expected. But I digress.

The point I made above was that the iPhone does what it claims to do. I've been using smartphones and PDAs, generally from the Palm/Handspring stable for years. Their basic features all worked, but they were self-enclosed (trying to get them to talk to a PC or Mac was always hit-or-miss) and frankly clunky. Third party apps were of course supported, but generally pretty darn poor. Different versions of the Palm platform confused apps which frequently fell into disrepair, or just didn't work in the first place. Of course, the iPhone platform changes too, but with apps sold on a subscription model, they tend to evolve and be fixed in just the same way that the phone itself does.

My point, though, is not to write a paean to the iPhone; there's plenty of those around already. The focus is this: there now exists a stable, high-res, high-power, widely adopted portable platform with decent autonomy, and capabilities that until 5 years ago or less were science fiction, or at best divided into dedicated (and generally fairly dumb) gadgets.

The best example (and the best described, in terms of human impact) of these integrated Sci-Fi gadgets, by the way, are Kim Stanley Robinson's wrist computers from his Mars trilogy. The iPhone is, pretty much, that device. To use a phrase that's very popular these days, it is a "game-changer". And with the v3 OS coming out soon, it'll knock the game up to another level.

Fortunately, other companies are also trying to join the new game, but whether they're succeeding, or even outdoing the iPhone, is irrelevant to this post. The point is:

The platform exists, be it in one or many devices. What are we going to do with it?

And why did I start out waffling about busses and trains, and what does it have to do with the iPhone?

The latter question is the easier one to answer. On my iPhone, I can (simplifying only very slightly) click "Trains -> Next train home from where I'm standing" and it'll give me the answer, quickly and accurately. There's a few things it needs to know to do that:

1) Where I am.
2) What the nearest station to that location is.
3) How I define "home".
4) What trains are scheduled to run from 2) to 3), directly or indirectly.
5) Whether they're running to time

It uses quite a range of technologies to achieve this:

1) it solves by consulting the iPhone's "Core Location" service. This combines GPS location (gathered from a range of satellites thousands of miles away, of course) with ambient electronic clues in the form of WiFi hotspot idents, correlates them via consultation with a remote system, and returns the data to the app. Quite a trick in and of itself, but much more useful when it's an input to a system.

2) it solves by asking a remote database the question "Where's the nearest station to this location?" This requires network communications, a remote server infrastructure, and a geographically tagged list of every rail station in the UK. That's a fairly large database, although the proximity maths is reasonably basic trigonometry. In other words, it's consulting an expert third party.

3) Well, it asked me. If it was feeling really clever, it might have been able to look in my address book and find the entry marked "me" and work out a station from there, but frankly, people find that spooky. And interacting with the user, particularly when you give them a large amount of "usefulness" in return for minimal data, is still a pretty good idea.

4), like 2), is a remote database lookup, coupled with a routing algorithm that's probably quite complex.

5) again is a remote lookup, but with the crucial difference that's live data. It's not something I could find out in a paper timetable, rather it's a representation of the current state of the world, specifically of a really pretty complex railway system, that's updated every minute or so. That in itself is quite a trick of data transfer and management.

So, by use of massively complex national, international, and even orbital systems, I save a few minutes' time, or gain access to information that would have been unfeasibly complex to get by trying to follow the steps manually. Which it is depends on where I am, and how much I already know about the local area and services. Even if I know nothing, it can find me a route.

So, why the heck can't it find a bus?

Well, obviously busses don't run on rails or have to be scheduled through stations. But most of them do now report their locations to central systems, and there are databases and routing systems that can report on their intended locations and capabilities.

Basically, it can't find a bus because no-one's joined together all the systems that it needs to use to find a bus. The main step missing is a public interface to the "Countdown" data that's displayed on some bus stops, but it could also quite practically report on traffic and weather conditions along the route.

But there are a couple of other issues. There may be resistance to the public knowing exactly where all busses are at all times - people on those busses may feel uneasy, and I'm sure someone in Whitehall or TfL would consider it a terrorism risk. There are social aversions to this sort of sharing of data.

The company that's been most on the receiving end of those aversions recently is Google - both for their Latitude and Street View applications. The tabloid outcry (and you can classify most of the media in that category these days) has been somewhere between hilarious and utterly depressing. Classifying Latitude as "Google's spy in your pocket" has been one of the most impressive displays of hyperbolic, point-missing technophobia of modern times. Yes, it's technologically (if not actually) possible to create a system that will track people without their knowledge, but Latitude is categorically not that system; and if you claim "but Google are just saying that", you might as well believe that the spy's already in your phone, at the OS level. Latitude changes nothing at the technical level.

What it changes is the capacity for sharing that information - in scope, in precision, and in audience.

I know people all over the UK. All over the world, in fact, and they tend (and I) tend to travel. I can only guess how many times I've walked 2 streets away from a friend who lives in another country, and never met, and still not met them because I didn't know they were there. This might seem a slightly tenuous example of trying to force serendipity, but on occasion I've managed to do so, and enjoyed the results. Not by automated location sharing, but via Twitter and Facebook status - I, like millions of others worldwide, am manually pushing data out there to increase my friends' "ambient awareness" of me in the hope that it may lead to a meeting, or a laugh, or useful information. In extremis, it can lead to a new job, career, or lifesaving information.

I actually want quite a lot of people to know quite a lot about me.

Of course, there is also some information I don't want widely known, and possibly even some people I'd rather knew very little. Most people would generally prefer, for example, that their employers didn't know every pub and club they'd been into recently. And we've already heard too many cases where uptight employers have seen things on facebook that they've deemed dismissible offences, often such heinous crimes as "I wish they'd give me something more interesting to do", or "pissed again".

Of course in the latter case, the crime's not getting pissed. It's getting caught. After all, chances are the employer's doing exactly the same thing. It's hypocrisy, and it's not a technical problem, it's a social one. And it stinks. If we're going to make even comparatively innocuous data risky or guilty, we're going to have one hell of a problem with real ambient awareness, and geo-aware assistive tools.

Of course, it's not all just curtain-twitching; there are some real reasons why certain people don't want their location to be widely known - one friend of mine has a stalker who's life they'd rather not make easier. But, unfortunately, even without actively broadcasting ambience (that may be an oxymoron) no-one's location, or at least their dwelling, is an impenetrable secret. Ambience just makes it easier.

I don't claim to have a solution to that one, and we'll need one at some point, but, prudish attitudes aside, it's not a problem that needs to apply to most people. Frankly, the entity that most people mistrust with their location and ambience isn't other individuals, or their employers, but the government. And with good reason; their abilities to contain and manage data are on a par with an igloo's ability to contain a blast furnace. Even beyond that, there's too much evidence that they don't always act in our best interests - the number of people arrested or investigated under terrorist legislation for anything from peaceful protest to putting their bins out too soon attests to that.

If we could just fix the government, and people's attitudes, it'd all be so simple.

Well OK, that's obviously far from trivial; but it's worth recognising that:

1) Ambient awareness, and location-aware services, have the potential to be a massive benefit to us.
2) The problems with these systems tend to be more of attitude than of significant social or technical issues, and
3) That if we can't solve 2, we're massively limiting the use and usefulness of 1)
4) At the moment most of our apparent privacy and secrecy (and sometimes security) is a shared myth that it's doing us very few favours to perpetuate.

It's difficult to discuss these topics and maintain a fully consistent attitude with regards to personal privacy (although Emerson's comments that "A foolish consistency is the hobgoblin of little minds" may be applied; the world itself may not be consistent). We really have to ask what privacy we need, and why, and how we can maintain our freedoms and abilities if we shift that balance around for technical benefit.

That's *not* a topic I'm going to try to cover in depth right now, though.

I mentioned Street View above, too. Many people are, apparently, outraged that they've been caught in the act of walking in a street at an undisclosed date and time (although frankly, given the fuzzing, most people can only identify themselves, and that very rarely). Or they're energetically objecting to the fact that people can see their houses (from here). It's often widely forgotten that such ancient technologies as feet and eyes have had provided this capability for more than a few years. And there's probably more sensitive data in the phone book.

Again, though, there are edge cases where it matters more - if you're pissed and throwing up in the street you have little of my sympathy for your self-inflicted plight, but rather more for your colleagues' or employer's subsequent self-righteousness. If you're being treated by Paramedics, then I think it's fair that Google swap out that content when notified. But if you're upset about the deer being knocked over, please go watch Bambi and get a grip.

We might need to tidy the data up. But let's not just trash it on knee-jerk technophobia or future shock.

And you'd better get used to future shock too, because the future's accelerating. No, we don't have flying cars yet (thankfully), but we may still be at the first hints of the Accelerando.

At this point it's incumbent on me to mention Charlie Stross, not merely an excellent and humorous author, but quite possibly the UK's best futurologist. His Cthuloid spy stories and world-walking tales may not prove entirely predictive, but his near-future vision in Halting State is spot on (too much so sometimes, having seen two Halting State incidents in Eve Online recently). Equally, his canonical "Accelerando" is possibly the best tale of human reaction to future technological change, and even his far-future and whimsical Eschaton novels are excellent studies of humanity in technological extremis.

I don't want my flying car. I want my phone to tap my ear and put up a subtle glyph in my glasses if the Northern Line's packed up when I leave the office, or I'm looking at museums to visit online and the Overland's shut for maintenance. I want it to update my list of local eateries when that new Japanese place opens, and make me aware that Porcupine Tree are releasing a new album. I'd like to know that Jacques, who I worked with in Paris, just moved to Kensington. I want it to make my life subtly simpler, and help me connect with friends and old acquaintances.

Give or take the pretty poor state of eyeglass projectors at the moment, it's all entirely possible - and not merely possible in the Tomorrow's World, Martlesham Heath sense that "given enough boffins, we can make a proof of concept", but rather in the sense that 90% of it's already on the shelves and in people's pockets.

The future's very close. At some point, your phone may realise this.
Posted by parsingphase, 2009-03-27 19:53

Anonymous user

Login

Blog

Contact

I'm currently available for contract work in London as a Senior PHP Developer. Contact me for a CV, rates, or a chat.

Twitter @parsingphase
Email richard@phase.org
Github parsingphase
LinkedIn Richard George
Flickr parsingphase