The iWatch Cometh…



The most highly anticipated Apple launch event since the arrival of the iPad in 2010 is rumoured to see the long-awaited unveiling of the iWatch – Apple’s bid to truly launch the era of wearable computing. I’m fortunate enough to be attending the event and will post thoughts on what is unveiled, but in the meantime, here is a column I wrote on smartwatches early last year. It will be interesting to see how prescient, or otherwise, I was.

iWatch this space: will Apple reinvent the timepiece?

From The Irish Times, March 25th, 2013

Last week I was travelling in a different time zone, so to keep my bearings I tried to reset the time on my prized but cheap Casio watch – this should not prove an onerous task, I thought. Wrong. I was quickly plunged into an excruciating rigmarole of continuous button pressing and mode changing and inadvertent mistake-making. This should be a lot easier, I thought to myself as I wasted yet more time in search of the right time. And judging from recent rumblings in the technology world, I’m not the only person thinking that.

If the rumour mill is to be believed, the humble wristwatch is next in line for some Apple-flavoured disruption. Recent stories in the mainstream US press, presumed to be strategic leaks out of Cupertino, suggest that Apple has a team of 100 engineers working on the inevitably nicknamed iWatch. The gadget sites are in overdrive, already predicting ways in which Apple can bring that trademark design flair to our lower arms. Analysts are all over the business networks, spouting extraordinary numbers suggesting that Apple could add another $6 billion business to their iPhone, iPad and Mac lines with the introduction of an iWatch. Not to be outdone, Samsung has already announced it has its own smartwatch in the works.

You don’t even need to mention Google Glass to realise the era of wearable computing is imminent, and the wrist is going to be prime real estate. Or rather, wearable computing is imminent once again, because experiments in feature-addled watches have been going on for longer than Dick Tracey has been busting criminals with the aid of a wrist-borne radio.

The efforts to add computing utility to our wristwatches is a more recent trend, but the results have tended to be inglorious failures – think of those ickle Casio calculators, or the ones with in-built IR remote controls.

Indeed, Samsung has long been one of the pioneers of the smartwatch space – they were the first company to ship a watchphone back in 1999, the SPH-WP10, a rather ill-conceived and bulky piece of kit that unsurprisingly failed to become a fashion statement or replace the Nokia handsets everyone was using back then.

More recently, in 2009, Samsung announced the considerably better-looking S9110, a touchscreen watchphone that was hobbled by mediocre specs and poor software. Around the same time, LG announced the GD910 watchphone, which went on sale in the summer of 2009 for an eye-watering sticker price of about €1,000, while Hyundai also introduced a watch phone that year. None took off, unsurprisingly.

And while we’re recalling dimly remembered wrist gadgets, it’s worth pointing out that Bill Gates unveiled a networked Microsoft smartwatch in 2002 – the Smart Personal Object Technology (SPOT) watch was a sales disaster that almost immediately went into the ledger as one of those foolish Redmond endeavours alongside their early tablets and Windows Vista.

More recent efforts, such as the crowdfunded Pebble watch or the Nike Fuelband, have scaled back their ambitions to act primarily as intelligent accessories for our smartphones, offering notifications of incoming messages or monitoring our exercise levels.

Given this long history of products in the smartwatch space, it’s worth asking why everybody is waiting for Apple to effectively “invent” the smartwatch all over again? Is it yet more evidence that the technology industry is overly dependent on Apple to define and crystallise the design of gadgets, and our relationship with them? And above all, can any such device prove to be as revolutionary as the iPhone and iPad?

I suspect not – I’m of the school that thinks any iWatch will not be much more than a new, diminutive iPod model with a range of sensors, Bluetooth 4.0 to communicate with your iPhone and the possibility for third-party developers to create apps to take advantage of the form factor. That could be cool, certainly, but not groundbreaking.

The hype, I reckon, is fuelled by the excitement surrounding wearable computing and, to a lesser degree, the quantified self, our lives recorded and measured by an array of devices. Google Glass is the pre-eminent example, though I suspect also the most overblown.

But it seems obvious to me that we are already in that era, for all intents and purposes – in a practical sense, those smartphones in our pockets are being “worn” just as much as bracelets or badges or necklaces or glasses are worn. The coming array of smartwatches might add marginal convenience for users, but it will be a while before they can completely usurp the smartphone for a whole host of reasons.

Imagining how things might unfold, I’d wager that wearable computing will take the form of a constellation of devices, from sensor-filled watches to camera-equipped badges to, possibly, networked spectacles, that will all interoperate, communicating with one another over Bluetooth, sending data to the cloud, inspiring a host of innovative apps we can barely begin to imagine.

It will be cool, ultimately it will probably be revolutionary, but let’s not ignore the obvious perils of such complexity – there’s no reason to think these devices will eradicate the frustrations I experienced trying to change the time on my Casio.

Searching for Satoshi


So it seems as if the famously pseudonymous Satoshi Nakamoto has finally been discovered – and his name really is Satoshi Nakamoto. At least that’s what Newsweek is claiming. But Newsweek isn’t the first publication to go searching for Satoshi – back in 2011, the New Yorker published a piece claiming that Satoshi was actually Irish computer science PhD candidate Michael Clear. Here’s my interview with Clear, published a few days after the New Yorker story broke.

From The Irish Times, October 8th, 2011

When we hear politicians and business leaders talking about the need to create a knowledge economy, they probably don’t mean our brightest and best should go out and literally create a new economy using their knowledge.

But that is exactly what the New Yorker magazine accused one of the country’s brightest and best students of doing this week in a lengthy article on the virtual currency Bitcoin. 

High-profile US technology writer Joshua Davis went sleuthing for “Satoshi Nakamoto”, the pseudonymous figure who revolutionised virtual currency and electronic finance when he announced Bitcoin in 2009. Davis combed through the world’s best programmers and cryptographers and eliminated them until he had what he thought was a plausible suspect, unveiling his likely Satoshi in the New Yorker as . . . 23-year-old Trinity College computer science postgrad Michael Clear.

Clear, understandably, was mightily surprised – when Davis contacted him at a cryptography conference in California in August, Clear thought he made it plain that he wasn’t Satoshi. “I thought I might feature in one paragraph as a possible candidate who was quickly eliminated,” he explains over a coffee, equal parts bemused and amused by the whole thing – he had no idea that Davis would paint him as a prime suspect. He has spent the week since the article was published denying he is Satoshi, writing on his website that “Although I am flattered that Josh had reason to think I could be Satoshi, I am certainly the wrong person…It seems that even limited searches yield candidates who fit the profile far better than I think I do.”

Clear pointed Davis towards a Finnish virtual currency researcher as a more likely Satoshi, more as a light-hearted illustration that even a cursory search could yield a huge number of credible Satoshi candidates who might be considered a closer fit.

“I think he didn’t get my sense of humour there, maybe it was a little dry and Irish for him,” Clear says. Above all, though, Clear points out that “I’m not even very interested in economics.”

And economics is clearly central to Bitcoin – it is controlled by peer-to-peer software, with a finite number of bitcoins to be created over the next few years, earned or “mined” by people using the software to crunch numbers and help with the decentralised accountability such a system requires, so they can’t be duplicated or stolen. Bitcoins can be bought and sold in online exchanges, or even be used to buy things – some real-world cafes and motels accept the things. By all accounts, it’s a masterpiece of programming. As Nobel prize-winning economist Paul Krugman described it: “Bitcoin has created its own private gold standard world, in which the money supply is fixed rather than subject to increase via the printing press.”

As interest in Bitcoin increased and its value rose – at its peak it was worth nearly $33 before tumbling back down to about $5 – the mystique about its creator became a core part of its appeal. But there was no evidence that Satoshi ever existed – he was the Keyser Söze of digital currency creation. So when Davis revealed that he finally had a suspect, it created a huge flurry of interest in the online tech and economics press, with many of the subsequent articles and blog posts removing the ambiguity of Davis’s piece altogether, despite all of Clear’s denials.

In true New Yorker style, Davis’s article is a very compelling narrative, reminiscent of those pieces where some Catcher in the Rye fan would go hunting for JD Salinger – Clear himself says he enjoyed it, despite what he felt were some misrepresentations. But plenty of Bitcoin observers have also cast doubt on Davis’s detective work, questioning the rigour of his search among other things – he relied on Satoshi’s excellent use of the Queen’s English to limit his search to British and Irish researchers, and settled on Clear because Davis expected to find Satoshi at that Californian conference.

There is no doubting Clear’s capabilities, that’s for sure – he was elected a Trinity scholar in 2008 (not top undergraduate, as the New Yorker said); graduated with a gold medal in 2010; and won the computer science category of the Undergraduate Awards, for which he’ll be getting another gold medal from President McAleese at the end of the month. “I like talking about projects that I was actually involved in, or things that I have designed and developed,” he says, happily discussing his security software start-up and his PhD research on something called “fully homomorphic encryption”, but he has no interest in taking credit for other people’s work.

Despite the hassle of having to fend off allegations of being the man who invented Bitcoin, Clear does now have a reputation as a world-class programmer and cryptographer, which might be a consolation when the dust settles. And it also says a lot about the high standards being set by Trinity’s computer science department that one of their own could be so easily mistaken as the mastermind behind a vastly ambitious virtual currency. Maybe that long-promised knowledge economy isn’t so far-fetched after all.

Growing old in the age of social apps



In the wake of Facebook’s gargantuan acquisition of messaging app WhatsApp, time to revisit a column I wrote for The Irish Times at the end of last year about the inevitable plurality of social apps.

From The Irish Times, December 2nd, 2013

At a certain point, everybody who writes about technology has to attempt to capture a developing trend in a pithy phrase and hope that it enters into such widespread use that it becomes a universally recognised principle.

So here’s my effort at coining Davin’s Law: “Technological old age is the point at which you have more social apps on your smartphone than you have friends whom you actually want to communicate with on a regular basis.”

Admittedly, this is partly a tongue-in-cheek observation of how even the most ardent of social butterflies inevitably finds their social circle dwindling with every passing year, but more than that, it is supposed to point to the seemingly endless profusion of social and messaging apps that keep pouring on to the market.

You wouldn’t have to be a particularly voracious early adopter to be faced with a choice of Gmail, iMessage, Facebook, Twitter, Skype, Line, Viber, BBM, MessageMe, WhatsApp, Snapchat, Branch, LinkedIn or Instagram the next time you wanted to shoot a message off to someone. Some people apparently even make use of Google+, for God’s sake.

The bewildering unbundling of the digital communications market is certainly worthy of a punchline or two, but it’s also worth pointing out because it paints Facebook’s huge $3 billion bid for private messaging and photo-sharing app Snapchat in a rather unflattering light.
There was no shortage of sceptics who looked at Snapchat’s current revenues – zero dollars, there or thereabouts – and thought Mark Zuckerberg had taken leave of his senses.

But I can’t help but feel that Zuckerberg’s huge bid for Snapchat isn’t eye-poppingly daft just because it’s inflated, though it certainly seems to be that. No, I reckon the real reason the bid is daft is because Zuckerberg basically offered to pay $3 billion for something he already had and intentionally destroyed – the trust of his users.

Before its IPO last year, I wrote a column about the failure of Facebook – a failure not in a business sense, obviously, but a more elemental sort of failure. Initially, if we recall the days of enthusiastic early adoption circa 2007, Facebook promised to build a platform to help people to keep in touch with their friends in a remarkably low-friction way.

But the relentless series of Facebook privacy fiascos marked a betrayal of that promise. “Rather than being about sharing content with a chosen circle of friends, it was attempting to coerce its users into essentially curating public blogs,” I wrote.

Our “social graph” had been a way of maintaining contact with friends we would otherwise lose touch with, and therefore a source of joy and indeed comfort. But all those privacy changes undermined that, and in the process turned the site into something vaguely threatening, a potential trap that needed to be constantly negotiated, forever just beyond our control.

Snapchat, which is evanescent by design, at least appears to offer a corrective to that anxiety, returning a sense of control, security and privacy to its users.

In that sense it is very intentionally an anti-Facebook, and the decision of founders Bobby Murphy and Evan Spiegel to reject Zuckerberg’s offer is eminently sensible when seen through that prism.

Ultimately, it’s important to realise that privacy isn’t merely about who has access to our data. It is in fact much more fundamental, and elusive, than that – the level of perceived privacy and control determines the level of our trust in the medium, and that in turn determines the way we communicate with one another on the medium.

It is this way with all our speech and communication – we behave one way in public and another in private; write one way in a work-related email and another way texting a friend; speak one way with our boss and another way with our family. The tonal variations are often subtle and subconscious, on occasions more obvious and determined, but controlling such variations is a fundamental part of what it is to be human.

And this is one reason why we have a never-ending variety of “social” apps – they fulfil different uses, facilitating variations in tone in different social contexts, albeit digital social contexts. Thus, our tone in workplace emails is decidedly different from our tone on Tumblr; the character of our text messages might be different to the character of our tweets; the sort of photos we post to Tumblr might be different than those we post to Instagram, and they are most definitely different to the photos we swap on Snapchat.

I’m sure Zuckerberg is aware of this – he’d need to be a veritable automaton not to. But both his statements regarding the dwindling importance of “privacy” and his actions, such as building an ill-fated Facebook email and buying every new social app that grows fast, suggest that he really thinks that Facebook can achieve a sort of hegemony in digital social communications apps. Zuckerberg’s vision is a world where the only “social” app on our smartphones is the Facebook app.

But that vision disregards our natural inclination to converse in infinitely varying ways using infinitely varying tools. And while that law I just coined is more facetious than fact, it is probably closer to reality than Zuckerberg’s dream of a Facebook-centric digital social universe ever will be.

Exploring Obama’s Victory Lab

Barack Obama’s re-election has prompted a lot of shock among American conservatives, convinced they were about to get their man back in the White House. In some of the rancorous fallout, Mitt Romney’s ground game software, Project Orca, has come in for some serious criticism – this Ars Technica piece details how poorly it performed….

Who invented the iPhone?

So Apple won a sizeable victory against Samsung in the big patent trial of the year/decade/century. Fair play for originality, I think, but still no vindication for the barmy patent system. But as I put it in this recent Irish Times column, “if Samsung could have invented the iPhone, Samsung would have invented the iPhone”….

Why Microsoft is losing the battle and the war

Microsoft have just announced their first ever quarterly loss in their corporate history, losing $492 million, albeit with a $6.2 billion write-down after its ill-fated (read: stupid) purchase of online advertising business aQuantive. That write-down is enough to get an asterisk beside this loss, although Microsoft’s ludicrously misjudged Online Services strategy meant something like this was only a matter…

RIM and Nokia’s grand unravelling

In this month’s Innovation Talk, I look at the startling decline of Nokia and RIM. Once the twin pillars of the mobile industry, they’re flailing now, with no sign of hope. In this week’s Monday Note, former Apple executive and all-round sharp guy Jean-Louis Gassée was even more blunt – “Nokia, once the emperor of mobile…

Facebook floats away…

The reaction to Facebook’s IPO is pretty extraordinary, with coverage akin to a major sporting event. This is what we’ll have to live with, now, this FB ticker watching, an unholy extra element to consider every time there’s another cack-handed redesign or privacy-smashing misstep. I think we’d all do well to ignore the minutia of…

Can magazines go iPad?

This is an illuminating, and rather dispiriting, account of creating iPad versions of print magazines from Jason Pontin, the editor in chief and publisher of Technology Review, whom you’d think would be pioneers of iPad publishing. The reality is a lot more problematic: We fought amongst ourselves, and people left the company. There was untold expense…

Minority Report Syndrome: Picturing the future and making it happen

  My Innovation Talk column from yesterday, about Google’s smart glasses and the mystery of concept videos. “ROSE-TINTED FANTASIES NOT AT ALL BETTER THAN THE REAL THING” From The Irish Times, May 7th, 2012 Last month, Google gave the world their vision of the future, almost literally. Project Glass was unveiled in a stylish concept…