How smartphones killed off boredom – and ushered in an age of distraction

20170516Smartphone

How smartphones killed off boredom – From The Irish Times, May 11th, 2017

Next month see the 10th anniversary of the release of the first iPhone and the start of the smartphone revolution that has shaped the intervening decade in many ways, both obvious and subtle.
But the birth of the smartphone also presaged the death of something else, something whose absence we hardly remark on but which is of momentous importance – the smartphone has in many ways eliminated boredom in the developed world.
Those moments of idle reflection and those minutes of frustrating inactivity have been rendered a thing of the past by the devices in our pockets, which provide short bursts of diversion whenever we have a moment to fill.
Very soon after getting my first iPhone, I realised that boredom was over – I had an infinite supply of articles and books to read at any moment. And this was before Twitter and Facebook became fine-honed mobile experiences, cunningly engineered to hog users’ attention.
And sure enough, I haven’t experienced boredom in the classic sense in years – it exists for me now more as a kind of vivid childhood memory rather than as a human emotion I might experience in daily life. And I’m certainly not alone in not missing it remotely.
Like most people, I had a strong aversion to boredom, and felt its imminent demise would be one of the great achievements of the smartphone, both on the individual level and in the aggregate. I  found persuasive the prediction of the US public intellectual Clay Shirky in his 2010 book Cognitive Surplus that the spread of the internet would unleash a huge wave of previously untapped human potential – the “cognitive surplus” of the title – leading to a burst of extra productivity and creativity.
Needless to say, it will be a very long time before we comprehensively reckon with how significant the extinction of boredom will be for our species, but now it’s clear that we can’t just dispense with a basic human experience like boredom and not have some unintended consequences to grapple with.
A decade on, and it’s clear that we are already facing many of those unintended consequences. That is because we have not substituted boredom with constant engagement, but rather with distraction. And it turns out constant distraction, with frequent hits of dopamine at every hint of novelty, isn’t necessarily a healthy state of mind.
Around the same time as Shirky’s book came out, Nicholas Carr was selling an altogether more pessimistic vision of how we were going to be affected by immersing ourselves in technology in his book “The Shallows: What the Internet Is Doing to Our Brains”. In Carr’s view, the internet was doing nothing good for our brains, eroding our ability to focus and think deeply.
Carr seemed to echo the warnings of previous generations about the dangers of television, or even the famous warning by Socrates on the potential damage of writing – it epitomised the predictable moral panic that greets every new technology. But while Carr’s polemic was a little histrionic for my taste, the intervening years have proven him rather prescient, and his thesis has been elaborated on by a multitude of writers, catering to an audience who are clearly feeling the ill effects of constant distraction.
Books such as Deep Work: Rules for Focused Success in a Distracted World by Cal Newport, Mindful Tech: How to Bring Balance to Our Digital Lives by David Levy, and The Distracted Mind: Ancient Brains in a High-Tech World by psychologist Larry Rosen and neuroscientist Adam Gazzaley are just some of the high-profile examples of the genre in recent years, the sense of techno-induced anxiety palpable in each of those subtitles.
Indeed, our compulsive need for smartphone distraction is drawing parallels with another form of mass addiction with harmful consequences. In a recent essay on the corrosive effect of smartphone dependence, the millennial self-help writer Mark Manson made the explicit comparison with smoking: “It’s attention pollution when somebody else’s inability to focus or control themselves then interferes with the attention and focus of those around them…The same way second-hand smoke harms the lungs of people around the smoker, smartphones harm the attention and focus of people around the smartphone user. It hijacks our senses.”
Perhaps these are all just growing pains, and the smartphone era is a mere phase, a temporarily jarring transition until we reach a point when we realise our brains are actually perfectly capable of diving in and out of different streams of information, and that it is society that needs to catch up with our diversion-filled environment.
But I suspect the conclusion will be something quite different. Constant distraction, obviously enough, is bad for us. But less intuitively, perhaps we also need to admit that doses of boredom are good for us.

Democracy in the age of social media

From The Irish Times, May 14th, 2016

For political addicts in the western world, recent elections have offered more unpredictable twists than this year’s Premier League.

We have just had an electoral split between the Civil War parties and a rise in Independents that was perhaps anticipated but certainly unprecedented. In the UK, Jeremy Corbyn led an astonishing coup in the UK Labour Party, dragging it far to the left. In Spain, new parties sprung up during the economic crisis and have entirely recast the political landscape, leading to months of parliamentary deadlock. In the race for the US presidential nominations, Donald Trump is redefining demagoguery for the world while Bernie Sanders is redefining socialism for an American audience.

There has been no shortage of theories for each of these scenarios, and this sort of political upheaval is probably to be expected in the wake of massive economic turbulence. But are there any longer-term patterns that might be a factor in these previously unlikely outcomes?

Democracy depends on an informed citizenry, it is often pointed out – the currents of information that shape the direction of politics, elections, policy and ultimately our lives traditionally flow through mainstream media. So we need to start considering the implications of what happens when the free press gets disrupted, which is evidently happening right now.

We have been hearing for many years about this election or that vote being the first “digital” or “social media” election, and the superficial impact has been quite evident to see – candidates with a huge social media presence, Q&A sessions on internet forums, get out the vote efforts on Facebook, that sort of thing.

But the larger implications are far harder to discern – what happens when the mainstream press loses its ability to determine the contours of mainstream discourse.

In a long series of tweets last month, writer and theorist Clay Shirky discussed this scenario. “Social media is breaking the political ‘Overton Window’ — the ability of elites to determine the outside edges of acceptable conversation,” he wrote. “Politically acceptable discourse is limited by supply, not demand. The public is hungry for more than politicians are willing to discuss. These limits were enforced by party discipline, and mass media whose economics meant political centrism was the best way to make money.”

The Overton Window Shirky refers to is a phrase conceived in the mid-1990s by US free-market advocate Joseph Overton, describing the range of acceptable political discourse on any given topic. While predating Overton’s theory, Edward Herman and Noam Chomsky’s 1992 book Manufacturing Consent offers a seminal analysis of the role played by the US press in disseminating establishment “propaganda”, in effect describing how the Overton Window of acceptable thought is maintained and policed. Just look at the coverage of Corbyn in much of the UK press to see the Overton Window in action.

The political order of the past few decades has come to rely on a relatively stable dynamic between politicians, press and the public. But we are at a point where that dynamic is undergoing flux due to the changing nature of the press, and the way the public engages with media?

The incisive technology analyst Ben Thompson goes so far as to suggest that “politics is just the latest industry to be transformed by the internet”. That might be overemphatic, but I feel he is correct in diagnosing an imminent and fundamental shift.

“In a Facebook world, information suppliers are modularised and commoditised as most people get their news from their feed,” he writes. “The likelihood any particular message will ‘break out’ is based not on who is propagating said message but on how many users are receptive to hearing it. The power has shifted from the supply side to the demand side.”

In this scenario, Thompson is right to identify Facebook as key to the changing landscape. While Twitter is a hugely influential broadcasting platform, Facebook’s reach is far greater. According to an article in Mother Jones magazine in 2014, Facebook conducted an experiment in the three months leading up to the 2012 US presidential election, increasing “the amount of hard news stories at the top of the feeds of 1.9 million users. According to one Facebook data scientist, that change – which users were not alerted to – measurably increased civic engagement and voter turnout.”

Concerns over concentrated media ownership will be nothing compared to the anxiety over the power wielded by Zuckerberg and Facebook, whose secretive algorithm will determine what we see, or don’t see, on our newsfeeds.

None of this is to suggest that the rise of Independent TDs, Podemos, Corbyn, Trump or Sanders can be directly attributed to the disruption of traditional media by social media – that would be absurdly reductive and would be to ignore the wider context in which each has succeeded.

But it would also be foolish to think that one form of media is merely replacing another without any political repercussions, without any underlying change in the established dynamic by which voters inform themselves about and then choose their leaders.

The great Canadian intellectual Marshall McLuhan, in coining the phrase “the medium is the message”, got to the core of the situation. “The personal and social consequences of any medium,” he wrote, “result from the new scale that is introduced into our affairs by…any new technology.”

To like or to dislike, that is the question

From The Irish Times, September 19th, 2015


Since its introduction in 2009, Facebook’s “like” button has become a ubiquitous unit of online expression. However, its limitations were always obvious: it’s far too reductive even as a swift mode of interaction.

One example I can recall occurred earlier this year when a good friend wrote a deeply affecting Facebook post about his devastation at the sudden death of a colleague.

He expressly asked people not to like it, but inevitably, the post received about a dozen likes, thoughtless little actions that presumably meant well but which in some small way exacerbated my friend’s grief. These feelings were too real, the emotions too raw, for a reaction as trite as a like.

Those limitations were acknowledged this week when founder Mark Zuckerberg announced that the company was experimenting with a “Dislike” button and other variations.

“I think people have asked about the dislike button for many years,” Zuckerberg said at a meeting with staff. “What they really want is the ability to express empathy. Not every moment is a good moment . . . your friends want to be able to express that they understand and relate to you, so I do think it’s important to give people more options than just like as a quick way to emote.”

The announcement generated a fair degree of alarm: Facebook has a rather inglorious habit of infuriating users with its tinkering.

One concern was whether the move will add to the online world’s already deep well of negativity; many fear that a dislike button would be a baton in the hand of the bullies, a tool of snide derision rather than an expression of empathy.

A larger concern, however, is how the like button and its imminent variations can actually limit our interactions; in a technical sense, our ability to communicate is limited only by our vocabulary, but when we transpose language, with its rich variety, for a series of buttons crudely approximating feelings, we constrain that communication in very real ways.

Obviously, the like button is not a replacement for language, but the rise of that blue “thumbs up” logo demonstrated a widespread desire to abbreviate our communications in the digital domain, “a quick way to emote”, as Zuckerberg put it. As such, there’s an undeniable vapidity to the like button, in that it encapsulates a shallowness to our online relationships that is potentially quite corrosive.

Particularly for the millennial generation of 20-somethings that has grown up interacting with their peers on Facebook, the like button might ultimately be seen less as a signature mode of interaction and more as a signifier of superficiality.

Three layers, two sides and one vote

A few weeks ago, I heard the most beautiful, inspiring story. A friend of a friend, a woman in her late 30s, had summoned the courage to be her true self – first she told her close friends, then she told her siblings, and finally she told her parents, that she was attracted to women….

A chat with David Carr

So saddened to hear about the sudden death of David Carr. I was fortunate enough to interview him at the Web Summit in 2013, and he was terrific company, and just as hilariously irascible as you would imagine. His legacy, not just as a media critic but as a chronicler of our times, will be…

The iWatch Cometh…

  The most highly anticipated Apple launch event since the arrival of the iPad in 2010 is rumoured to see the long-awaited unveiling of the iWatch – Apple’s bid to truly launch the era of wearable computing. I’m fortunate enough to be attending the event and will post thoughts on what is unveiled, but in…

Distinctive voices push the boundaries of journalism

  My opinion column on the changing nature of journalism, from The Irish Times, Monday June 16th, 2014 When the Pulitzer Prizes were announced in April, the prestigious prize for public service was awarded jointly to the Guardian and the Washington Post for their role in breaking the series of stories about vast government surveillance…

Searching for Satoshi

So it seems as if the famously pseudonymous Satoshi Nakamoto has finally been discovered – and his name really is Satoshi Nakamoto. At least that’s what Newsweek is claiming. But Newsweek isn’t the first publication to go searching for Satoshi – back in 2011, the New Yorker published a piece claiming that Satoshi was actually…

Assessing Assange

  So Andrew O’Hagan has written a looooong piece in the London Review of Books about his experience ghostwriting the still-born memoir of Julian Assange. What was ultimately published was an extremely odd book, for obvious reasons, so I dug up my review of the book from the time. A lot of the conclusions I…

Growing old in the age of social apps

  In the wake of Facebook’s gargantuan acquisition of messaging app WhatsApp, time to revisit a column I wrote for The Irish Times at the end of last year about the inevitable plurality of social apps. From The Irish Times, December 2nd, 2013 At a certain point, everybody who writes about technology has to attempt…