close
Thursday November 21, 2024

Digital aesthetics: me, myself and myselfie

By Farooq Sulehria
July 18, 2018

Expressing one’s love for his/her partner at a wedding anniversary used to be a very private affair. Not anymore. Now, love is not real unless expressed on Facebook or Instagram. Now, before the dreaded angel of death has arrived back in the heavens, mourners have already posted pictures of their dear departed ones on Facebook.

Consider the following facts: Instagram, launched in 2010, had over 400 million subscribers who had uploaded over 40 billion pictures by 2015. By 2011, there were already 500 billion pictures on Facebook and six billion on Flickr; and we were taking over a trillion pictures annually. A sizable majority on such sites consists of users who are below 30 years of age. They usually post their selfies. The very term ‘selfie’, introduced by the Oxford English Dictionary in 2013, had its use increased by 17,000 percent over the year.

Such is the urge to take a selfie that we have such terms as Auschwitz Selfies and Bridge Girl (denoting an incident whereby a girl took selfie ‘with’ somebody committing suicide over New York’s Brooklyn Bridge). Such is the urge to click selfies that heads of states such as Obama cannot resist even when attending the funeral of Nelson Mandela. At home, Komal Rizvi attracted online censure for her selfie with the late Abdul Sattar Edhi while he was terminally ill. In September 2015, a press report highlighted that more people had died (12) while taking selfies than by shark attacks (8). In 2014-2015, 127 selfie deaths were recorded, mostly in India, which spurred a ban on selfies in June this year at least in Goa’s coastal areas.

What explains this selfie craze/madness? An apparent answer is narcissism. There is more and more evidence suggesting that various social platforms and networking sites are making their users increasingly narcissist. But how does digital networking drive us to narcissism?

There are two possible explanations. On a socio-psychological level, there is the fear of loneliness. According to MIT professor Sherry Turkle: “Afraid of being alone we struggle to pay attention to ourselves. And what suffers is our ability to pay attention to each other. If we can’t find our own centre, we lose confidence in what we have to offer others.”

There is much evidence to show that online ‘community life’ generates offline loneliness. For instance, a 2013 study of 600 Facebook users by the Institute of Information Systems at Berlin’s Humboldt University found that Facebook made more than 30 percent of its users feel lonelier, angrier, or more frustrated. Likewise, a 2014 Pew Research Centre report shows that only 19 percent of millennials trust others, compared with 31 percent of Gen Xers and 40 percent of boomers. To quote Andrew Keen, a fierce critic of mainstream accounts glorifying social media, “After all, if we can’t even trust our own existence without Instagramming it, then who can we trust?”

The other possible explanation is peer pressure and the bandwagon effect. When from heads of states to your next-door neighbour are posting selfies ranging from their presence at funerals to their dash to holiday resorts, narcissist culture is generalised and normalised also because virtual ‘communities’ on Facebook and Instagram are outnumbering real-life nations.

Curiously, in terms of consequences, this narcissist digital culture symbolised by the selfie mania is perfectly in sync with these neoliberal times. On the one hand, social media constitutes an ultimate example of Do-It-Yourself capitalism, while on the other, social media breeds fierce individualism (bordering on xenophobia).

The other feature in terms of digital aesthetics that stands out is ephemerality. This ephemerality, in turn, is continuously shortening our attention span. It operationalises in two ways; one, there is a race and craze for online visibility and, hence, a stress on newness. Second, online distractions abound. In fact, distractions are inbuilt in the system because distraction benefits our digital godfathers. Take, for instance, the case of Google. According to American technology writer Nicholas Carr, the Google advertisement system “is explicitly designed to figure out which messages are most likely to grab our attention and then to place those messages in our field of view. Every click we make on the Web marks a break in our concentration, a bottom-up disruption of our attention – and it’s in Google’s economic interest to make sure we click as often as possible. The last thing the company wants is to encourage leisurely reading or slow, concentrated thought. Google is, quite literally, in the business of distraction.” Online distraction is moulding our social behaviours.

Turkle aptly describes how this online distraction is translating into offline distraction: “We are always elsewhere. At class or at church or business meetings, we pay attention to what interests us and then when it doesn’t we look to our devices to find something that does. There is now a word in the dictionary called ‘phubbing’. It means maintaining eye contact while texting. My students tell me they do it all the time and that it’s not that hard.”

The element of newness coupled with the internet’s super-ability to offer distractions has taken its toll on our ability to concentrate. Study after study shows that the internet simply subverts human attention.

Why does this happen? Carr in his book ‘The Shallows’ has cited the experiences of Christine Rosen, a fellow at the Ethics and Public Policy Center in Washington, DC. Many of us would identify with Rosen’s experience. Writing about her experience using Kindle to read Charles Dickens’ ‘Nicholas Nickleby’, Rosen states: “Although mildly disorienting at first, I quickly adjusted to the Kindle’s screen and mastered the scroll and page-turn buttons. Nevertheless, my eyes were restless and jumped around as they do when try[ing] to read for a sustained time on the computer. Distractions abound. I looked up Dickens on Wikipedia, then jumped straight down the Internet rabbit hole following a link about a Dickens short story, ‘Mugby Junction.’ Twenty minutes later I still hadn’t returned to my reading of Nickleby on the Kindle.”

Carr, while justifiably acknowledging the benefits and inevitability of online books, mounts a valid critique of online reading from an aesthetic viewpoint: “the inevitability of turning the pages of books into online images should not prevent us from considering the side effects. To make a book discoverable and searchable is also to dismember it. The cohesion of its text, the linearity of its argument or narrative as it flows through scores of pages, is sacrificed. What that ancient Roman craftman (sic) wove together when he created the first codex is unstitched. The quiet that was ‘part of the meaning’ of the codex is sacrificed as well. Surrounding every page or snippet of text on Google Book Search is a welter of links, tools, tabs, and ads, each eagerly angling for a share of the reader’s fragmented attention.”

Ephemerality characterised by distractions and a short attention span is breeding superficiality. Columbia University, with the French National Institute, conducted a research based on 60,000 articles from global mainstream media shared 2.8 million times on social sites. The study found out that that 59 percent articles shared by social media users were not read before sharing. Aleksandr Solzhenitsyn was mourning superficiality in the press of the twentieth century. Pity his mighty pen did not survive to demolish digital superficiality.

The writer is a freelance contributor.

Email: mfsulehria@hotmail.com