Web 2.0 is one of those terms that everyone uses but no one like to define. It’s been around for a long time (in popular use since 2004) and it’s time we understood that it’s not just one kind of use for the web: it is the web.
Web 2.0 is the web made by us, the ordinary user, as opposed to being a repository of static information placed online by an organisation or authority. Wikipedia, blogs, eBay, YouTube – if this is the kind of place where you usually go to find information, entertainment and services online then Web 2.0 is your web.
Writing on the news and information website Lifewire, Daniel Nations sums up his broad definition of Web 2.0 as follows:
Most people generally have some idea that Web 2.0 is an interactive and social web facilitating collaboration between people.
This is distinct from the early, original state of the web (Web 1.0) which was a static information dump where people read websites but rarely interacted with them.
It’s hard to think of pure ‘Web 1.0’ sites nowadays. Small businesses with a very small online presence might still have a simple, non-interactive web page giving information and offline contact details, but they won’t be getting much business from that: if you want to attract custom online you need to involve existing and potential customers in the fabric of the purchase. I just bought my daughter a £2.50 t-shirt from the Marks and Spencer website and there are 40 reviews telling me how it fits, how it washes, and how the price compares to other retailers. Even the simplest business websites (like mine!) can incorporate user-generated comment, reviews and information online by including their Facebook or Twitter feeds.
This brings us on to social media in general – the behemoth of Web 2.0. Facebook and Twitter in particular have played, in my mind, the biggest role in revolutionising our relationship with the web and in making user-generated content central to how we operate online. This is because they are based on conversation – words, rather than visual media, were their original backbone, which gives them depth and diversity of use for different audiences and purposes (I’ll come back to YouTube and company later).
Online news sites show how integral this conversation has become. A quick look at the BBC news homepage today reveals a high proportion of front page stories born entirely from online conversations. These range from traditional reporting of old problems with a modern context (“Facebook ‘failed to remove sexualised images of children’“) to very modern stories being reported in the news but which have already unfolded entirely on social media (“Celine’s depression: ‘My selfies tell a story’“, “‘Say My Name’: The Chinese students fighting racism“).
Even more interesting is the story about Emma’s Watson’s controversial Vanity Fair cover, “Is Emma Watson anti-feminist for exposing her breasts?“. This is an article about a debate which was born on social media – a debate which would never have arisen without social media’s facilitation. The journalist picks up the issue and gives it academic analysis with comment by feminism researchers, but many of the quotes are still drawn from Twitter and the article finishes by asking for comment: “Are you a feminist? Has someone challenged whether you are a feminist because of something you’ve said, done or worn? Tell us about your experiences.” The user creates, then reads, then creates some more.
This kind of use of social media isn’t the problem – however important you think Emma Watson’s cleavage is (or isn’t), it’s an amazing thing that we can be so enmeshed with cultural debate, if we choose to be. The problem is that we mainly use social media for much more trivial purposes – for chats and arguments and logistical arrangements that should be conducted in private, whether online or in our living rooms.
Social media is no longer an add-on to what’s happening online. It is what’s happening. We need to realise that our online conversations are part of the great public conversation, and take our public chatting out of that. Where we engage in debate, we should do so knowing that we are contributing to public information on the topic – whether it be a product review, a comment on a news article, or this blog post!
I am no enemy of social media, but what I hate is the fact that it’s seen as a separate entity, rather than just a part of our online lives. As its integration into retail, news and everything else becomes cemented, my hope is that social media will lose the glamour of novelty, separate itself from our obsession with self-publication and become a channel for real public conversation once more.
This is happening to some extent. Snapchat, the channel of choice amongst young adults, doesn’t store information – it’s just for private chatting. I don’t use it – I’m far too old and uncool – but I like its immediacy and so do teenagers, who generally care less about posterity than they do about this Saturday night. YouTube and Flickr, meanwhile, are the grandparents of Snapchat – the old guard of Web 2.0. No one really uses them for conversation any more and that’s fine by me, because they have instead become directories, encyclopaedias of media.
Can we even use social media as an umbrella term any more? I’m not sure it’s useful when the purposes and potentials of these channels are so different.
Anyway, I hope the rise of Snapchat points the way we’re going, with personal and private conversation taken out of the public arena, leaving user-generated content in its rightful place – informing us and giving us access to free media and rich, diverse content. As Web 2.0 reaches maturity maybe we can all grow up too, and small talk can stay where it should have been all along – in private.
‘Social media’ is (I hope) dead – long live Web 2.0.