Whether it’s new friends and acquaintances or school mates we have not seen in the past ten years, the phrase ‘add me on Facebook’ is most likely to come about sometime during that meeting. Any intention of such an invitation really only translate to one thing: to keep in touch. But now, available in so many different languages, Facebook users are invited to create and share their thoughts, topics and pages of interest, groups, advertisements, videos, pictures… practically everything. And the whole ‘sharing’ idea is not just limited to friends, but friends of friends and strangers who just happen to ‘like’ the same thing. It is also not an abnormal phenomenon to become ‘friends’ with people you have never even met in real life before. You don’t even have to ask your friend who may be 12,000 miles apart what she or he had for lunch, because everyone on her or his newsfeed have seen her plate.
Bearing all this in mind, defining Facebook as a social networking site is simply not enough. Social media networking site, maybe. Indeed, Facebook now is a source and cause for news that warrants the attention from all eyes of the public (or at least, those who bother to watch) – since late October 2011, calls to regulate content on the site have continuously been made. From pages of disturbing topics (such as rape and jokes about dead babies) to snapshots of bullying and beheading, authorities on Facebook struggle to decide whether the content is really crossing the line of their beliefs in the freedom of speech and expression.
Though Facebook has apologised and promised to revise the way they handle “controversial, harmful and hateful content” after receiving pressure from campaign groups, advertisers and the traditional media, who can we really say is responsible for the matter? Did Mark Zuckerberg ever envisioned himself or his creation to eventually become responsible in filtering all the content members of the public are uploading, sharing, seeing or hearing? Has anyone asked about the actual source of these ‘offensive’ contents?
Facebook started off humbly as a website for just a bunch of university friends. Then it catered to other university students as someone outside the ring of friends accidentally stumbled upon it (and liked it very much indeed) and now, to the rest of the world. Regulating content at its beginning was never an issue – who were the audience and who else was watching? And with the count being so little, it was easy to remove bad material without kicking a so much a fuss with others. But now that everyone is involved, not everyone can, and will be pleased.
Even though Facebook has acknowledged and obliged to cave in to society’s demands, people seem to forget that stopping offensive content does not eradicate the groups of people responsible from continuously harbouring such thoughts or carrying out such actions in real time. Facebook could not possibly carry out censorship alone – after all it is its audience that helps supply and act as an interpreter to explain what is ‘good’ or ‘bad’. Of course, stopping them on social media
sites would slow down the idea from spreading and communicating to young and innocent minds, but it does not stop it from happening within and by members of the society altogether.