Is it time for businesses to hold social media more accountable? | Cavalletti Communications copywriters

Why Businesses Must Challenge Social Media Platforms

Following recent significant events, is it time for businesses to stand up to social media? I think so. We all need to hold them more accountable for their actions and lack of social responsibility.

Written by Daniela Cavalletti

8 min read

Along with most of the rest of the world, I have watched with an unsettling mix of horror and hope as events unfolded in Christchurch on 15 March 2019 and beyond. From the hate and despair of the attacks to the compassion and care of New Zealand’s response.

The abhorrent killings sent shock waves around the globe. To maximise the impact, the perpetrator added a new and frightening level to the terror he spread – by live-streaming his actions for 17 minutes on Facebook.     

Is it Time for Us to ‘Unfriend’ Social Media?

The global reach of that video was swift and staggering.

But the reaction from social media in taking it down was inadequately slow. It took a full 29 minutes before Facebook even knew it was hosting such a graphic video, and then roughly an hour to take it down.

This seems a quick response – but it wasn’t.

Facebook is a social media platform of highly sophisticated algorithmic filters and AI (artificial intelligence) technology that can detect within minutes a copyright breach or a passing interest in a pair of shoes to show you relevant advertising.

So rightly a universal clarion call went out demanding to know why it had taken so long to detect and remove the horrific video.

During the time it was available, the video was viewed, shared, commented on and – worst of all – liked by a staggering number of social media users. The New York Times reported that the video was uploaded 1.5 million times in the first 24 hours. Of those 1.5 million copies of the video, Facebook’s automatic detection systems blocked 1.2 million, roughly leaving 300,000 copies still in circulation.

Facebook partially blamed its reliance on viewers to flag problematic live stream, and the first user report on the video came in 29 minutes after the broadcast started and 12 minutes after it ended.

Would you blame your clients for not reporting a troll on your business’ social media channel – or take responsibility for moderating the content on your own account?

… I thought so.

Who’s to Blame, And Where Does That Leave Us?

In the wake of the attacks, politicians, businesses, and the public have joined together to damn social media’s lack of social responsibility and their inability to curtail ‘weaponised content’.

New Zealand’s Prime Minister, Jacinda Ardern was one of the first and clearest to call on social media platforms to do more to combat terrorism. And I cannot agree more with her, when she says:

“They are the publisher, not just the postman. There cannot be a case of all profit, no responsibility.”

Supporting her views, Australia’s Prime Minister Scott Morrison and cabinet met with Google, Facebook and Twitter executives last month to urge them to take increased responsibility for content control in their role as ‘good corporate citizens’.

Reportedly during the meeting the social media bosses failed to impress and offered little reassurance about future content moderation changes. This prompted the government to commit to new laws that could see companies and executives hit with criminal penalties if they do not take terrorist material down quickly enough. Prime Minister Morrison has also requested that the issue of social media governance be added to the top of the agenda at June’s G20 summit in Osaka.

‘If they build it, if they make it, they have to […] make it safe”, is how the Prime Minister put it. A rare moment of agreement between Mr Morrison and myself.

But it’s more complicated than that: exactly how a government plans to make social media safe leaves room for ambiguity and another kind of issues as well.

Social Media Watch: The Role of Business

So, in light of everything that has happened – from egg boy to the gentle grace of Jacinda Arden – how do we as business owners feel about our own role and responsibility when it comes to social media? It is after all, one of the central communication links to our customers. And most of us also spend good money on advertising through these platforms.

Is it really enough to nod in agreement to a need for more accountability?

Or is it time to reflect and consider our own values and beliefs, and take a stand against those who are happy to take our money – but are not prepared to do more to curtail the spread of extremist ideas?

I think it is.

Big business lead the way swiftly in New Zealand with companies including Lotto, Westpac and local bank TSB pulling their social media advertising just three days after the Christchurch attacks.

Lotto said it had removed social media advertising ‘because the tone did not feel right’, while TSB stated it was disappointed by the role social media played and felt it was ‘inappropriate to continue to support the channel’. Other New Zealand businesses, such as Hungry Jacks, ASB Bank, many telecommunications companies and five major super funds, are also halting their social media advertising spend and demanding action from global companies to stop the spread of harmful content.

The Association of New Zealand Advertisers (ANZA) and the Commercial Communications Council issued a statement that questioned the responsibility taken by social media platforms and urged advertisers to ‘recognise they have a choice where their advertising dollars are spent, and carefully consider, with their agency partners, where their ads appear’.

Should We Put Our Trust in Social Media?

For small business owners, the decision to pull social media advertising spend is not an easy one, nor in many cases practical for those that rely on Instagram, Twitter, LinkedIn and Facebook rather than traditional advertising to attract clients.

Security Risks

But it is not just the social responsibility of these platforms that are being questioned. Running a business we also need to consider some of the other major security and technical issues plaguing Facebook in particular, and how that may affect our business.

Coincidently, the Christchurch attack occurred on the eve of the first anniversary of Facebook’s Cambridge Analytical scandal, when it emerged that the British consulting company had harvested the data of 50 million Facebook profiles to try to manipulate the Donald Trump election campaign.

We have also had a more recent incident when earlier this month more than 540 million records about Facebook users were publicly exposed on Amazon’s cloud computing service. The information included account names, IDs and details about comments and reactions to posts.

Site Outages Cost You Money

And then just two days before the Christchurch attack Facebook, Instagram and WhatsApp experienced a nearly day-long outage causing major disruption for millions of businesses worldwide. While influencer-sponsored blogs, ad campaigns and marketing posts went up as scheduled, very few people actually saw them – resulting in major losses of revenue and customer engagement.

Interestingly, even though it was the worst disruption to the platform since 2008, Facebook offered little explanation other than blaming it on a ‘server configuration change’ – whatever that means. Many suspected it was actually caused by a distributed denial-of-service (DDoS) attack – a malicious attempt to disrupt normal traffic by overwhelming it with a flood of Internet traffic.

Let’s Hold Social Media Accountable

If we combine Facebook’s lack of reassuring information on the outage and data breaches with the less than enthusiastic response to their role in the Christchurch incident, the company is looking pretty iffy in the transparency and responsibility stakes.

Online Platforms Reluctant to Do More

As a response to the terrorist attack, the tech giant is on the record as saying it will look to further bolster its detection technology, react faster to live video complaints, combat hate groups and expand industry collaboration.

Whether this will translate into practical measures – and adequate ones – remains to be seen.

Surprisingly they continue to defend live-streaming. Facebook’s chief executive, Mark Zuckerberg, in a recent interview was resistant to the proposed solution of adding a delay to its live-stream feature, saying it would ‘fundamentally break what live-streaming is for people’.

Disappointing Response

This seems a fairly flippant statement in light of the fact that this was the technology that allowed such a deplorable act to be broadcast freely around the world. It does little to foster trust or confidence in their concern for us or their moral responsibility.

We wanted Facebook, YouTube and other social media to be as outraged as we were and immediately commit to ensuring that such an event could never happen again. Instead, their general response was restrained, cold and corporate.

What Can We Do?

Legal ramifications and the prospect of individual executives facing jailtime could be the pressure needed to start taking social media giants’ undeniable powerful role in our lives much more seriously. But laws need to be carefully crafted. I don’t have the answers on this; it’s a multi-faceted issue.

But we as individuals and as advertising businesses should expect and must demand transparency and socially responsible conduct in all that social media platforms do. Put care of us their customers and consumers before making an extra buck.

Unlike the New Zealand banks, small business may not be in a power position to pull large advertising contracts. But we should only entrust our marketing dollars and customer data to global tech companies that we believe are doing everything possible to protect us – whether it be from data breaches, outages or extremist content.

Because as we know in business: customer trust is everything.

  • Bruce Carr
    Posted at 10:29h, 12 April Reply

    Thought provoking article. It has always struck me that there is a wide disparity in the way different media channels are required to, or choose to comply with the law. If a journalist writes an article and wishes it to be published in a major newspaper for example, that process is subject to legal and editorial review and the publisher as well as the author have to take legal responsibility for what they publish. If a reader wishes to comment on an article newspaper editors submit that to a vetting process to ensure that the comment complies with their policies and presumably does not put them in a vulnerable legal position.

    I think the simplest approach is to treat all publishers equally whether they be newspapers, online publishers or other media distributors. If that means that Facebook, Google or whichever must delay what it publishes so it can exercise its responsibilities as a publisher then so be it!

    Legislation is frequently outdated by technological advances. In this case it is high time that they catch up.

    • Daniela Cavalletti
      Posted at 14:49h, 16 April Reply

      That makes imminent sense. I wonder whether this approach will succeed. Or whether we will get so much push back on the idea – including from social media users who are used to instant news and gratification – that there will remain a gap in accountability and responsibility between the mediums. The next months and years will be an interesting time regarding how we interact with online platforms, and what we’re willing to accept as a community.

  • Tara Bufton
    Posted at 08:16h, 21 April Reply

    The platforms do need to create a safe space. 🌸

    • Daniela Cavalletti
      Posted at 09:44h, 22 April Reply

      They do indeed, Tara. It’ll be a long road to that goal, but a worthwhile journey.

Post A Comment