With increasing pressure from investors and regulators, the party may be over for social media giants.
Tech billionaire Elon Musk describes himself as a “free speech absolutist”. In the midst of discussions over his US$44 billion deal to purchase Twitter, he criticised the platform’s policies monitoring hate speech and disinformation, feeding rumours that he plans to eventually reinstate the accounts of incendiary figures like Donald Trump.
Well, the deal has been finalised, and Musk is the so-called ‘Chief Twit’ and ‘Twitter Complaint Hotline Operator’, after laying off hundreds of Twitter’s staff within the first week.
It remains to be seen how far he will go in supporting free speech. United Nations High Commissioner for Human Rights Volker Türk sent an open letter to Musk, calling for him to “ensure human rights are central to the management of Twitter”.
Musk’s uncontested purchase of the platform is another example of one individual exerting a huge amount of influence in the minimally regulated space, according to Charles Radclyffe, Partner at ESG data company EthicsGrade.
“There are many shades of grey in social media, so it’s a big opportunity for Musk to recast Twitter as a responsible (and profitable) platform. I’m not very optimistic that he’s started on the right foot by disbanding what little corporate governance exists,” he tells ESG Investor.
As high growth companies with low carbon footprints, social media firms have been a popular investment choice for investors looking to drive down their portfolio emissions while ensuring strong financial returns, says Michael Connor, Executive Director of Open MIC, an NGO working with shareholders to encourage corporate accountability in the information technology sector.
“However, social media companies have failed in many respects to fulfil their roles as responsible corporate actors, especially with regard to the social concerns,” he says.
They have a fraught history of poor treatment of staff, data privacy breaches, limited oversight and management of harmful content, and even failing to address a conspiracy planning to overthrow government.
These social risks, paired with undemocratic governance and capital structures, have led to a growing number of investors trying to hold these autonomous platforms accountable, not just on social issues but also on unsustainable and overly aggressive tax practices.
Could the party be over? As the regulatory noose gradually begins to tighten, we are possibly reaching the end of the “golden era of high growth with no regulation and no challenge”, according to Lourdes Montenegro, Research and Digitisation Director at the World Benchmarking Alliance (WBA).
Taking a stand
Investors are filing and supporting shareholder resolutions at annual general meetings (AGMs) demanding increased transparency and better management of social-related risks.
Meta – formerly known as Facebook – has often been in the headlines for its poor social-related performance. One notable example is the unchecked spread of hate speech on the platform playing a “determining role” in an estimated 10,000 Rohingya Muslims being killed during a military crackdown in Myanmar in 2017.
It is unsurprising, then, that earlier this year eight social and governance-focused resolutions were filed, including a request for Meta to issue a report by February 2023 that assesses the risks of increased sexual exploitation of children as the platform develops and implements additional privacy tools, such as end-to-end encryption.
Another shareholder proposal called for Meta’s board to commission a third-party assessment of the potential psychological, civil and human rights harms of its proposed metaverse on users. Co-filers included investment manager Storebrand Asset Management and Open MIC.
None of these resolutions achieved a majority, which experts speaking to ESG Investor largely attributed to one reason.
“Dual class share structures (DCSS) are a continuing example of outrageously bad corporate governance,” says Open MIC’s Connor.
DCSSs allow companies to issue more than one type of share, with those held by the founder and c-suite typically carrying more voting weight, allowing them to more easily override any shareholder proposals they don’t agree with.
“Being a responsible investor and active owner necessarily includes having this possibility to use our shareholder right to vote, file resolutions and make our opinion heard,” Victoria Lidén, Sustainability Analyst at Storebrand Asset Management, tells ESG Investor.
“The importance of checks and balances are especially important for [social media] companies, given their enormous influence globally. As institutional investors, we believe that the establishment of a robust system of accountability and oversight at the board level is critical to risk management and securing brand reputation, which in turn can help protect shareholders’ long-term value.”
Thanks to its DCSS, Meta Co-Founder and CEO Mark Zuckerberg controls 58% of the vote.
Andrew Behar, CEO of US-based shareholder advocacy NGO As You Sow, says platforms like Meta are operating under “an authoritarian regime”.
“We’ve been filing social-focused resolutions at Meta for six years, but change doesn’t happen if [Zuckerberg] and his board don’t agree,” he says. “Although it’s technically a public company, it operates like a private one, with one man at the helm.”
In instances where a shareholder proposal fails to secure a majority vote, Lidén says that Storebrand will consider taking additional action, such as publishing a public statement outlining its views, collaborating with other investors, and voting against the appointment of directors.
Storebrand has also signed the Investor Statement on Corporate Accountability for Digital Rights, an initiative led by the Investor Alliance for Human Rights which outlines investor expectations for IT companies in line with the Ranking Digital Rights Corporate Accountability Index.
Asset owners are taking action. The Church of England’s (CoE) Ethical Investment Advisory Group (EIAG) published a report advising investors with Christian values on how to approach IT companies, covering issues such as data storage, human rights and AI ethics, which will inform future engagement efforts across all the CoE’s investing bodies.
UK-based pension fund Railpen and the Council of Institutional Investors co-founded the Investor Coalition for Equal Votes (ICEV), which aims to curb the use of DCSSs.
But EthicsGrade’s Radclyffe emphasises the importance of instilling core ESG principles into businesses from the outset.
“What we really need is for venture capital firms to encourage more social responsibility in social media start-ups before they go public,” he says. “They need to ensure that these companies understand that high growth at all costs simply isn’t compatible with long-term sustainability.”
Institutional investors can also exert their influence by encouraging venture capital firms to incorporate ESG considerations into their management of innovative start-ups, he adds.
From a legal perspective, it remains unclear the extent to which social media companies are responsible for the content on their platforms.
In the US, social media companies often cite Section 230, which was developed in the early 1990s and falls under the US Communications Decency Act. It essentially provides immunity for website platforms when it comes to third-party content, arguing that they are distributors of content created by its users as opposed to publishers.
“We have been long saying that these companies must own the content on their sites. The argument that they merely own the pipes – and don’t own what travels through them – is ridiculous,” says As You Sow’s Behar.
Governments will have to find the line between what is considered acceptable under free speech versus controlling the narrative through censorship, an issue which Radclyffe says “only interoperability [between jurisdictions] can solve”.
A complete lack of content control on TikTok allowed influencer Andrew Tate to spread harmful misogynistic views amongst his largely teenage demographic. Although he was eventually banned from the platform (it is still possible to watch videos featuring him), this didn’t happen before his content was viewed millions of times, with his opinions inciting dangerous behaviour from his audience.
In comparison, Douyin – the Chinese version of TikTok – has been criticised for overly censoring content, including blocking videos promoting a mortgage boycott.
A more regulated future
Better late than never, governments are beginning to understand what effective regulation looks like.
EU policymakers have agreed on the Digital Services Act (DSA) package, which will come into force in 2024 and require companies, including social media operators, to better address harmful content on their platforms or face fines of up to 6% of their global turnover. These new rules include banning targeted advertising aimed at children or based on a consumer’s sensitive data (sexuality, political affiliation, sexuality or religion). Repeat offences will potentially result in these companies being banned from doing business in the EU.
Thierry Breton, EU Commissioner for the Internal Market, issued a warning to Musk, noting that “in Europe, the bird will fly by our rules”.
“Although the legislation only applies in the EU, it might influence their operations worldwide and could contribute to the development of a global regulatory standard, which would be preferred given the global reach of these platforms,” says Lidén.
In the UK, a more watered-down version of the long-awaited Online Safety Bill is expected to be brought back to Parliament at the end of this month. It will impose criminal sanctions on internet and social media platforms if they fail to moderate content. The proposed rules have been softened, however, no longer requiring companies to address content promoting suicide and/or self-harm, or content targeting adults.
Connor says these measures are “a good start”, but “the big question is whether governments will have the political will and resources to enforce them”.
Given how integral social media has become to the average politician’s communications with the public, there are questions around the power dynamic between social media companies and governments, says EthicsGrade’s Radclyffe.
“In the same way as we recognise the role of banks, law, media and healthcare in society, regulators need to recognise the role social media plays and install the same legal precedents these other industries face, such as a requirement to maintain impartiality,” he notes.
Regulatory constraints may lead to limits on growth. There is evidence to suggest that the financial performance of social media companies may be negatively impacted by ethics-focused rule changes.
“Overall, confidence in the future for these companies has changed,” says Lidén. “Earnings remain good, but the valuations, which are partly based on expectations about growth rates, naturally are changing. So, now [social media] companies must instead focus on the profitability and sustainability of their business models and their financing.”
WBA’s Montenegro echoes Lidén’s sentiments, calling for “more humility” from those in charge of social media platforms going forward.
“There needs to be more willingness to listen to society and stakeholders, and more transparency around social risks. Social media companies need to be less anti-social.”