With eight shareholder proposals filed, investors are sending “a strong signal” that Meta must address social and governance issues.
Investors are concerned that social media giant Meta’s proposed ‘metaverse’ will “amplify” the firm’s existing poor governance structure and social-related performance.
A combination of long-standing criticisms and new risks mean that Meta’s upcoming AGM could be one of the most watched of the 2022 US proxy season.
Despite a controversial track record on social and governance issues, internal reports suggest the firm “is unwilling to address its flaws”, Victoria Lidén, Sustainability Analyst at Norwegian Storebrand Asset Management, told ESG Investor.
“Now the company is dedicating significant resources to developing a virtual universe, where there is a high risk that these negative effects will amplify, if no effective risk management measures are put in place.”
Mark Zuckerberg, Founder and CEO of Facebook – now known as Meta – has said the metaverse will be a virtual reality space where users’ avatars are able to work and interact. But the firm has not yet addressed safety concerns raised by investors and others.
In December, a shareholder proposal was filed requesting Meta’s board to commission a third-party assessment of the potential psychological, civil and human rights harms of the metaverse on users.
The proposal also asks the company to outline how these harms can be mitigated and which are “unavoidable risks”. It was filed by Storebrand Asset Management, alongside investment management firm Arjuna Capital, SHARE and SumOfUs, and was supported by Open MIC, an NGO working with shareholders to encourage corporate accountability in the tech sector.
An advisory shareholder vote on the proposal is set to take place at the company’s AGM on 25 May.
“We are simply seeking an advisory shareholder vote which would give investors an opportunity to weigh in and advise the board and company management on the development of the metaverse,” said Kamil Zabielski, Storebrand’s Head of Sustainable Investment.
Despite protests from Meta, the US Securities and Exchange Commission (SEC) ruled that the vote should go ahead, which Michael Connor, Executive Director of Open MIC, noted is “a win for all those who are deeply troubled by Meta’s appalling track record of dodging accountability and failing to address human and civil rights abuses, as well as privacy concerns affecting billions of people globally”.
Earlier this month, the SEC similarly rejected Amazon’s challenge against a shareholder resolution calling for improved tax transparency.
If the Meta vote doesn’t pass, Lidén said that it nonetheless sends “a strong signal” to company management and the public that governance and social-related issues are important to investors and should be addressed.
Tip of the iceberg
The metaverse is not investors’ only concern about the social media company, which also owns WhatsApp and Instagram.
Seven other shareholder proposals have been filed by investors for consideration at next month’s AGM.
One of the proposals calls for a vote on an outside assessment of Meta’s Audit and Risk Oversight Committee, and the potential expansion of its charter responsibilities.
“We also oppose ownership structures that allow a combined Chair/CEO role, or a lack of independent directors on the board. The importance of checks and balances are especially important for a company such as Meta, given its enormous influence globally. As institutional investors, we believe that the establishment of a robust system of accountability and oversight at the board level is key to protect shareholders’ long-term value,” said Lidén.
However, Meta’s dual class share structure (DCSS) makes it far more challenging for investors to push for positive ESG-related changes.
A DCSS allows companies to sell more than one kind of share, with some offering a higher number of votes per share than others. Zuckerberg owns 14% of Meta’s total shares, but has control of 58% of the votes, meaning he can overrule proposals he doesn’t agree with.
Despite publishing a new human rights policy, Meta has failed to commit fully to upholding international human rights in its development and use of algorithmic systems, according to a scorecard recently published by Ranking Digital Rights (RDR), the independent research programme at policy think tank New America.
In April 2021, more than 500 million Meta (then Facebook) users’ sensitive personal data was leaked. However, Meta has only publicly disclosed information on data breach management in its response to the earlier Cambridge Analytica scandal, the RDR report said, adding that Meta “says nothing about whether policies and practices are in place that can systematically address a data breach when it occurs”.
With Tesla Co-Founder Elon Musk controversially spending US$44 billion to purchase Twitter, when he had previously proposed to spend US$6 billion to end world hunger, Meta is not the only big tech company undergoing more scrutiny from investors.
“Investors in technology companies are growing increasingly anxious as to whether the boards of their investments are acting as appropriate stewards of their capital,” said Charles Radclyffe, CEO of ratings company EthicsGrade.
“Not just in pursuing capital growth – which the technology industry has been historically very good at – but also in their wider alignment to ESG goals.”
Storebrand’s Zabielski said it is too early to say what changes Musk may bring to Twitter, and whether these will present social or governance-related risks.
“We see that there is currently a widespread discussion about how he will address these issues and secure the right to free speech, a discussion and scrutiny that probably will continue also going forward. We will follow the situation closely as it evolves,” he said.
As well as increased pressure from investors, social media firms are facing greater oversight from regulators.
This month, EU policymakers agreed on the Digital Services Act (DSA), which aims to push companies to better tackle illegal and harmful content on their platforms. It will apply from 2024.
“The purpose of the DSA is to hold the platforms accountable for what happens on social networks, by forcing them to better moderate the content on their platforms,” said Lidén.
Tech companies will face fines of up to 6% of their global turnover if they violate the rules of the DSA, with repeated offences potentially resulting in their being banned from doing business in the EU. The new rules include banning targeted advertising that is specifically aimed at children or based on sensitive data, such as religion, sexuality, race or political opinions.
In a measure triggered by Russia’s invasion of Ukraine, the DSA will also require the largest online platforms to take specific steps during a crisis, including managing the spread of disinformation.
Companies will be asked to pay a yearly fee of 0.05% of their worldwide revenue to cover the costs of monitoring their compliance.
“Although the legislation only applies in the EU, it might influence [tech companies’] operations worldwide and could contribute to the development of a global regulatory standard,” Lidén noted.