‘Virtual influencers’ are here, but should Meta really be setting the ethical ground rules?
By Tama Leaver and Rachel Berryman
Earlier this month, Meta announced it is working on a set of ethical guidelines for “virtual influencers” — animated, typically computer-generated, characters designed to attract attention on social media.
When Facebook renamed itself Meta late last year, it heralded a pivot towards the “metaverse” — where virtual influencers will presumably one day roam in their thousands.
Even Meta admits the metaverse doesn’t really exist yet. The building blocks of a persistent, immersive virtual reality for everything from business to play are yet to be fully assembled. But virtual influencers are already online, and are surprisingly convincing. But given its recent history, is Meta (née Facebook) really the right company to be setting the ethical standards for virtual influencers and the metaverse more broadly?
Who (or what) are virtual influencers? Meta’s announcement notes the “rising phenomenon” of synthetic media – an umbrella term for images, video, voice or text generated by computerized technology, typically using artificial intelligence (AI) or automation.
Many virtual influencers incorporate elements of synthetic media in their design, ranging from completely digitally rendered bodies, to human models that are digitally masked with characters’ facial features. At both ends of the scale, this process still relies heavily on human labor and input, from art direction for photo shoots to writing captions for social media. Like Meta’s vision of the metaverse, influencers that are entirely generated and powered by AI are a largely futuristic fantasy.
But even in their current form, virtual influencers are of serious value to Meta, both as attractions for their existing platforms and as avatars of the metaverse.
Interest in virtual influencers has rapidly expanded over the past five years, attracting huge audiences on social media and partnerships with major brands, including Audi, Bose, Calvin Klein, Samsung, and Chinese e-commerce platform TMall.
A competitive industry specializing in the production, management, and promotion of virtual influencers has already sprung up, although it remains largely unregulated.
So far, India is the only country to address virtual influencers in national advertising standards, requiring brands “disclose to consumers that they are not interacting with a real human being” when posting sponsored content.
ETHICAL GUIDELINES
There is an urgent need for ethical guidelines, both to help producers and their brand partners navigate this new terrain, and more importantly to help users understand the content they’re engaging with.
Meta has warned that “synthetic media has the potential for both good and harm,” listing “representation and cultural appropriation” as specific issues of concern.
Indeed, despite their short lifespan, virtual influencers already have a history of overt racialization and misrepresentation, raising ethical questions for producers who create digital characters with different demographic characteristics from their own.
But it’s far from clear whether Meta’s proposed guidelines will adequately address these questions. Becky Owen, head of creator innovation and solutions at Meta Creative Shop, said the planned ethical framework “will help our brand partners and VI creators explore what’s possible, likely and desirable, and what’s not.”
This seeming emphasis on technological possibilities and brand partners’ desires leads to an inevitable impression that Meta is once again conflating commercial potential with ethical practice.
By its own count, Meta’s platforms already host more than 200 virtual influencers. But virtual influencers exist elsewhere too: they do viral dance challenges on TikTok, upload vlogs to YouTube, and post life updates on Sina Weibo. They appear “offline” at malls in Beijing and Singapore, on 3D billboards in Tokyo, and star in television commercials.
GAMEKEEPER, OR POACHER?
This brings us back to the question of whether Meta is the right company to set the ground rules for this emerging space.
The company’s history is tarred by unethical behavior, from Facebook’s questionable beginnings in Mark Zuckerberg’s Harvard dorm room (as depicted in The Social Network) to large-scale privacy failings demonstrated in the Cambridge Analytica scandal.
In February 2021 Facebook showed how far it was willing to go to defend its interests, when it briefly banned all news content on Facebook in Australia to force the federal government to water down the Australian News Media Bargaining Code. Last year also saw former Facebook executive Frances Haugen very publicly turn whistleblower, sharing a trove of internal documents with journalists and politicians.
These so-called “Facebook Papers” raised numerous concerns about the company’s conduct and ethics, including the revelation that Facebook’s own internal research showed Instagram can harm young people’s mental health, even leading to suicide.
Today, Meta is fighting US antitrust litigation that aims to restrain the company’s monopoly by potentially compelling it to sell key acquisitions including Instagram and WhatsApp.
Meanwhile, Meta is scrambling to integrate its messaging service across all three apps, effectively making them different interfaces for a shared back end that Meta will doubtless argue cannot feasibly be separated, no matter the outcomes of the current litigation.
Given this back story, Meta seems far from the ideal choice as ethical guardian of the metaverse.
The already extensive distribution of virtual influencers across platforms and markets highlights the need for ethical guidelines that go beyond the interests of one company — especially a company that stands to gain so much from the impending spectacle. — The Conversation
Tama Leaver is Professor of Internet Studies at Curtin University in Australia.
Rachel Berryman is a PhD Candidate in Internet Studies at Curtin University in Australia.
This article is republished The Conversation under a Creative Commons license. Read the original article.