With the Joe Rogan podcast controversy, Spotify has officially joined the ranks of media platforms publicly defending their governance practices.
Rogan’s podcast is a harbinger of the future of business — and of social media. Platforms that did not consider themselves social are now faced with managing content and user interaction. In the industry, we’d say Spotify has a “trust and safety” issue.
Spotify, and all other platforms featuring user-generated content, are learning the hard way that they can’t stand aside and rely on users to post appropriate content that doesn’t violate company policies. or social norms. Platforms are discovering that they need to become legitimate, active authority figures, not passive publishers. Research shows that they can start by generating user trust and creating expectations of good behavior.
Rogan is just one example. With Spotify’s acquisition of Anchor and its partnership with WordPress providing “easier access to podcast creation”, user-generated podcasts about politics, health and social issues are part of the Spotify’s new frontier.
To this we can add platform integration: users can now use Spotify with other platforms, such as Facebook, Twitter and Peloton. This means that Spotify’s user experience is shaped by content created on the Internet, on platforms with distinct rules and codes of conduct. Without common industry standards, “misinformation” on, say, Twitter won’t always be flagged by Spotify’s algorithms.
Welcome to the future of social media. Companies once thought they could rely on algorithms to detect inappropriate content and intervene with public relations in high-profile cases. Today, the challenges are bigger and more complicated as consumers redefine where and how to be social online.
Tech companies can adapt by working on two fronts. First, they must establish themselves as legitimate authorities in the eyes of their community. It starts with making the rules readily available, easily understood, and applicable to all users.
Think of it as driving rules, another large-scale system that works by making sure people know the rules and can share a common understanding of traffic lights and rights of way. Simple reminders of the rules, like stop signs, can be very effective. In experiments with Facebook users, reminding people of the rules decreased the likelihood of continued bad behavior. To create security on platforms facing thousands or even millions of users, a company must also develop clear and understandable procedures.
Try to find the rules of Spotify. We could not. Imagine driving without stop signs or traffic lights. It’s hard to follow the rules if you can’t find them. Tech companies have always been reluctant to be responsible authority figures. The first efforts in Silicon Valley to manage user content were anti-spam teams blocking players who hacked into their systems for fun and profit. They legitimately believed that by disclosing the rules, users would play with the platform and that people would only change their behavior when they were punished.
Try to find the rules of Spotify. We could not. Imagine driving without stop signs or traffic lights. It’s hard to follow the rules if you can’t find them.
We call this approach “deterrence”, which works for adverse people like spammers. It’s not as effective for more complicated behaviors that break the rules, like racist rants, misinformation, and incitement to violence. The vendors here aren’t necessarily motivated by money or a love of piracy. They have a cause, and they can see themselves as rightfully voicing an opinion and building a community.
To influence the content of these users, companies must abandon reactive sanctions and instead adopt proactive governance – setting standards, rewarding good behavior and, if necessary, enforcing rules quickly and with dignity to avoid the perception of be arbitrary authority figures.
The second key step is to be transparent with the community and set clear expectations for appropriate behavior. Transparency means disclosing what the company is doing and how well it is doing to keep things safe. The effect of reinforcing so-called “platform standards” is that users understand how their actions might impact the wider community. The Joe Rogans of the world are starting to look less appealing as people view them as a threat to the safe and healthy experience of the wider community.
“We’re defining a whole new space in technology and media,” Spotify founder and CEO Daniel Ek said at a recent employee meeting. “We are a very different kind of company, and the rules of the road are being written as we innovate.”
It’s just not true. Sorry, Spotify, but you’re not that special. There are already proven “rules of the road” for tech platforms – rules that hold great promise for building trust and security. The company just has to accept them and follow them.
You’ll still get incidents of “road rage” online from time to time, but the public might just be more forgiving when it happens.