Tech

FDA should regulate Instagram’s algorithm as a drug – TechCrunch


The Wall Street Journal reported on Tuesday Silicon Valley’s worst-kept secret: Instagram is hurting teen mental health; in fact, its impact is so negative that it introduces thoughts of suicide.

Thirty-two percent of teenage girls who feel bad about themselves say Instagram makes them feel worse. Among teens with suicidal thoughts, 13% of UK users and 6% of US users report these thoughts to Instagram, according to the WSJ report. This is Facebook’s internal data. The truth is surely worse.

President Theodore Roosevelt and Congress formed the Food and Drug Administration in 1906 precisely because Big Food and Big Pharma failed to protect general welfare. As its leaders march to the Met Gala to celebrate the 0.01% inaccessible lifestyles and bodies that we mere mortals will never achieve, Instagram’s reluctance to do what’s right is a clear call to regulation: FDA must assert its codified right to regulate Instagram’s drug-fueling algorithm.

The FDA should view algorithms as a drug impacting our nation’s mental health: Federal Food, Drug, and Cosmetic Law gives the FDA the right to regulate drugs, defining drugs in part as drugs. ‘Articles (other than food) intended to affect the structure or any function of the body of man or of other animals. Instagram’s internal data shows that its technology is an article that modifies our brains. If this effort fails, Congress and President Joe Biden should create an FDA for mental health.

Researchers can study Facebook’s priorities and the impact of those decisions on our minds. How do we know this? Because Facebook is already doing it – they just bury the results.

The public needs to understand what Facebook and Instagram’s algorithms prioritize. Our government is equipped to study clinical trials of products that can physically harm the public. Researchers can study Facebook’s privileges and the impact of those decisions on our minds. How do we know this? Because Facebook is already doing it – they just bury the results.

In November 2020, as Cecilia Kang and Sheera Frenkel report in “An Ugly Truth,” Facebook made an emergency change to its news feed, placing more emphasis on “News Ecosystem Quality” scores (NEQ ). The high NEQ sources were trustworthy sources; down were not trustworthy. Facebook changed the algorithm to favor high NEQ scores. As a result, for five days around the election, users saw a “nicer news feed” with less fake news and fewer conspiracy theories. But Mark Zuckerberg reversed the change because it led to less engagement and could provoke a conservative backlash. The public suffered from it.

Facebook has also studied what happens when the algorithm favors “good for the world” content over “bad for the world” content. And there you have it, the commitment decreases. Facebook knows that its algorithm has a remarkable impact on the minds of the American public. How can the government let one man decide the standard based on its business imperatives, not the general welfare?

Upton Sinclair memorably discovered dangerous abuses in “The Jungle,” which sparked a public outcry. The free market has failed. Consumers needed protection. The Pure Food and Drug Act of 1906 first enacted safety standards, regulating consumer goods that impact our physical health. Today, we need to regulate the algorithms that impact our mental health. Depression among adolescents has increased alarmingly since 2007. Likewise, suicide among those aged 10 to 24 increased by almost 60% between 2007 and 2018.

It is of course impossible to prove that social media alone is responsible for this increase, but it is absurd to claim that they did not contribute to it. Filter bubbles distort our views and make them more extreme. Online bullying is easier and more constant. Regulators must audit the algorithm and question Facebook’s choices.

When it comes to Facebook’s biggest problem: what’s the product U.S – regulators have struggled to articulate the problem. Section 230 is correct in its intent and application; The internet cannot work if the platforms are responsible for every user statement. And a private company like Facebook loses the trust of its community if it applies arbitrary rules that target users based on their origin or political beliefs. Facebook as a business does not have an explicit duty to comply with the First Amendment, but the public’s perception of its fairness is essential for the brand.

So Zuckerberg has been equivocal over the years before belatedly banning deniers, Donald Trump, anti-vaccine activists and other bad actors. Deciding which speech is privileged or allowed on its platform, Facebook will always be too slow to react, too cautious and ineffective. Zuckerberg only cares about engagement and growth. Our hearts and minds are caught in the balance.

The scariest part of “The Ugly Truth,” the part that got everyone in Silicon Valley talking, was the eponymous memo: Andrew “Boz” Bosworth’s 2016 “The Ugly”.

In the note, Bosworth, Zuckerberg’s longtime deputy, writes:

“So we are connecting more people. It can be bad if they make it negative. Maybe it costs someone their life to expose someone to bullies. Maybe someone dies in a coordinated terrorist attack on our tools. And we are still connecting people. The horrible truth is that we believe in the importance of connecting people so deeply that anything that allows us to connect more people more often is de facto Well.”

Zuckerberg and Sheryl Sandberg made Bosworth reconsider his statements when employees objected, but to outsiders the memo represents Facebook’s unvarnished identity, the ugly truth. Facebook’s monopoly, its stranglehold on our social and political fabric, its growth at all costs, mantra of “connection”, is de facto not good. As Bosworth acknowledges, Facebook causes suicides and allows terrorists to organize. This concentrated power in the hands of a single company, run by one man, is a threat to our democracy and our way of life.

Critics of the FDA’s social media regulation will claim it is an invasion of our personal freedoms by Big Brother. But what is the alternative? Why would it be bad for our government to demand that Facebook be publicly accountable for its internal calculations? Are the number of sessions, time spent, and revenue growth the only results that matter? What about the collective mental health of the country and the world?

Refusing to study the problem does not mean that it does not exist. In the absence of action, we end up with one man deciding what is right. What is the price we pay for the “connection”? It’s not in Zuckerberg. The FDA should decide.


techcrunch Gt

Back to top button