How would you prefer to be manipulated? If we weight your social media feed on what we know you like to see, then you will be showered with commercial clickbait. If we weight your social media feed by what is likely to spark intense emotions, then you will be flooded with hate speech.
How many levels of manipulation are we seeing here? Facebook is adjusting its flow to keep you engaged and spending time and energy on the platform. Advertisers are constantly working the algorithm to spin more views and tweaking their messages to spark more clicks. Haters gonna hate, and they have learned how to turn that hate into views and clicks. Everyone wants to manipulate you, but they need to manage Facebook first.
Recently leaked documents show that Facebook’s concern over passivity in its user base led the company to re-prioritize its feed algorithm to reward “meaningful social interactions” over interesting articles. As the Washington Post observed, “The downside of this approach was that the posts that sparked the most comments tended to be the ones that made people angry or offended them, the documents show. Facebook became an angrier, more polarizing place. It didn’t help that, starting in 2017, the algorithm had assigned reaction emoji — including the angry emoji — five times the weight of a simple “like,” according to company documents.” So the change intended to spark conversations between family and friends instead prioritized anger and disgust. The increase in anger and partisanship in the US may or may not be affected by this change.
Many people believe such a system is bad for Facebook users (more on that below) and bad for democracy as people at the extremes whip up everyone else’s emotions and distrust. Not to mention the fact that highly motivated bad actors, like government-sponsored bots and sock puppets from Russia and Iran take the next step in manipulation – fomenting wrath from lies and outrageous claims simply to drive Americans further apart. But the engagement seems superficially good for Facebook.
Which brings us to the crux of the problem. Should the government be regulating the news feed of a private company, forcing the company to make different decisions about how it chooses content to display to its own customers? Facebook works to eliminate sexually-oriented content, avoiding one common area of government content restrictions. We regulate honesty in consumer speech, so it is reasonable to force Facebook to crack down on lies and misinformation where they can. Companies can use their dominant positions in one market to gain an unfair advantage in another market, and we regulate this through anti-trust laws. We have rules about keeping consumer information private and we punish companies for breaking these rules. But Congressional calls to regulate Facebook’s method of providing content to its subscribers are not based on any of these well-established regulatory themes.
Should the government be regulating the news feed of a private company, forcing the company to make different decisions about how it chooses content to display to its own customers?
Facebook’s behavior may be wrong or even illegal. But driving its users toward or away from one set of essentially honest (if gross and despicable) content does not provide a basis for government control. There is something about the depth of social media’s impact on people that makes us inherently suspicious and afraid. We know the manipulation is happening. Such manipulation seems wrong. With nearly three billion Facebook members, we know that people aren’t stopping themselves and we suspect the manipulation and involuntary addition are to blame. So why can’t Congress do something to stop it?
There is also a First Amendment overlay to this problem. American legislators are opposed to telling the New York Times and Fox News that they must change the messages they publish to readers and viewers. Why is Facebook any different? Doesn’t Facebook have the right to push its subscribers in any direction it chooses, as long as it is honest about doing so? If users don’t like it, they can drop out or move to a different service, platform or medium. There is a strong argument that the Facebook algorithm is speech, simply choosing speech options for each users. What lines would this speech need to cross before we exercise prior restraint? Is it the place of the government to restrain this speech?
Maybe we should do this to protect democracy. There are lots of media outlets that I believe are harmful to democracy, but legislators are not looking into regulating them more tightly. Why is Congress most interested in services that are essentially neutral third parties, and not the screaming troublemakers on one side or the other? All major social media – Facebook, Instagram, TikTok, YouTube, Twitter – provide a platform for people to build their own experiences. At no time in our history have we passed laws to regulate what a neutral platform can show people.
Facebook’s own research shows that its Instagram service is bad for the mental health of teenage girls. Internal company documents showed that Instagram negatively affects body image problems for minors. Congressional representatives are now requesting that YouTube, Snap and TikTok turn over research they have conducted evaluating users’ mental health. Children’s health is clearly an area of historic government regulation, and we may yet see legislation promoting children’s wellbeing that are directed at Facebook.
Of course, Section 5 of the FTC Act allows regulation and penalties for unfair and deceptive conduct. This is an intentionally broad standard. So the FTC could simply decide that manipulative algorithms are unfair and deceptive, and push to have them altered. It would be interesting to see whether this presented as a “stop doing this bad thing” regulatory action or more of a “do things this way” action.
Facebook presents some similar problems now that television presented in the 1950s. Once many people are using the medium and influenced by its content, Congress becomes itchy to act. Television networks were dominating the public airways, and Facebook is just one of millions of options available on the internet. So Congressional influence may remain with hearings and lectures, rather than active legislation. In any case, as the Metaverse enters our public consciousness, we are likely to see Mr. Zuckerberg in legislative hearings for many years.