What to do about domestic social media disinformation

A new report argues for social media platforms to take action

Now that the goals and tactics of foreign manipulators such as the Kremlin are better understood, experts in media and disinformation are turning their attention to the problems of domestic disinformation, homegrown in the United States. A new report authored by Paul Barrett, Deputy Director of the Center for Business and Human Rights at NYU’s Stern School of Business, chronicles the problem and makes specific recommendations to social media platforms on what to do.

The report is full of sensible recommendations, including that platforms should invest more into content moderation to remove content that is demonstrably false, and that such moderation decisions should be transparent to users. “If statements are meant intentionally to mislead, they should be taken down,” said Dipayan Ghosh, the co-director of the Platform Accountability Project at the Harvard Kennedy School.

Of course the mechanism for making such decisions is incredibly difficult to manage at scale- even if arguments about freedom of speech (dubious in the case of private social media companies, suggests this report) are put to bed. It follows then that other recommendations- such as hiring senior executives responsible for addressing disinformation and retooling algorithms to reduce the ‘outrage factor’- are introduced in sequence.

One of the more interesting counterpoints in the report is presented by a former Facebook executive. “I’m very afraid of what happens five or 10 years out, when the companies have machine-learning systems that understand human languages as well as humans,” Alex Stamos, the former Facebook Chief Security Officer- now at Stanford University- told Barrett. “We could end up with machine-speed, real-time moderation of everything we say online. Do we want to put machines in charge that way?” It seems we are a ways from such a scenario- presently the job of content moderation is very human indeed. But the question is a good one- as it forces us to ask the question about what rules we set today.

Summary of Recommendations to Social Media Companies

Read the full report here, and find earlier reports from the Center for Business and Human Rights here.

 

 

You may also like