Monday, November 16, 2020

What Should Be Done About Social Media?

By Moshe Y. Vardi
Communications of the ACM, November 2020, Vol. 63 No. 11, Page 5
10.1145/3424762
Comments (3)

One of the most basic and urgent policy questions is how to tackle the rising role of social media in our public sphere. As social media has proliferated across the globe, societies have had to grapple with its implications for both exercising and constraining speech. While social media has provided a platform for countless individuals to express their opinions, many argue that social-media companies must adopt more accountability for harmful content published on their sites.

To illustrate the salience of technological change in the world of social media, the case of Facebook is timely. Over the last several years, Facebook has been involved in a long series of controversial issues, from Cambridge Analytica to hate speech in Myanmar. Responding to the bad publicity that accompanied these disclosures, Facebook's CEO Mark Zuckerberg wrote a Washington Post op-ed in 2019 calling for increasing regulation of the Internet in four areas: harmful content, election protection, effective privacy and data protection, and data portability.

Until 2014, Facebook's motto was, "Move fast and break things. Unless you are breaking stuff, you are not moving fast enough." Despite the benefits of innovation and change, "breaking things" can have profound and dangerous unintended societal consequences. The case that social media has become an instrument for undermining democracy is a strong one. It is now widely accepted that social media seriously affected the 2016 Brexit referendum in Britain and the presidential election in the U.S. It is this cavalier attitude about breaking things that led Wall Street Journal columnist Peggy Noonan to describe Silicon Valley executives as "moral Martians."

A discussion of modern Internet regulation merits mention of Section 230 of the Communications Decency Act of 1996, a fundamental piece of U.S. legislation that provides immunity from liability for providers and users of an "interactive computer service" who publish information provided by third-party users. By allowing Facebook and other Internet companies to operate as a platform, rather than as a publisher, Section 230 frees them from liability for the content they publish. The explosive growth of social-media platforms would have not been possible without Section 230. At the same time, it is doubtful Congress in 1996 could have conceptualized anything similar to social media. One can also argue that Facebook is quite far from being a neutral platform because of its algorithm-based system that generates content based on users' preferences. In fact, the proliferation of "bad speech" on social-media platforms has become politically untenable, and now all social-media platforms are actively fighting "bad speech." Thus, in spite of Section 230, social-media platforms seem to be accepting some responsibility for the content they publish. In other words, they are starting to behave with some restraint, like publishers, rather than platforms.

It is not at all clear, however, whether a platform like Facebook, which also owns Instagram and WhatsApp, with more than 2.5 billion active users, can behave like a traditional publisher. First, there is the difficulty of vetting content from a very large number of users. With just over 50,000 employees, Facebook clearly cannot have people review all its content; algorithmic filtering is a must. But, if we have learned anything over the last few years, it is how good people are at outsmarting algorithms. More fundamentally, however, do we really want Facebook to regulate the speech of more than 2.5 billion people? No government in the world has such power to regulate the speech of almost a third of humanity. Of course, traditional publishers regulate speech on their platforms, but there is a multiplicity of such outlets with no single authority having a monopoly on deciding for or against certain content. In contrast, there is only one Facebook.

The basic policy question—how to regulate speech on social media platforms—seems inseparable from another policy concern, namely how to deal with the concentration of power in technology. The five largest U.S. corporations are all tech companies—Alphabet, Amazon, Apple, Facebook, and Microsoft—with combined market capitalization approaching seven trillion dollars. For this reason, the tech sector is often called "Big Tech" these days. In a 2018 book, The Curse of Bigness: Antitrust in the New Gilded Age, legal scholar Tim Wu argues the U.S. must enforce anti-trust laws against such corporations.

Breaking well-integrated corporations is difficult, but there are some easier options on the table. Should Facebook be forced to spin off Instagram and WhatsApp? Should Google be forced to spin off YouTube? The time has come to put these questions on the table!

Follow me on Facebook and Twitter.

Back to Top

Author

Moshe Y. Vardi (vardi@cs.rice.edu) is the Karen Ostrum George Distinguished Service Professor in Computational Engineering and Director of the Ken Kennedy Institute for Information Technology at Rice University, Houston, TX, USA. He is the former Editor-in-Chief of Communications.


Copyright held by author.
Request permission to (re)publish from the owner/author

The Digital Library is published by the Association for Computing Machinery. Copyright © 2020 ACM, Inc.


Comments


M. Grossman
October 26, 2020 12:56

You say that the question of how to regulate speech seems inseparable from the question of Big Tech's bigness. But I don't feel you've really made a solid connection between the two. A platform company 1/3 the size of Facebook can either succeed or fail at regulating speech too. And what does success look like?


Moshe Vardi
October 27, 2020 09:55

No one complains about the ability to the New York Times to regulate speech on its "platform", because theire are many competing newspapers, each making its own editorial decisions. But Facebook is effectively the sole public square of cyberspace, with the ability to regulate the speech of more than 2.5M users.


Richard Altmaier
October 30, 2020 03:03

I have two points.
1. there is clearly the path of break-up. Divide Facebook into 3 companies per geographic region, and they compete with each other, and hopefully trend to different censorship policies.
2. we cannot accept the notion of "bad speech". There is free speech, and there is a narrow segment of violence speech. Hate speech is a form of free speech. There is no such thing as fact checking, there is merely the opportunity to put out your own opinion. Politicians cannot be fact check, by definition.


Displaying all 3 comments



from Hacker News https://ift.tt/36Jl5Hg

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.