Via America’s Lawyer: Mike Papantonio talks to Sarah T. Roberts, Professor of Information Studies at UCLA to discuss the legal issues surrounding violent content being posted on social media and the responsibility to stop its proliferation.
Transcript of the above video:
Papantonio: By now most people are familiar with the murder of 74 year old Robert Godwin. The Cleveland resident Steve Stevens murdered Godwin last month and proceeded to post a video of the murder on Facebook where it remained for more than two hours before being taken down. This case has reignited the debate about the legal and ethical responsibilities of social media sites for preventing violent content from spreading. While sites like Facebook have been sued in the past for failing to prevent the proliferation of graphic content, these cases are nearly impossible to win. Social media sites are protected by the 1996 Communications Decency Act which states that companies can’t be held liable for the graphic content of their users.
Here to discuss this case further is Sarah Roberts, a professor of information technology at UCLA. Sarah, do you think this specific case exposed a major flaw in how content is moderated by Facebook, or is it just an aspect of our modern socially connected world that we simply have to accept and we can’t do anything about? What’s your take?
Sarah Roberts: My take is that indeed this is a major downside of the social media economy in which most of us are engaged on a daily basis at this point. I’ve been looking at this for seven years. I’d been online for over 20 years myself. As the users and number of people engage in social media have exponentially increased, so too have the potential for people to use the platforms for all kinds of human self-expression, including the very worse kind, and that’s what we’ve seen with this case.
Papantonio: Can you explain how the content moderation process works at Facebook? If a violent crime is posted, how is it posted on the site? How does the process work? That’s what I think most people are wanting. Is any gatekeeper? Does somebody say, “Gee whiz, we can’t allow this to go out.” Tell us about the process.
Sarah Roberts: One of the things that this case has done in a tragic way is brought to light this very issue of the question of moderation. In fact, contrary to what many people believe, that there are computers and algorithms that engage on a large scale with all the content that goes up on social media websites. It’s actually really not true at all. In fact, not only is it not all computation that goes to adjudicating the content on a site, there are legions of human beings around the world who look at this material and make decisions about it.
The other piece is that the volume is so great and enormous that even with computers engaged in that process to a certain extent, which they are, all of that content cannot possibly be reviewed before it goes up. What ends up happening on Facebook, on Instagram, on Twitter, on any number of platforms that people use, is that these platforms are essentially empty vessels and rely on user generated content to fill them. People fill up the content in volumes, in droves, it goes up online and then it’s incumbent upon the very same users, people like you and me, to see something and alert the platform in most cases.
Papantonio: All the platform is doing is looking at the metrics. They’re saying a numbers are going up. They’re really not paying any attention until somebody says, “You’ve got to take this down. This isn’t right.” We think that there’s some infrastructure to oversee this, but as you point out, there really is not much of an infrastructure at all when it comes right down to it. This video was reportedly on the site for more than two hours. Do you think that more human curators could’ve gotten the video taken down sooner had there been a system in place?
Sarah Roberts: To the point about the infrastructure, there is infrastructure and there are human beings working diligently on this issue. It’s actually a full-time job for a lot of people and it’s one that has it’s own serious repercussions, as you can imagine. The issue is one again of scope and scale. What we’re actually talking about is finding a needle in a haystack. Just to give you an idea of what we’re talking about here, in 2014 YouTube released a statistic about the volume of material they received and it was something like for every minute of every day that the platform is live it receives 100 hours of video uploaded to the site. What happens is there’s just no way for the platforms to review and look at that volume of content. I think you pointed out in the beginning of your comments that platforms are largely immune from being prosecuted for transmitting this kind of material under that 1996 telecommunications law.
What ends up happening is basically they get all of the reward and very little of the responsibility for what they’re transmitting.
Papantonio: There is another side to it though isn’t there? If we start tampering with this too much then the other extreme is that anytime you start tampering we’re going to run in the risk of are we tampering too much? Are we closing too much down? Is there too much censorship? Isn’t that kind of the balancing part of this whole thing? We’re kind of right on that edge. How do we find that sweet spot? What’s your take on that?
Sarah Roberts: You’re absolutely right that people are very concerned, and rightly so, about maintaining their ability to express themselves on these platforms. One thing that we have to keep in mind is that these platforms are a private entities. They’re for profit enterprises. They’re actually not the public square. They’re not the commons that people actually want them to be and need them to be, and we’ve seen them come to prominence. It’s no coincidence as we see a shuttering of the public’s fear, a closing down of those spaces.
When we think about this notion of free speech in these arenas it’s less like the public square. It’s more like the private shopping mall. When they don’t like a particular kind of speech they’re actually incredibly empowered to shut it down and they do so already all over the world, in places like Turkey and elsewhere.
Papantonio: Lately Facebook has allowed the block whatever content they see fit. I mean isn’t that the quick answer here? I mean it is a private enterprise. This is not like our public airways. It’s not part of the public commons, as you say. this is because of that they do have a lot of control. They can shut down what they want, so with that shouldn’t we put a higher standard on them to do a better job? I know, look, it’s all about profit and I recognize the numbers are extraordinary, but with those extraordinary profits and extraordinary numbers comes an extraordinary responsibility. That responsibility is to get ahead of this. Do you ever see them really getting ahead of this problem Sarah?
Sarah Roberts: Mike, I appreciate you highlighting this issue because one of the most important pressure levels on firms like this is the court of public opinion, and this horrible tragedy in Cleveland has put people in the position of asking Facebook and other platforms just what do you intend to do to protect our rights to express ourselves while protecting people’s rights to safety and to keep violence like this off the platforms? Because they’re largely immune in terms of the court of law and in other ways, this is one of the few places where pressure can be applied. I think through that pressure the firms will ask themselves how can we do better. Unfortunately in the case of Facebook the forward facing, public facing side of that really is in acknowledging this problem significantly and hasn’t really satisfied the public in that regard to date.
Papantonio: In about 30 second Sarah, I’ve been an attorney for 35 years. I’ve seen the ways that law evolves, the ways the good lawyers find ways to solve these problems. Do you see them making an end road? Do you see lawyers making any end roads to solve this problem with the 1996 act?
Sarah Roberts: It’s certainly put it on the map. It’s put this issue on notice. I mean one thing I’ve been thinking about in the fallout of this horrible case is as that material stayed up, the murder of this senior citizen in Cleveland stayed up for two hours, we know that people viewed it. Thousands of people viewed it. Where are those people and what harm has been done to them by perhaps coming across this material quite innocently without knowing what they were going to see and being exposed to that? What about their harm? That’s one way I’m thinking. From a legal perspective, is there something to that?
Papantonio: That’s also the way I see into it also Sarah. Thank you for joining me. Thank you for your work out there.
Sarah Roberts: I appreciate you. Thank you.