What to Make of Google's Decision to Block the 'Innocence of Muslims' Movie
SEP 14 2012, 4:50 PM ET 80The attacks on U.S. missions abroad this week have been a test for Google's "bias in favor of free expression."
Wednesday morning must have been a nightmare for the people who work at YouTube. Late the night before, angry demonstrators had attacked the U.S. missions in Cairo and Egypt, killing four Americans, purportedly provoked by an American-made video that villified and mocked Muhammad. That video, like pretty much all videos these days, was available on YouTube, a site where Google (which owns YouTube) has the power to block access to content on a country-by-country basis. By midday Wednesday, the company had decided that that was just what it was going to do.
A YouTube spokesperson explained via email:
This is not to say that Google is absolutist about free expression. Quite the contrary: Google has made a point of its position that it takes takedown requests very seriously, and has released a Transparency Report every six months over the past two years, detailing just how much content it has removed and where. Google has a careful but somewhat opaque process by which they determine whether to comply with a government's requests, taking into account a country's local laws (e.g. laws that prohibit pro-Nazi content in Germany) and whether the request is appropriately narrow, as Google analyst Dorothy Chou explained to me earlier this year. Additionally, YouTube has a pretty reasonable set of "Community Guidelines" that prohibit sexually explicit content, "bad stuff like animal abuse, drug abuse, under-age drinking and smoking, or bomb making," and hate speech.
So content removal is nothing new to Google, and it works hard and throws a lot of people at managing takedown requests. But, even considering that context -- or, perhaps, especially considering that context -- Google's decision to block access to the "Innocence of Muslims" video is unprecedented. Even by Google's own assessment, the video is "clearly within our guidelines" -- meaning it is not hate speech and did not otherwise violate the website's terms of service. (An update from a YouTube spokesperson by email late today added that the video additionally had been blocked in India and Indonesia, where it was, in fact, in violation of local law.)
Why did Google make such an extraordinary decision? Perhaps it felt somehow culpable for the deaths or feared that if the video spread further, more would come; perhaps -- as the Los Angeles Times thinly suggested -- it felt pressured to do so by Obama administration officials. Google, for it's part, won't say, and we are left guessing. (I have twice criticized Google's Transparency Report for, ironically, not being very transparent on this key question of how Google makes decisions regarding what content to remove.)
There are reasons both practical and abstract to be concerned about Google's decision here. The first is the head-scratcher: Did Google really think that blocking access to the video would quell the outrage? The video has been discussed on Egyptian television, and of course will continue to be available online on non-Google sites. It wouldn't be so hard for one person to mask his or her IP address, access and screen capture the video, and then upload and share it through any number of means. Once something like this is out of the bottle, blocking it on YouTube does not put it back in. (To complicate matters, although early reports pointed to the video, it seems that the attacks in Libya were long-planned and had other provocations entirely.) The protests have since spread to many other cities including Khartoum, Tunis, Chennai. Is YouTube going to spread its ban in lockstep?
It would be one thing if blocking the video somehow restored calm in these parts, but without such benefits to consider, why pursue the ban? At best it does nothing; at worst it provokes an even more dramatic backlash. Google has put a lot at stake in this move: its reputation as a defender of free expression, its perceived independence from the wishes of the U.S. government, its ability to weather similar storms in the future. As Kevin Bankston, director of the free expression project at the Center for Democracy and Technology, told The New York Times, blocking the video "sends the message that if you violently object to speech you disagree with, you can get it censored."
Free expression is not pretty. People like to gussy it up as though it were the white knight of those who speak truth to power, as some sort of superpower for activists and journalists around the world. That is the dream, the First Amendment we inscribe on the facades of our museums and drape above the daises of awards dinners. But freedom of speech, in its deepest sense, is the foundation of our centuries-old experiment with self-governance. It is about whether society can take ideas, process them, and somehow build something out of them. That can be a very ugly and even dangerous process, something you commit to only because you have unwavering faith in humans to work it all out in the end, or because you believe that the alternative -- any alternative -- is worse.
This week was a test for Google's commitment to that idea, and the company seems to have come up short. But that misstep is, paradoxically, a symptom of Google's admirable and hard-fought struggle to do right by its users when it comes to freedom online. Long ago, Google could have just said that it would not not, under any circumstance, remove any content. They wouldn't have had to make this choice, nor the thousands of others they have weighed over the years. Absolutism is easy.
But Google is trying to do something much harder, and much more thoughtful, than that. As it saidin its statement outlining its approach to freedom of expression, "We recognize that there are limits. In some areas it's obvious where to draw the line. For example, we have an all-product ban on child pornography. But in other areas, like extremism, it gets complicated because our products are available in numerous countries with widely varying laws and cultures."
In choosing that path, Google is bound to make mistakes, because navigating that terrain -- particularly as its sands shift constantly underfoot -- is just plain difficult. But we should not wish that they had chosen a different path. What we should wish is that they learn from those mistakes, and become a wiser, more-forward-thinking Google. If not, that's when we'll really have reason for concern. Because if nothing else, what this whole incident shows, once again, is that Google is acting like a court, deciding what content it keeps up and what it pulls -- all without the sort of democratic accountability or transparency we have come to expect on questions of free expression and censorship. It has gone into uncharted waters, and it has taken us along with it.
Wednesday morning must have been a nightmare for the people who work at YouTube. Late the night before, angry demonstrators had attacked the U.S. missions in Cairo and Egypt, killing four Americans, purportedly provoked by an American-made video that villified and mocked Muhammad. That video, like pretty much all videos these days, was available on YouTube, a site where Google (which owns YouTube) has the power to block access to content on a country-by-country basis. By midday Wednesday, the company had decided that that was just what it was going to do.
A YouTube spokesperson explained via email:
We work hard to create a community everyone can enjoy and which also enables people to express different opinions. This can be a challenge because what's OK in one country can be offensive elsewhere. This video -- which is widely available on the Web -- is clearly within our guidelines and so will stay on YouTube. However, given the very difficult situation in Libya and Egypt we have temporarily restricted access in both countries. Our hearts are with the families of the people murdered in Tuesday's attack in Libya.YouTube is in a tough spot here. It certainly doesn't want to play any part, even an indirect one, in fueling violence that has already resulted in four American deaths. But censoring the video also cuts against Google's stated ideology, which has a "bias in favor of free expression -- not just because it's a key tenet of free societies, but also because more information generally means more choice, more power, more economic opportunity and more freedom for people." Google's top leaders have championed the power of the Internet to make society more free by making the Internet more free, and the company has been a vocal and constant critic of China's efforts to control what people do and say online. In certain instances, Google has prominently defied a government's request to remove content, such as when it protected videos documenting police brutality here in the United States.
MORE ON GOOGLE AND FREEDOM OF EXPRESSION
So content removal is nothing new to Google, and it works hard and throws a lot of people at managing takedown requests. But, even considering that context -- or, perhaps, especially considering that context -- Google's decision to block access to the "Innocence of Muslims" video is unprecedented. Even by Google's own assessment, the video is "clearly within our guidelines" -- meaning it is not hate speech and did not otherwise violate the website's terms of service. (An update from a YouTube spokesperson by email late today added that the video additionally had been blocked in India and Indonesia, where it was, in fact, in violation of local law.)
Why did Google make such an extraordinary decision? Perhaps it felt somehow culpable for the deaths or feared that if the video spread further, more would come; perhaps -- as the Los Angeles Times thinly suggested -- it felt pressured to do so by Obama administration officials. Google, for it's part, won't say, and we are left guessing. (I have twice criticized Google's Transparency Report for, ironically, not being very transparent on this key question of how Google makes decisions regarding what content to remove.)
There are reasons both practical and abstract to be concerned about Google's decision here. The first is the head-scratcher: Did Google really think that blocking access to the video would quell the outrage? The video has been discussed on Egyptian television, and of course will continue to be available online on non-Google sites. It wouldn't be so hard for one person to mask his or her IP address, access and screen capture the video, and then upload and share it through any number of means. Once something like this is out of the bottle, blocking it on YouTube does not put it back in. (To complicate matters, although early reports pointed to the video, it seems that the attacks in Libya were long-planned and had other provocations entirely.) The protests have since spread to many other cities including Khartoum, Tunis, Chennai. Is YouTube going to spread its ban in lockstep?
It would be one thing if blocking the video somehow restored calm in these parts, but without such benefits to consider, why pursue the ban? At best it does nothing; at worst it provokes an even more dramatic backlash. Google has put a lot at stake in this move: its reputation as a defender of free expression, its perceived independence from the wishes of the U.S. government, its ability to weather similar storms in the future. As Kevin Bankston, director of the free expression project at the Center for Democracy and Technology, told The New York Times, blocking the video "sends the message that if you violently object to speech you disagree with, you can get it censored."
Free expression is not pretty. People like to gussy it up as though it were the white knight of those who speak truth to power, as some sort of superpower for activists and journalists around the world. That is the dream, the First Amendment we inscribe on the facades of our museums and drape above the daises of awards dinners. But freedom of speech, in its deepest sense, is the foundation of our centuries-old experiment with self-governance. It is about whether society can take ideas, process them, and somehow build something out of them. That can be a very ugly and even dangerous process, something you commit to only because you have unwavering faith in humans to work it all out in the end, or because you believe that the alternative -- any alternative -- is worse.
This week was a test for Google's commitment to that idea, and the company seems to have come up short. But that misstep is, paradoxically, a symptom of Google's admirable and hard-fought struggle to do right by its users when it comes to freedom online. Long ago, Google could have just said that it would not not, under any circumstance, remove any content. They wouldn't have had to make this choice, nor the thousands of others they have weighed over the years. Absolutism is easy.
But Google is trying to do something much harder, and much more thoughtful, than that. As it saidin its statement outlining its approach to freedom of expression, "We recognize that there are limits. In some areas it's obvious where to draw the line. For example, we have an all-product ban on child pornography. But in other areas, like extremism, it gets complicated because our products are available in numerous countries with widely varying laws and cultures."
In choosing that path, Google is bound to make mistakes, because navigating that terrain -- particularly as its sands shift constantly underfoot -- is just plain difficult. But we should not wish that they had chosen a different path. What we should wish is that they learn from those mistakes, and become a wiser, more-forward-thinking Google. If not, that's when we'll really have reason for concern. Because if nothing else, what this whole incident shows, once again, is that Google is acting like a court, deciding what content it keeps up and what it pulls -- all without the sort of democratic accountability or transparency we have come to expect on questions of free expression and censorship. It has gone into uncharted waters, and it has taken us along with it.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.