COO Sheryl Sandberg has acknowledged that Facebook was wrong to delete posts showing an iconic image of a naked girl fleeing a napalm attack during the Vietnam war, according to a Reuters report. The admission came in a letter to Norwegian Prime Minister Erna Solberg, which Reuters obtained on Monday.
Facebook last week repeatedly deleted the Pulitzer Prize-winning photo, “The Terror of War,” on grounds that it violated its nudity restrictions.
The company’s reversal of course came in response to a global firestorm over its censorship. The controversy began when Norwegian author Tom Egeland posted the image as one of seven war-related photographs accompanying an entry on the history of warfare. The photo shows Pan Thi Kim Phuc, age 9 — naked because her clothes had been burned off her body — fleeing in terror with several other children.
The editor of Aftenposten, Norway’s largest daily newspaper, wrote an open letter to CEO Mark Zuckerberg, criticizing Facebook’s handling of the issue. Facebook’s Hamburg office had sent his publication an email demanding that it remove the photo from its page. Before the paper’s editor, Espen Egil Hansen, could respond, Facebook deleted the image and the accompanying article from the paper’s social media page.
Hansen blasted the action in his letter, telling Zuckerberg that Facebook, which funcitons as a powerful media platform, was engaging in censorship.
“I think you are abusing your power and i find it hard to believe that you have thought it through thoroughly,” Hansen wrote.
Support for the original post quickly became so widespread that Solberg reposted the image to her page, which Facebook also deleted.
“What Facebook does by removing images of this kind, good as the intentions may be, is to edit our common history,’ Solberg wrote in a Facebook post. “I hope that Facebook uses this opportunity to review its editing policy, and assumes the responsibility a large company that manages a broad communications platform should take.”
Before Facebook reversed course, free speech advocates, journalists and other critics called it out for ignoring the historical context of the photograph and relying far too much on its computerized algorithms.
The image of a naked child normally would violate Facebook’s community standards and might be classified as pornographic in some countries, noted Facebook spokesperson Andrea Saul. However, the company realized after the outcry that the specific image in question had great historical significance.
“Because of its status as an iconic image of historical importance, the value of permitting sharing outweighs the value of protecting the community by removal, so we have decided to reinstate the image on Facebook where we are aware it has been removed,” Saul explained.
Facebook would adjust its review mechanisms to permit sharing of the image going forward, she added.
The company will engage with publishers and other members of Facebook’s global community to improve its policies in order to promote free expression, Saul said.
The controversy follows a series of moves Facebook and other social media companies have made to monitor the growing problems of online harassment, hate speech, and the use of social media for terrorist activities. Facebook and other social media platforms have made changes to their service agreements and algorithms to monitor more closely what many consider abusive behavior online.
Facebook came under fire in May after a report accused the company of biasing trending topics against conservatives. In response, it made several changes to the way it curates content, in an effort to take the human bias out of the picture.
It’s possible Facebook went too far in relying on its technology to make common sense decisions about the content appearing on its pages, with censorship of legitimate imagery and speech becoming unintended consequences.
“I think it’s one more demonstration that Facebook has taken a pivotal role in publishing,” said Rick Edmonds, media business analyst at the Poynter Institute for Media Studies.
“It’s standards bear scrutiny, whether they are algorithmic or not,” he said. “Human editors make blunders too, but this action is egregiously illogical, as the affected Norwegian journalists say.”
2,119 total views, 1 views today