The Anne Frank Center for Mutual Respect last week called on its 112,000 followers to sign a petition demanding that Facebook remove pages denying the Holocaust from its site. That didn't happen.
Instead, a week later, Facebook removed a news article that the center posted about Holocaust education "for apparently violating community standards," the center said. The violation: It included a photo showing naked, emaciated children from a Nazi concentration camp.
You read that right.
Facebook CEO Mark Zuckerberg, who is also a Jew, has defended Holocaust deniers.
It was the second time in as many months that Facebook has found itself on an eyebrow-raising side about the Holocaust. Just last month, Facebook CEO Mark Zuckerberg defended Holocaust deniers on the Recode Decode Podcast, saying "I don't think that they're intentionally getting it wrong," something many people disagreed with because Holocaust denial is a well-documented, maliciously anti-Semitic act.
It's also the second time Facebook has gotten in trouble for removing photojournalistic images.
In 2016, Facebook pulled down a story from a Norwegian newspaper, Aftenposten, because it included the Pulitzer Prize-winning photograph of a naked Vietnamese girl fleeing a napalm attack.
Over the past few months, the tech company's moves have drawn criticism from lawmakers, pundits and President Donald Trump for what they claim is censorship. "Social Media Giants are silencing millions of people," Trump tweeted at the time.
Earlier this month, Facebook -- along with Google, Apple and others -- banned the conspiracy theorist Alex Jones and many parts of his Infowars publication. Jones is known for using social media and his site to spread false assertions that children killed in school shootings were fake and that the survivors were part of a conspiracy.
The incidents indicate Facebook's human employees, not its algorithms, continue to struggle to tell the difference between false news, propaganda, pornography and legitimate content. That's despite the company's move to hire more security and content moderators and revamp its community standards.
This and other questions are likely to come up on Sept. 5, when Facebook COO Sheryl Sandberg, Twitter CEO Jack Dorsey and an as-yet-unnamed Google executive give testimony before the Senate Select Intelligence Committee and House of Representatives Energy and Commerce Committee.
The wrong call
The center's post, which was a link to an article, got Facebook's attention because it included a photo of naked, emaciated children from a Nazi concentration camp. That image, Facebook said in a statement, violated its rules against nudity. "As our Community Standards explain, we don't allow people to post nude images of children on Facebook," it said.
When Facebook initially removed the post on Aug. 27, the center sent a request for an explanation. It didn't receive a response from Facebook until after its public tweet, when the company made an exception because the photo was "newsworthy, significant or important to the public interest."
Facebook didn't immediately respond to a followup question asking why it waited two days.
"If Facebook is serious about its community standards, it should start tackling Holocaust denial and not the organizations who are trying to educate people on discrimination, facts, and history," the center said in a statement.
First published Aug. 29 at 6:30 p.m. PT.
Correction, 8:37 p.m. PT: Corrects to indicate that Zuckerberg's interview with Recode was in July, not August.
Infowars and Silicon Valley: Everything you need to know about the tech industry's free speech debate.
The Honeymoon is Over: Everything you need to know about why tech is under Washington's microscope.