A couple of questions describe the problem with censorship: Who controls the censors? What biases do they have?
– Mike “Mish” Shedlock
For a few hours after The New York Times published an article about conflict and hunger in Yemen, Facebook temporarily removed posts from readers who had tried to share the report on the social platform.
At issue was a photograph of a starving child.
The article included several images of emaciated children. Some were crying. Some were listless. One, a 7-year-old girl named Amal, was shown gazing to the side, with flesh so paper-thin that her collarbone and rib cage were plainly visible. Tens of thousands of readers shared the article on Facebook, but some got a message notifying them that the post was not in line with Facebook’s community standards.
Facebook had addressed the issue by Friday night.
“As our community standards explain, we don’t allow nude images of children on Facebook, but we know this is an important image of global significance,” a spokeswoman said in an emailed statement. “We’re restoring the posts we removed on this basis.”
It took Facebook a few hours to realize it made a mistake in removing brutally honest images of the effects of the civil war in Yemen.
The images expose the blatant hypocrisy of the US in backing the corrupt Saudi Arabia regime in its war in Yemen.
This was not a nude image. It is not a “community standards” image. Nor was there any doubt about the authenticity of the image.
Any censor can judge “community standards” however they want, but Facebook is an international phenom, not Podunk USA.
Facebook could have and should have said “we f*ed up yet again” but never expect that.
Rather than rejecting that image, Facebook should have promoted it.
Instead, we had temporary censorship. Next time it might not be temporary.