Detail of "Ema (Akt auf einer Treppe)," 1966 by Gerhard Richter.
If you are interested in an insightful consideration of the problem of posting art that contains nudity on Facebook or if your are interested in Internet censorship, read this article from Blouin ArtInfo by Terri Ciccone. It is an interesting read.
To get at the root of the problem, it helps to know how Facebook goes about identifying and removing obscene content in the first place. When an image depicting “sexually explicit” content gets reported by a Facebook user, it heads to the “abusive content” department, one of four teams that work around the world and around the clock to monitor time-sensitive material (the process is detailed by a chart on the website NakedSecurity.)
The team then measures the photo against Facebook’s community standards, which define what type of content is prohibited, including content containing violence and threats, self harm, bullying and harassment, and “graphic content,” which among other things includes nudity and pornography.
If the image is found to have violated a standard, the team will issue a warning. A second offense causes the account to be disabled. There is no algorithm or auto-delete that searches for offensive content, save for a software called PhotoDNA, which polices the platform for child pornography.
We reached out to Facebook for a clarification on what is acceptable and what is not, given the recent scuffles over artworks. In an email, Frederic Wolens, a representative of Facebook’s Policy Communications, explained....