BBC finds Facebook failed to remove child abuse (2017)

Author: notRobot

Score: 39

Comments: 10

Date: 2020-10-28 10:50:57

Web Link

________________________________________________________________________________

codnee wrote at 2020-10-28 11:00:50:

"A BBC investigation has revealed that 80 per cent of child abuse images it reported to Facebook were not removed. Facebook responded to the allegations by requesting the BBC send examples of the material to it, then reporting the team to the authorities for sending them."

Edit: The title previously read (I am paraphrasing) "BBC finds FB failed to remove child abuse. FB reports BBC to authorities."

Kim_Bruning wrote at 2020-10-28 18:21:49:

Making the information illegal instead of the act leads to these kinds of funny situations; where the reporter gets reported.

Evidence of child abuse, war crimes, and certain forms of violence in general seems to be illegal or at least unwantedin-and-of itself; and thus tends to get deleted by social networks.

ulucs wrote at 2020-10-28 12:13:01:

Why was the title changed? It looks like the previous one was much better.

It is also editorialized now, which HN mods claim to not like?

notRobot wrote at 2020-10-28 12:30:52:

I wasn't able to fit the year into the original title, so that's probably the reason.

ethanwillis wrote at 2020-10-28 11:11:59:

“The moderation clearly isn't being effective, I would 
  question whether humans are moderating this, are looking 
  at this, and also I think it is failing to take account 
  of the context of these images."

Once again. Unless something gets big enough a human is never involved. The digitization of bureaucracy is a problem for our society. I would reckon that surely a company, especially of Facebook's size, would have all such reports in this category reviewed by humans and not _just_ algorithms. A false negative in these cases is not acceptable.

holtkam2 wrote at 2020-10-28 16:21:27:

I feel like there is a perverse incentive for FB to police it’s content in a very un-stringent manner, due to the fact that more content = more means of holding its users attention. Like yeah we know there’s some bad stuff on our site but ugh taking it down would lose some viewers attention and cost us money... hmm maybe we’ll pretend we didn’t see it

swiley wrote at 2020-10-28 12:20:07:

I've been saying this for a while. People think these huge moderated platforms are somehow better than other parts of the internet but they're really not.

Lots of pedophiles and abusers seem to just use Facebook and a decent number get away with it.

dathinab wrote at 2020-10-28 12:42:26:

Yup, it's super depressing but you don't at all need to go into the "dark web" to do any of the bad thinks and you still have good chances to get away with it.

I mean one time a few years ago I did use youporn (private browsing, no account) and ran into obvious child pornography :-(. (I'm really happy that it became visible what's going on before any action showed but dam that where little kids on a supposedly serious porn website)

And while there maybe not the best example you can find endless ~studies~ stories (edit: auto correct messup) about criminals using Facebook and similar as a platform to connect to, pressure and blackmail children.

Nextgrid wrote at 2020-10-28 20:38:06:

Brian Krebs once reported about several card fraud groups operating in the open on Facebook in total impunity.

I have personally seen livestreams of bottom-of-the-barrel “gangsters” boasting about stolen bikes and trying to sell them live.

Both of these are operating completely in the open and can be detected with simple regular expressions. The problem is that Facebook doesn’t care and will not care until it becomes liable for it.

malloreon wrote at 2020-10-28 21:23:55:

I'm curious if Facebook has done anything to fix this in the 3 years since this article.