Once, I encountered the funny story of an AI image descriptor with a sheep obsession. It had been trained on pictures of fields of sheep. Therefore, it tagged anything in a field as 'sheep', including an empty field, because they work on statistical probability. Therefore, it thinks "ah, a field! there's probably a sheep here." (It's a bit more complicated but basically that.) It also couldn't recognise sheep in places that weren't fields, such as petrol stations or barns. [cont]
Now, the alarming aspect of this story is that the very same technology is probably what tumblr is using to identify porn. Now, if it can’t tell that an empty field is not, in fact, full of sheep, what hope do we have that it can’t tell an empty room isn’t full of writing human forms engaged in passionate coitus?
this really does sound like an episode of black mirror