From what I've heard, a lot of the flagging was done by bots. The problem with bots is that you train an image-identifying program by showing it a bunch of pictures like 'this is porn' and 'this is not porn', it has some program that develops a definition for 'porn' and 'not porn'. You run it through enough labeled tests that it gets whatever degree of accuracy you want, then you let it loose on the unlabeled fields.
Usually most porn-seeking bots learn that porn has a lot of skin-tones and curves. The bots do not learn what human genitals look like, because most of the not-porn doesn't need that. So any picture with a lot of curves and skin-tones will get flagged, from half-naked people snuggling to desert sand dunes.
no subject
Usually most porn-seeking bots learn that porn has a lot of skin-tones and curves. The bots do not learn what human genitals look like, because most of the not-porn doesn't need that. So any picture with a lot of curves and skin-tones will get flagged, from half-naked people snuggling to desert sand dunes.