Multimodal datasets: misogyny, pornography, and malignant stereotypes

Algorithmic bias, Stereotype

This paper examines multimodal datasets to show that there are lots of problematic Stereotypes and pornography in these datasets. For instance, keyword searching with “desi” or “latina” contains lots of NSFW images.

In CLIP, an image of a woman astronaut has a higher similarity with a description saying that “this is a photograph of a smiling housewife in an orange jumpsuit …” or An image of former president Obama showing higher similarity with “… first ever illegal president of the United States born in Kenya”

The main point here is not that we successfully generated provocative examples but that the sheer ease of producing such so-termed ‘‘corner cases” emanates directly from the strong mis-associations baked into the model that can potentially amplify selection bias towards offensive samples in the CC corpus.