Skip to content

Facebook’s algorithm identifies an image of onions as “sexually explicit” (and this is not the first time something like this has happened)

27 mayo, 2021

A few days ago, the Canadian company The Seed Company, dedicated to the agricultural sector, decided start an advertising campaign on Facebook to promote seeds of a popular French variety of onion known as Walla Walla.

But, to the surprise of his manager, the social network notified him that the ad had not passed the usual verification process, because it was forbidden to “position products or services of a sexually suggestive way“.


The problem (and the reason for the surprise) is that the photo accompanying the ad consisted of only several Walla Walla onions in a wicker basket, one of them cut in half.

Facebook Blocks Onion

What happened then? It’s not the fault of any fetish Facebook content moderator, but of automated moderation systems, based on artificial intelligence.

Jackson McLean, the cited manager, assumes that, in the eyes of the algorithm, “the two round shapes in the image could have been misinterpreted as breasts or something like that”. Still, he is surprised that Facebook talks about ‘sexually explicit’ content, “as if it’s impossible not to see something sexual there.”

Screenshot 23

The company’s Facebook welcomed the controversy: “We think this is what the Facebook algorithm saw.”

McLean was confident that “some real human can take a look at the photo to conclude that it is not really sexual at all: they are just onions.”

Sure enough, shortly after, Facebook approved the ad and sent a statement of apology to The Seed Comapny. Meg Sinclair, Facebook Canada Communications Manager, Meg Sinclair, confirmed that the problem lay in artificial intelligence:

“We use automated technology to keep nudity out of our apps, but sometimes it doesn’t distinguish an onion from, well, you know …”.

Artificial intelligence concepts: what is antagonistic artificial intelligence (and how it can manipulate other AIs)

FACEBOOK keeps a record of ALL WEBSITES you visit so you can DEACTIVATE IT

Does artificial intelligence have a dirty look?

It is not, however, the first case of its kind: three years ago, the London Metropolitan Police developed an image recognition software to facilitate the detection of crimes, trusting that in a short time they could turn it into a 100% task automatic.

Dunes

“Dune that hand does not cover …”

However, they realized that the AI ​​they had developed identified desert dunes (ubiquitous element in wallpapers) as nude body scenes. Mark Stokes, head of the digital section of the forensic department, had to admit that perhaps the tool was not yet “sophisticated enough”.

A year earlier, the Instagram AI had also evidenced his ‘dirty look’ confusing a typical UK cake with a feminine breast, which led to the suspension of the confused cook’s account:

The reason for all this confusion is that artificial intelligence does not see the same things in an image as the human eye: when we see, we mostly perceive shapes, but machines are better at recognizing textures. That allows them to see things that escape us, but also to be manipulated.

Via | BBC