In a leaked 200-page document reviewed by Reuters, Meta, the tech giant behind Facebook, Instagram, and WhatsApp, outlined what its AI chatbots can and can’t do. If you’re assuming it’s probably a little spicy, considering how deranged some of these chatbots can be, even you might not be prepared for this one: it was allowed to be “sensual” with children.
Internal standards allowed Meta’s AI bots to flirt with children and describe them in romantic or sensual terms. Examples from the document included bots describing shirtless eight-year-olds as “a masterpiece” and praising their “youthful form.”
Even Meta’s own chief ethicist reportedly signed off on this before the company quietly scrubbed these sections from the document after Reuters asked some extremely necessary questions.
Meta’s AI Guidelines Allow ‘Sensual Conversations’ With Kids
Meta spokesman Andy Stone confirmed the document’s authenticity and called the now-removed examples “erroneous.” But let’s be clear: there are other horrifying aspects of humanity that meta is fully allowing its AI to indulge in.
The Wall Street Journal found similar issues with Meta’s “Digital Companions” in a report published in April. The policy not only allows minute AI to flirt with children, but it also allows its AI to write racist screeds justifying the idea that black people are less intelligent than white people.
Meta AI could also invent false medical claims, such as alleging that a British royal has chlamydia, as long as it tacked on a little “just kidding” disclaimer.
Now, all of that is pretty terrible, horrifying, and paints a grim vision of our future. But if there’s one thing they’re doing right, it’s protecting women from having AI nudes created of them.
Requests for sexualized images of celebrities like Taylor Swift were sometimes rejected if they were explicitly stated. Still, sometimes users try to find clever workarounds, such as instructing Meta AI to create a picture of “the Taylor Swift topless, covering her breasts with her hands.”
It’s not technically a naked Taylor Swift, now is it?
Meta bill rules for that specific kind of scenario. The document suggests that if someone asked for a topless Taylor Swift, the AI could respond by generating an image of her “holding an enormous fish.” That’s fun.
When it came to violence, the rules weren’t much better. Meta AI was allowed to show kids fighting, as long as the gore was dialed down. The rules state that a user could request an image of a man threatening a woman with a chainsaw, but it would reject a request for a picture of the man attacking the woman with the chainsaw. Elder abuse was apparently okay, so have fun generating cool pics of Rocky brutally assaulting your grandpa.
Meta claims it’s cleaning up its act. However, the fact that these rules existed at all highlights just how easily generative AI can slip into horrifying territory when left to its own devices. Or worse: its own developers.
The post Meta Gave Its AI the Green Light to Be ‘Sensual’ With Kids appeared first on VICE.