A woman booked an Airbnb for two and a half months, left after seven weeks, and thought that was the end of it. Then came the host’s damage claim: $16,000 worth of broken furniture, stained mattresses, and destroyed appliances.
That’s what happened to a London-based woman who rented a Manhattan apartment earlier this year. After leaving the unit early because she didn’t feel safe, her host accused her of breaking a robot vacuum, trashing a microwave, damaging a couch, ruining a mattress, and cracking a coffee table—among other things. He submitted photos to Airbnb to prove it.
Except she says those photos weren’t real. Not “staged” or “misleading.” Fabricated. As in: potentially generated with AI.
Airbnb Guest Denies Causing $16K in Damages, Claims Host Faked Evidence With AI
“I clearly demonstrated visual discrepancies in images of the same object,” she told The Guardian. “These inconsistencies are simply not possible in genuine, unedited photographs of the same object.” The coffee table, for instance, appeared to have different patterns of damage in each photo.
The host, listed as a “superhost” on Airbnb, didn’t respond to questions. But Airbnb initially sided with him, telling the guest she owed $7,000. She appealed. She offered an eyewitness who could confirm the condition of the apartment. She pointed out the image inconsistencies. Nothing moved the needle—until The Guardian started asking questions.
Five days later, Airbnb reversed course. They refunded her $670. Then $1,140. Finally, they offered her a full refund of the $5,700 she’d paid for the stay and removed the negative review the host had left on her profile.
Airbnb also issued an apology and launched an internal review. They warned the host that another incident could get him kicked off the platform. “We take damage claims seriously,” the company said. “Our specialist team reviews all available evidence.”
But the guest worries that next time, the person on the receiving end of a deepfaked damage claim won’t have the energy—or evidence—to push back. “Given the ease with which such images can now be AI-generated and apparently accepted by Airbnb despite investigations,” she said, “it should not be so easy for a host to get away with forging evidence in this way.”
In a world where photos can lie and superhosts might be testing the limits of Photoshop or AI filters, your best defense might not be your credit card. It’s a witness, screenshots, and a good lawyer—or at least a journalist.
The post Airbnb Host Accused of Using AI to Fake $16K in Damage appeared first on VICE.