Amazon rolled out Rufus, a new chatbot, to all US shoppers in September. It’s designed to help you find things to buy and answer questions.
It can also help find cheaper versions of popular name-brand items, but you have to ask it indirectly: It won’t engage with direct asks for “dupes” or copycats of more expensive products.
The new chatbot illustrates Amazon’s complicated relationship with copycat items: The company forbids counterfeit or illegally replicated products on its platform. But it sells plenty of products that are merely similar to the brands you know from the mall. That’s where this AI chatbot can come in.
I tested the bot and found it helpful when I asked for gift suggestions for a 2-year-old’s birthday. It could also answer questions on product details and help with customer service issues.
The one thing Rufus simply will not do is help you find a “dupe.”
Maria Boschetti, a spokesperson for Amazon, couldn’t confirm that the term “dupe” is forbidden in Rufus, but told Business Insider, “as we continue to test and learn with generative AI, Rufus, and search may provide different experiences.”
“Dupe” is common parlance for a similar but cheaper product to a luxury or trendy item. On social media, influencers often talk about being excited to find a “dupe” for items like makeup, skincare, or clothing. My former colleague Jennifer Ortakales Dawkins wrote last year about how Gen Z is the “dupe generation” — they aren’t ashamed of having the off-brand version of a fancier item; they’re even proud of it. Notably, the dupes that most people are talking about online aren’t for luxury or high-end designer items; they’re often of mall brands, like Uggs, Lululemon, and Skims — or makeup brands in Sephora like Charlotte Tilbury or Supergoop sunscreen.
When I asked Rufus to find me a “dupe” — using the word “dupe” — of various products that are known for having dupes, it shut me down immediately with: “Sorry, I can’t help with that.”
However, if I worded my request differently and instead asked for an item similar to a name brand — without using the word “dupe” — it was happy to oblige:
It’s worth noting that the Rufus recommendation for leggings similar to Lululemon pants — “Crz Yoga Women’s Naked Feeling Workout Leggings” — are very similar to the Lululemons.
I tried asking Rufus for items that people on social media often seek “dupes” for, like Drunk Elephant skincare (popular with the preteen and teen set at Sephora), Ugg boots, and Adidas Samba sneakers. Nope, nope, and nope. Rufus refused my requests for “dupes,” although it would happily offer up alternate products if I asked for a cheaper version of any brand-name product — without using the word “dupe.”
The popularity of these look-for-less items, often fueled by TikTok, could very well be driving a lot of sales for Amazon: The CRZ leggings Rufus recommended have more than 10,000 reviews on the platform.
It’s unclear exactly what Amazon’s parameters are for a product to be considered a “dupe.” The e-commerce giant works to take down counterfeit items and products that violate copyright or intellectual property. And Amazon’s policies forbid sellers from using the term “dupe” in their product names or descriptions.
Amazon has warehouses full of products that aren’t brand names people would immediately recognize — but might be similar to them. There is a wide array of yoga leggings in the $30 range from brands you’ve never heard of, like “CRZ.” How is a shopper supposed to know which one to buy? In theory, Rufus should be able to help — AI should help people compare similar items and answer questions about the products.
Still, Amazon’s AI likely won’t give you the juicy reviews you really want — like the ones of TikTokers who recommend women’s fashion, or a YouTuber who compares vacuums. They have the editorial freedom to speak openly about whether a product is a high-quality replica of a more expensive version. For now, Amazon’s Rufus probably won’t.
The post Amazon’s AI chatbot will help you shop. Just don’t ask it for a ‘dupe.’ appeared first on Business Insider.