Anker's Eufy security camera division launched a controversial campaign offering customers $2 per video to train AI theft detection systems, raising significant privacy and security concerns about user data collection practices.

Anker's Eufy security camera division launched an unusual campaign from December 2024 to February 2025, offering customers $2 per video to help train AI theft detection systems. The program specifically targeted footage of package thefts and car door break-ins, with users encouraged to stage fake theft scenarios if they lacked real incidents. More than 120 users participated according to campaign comments, with the company aiming to collect 20,000 videos of each theft type. This represents a new model where tech companies directly compensate users for valuable training data rather than collecting it without explicit payment.
The scope of Eufy's data collection extends far beyond the initial paid campaign. Through ongoing "Video Donation Programs," the company has accumulated hundreds of thousands of user videos, with their app's "Honor Wall" showing the top contributor has donated over 201,531 video clips. Users receive badges, gift cards, and camera equipment as rewards for sharing footage. Eufy explicitly requests videos involving humans and promises the data won't be shared with third parties, but the company's previous privacy violations raise questions about these commitments.
Eufy's assurances about data protection ring hollow given the company's track record. In 2023, The Verge exposed that Eufy had misled customers about end-to-end encryption, with camera streams remaining unencrypted when accessed through the company's web portal despite marketing claims. After initially trying to cover up the issue, Anker eventually admitted to misleading users and promised fixes. This history of deceptive privacy practices makes current promises about AI training data protection particularly concerning, especially as the company continues expanding data collection to baby monitor footage.
The Eufy case exemplifies emerging risks in user data monetization, where companies offer small payments for potentially valuable personal information. Similar concerns recently surfaced with the Neon calling app, which paid users to share call recordings and transcripts before a security flaw exposed all user data. While direct compensation gives users some agency over their data, it also creates new vulnerabilities and privacy risks. As AI training datasets become increasingly valuable, expect more companies to experiment with these pay-for-data models, making user awareness of data rights and security practices more critical than ever.