Apps like Kled AI, Silencio, Neon Mobile, and Luel AI are paying users for personal data including phone calls, texts, and videos that AI companies use to train their models.
A new wave of gig economy apps is emerging that pays users for their personal data, which is then sold to AI companies for model training. According to a Guardian investigation, apps like Kled AI, Silencio, Neon Mobile, and Luel AI are compensating users worldwide for sharing intimate moments of their lives, including phone calls, text messages, and videos of their surroundings.
The business model is straightforward: users download these apps and agree to share various types of personal data in exchange for small payments. The apps then package this data and sell it to AI companies that need diverse, real-world training data for their models. This includes everything from casual conversations to videos of public spaces.
This practice raises significant privacy concerns. Users may not fully understand how their personal information is being used or who ultimately has access to it. The data collected can include sensitive information that users might not want shared with third parties, even if anonymized.
From the AI companies' perspective, this provides a way to obtain training data that would be difficult or impossible to collect through other means. Real-world conversations, diverse accents, and varied environments are valuable for training AI systems that need to work effectively across different populations and contexts.
The trend reflects the growing demand for training data in the AI industry. As companies race to develop more sophisticated models, the need for vast amounts of diverse data continues to increase. Traditional data sources like public websites and books are no longer sufficient for many AI applications.
However, the ethical implications are substantial. Users may be trading away personal information for minimal compensation without understanding the long-term consequences. There are also questions about informed consent, data ownership, and the potential for misuse of the collected information.
This business model represents a new frontier in the gig economy, where workers are essentially selling their personal experiences and communications rather than traditional labor. It highlights the increasing commodification of personal data in the age of AI and raises important questions about privacy, consent, and the value of personal information in the digital economy.
The practice is particularly concerning given that many users may be in economically vulnerable positions, making them more likely to participate in data-sharing programs for small payments without fully considering the privacy implications. This creates a situation where personal data is being extracted from those who can least afford to protect it.
As AI development continues to accelerate, the demand for training data is likely to grow, potentially leading to more apps and services that monetize personal information in similar ways. This trend underscores the need for stronger data protection regulations and greater transparency about how personal information is collected, used, and shared in the AI ecosystem.

Comments
Please log in or register to join the discussion