Hassan, a teenager of 15 years, dedicated a significant amount of time to browsing websites, a common activity for individuals his age. Before the onset of the pandemic, he engaged in sports activities with fellow kids from the neighborhood in Burewala, located in the Punjab region of Pakistan. However, the Covid lockdowns altered his routine, transforming him into somewhat of a recluse who relied heavily on his mobile phone. Now at the age of 18, Hassan mentioned, “I only left my room when absolutely necessary.” Due to privacy concerns, he preferred to use a surname instead of his full name. Unlike many teenagers, he did not spend his time playing games or scrolling through TikTok. Instead, he delved into the world of artificial intelligence supply chains, working from his bedroom to upload and categorize data for leading AI corporations.
Initially, humans are responsible for labeling raw data used in training machine-learning models, followed by animal verification to ensure accuracy. These labeling tasks range from simple identification of objects like street lamps to more complex content moderation tasks involving potentially harmful material sourced from various online platforms. Crowdsourcing platforms such as Toloka, where Hassan started his journey, often assign these tasks to remote workers.
Introduced to the platform by a friend, Hassan was enticed by the promise of flexible work hours from any location. He mentioned that an hour of work could yield him \(1 to \)2, surpassing the then-prevailing national minimum wage of around $0.26. Coming from a modest background where his father worked as a manual laborer and his mother crafted home remedies, Hassan remarked, “You could say I come from a disadvantaged background.” The crisis prompted him to seek additional work opportunities, leading him to explore beyond Toloka.
According to Saiph Savage, the head of the Civic AI Lab at Northeastern University, there is often a misconception that AI functions autonomously without human intervention, whereas in reality, there is a workforce behind the scenes.
Interestingly, some of these workers are minors, despite platforms requiring users to be above 18. Hassan circumvented these restrictions, like many others, by using a relative’s details and appropriate payment methods. WIRED’s interactions with workers from Pakistan and Kenya revealed a pattern of minors engaging in such tasks, highlighting the prevalence of this practice.
While platforms like Appen stipulate an age requirement of 18, underage individuals manage to bypass these regulations by using family members’ information. Hassan’s experiences shed light on the challenges faced by young workers in this industry, where they tackle various tasks, including content moderation and data labeling, often working long hours for minimal pay.
Despite the potential for earning, Hassan acknowledges the toll this work can take on mental well-being, especially when exposed to distressing content. He recalls instances of moderating content that he deemed harmful, underscoring the ethical concerns associated with such tasks.
The exploitation of young workers in the AI sector extends beyond geographical boundaries, with evidence of underage workers found in different regions worldwide. The ease with which age restrictions can be circumvented raises questions about the accountability of these platforms in ensuring ethical practices.
In conclusion, the narrative portrays the intricate dynamics of the AI industry, where young individuals like Hassan navigate through challenges to earn a livelihood. The ethical implications of employing underage workers underscore the need for stringent regulations and oversight in this evolving sector.