argument: Notizie/News - Labor Law / Employment Law
Source: BBC News
A BBC News report reveals a growing and concerning trend within the artificial intelligence industry: human workers hired to generate high-quality, human-like data for training AI models are increasingly outsourcing their own work to AI. These individuals, often working as freelance data labelers, content writers, or programmers, are tasked with creating examples that AI systems can learn from. However, to increase their efficiency and earnings, many are secretly using generative AI tools to complete their assignments, and then passing off the AI-generated output as their own human work.
This practice raises significant questions about the integrity and quality of the data being used to train the next generation of AI systems. Experts cited in the article warn that if AI models are trained on data produced by other AIs, it could lead to a feedback loop where models amplify each other's biases, errors, and stylistic quirks, a phenomenon sometimes referred to as "model collapse." This could degrade the performance and reliability of future AI technologies. The report highlights the economic pressures on these workers and the challenges companies face in verifying the authenticity of human-generated data, exposing a critical vulnerability in the AI supply chain.