AI systems trained on historical data can inherit past societal biases. Since historical data often reflects societal norms and inequalities of its time, AI trained on such data can perpetuate outdated stereotypes and biases, affecting decision-making and fairness.
- Log in or register to contribute
Contribute to three or more articles across any domain to qualify for the Contributor badge. Please check back tomorrow for updates on your progress.