When AI systems are used for screening applicants or matching candidates with job opportunities, biased training data can lead to discrimination against women. If the data on which these systems are trained reflects a historical preference for male candidates in certain roles, women may be unfairly overlooked or rated lower by such automated systems, perpetuating gender disparities in the tech industry.
- Log in or register to contribute
Contribute to three or more articles across any domain to qualify for the Contributor badge. Please check back tomorrow for updates on your progress.