Voice-operated assistants and voice recognition technologies often struggle with recognizing voices that do not conform to the male-female binary or have higher pitches, as seen traditionally in women and non-binary individuals. This bias roots from initial training data sets that lacked diversity, impacting effective usability for a significant user segment.

Voice-operated assistants and voice recognition technologies often struggle with recognizing voices that do not conform to the male-female binary or have higher pitches, as seen traditionally in women and non-binary individuals. This bias roots from initial training data sets that lacked diversity, impacting effective usability for a significant user segment.

Empowered by Artificial Intelligence and the women in tech community.
Like this article?

Interested in sharing your knowledge ?

Learn more about how to contribute.