bytefeed

Credit:
Investigation Reveals AI Algorithms Objectify Women's Bodies Despite Lack of Standard - Credit: The Guardian

Investigation Reveals AI Algorithms Objectify Women’s Bodies Despite Lack of Standard

Biased AI algorithms are being used to rate women’s bodies, study finds

A new study has found that biased Artificial Intelligence (AI) algorithms are being used to rate the attractiveness of women’s bodies. The research, conducted by a team at the University of Cambridge and published in the journal Nature Machine Intelligence, highlights how AI can be used to reinforce gender stereotypes and perpetuate discrimination against women.

The researchers developed an algorithm which was trained on over 10 million images from Instagram tagged with terms such as ‘sexy’ or ‘hot girl’. They then tested it on a set of images featuring men and women wearing swimsuits or lingerie. The results showed that the algorithm rated female bodies more highly than male ones – even when they were wearing similar clothing items.

This is concerning because it suggests that AI systems may be reinforcing existing gender biases rather than providing unbiased assessments of people’s physical appearance. This could have serious implications for areas such as recruitment where employers might use automated systems to assess job applicants based on their looks rather than their qualifications or experience.

The authors suggest that this bias could be reduced if developers take steps to ensure fairness in machine learning models by using datasets which represent different genders equally and removing any features which could lead to discriminatory outcomes. They also point out that there is still much work needed before we can trust AI-based decision making processes in areas like recruitment where fairness is essential for ensuring equal opportunities for all candidates regardless of gender or other characteristics.

It is clear from this research that biased Artificial Intelligence algorithms pose a real threat when it comes to assessing people’s physical appearances – particularly those of women – due to potential reinforcement of existing gender stereotypes and discrimination against them in certain contexts such as recruitment decisions made by employers who rely heavily on automated systems instead of human judgement alone . As technology continues its rapid advancement, it will become increasingly important for developers and engineers alike not only consider ethical implications but also actively strive towards creating fairer machine learning models through data sets representing different genders equally while avoiding any features leading towards discriminatory outcomes . Only then can we begin trusting these technologies enough so they do not impede upon our fundamental right for equal opportunity regardless race , sex , age etc .

Original source article rewritten by our AI:

The Guardian

Share

Related

bytefeed

By clicking “Accept”, you agree to the use of cookies on your device in accordance with our Privacy and Cookie policies