Do employment portals really need to know candidate gender?

February 24, 2019
Share this Post: 

This article was written and published in Spanish and has been translated into English via Google Translate. Click here to read the original article.

The experience of a user who received better offers when changing the gender of her profile warns again about the danger of discriminatory bias in artificial intelligence.

It all started as a joke that ended up questioning again the neutrality of the algorithms that select our profiles to launch offers. E. G., who prefers not to be identified, is a doctor and researcher. But this spring he finishes his scholarship and has registered in job portals. Tired of receiving "proposals in feminine" of secretary and administrative, she tried to change the gender of her curriculum. A few days later, he began receiving offers of more responsible and better paid jobs. The platform denies that gender is taken into account and attributes the case to a change in search behaviors, but experts in data analysis warn that discrimination occurs not because of the platform but because machines learn from reality . E. G. poses a key question: if they do not take into account gender, why is the first data they ask for in the registration?

Caroline Ragot, director of Product of Infojobs, the platform where the case of E. G. was registered, assures that gender and age are "common statistical data in human resources and that they are only used internally".

"It is not used as a variable," he says bluntly.

The directive assures that it is the first time that they register a case like this among 2.5 million candidates and attributes it to a modification in the search pattern of the candidate after changing gender ("The algorithm returns keywords entered in the search engine" , defends), to a greater available supply the days of the alteration of the curriculum, to an error, to the use of a profile to look for employment for a friend with different criteria, to the incorporation of highly demanded jobs or to what they denominate "collaborative filters", which allow to send proposals that other candidates with similar profiles have consulted and that have gone unnoticed by the user.

Ragot insists that Infojobs promotes inclusive vacancies and equality as well as offers without gender discrimination, so they refer to the companies recommendations of good practices that include the description of the position offered without gender or with terminations in male and female.

This newspaper has maintained for a week two equal profiles of a man and a woman. Throughout this period, the offers received were the same and only slightly changed when modifying the search pattern of both profiles.

Gemma Galdón, a public policy doctor specializing in surveillance, social, legal and ethical impact of technology, warns that the key is that the bias is in reality, not on the platform. "The algorithm learns what it sees, but not correcting it makes it worse," warns to defend the need to eliminate gender and any data that may end in discrimination.

This partner, founder and director of research at Eticas Research & Consulting, recalls the case of Tay, a robot created by Microsoft to learn more about the interaction between computers and humans through Twitter. It had to be annulled a few days later when reproducing racist and macho messages. "Tay saw and learned the worst, the most offensive conclusion," said Diana Kelly, head of cybersecurity at Microsoft's recent Cybertech meeting in Israel. "If we do not learn inclusion, we will continue to create exclusion," he said to defend that diversity is "more effective and productive".

"We must have formulas to correct undesirable algorithms because they collect mechanisms of society," recommends Galdón, who recalls that, through reverse engineering, you can "accompany" the machine to avoid unwanted effects such as the one detected her team in a health portal that ended up recommending women to paint their nails.

Research from Carnegie Mellon University found that women were less likely to receive advertisements for well-paid jobs on Google because the programs used in the hiring departments of some companies tended to be used by men.

Amazon had to cancel its Artificial Intelligence system of selection of personnel to detect serious biases that hurt the hiring of women, according to a report by Business Insider.

And there can also be racial discrimination, as Galdón's team has proven when analyzing a public housing allocation program that sought a more efficient distribution and ended up creating ghettos.

The bias included due to lack of training of the programmers or the lack of follow-up can be aggravated by the incorporation of facial recognition tools, as happened to Jacky Alcine, a young man from Brooklyn who discovered that Google Photos had classified "gorillas" as "an image of you with a friend." The company rectified this after the complaint.

"Companies rectify when they are facing a crisis of reputation, but can not hide in the neutrality of mathematical calculation," concludes Galdón.

180 PREJUDICES DEFINED AND CLASSIFIED

"Artificial intelligence systems are as good as the information we put in them.The bad data may contain implicit racial, gender or ideological biases." Many of these artificial intelligence systems will continue to learn to use bad data, making the data grow. problem, but we believe that it can be corrected and the systems that do it will prosper," says IBM on its research page.

Eliminating prejudice and discrimination, according to this company, is "crucial for humans and machines" so it defends the need to develop reliable systems and "teach" them with impartial information. IBM calculates that there are more than 180 prejudices defined and classified that affect to the decision-making processes, which are incorporated into the artificial intelligence systems and that make them distrustful of them.

IBM researcher Francesca Rossi, in a presentation of the company, argues that not all partial information is bad (can be used to highlight reliable and relevant sources) and that the harmful could be eliminated within five years by those companies that they are more sensitive to the importance of diversity and the effects of incorporating prejudices into systems.

This article was written and published in Spanish and has been translated into English via Google Translate. Click here to read the original article.

Join us in Bangkok the 19th to the 21st of March for the Property Portal Watch Conference.

899

February 24, 2019

Subscribe to our mailing list to get the famous, free Friday newsletter!

News and analysis to help build better online marketplace businesses, in your inbox, every Friday

Related News

New Jersey Governor and NJEDA team up to launch job portal

Read More
Shutterstock 2738268051 1
CMP has a plan to answer unemployment numbers due to COVID-19

Recruitment, coaching and outplacement firm, CMP, is providing over a thousand people free access to their career portal to make...

Read More
Searchie 1 1
According to AI recruiter Searchie, COVID-19 has actually inflated global hiring numbers

COVID-19 has had one major impact on HRtech: artificial intelligence has found its time to truly shine. Sahiqa Bennett, Co-founder...

Read More
Coronavirus Covid Work Employment Jobs Economy 1 2
Ex-SocialRank founders premiere new job board to combat coronavirus layoffs

Employment numbers have been in near free-fall since the coronavirus pandemic, and hardly any industry can hide from it. But...

Read More

Editor's Pick