Sydney-based and ASX-listed tech company Appen has been very publicly called out for a racist question for prospective employees, asking applicants who identify as Black or African American to specify the tone of their complexion.
Appen provides and improves data used in the development of AI and machine learning products, and now has nine offices across the US, Europe and Asia.
Last week, would-be applicant Charné Graham, a digital content specialist and social media strategist based in Texas, shared a screenshot of the question on Twitter, asking: “Has anyone ever seen this on a job application?”
This is the company guys, a recruiter reached out to me on LinkedIn to apply for a role. I did not continue with the app after seeing the paper bag test. pic.twitter.com/P6xmv1ngPH
— Charné Graham (@CharneGraham) May 10, 2021
The image shows two drop-down menus, one asking candidates to identify their ethnicity, in which Graham has selected ‘African American’.
The next box is labelled ‘complexion’. It then offers six options ranging from ‘type 1’, of light or ‘very pale’ skin, through ‘moderate brown’ and ‘dark brown’, to type six: “very dark brown to black”.
Predictably, Twitter users were outraged at what Graham called a “paper bag test”, referring to a system by which people of colour in America would only be considered for employment or allowed into certain places if their skin tone was lighter than a paper bag.
Many expressed shock, disbelief and anger. Others, however, suggested that the question could have been related to an effort to hire more people of colour and address racial bias in AI algorithms.
In its response, that’s basically what Appen went with.
While apologising for “the way the question is phrased”, the business also took to Twitter to try to “add context” to why this question is asked.
“Our goal is to eliminate bias and make AI that works for everyone,” the statement said.
— Appen (@AppenGlobal) May 10, 2021
The question is used to “ensure diverse datasets are included in the collection and annotation used to train computer vision algorithms”.
For many, that wasn’t satisfactory. In fact, it raised more questions. Are applicants also required to submit an image of themselves, also used to help train the AI? Or, are successful employees expected to be test subjects for the tech?
There is a well-known issue in AI, stemmed from a lack of diversity within the industry. A weighting towards white engineers, and therefore white test subjects, means algorithms are much more practiced at recognising white skin.
To try to address this inequality is fine. To do that via the recruitment process, without the consent or knowledge of the applicant, is not.
As one response to Graham’s post put it: “This should be handled at the engineering level, not candidate level”.
Equally, if the role includes being a tester for training anti-racist AI, that should be in the job description.
To top it all off, Graham confirmed she did not continue with the application after seeing this question, proving that it actually probably hampered Appen’s diversity efforts.
Her original screenshot has now been retweeted more than 10,000 times, with almost 6,000 quote tweets. And she’s used the opportunity to spruik the fact that she is still available for hire. If that’s not enterprising, I don’t know what is.