A retailer in Indianapolis is looking for a “Retail Sales Ninja who thrives in a competitive environment, focused on being the Champion in your division.” The pay isn’t mentioned, but the job posting dangles a $1,500 bonus.
Tempted? Not that much if you’re a woman.
Job description software would have flagged the words before the ad ever got posted. “Ninja,” “competitive,” and “champion” are words research has found to dissuade women. They have masculine overtones. Ironically, the job posting is for a cashier, an occupation other research found is female dominated.
AI-powered job description software is being used today by thousands of employers to avoid the unintentional traps that can discourage qualified candidates from applying. In today’s highly competitive recruiting environment, no company can afford to lose candidates because of the wording in a job posting.
Though gender-coded language has been most widely studied, bias isn’t limited to gender. “Tech savvy,” “high energy,” and “go-getter” are associated with ageism. “Brown bag” and “blacklist” both have racial connotations.
Even in neutral language job postings, clever titles will be a turnoff. Women are 38% less likely than men to apply to a job with “guru” in the title. Using “hacker” discourages 90% of jobseekers from applying regardless of gender.
“Based on data analytics on the kinds of jobs men and women apply for, research shows that the adjectives matter,” says Iris Bohnet, Albert Pratt professor of business and government and the co-director of the Women and Public Policy Program at Harvard Kennedy School.
AI-based Job Description Software
Job description software is tuned to identify words in job descriptions that may seem perfectly innocent to the hiring manager or recruiter who used them, but which carry baggage.
Writing technology has been around since the first word processors were developed. They are rules-based, meaning they’ll flag poor grammar, spelling, and identify too complicated sentences and overly long paragraphs.
Job description software can do that as well. But as AI-enabled tools, they rely on data to highlight words that have been shown to limit applications, especially from diversity candidates. The best job description software will suggest neutral language substitutes for dozens of words carrying some implication of bias. Most focus on gender, race and age.
The data these tools use to identify bias comes from comparisons of thousands of job descriptions against lists of words research has shown to carry baggage. Increasingly, the AI that powers job description software will also compare the responses these job postings get to make fine adjustments to the process.
Other software will strip incoming resumes and applications of names, photos, and other elements that might be a tip-off to age, gender or race. Not strictly job description software, these inbound de-biasing tools are beginning to be incorporated into job description software and applicant tracking systems.
Job Description Software Tools
More than a dozen vendors offer job description software. Textio, one of the oldest, calls itself an “augmented writing tool.” Backed by a huge database amassed from hundreds of companies over several years, Textio’s AI algorithms not only identify words with hidden and unintended meaning, they predict the results a job description is likely to get.
Ongig’s Text Analyzer software identifies gender-coded words and words with racial, age, disability, race and other connotations, suggesting alternatives. It scores each job description for bias and will also help improve readability.
There is also free job description software. Though rudimentary and limited in their approach, tools like Gender Decoder and the Gender Bias Decoder made available by TotalJobs offer at least some help in identifying unintentional bias.
Though job description software makes the task of crafting neutral job postings easier, simply by being aware that words can carry unintended messages, recruiters can avoid creating job descriptions like the one that introduced this article.
John Zappe Contributed this article.