0

DOJ warns that misuse of algorithmic hiring tools could violate accessibility la...

 1 year ago
source link: https://finance.yahoo.com/news/doj-warns-misuse-algorithmic-hiring-191048160.html
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

DOJ warns that misuse of algorithmic hiring tools could violate accessibility laws

Devin Coldewey
Sat, May 14, 2022, 4:10 AM·3 min read
c39c67833b8dc14861c64e5873e0284d

AI tools for the hiring process have become a hot category, but the Department of Justice warns that careless use of these processes could lead to violations of U.S. laws protecting equal access for people with disabilities. If your company uses algorithmic sorting, facial tracking or other high-tech methods for sorting and rating applicants, you may want to take a closer look at what they're doing.

The department's Equal Employment Opportunity Commission, which watches for and advises on industry trends and actions pertaining to eponymous matters, has issued guidance on how company can safely use algorithm-based tools without risking the systematic exclusion of people with disabilities.

"New technologies should not become new ways to discriminate. If employers are aware of the ways AI and other technologies can discriminate against persons with disabilities, they can take steps to prevent it," said EEOC Chair Charlotte A. Burrows in the press release announcing the guidance.

The general sense of the guidance is to think hard (and solicit the opinions of affected groups) about whether these filters, tests, metrics and so on measure qualities or quantities relevant to doing the job. They offer a few examples:

  • An applicant with a visual impairment must complete a test or task with a visual component to qualify for an interview, such as a game. Unless the job has a visual component this unfairly cuts out blind applicants.

  • A chatbot screener asks questions that have been poorly phrased or designed, like whether a person can stand for several hours straight, with "no" answers disqualifying the applicant. A person in a wheelchair could certainly do many jobs that some may stand for, just from a sitting position.

  • An AI-based resume analysis service downranks an application due to a gap in employment, but that gap may be for reasons related to a disability or condition it is improper to penalize for.

  • An automated voice-based screener requires applicants to respond to questions or test problems vocally. Naturally this excludes the deaf and hard of hearing, as well as anyone with speech disorders. Unless the job involves a great deal of speech, this is improper.

  • A facial recognition algorithm evaluates someone's emotions during a video interview. But the person is neurodivergent, or suffers from facial paralysis due to a stroke; their scores will be outliers.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK