No Result
View All Result
SUBSCRIBE | NO FEES, NO PAYWALLS
MANAGE MY SUBSCRIPTION
NEWSLETTER
Corporate Compliance Insights
  • Home
  • About
    • About CCI
    • Writing for CCI
    • NEW: CCI Press – Book Publishing
    • Advertise With Us
  • Explore Topics
    • See All Articles
    • Compliance
    • Ethics
    • Risk
    • FCPA
    • Governance
    • Fraud
    • Internal Audit
    • HR Compliance
    • Cybersecurity
    • Data Privacy
    • Financial Services
    • Well-Being at Work
    • Leadership and Career
    • Opinion
  • Vendor News
  • Career Connection
  • Events
    • Calendar
    • Submit an Event
  • Library
    • Whitepapers & Reports
    • eBooks
    • CCI Press & Compliance Bookshelf
  • Podcasts
  • Videos
  • Subscribe
  • Home
  • About
    • About CCI
    • Writing for CCI
    • NEW: CCI Press – Book Publishing
    • Advertise With Us
  • Explore Topics
    • See All Articles
    • Compliance
    • Ethics
    • Risk
    • FCPA
    • Governance
    • Fraud
    • Internal Audit
    • HR Compliance
    • Cybersecurity
    • Data Privacy
    • Financial Services
    • Well-Being at Work
    • Leadership and Career
    • Opinion
  • Vendor News
  • Career Connection
  • Events
    • Calendar
    • Submit an Event
  • Library
    • Whitepapers & Reports
    • eBooks
    • CCI Press & Compliance Bookshelf
  • Podcasts
  • Videos
  • Subscribe
No Result
View All Result
Corporate Compliance Insights
Home HR Compliance

Algorithms Behaving Badly: New NYC Law Tackles Bias in Hiring Technology

Auditing the tools making automated employment decisions gives employers a chance to get it right

by Lofred Madzou
June 2, 2022
in HR Compliance
ai in hiring

From recruitment to retention, technology has long been crucial to effective workforce management. And while companies may be flocking to tools powered by AI and machine learning, a New York City law set to go into effect in 2023 calls attention to the need to ensure that automated tools don’t multiply society’s ills.

The use of technology in workforce management is nothing new. But concern is growing over adoption of advanced data analytics and AI-based tools, particularly the risk these tools run of widening employment disparities between demographic groups.

To that end, in early 2023, New York City will begin requiring employers to conduct annual bias audits of any AI-based employment decision tools, including those supported by machine learning, statistical modeling and data analytics solutions.

The city’s move comes on the heels of several high-profile incidents of “algorithms behaving badly,” including the revelation of an Amazon hiring tool that was biased against women. However, it’s important to note that while concern over bias in AI technology is understandable, such issues are surely not unique to algorithmic decision-making.

Multiple studies have demonstrated that unconscious biases deeply affect recruiters’ and business managers’ human decisions as well. Ultimately, business leaders need to maximize the benefits of technology to improve their workforce decisions while mitigating their associated risks through robust governance processes. Simply abandoning new technology would be of little help and could be detrimental to their DEI commitments. 

Law goes into effect in 2023, but questions remain

New York City’s new measure, passed Nov. 10, 2021, requires employers to display the results of their annual AI hiring audits publicly, to inform candidates that their applications have been reviewed by an automated hiring software and to offer an alternative selection process if requested. 

However, there remains uncertainty around various aspects of these bias audits. In the legislation, the term “bias audit” refers to “an impartial evaluation by an independent auditor” and the bill says that such an audit “shall include but not be limited to the testing of an automated employment decision tool to assess the tool’s disparate impact on persons.” 

Such a broad definition, while helpful in providing flexibility to the providers and users of such technology, lacks detail on the complete set of requirements from such an audit. It is also unclear if there are specific parties who can play the role of such an independent auditor, what skills might such auditors need and how their independence can be ensured. 

How companies can prepare now 

Companies should not simply react to the city’s legislation but should proactively build capabilities to ensure their automated decisioning tools are robust and trustworthy. This would not only reduce their regulatory exposure but, more fundamentally, also drive business value and build trust with their current and future workforce.

To achieve this, they should run all their models (both during development and after they have been implemented) through a comprehensive fairness workflow, in real-life situations, structured around four key steps: 

  • Bias identification. The purpose of any hiring process is to identify and recruit the best talent for a role. In doing so, it will necessarily be “biased” against those deemed less suited to the role. The first challenge is to identify whether, in doing so, significant differences in outcomes are being observed between different groups (e.g., based on race, gender or other identified groups of interest).
  • Root cause assessment and determination of whether bias can be justified. Correlation does not imply causation. In other words, simply observing disparate outcomes between groups does not help in determining if they are justified, let alone in addressing them. So an essential corollary is to identify the root causes of the apparent differences between groups and reach a considered conclusion on whether such differences are justified.
  • Bias mitigation. Assuming that the bias is not considered justified, targeted interventions will be needed to mitigate it. For machine learning models, this could include rebalancing the data set used to train the model to make it more representative or even to correct historic imbalances if appropriate; dropping input data variables that are proxies for protected groups (e.g., years of uninterrupted work experience as a proxy for gender); or enforcing parity between groups through changes in threshold rates where appropriate.
  • Bias reporting. It is essential that companies document bias mitigation strategies and report results to external auditors upon request. Such documentation should include information about defined protected groups, fairness objectives, observed disadvantages between different demographic groups, identified root causes, targeted interventions and final fairness testing results.

Taking action

It is important that companies likely to be affected act promptly to maximize the benefits of machine learning, statistical models and data analytics solutions for workforce management while complying with these new rules in New York City, a community in which about 4.5 million people work. 

Failure to deploy such diagnostic and monitoring capabilities may result in significant cost in terms of preventable harms, increased distrust between employers and employees and reputational damage. 


Tags: Artificial Intelligence (AI)DEIMachine Learning
Previous Post

Will I Come Home From School Alive? A Compliance Response to Uvalde

Next Post

Your Employees Are Not the Same. Are You?

Lofred Madzou

Lofred Madzou

Lofred Madzou, Director of Strategy and Business Development at TruEra, is a leading expert in responsible AI and AI governance and has spent most of his career driving responsible AI in government and corporate settings. In his role he works with organizations to strengthen their AI governance, prepare for regulatory requirements and emerging guidelines and establish processes that allow them to use AI in more effective and responsible ways. Prior to TruEra he worked at the World Economic Forum, where he led various global and multi-stakeholder AI governance projects. In practice, he advised various EU and Asia-Pacific governments on AI regulation and supported organizations in their implementation of responsible AI practices.

Related Posts

DALL·E 2023-02-16 13.18.43 - magritte style painting of robot looking into mirror

A Bot Isn’t Going to Take Your Place, But AI Will Make Your Job Harder

by Jennifer L. Gaskin
March 8, 2023

OpenAI’s splashy ChatGPT rollout has generated untold amounts of text, both directly and indirectly. While much of what’s been written...

diverse hiring

Survey: 1 in 3 In-House Legal Teams Don’t Have DEI Strategy

by Staff and Wire Reports
March 1, 2023

In many organizations, diversity and inclusion efforts are enterprise-wide, but a survey by legal and compliance recruiter BarkerGilmore found that...

helping hand

Getting Personal: Human Connection More Important Than Numbers in DEI

by Elizabeth Weingarten
January 11, 2023

Corporate America seems to agree — DEI is important and should be among any company’s top priorities. But getting there...

cci top 10 stories collage

Top 10 Compliance Stories of 2022

by Jennifer L. Gaskin
December 7, 2022

The more things change, the more they stay the same. This time last year, we summarized the top 10 ESG...

Next Post
sad dog awaits owners return

Your Employees Are Not the Same. Are You?

Compliance Job Interview Q&A

Jump to a Topic

AML Anti-Bribery Anti-Corruption Artificial Intelligence (AI) Automation Banking Board of Directors Board Risk Oversight Business Continuity Planning California Consumer Privacy Act (CCPA) Code of Conduct Communications Management Corporate Culture COVID-19 Cryptocurrency Culture of Ethics Cybercrime Cyber Risk Data Analytics Data Breach Data Governance DOJ Download Due Diligence Enterprise Risk Management (ERM) ESG FCPA Enforcement Actions Financial Crime Financial Crimes Enforcement Network (FinCEN) GDPR HIPAA Know Your Customer (KYC) Machine Learning Monitoring RegTech Reputation Risk Risk Assessment SEC Social Media Risk Supply Chain Technology Third Party Risk Management Tone at the Top Training Whistleblowing
No Result
View All Result

Privacy Policy

Founded in 2010, CCI is the web’s premier global independent news source for compliance, ethics, risk and information security. 

Got a news tip? Get in touch. Want a weekly round-up in your inbox? Sign up for free. No subscription fees, no paywalls. 

Follow Us

Browse Topics:

  • CCI Press
  • Compliance
  • Compliance Podcasts
  • Cybersecurity
  • Data Privacy
  • eBooks Published by CCI
  • Ethics
  • FCPA
  • Featured
  • Financial Services
  • Fraud
  • Governance
  • GRC Vendor News
  • HR Compliance
  • Internal Audit
  • Leadership and Career
  • On Demand Webinars
  • Opinion
  • Resource Library
  • Risk
  • Uncategorized
  • Videos
  • Webinars
  • Well-Being
  • Whitepapers

© 2022 Corporate Compliance Insights

No Result
View All Result
  • Home
  • About
    • About CCI
    • Writing for CCI
    • NEW: CCI Press – Book Publishing
    • Advertise With Us
  • Explore Topics
    • See All Articles
    • Compliance
    • Ethics
    • Risk
    • FCPA
    • Governance
    • Fraud
    • Internal Audit
    • HR Compliance
    • Cybersecurity
    • Data Privacy
    • Financial Services
    • Well-Being at Work
    • Leadership and Career
    • Opinion
  • Vendor News
  • Career Connection
  • Events
    • Calendar
    • Submit an Event
  • Library
    • Whitepapers & Reports
    • eBooks
    • CCI Press & Compliance Bookshelf
  • Podcasts
  • Videos
  • Subscribe

© 2022 Corporate Compliance Insights

Welcome to CCI. This site uses cookies. Please click OK to accept. Privacy Policy
Cookie settingsACCEPT
Manage consent

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
CookieDurationDescription
cookielawinfo-checbox-analytics11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checbox-functional11 monthsThe cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checbox-others11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-necessary11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-performance11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
viewed_cookie_policy11 monthsThe cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytics
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Others
Other uncategorized cookies are those that are being analyzed and have not been classified into a category as yet.
SAVE & ACCEPT