At a glance
- New York City (NYC) plans to begin enforcing its employment-related artificial intelligence law on 5 July 2023.
- In this alert, we outline key considerations for NYC employers to help them prepare for this law’s enforcement.
Local Law 144 of 2021, which took effect on 1 January 2023, regulates employers’ use of automated employment decision tools (AEDTs) in making hiring and promotion decisions. However, the Department of Consumer and Worker Protection – the agency charged with the AEDT law’s enforcement – did not issue final rules to implement the law until 6 April 2023.
Employers now have a short window to determine whether they are using AEDTs to make covered employment decisions and, if so, whether they are in compliance with the law’s requirements.
Below we outline key considerations for NYC employers to help them prepare for this law’s enforcement.
The Department of Consumer and Worker Protection’s journey to final rules
The DCWP first released proposed rules in September 2022. After an outpouring of comments and a public hearing, the DCWP released revised proposed rules in December 2022, which was followed by another round of public comments and a second public hearing.
The final rules include several notable changes and clarifications, including:
- Modifying the definition of “machine learning, statistical modeling, data analytics, or artificial intelligence” to expand its scope.
- Requiring that bias audits indicate the number of individuals the AEDT assessed that are not included in the calculations because they fall within an unknown category (i.e. unknown race, ethnicity, sex, or gender, as well as intersectional categories) and requiring that number be included in the summary of results.
- Clarifying that the number of applicants in a category and scoring rate of a category, if applicable, must be included in the summary of results.
- Allowing an independent auditor to exclude a category that comprises less than 2% of the data being used for the bias audit from the calculations of impact ratio.
- Clarifying the examples of a bias audit.
- Clarifying that an employer or employment agency may rely on a bias audit conducted using the historical data of other employers or employment agencies only in certain circumstances.
What should employers do now?
The primary question for employers to answer is whether they use AEDTs to make covered employment decisions. AEDTs are defined as:
- Any computational process, derived from “machine learning, statistical modeling, data analytics, or artificial intelligence”;
- That issues a “simplified output” to “substantially assist or replace discretionary decision making”;
- For issuing “employment decisions” that impact natural persons.
First, the only “employment decisions” the law covers relate to hiring and promotions. Employers should initially evaluate whether they use any automated tools in connection with hiring- or promotion-related decisions – i.e. to screen (i) candidates who have applied to a specific job or (ii) employees for promotions. “Screening,” for example, refers to the determination of whether a candidate or employee should be selected or advanced in the hiring or promotion process. Automated tools used for decisions unrelated to hiring or promotions (eg, workplace performance or compensation) are not covered by the law.
Second, not all automated tools fall under the law’s AEDT definition. To determine if they do, employers are encouraged to assess whether they are using “machine learning, statistical modeling, data analytics, or artificial intelligence” to issue a “simplified output.”
The final rules clarify that “machine learning, statistical modeling, data analytics, or artificial intelligence” is a group of mathematical or computer-based techniques that (i) generate a prediction or an expected outcome for an observation (ii) for which a computer at least in part identifies inputs, their relative importance, and other parameters that might improve the predictions generated, if applicable, that is then used to assign an observation to a group (ie, a “simplified output”) for ascertaining whether the individual’s experience, skill, or credentials match the particular job role; this can take the form of a score, tag or categorisation, recommendation, or ranking.
Finally, employers should assess whether the “simplified output” is being used to “substantially assist or replace” discretionary human decision making. In other words, employers should assess if (i) they are relying entirely on the “simplified output” without any additional human considerations; (ii) the “simplified output” is the most important consideration in the hiring or promotion decision; or (iii) although human decision making is used to determine the hiring or promotion decision, the “simplified output” can override human decision making. If the answer to any of these is yes, then the employer may be using an AEDT covered under the law.
What’s next for employers using AEDTs?
If employers conclude that they are using AEDTs to make covered employment decisions, they must:
- Engage in an independent bias audit within one year of the use of the tool;
- Publish the audit’s results; and
- Provide notice (i) to applicants and employees about the AEDT’s use and functioning; and (ii) to those subject to the AEDT’s use that they may request an accommodation or alternative selection process.
An “independent auditor” must conduct the bias audit, which means that they are “a person or group that is capable of exercising objective and impartial judgment on all issues within the scope of a bias audit of an AEDT.” An auditor may not have been involved in “using, developing, or distributing the AEDT,” and must not have had an employment relationship or financial interest with an employer or employment agency that seeks to use the AEDT or with a vendor that developed or distributes the AEDT.
Under the penalty schedule published by the DCWP, if an employer does not conduct the appropriate bias audit or provide adequate notice, which are separate violations under the law, it may be subject to civil penalties which range between USD 375 for a first violation and USD 1,500 for third and subsequent violations. Each day the AEDT is used in violation of the law is a separate violation.
While the final rules provide some clarity, NYC employers that use any automated tools to make hiring -and promotion- related decisions are encouraged to consult counsel to determine if their tools are covered and assess compliance with the AEDT law. NYC employers with operations in other jurisdictions are also encouraged to monitor increasing legal and regulatory activity related to AI in the workplace.
If you have any questions regarding these or other AI developments or assistance conducting a bias audit of AEDTs, please contact the authors or your DLA Piper relationship attorney.