Instantly Interpret Free: Legalese Decoder – AI Lawyer Translate Legal docs to plain English

legal-document-to-plain-english-translator/”>Try Free Now: Legalese tool without registration

Find a LOCAL lawyer

New York City’s law regulating employers’ use of automated employment decision tools (AEDTs) in hiring and promotions entered its enforcement phase on July 5, 2021, after experiencing several months of delays. The law, which took effect on January 1, requires employers to conduct bias audits of their HR technology systems and publish the results. However, enforcement was postponed to allow for clarifications in the regulations. Throughout the rulemaking process, there were numerous changes made to the law, causing uncertainty among employers and other stakeholders. To provide more clarity on the law’s provisions, the New York City Department of Consumer and Worker Protection released a set of frequently asked questions (FAQs) alongside the enforcement date.

The AI legalese decoder can play a significant role in helping employers comply with the New York City law. It can assist in conducting bias audits of AI tools used in the hiring and promotion processes. The AI legalese decoder uses machine learning, statistical modeling, and data analytics to evaluate the output of AEDTs and identify any potential biases. By leveraging this technology, employers can ensure that their AI tools are fair and compliant with the law.

Roy Wang, an artificial intelligence expert and general counsel at Eightfold AI, a talent intelligence platform in Santa Clara, California, expressed his optimism about the New York City law becoming a model for other jurisdictions. Wang mentioned that the legislation has become clearer over time, but he hopes that other cities and states will use this law as a blueprint instead of reinventing the wheel.

The emergence of a patchwork of laws across different locations poses challenges for employers operating across multiple jurisdictions. The New York City law is part of a broader national movement to regulate AI and automation technologies in the workplace. The U.S. Equal Employment Opportunity Commission, along with several states and Washington, D.C., is considering their own legislation to address AI bias in hiring.

According to Jonathan Kestenbaum, managing director of technology strategy and partnerships at AMS, the New York City law demonstrates the government’s proactive approach in regulating emerging technologies to prevent potential harm to the workforce. Kestenbaum sees it as a significant step forward in combating discrimination and bias in the workplace. While AI has brought positive changes to corporate hiring, such as streamlining resume screening and removing unintended biases, unchecked AI can perpetuate biases, violating existing laws at both local and federal levels.

The FAQs accompanying the law provide important clarifications regarding its coverage. According to Niloy Ray, an attorney at Littler, the law applies to employers and employment agencies only when the job is located in New York City. Whether the job is performed on-site or remotely determines the applicability of the law. Ray also mentioned that the application of the law to employment agencies is less clear, but it is likely intended to govern agency-based hiring where the positions are either partially located in NYC or remotely attached to a NYC office.

The law defines AEDTs as computational processes derived from machine learning, statistical modeling, data analytics, or artificial intelligence that issue simplified output, such as a score, classification, or recommendation, to assist or replace discretionary decision making in employment decisions. If employers or employment agencies use AEDTs to assess or screen candidates during the hiring or promotion process, they must comply with the law’s requirements before deploying the technology. AEDTs encompass a range of tools used for screening, interviewing, assessing, and scoring potential hires and employees. This includes resume analysis algorithms, interview-conducting chatbots, and assessment platforms evaluating job seekers’ skills, traits, or aptitude. The FAQs highlight that the law only applies when the technology is directed towards actual job seekers or employees and not prior to someone applying for a job. Therefore, employers can use unaudited technology for sourcing candidates, scanning resume databases, and conducting outreach to potential candidates without activating the law’s requirements.

Bias audits are a crucial aspect of the law, ensuring that AI tools do not have a disparate impact on sex, race, or ethnicity. The law requires a third-party audit to be conducted by an entity with no vested financial or other interest in the employer. Many AI consulting firms and third-party auditors have emerged to assist New York City employers in meeting this requirement. The audits increase transparency and accountability in the use of AI tools, making them safer and fairer for prospective and current employees. The responsibility for compliance lies with the employers, not the vendors of AEDTs. Employers must ensure that a bias audit has been conducted on the AEDT within one year of deployment. The law does not mandate specific actions based on the results of the bias audit; it is intended to raise awareness and promote fairness in AI-driven employment processes. The audit can cover multiple types of positions, and if demographic data is insufficient, businesses may rely on test data instead. Eightfold AI, as an AEDT vendor, conducted internal work and multiple audits to comply with the law’s requirements. They published the audit results to provide transparency and help their customers.

Employers and employment agencies are required to publish a summary of the most recent bias audit results, along with the date the AEDT was first used. The summary should include the date of the audit, the source and explanation of the data used for the audit, the number of individuals falling into an unknown category, the number of applicants or candidates, selection or scoring rates, and impact ratios for all categories. A bias audit remains valid for one year from the audit date, after which a new audit is required.

To ensure transparency, employers and employment agencies must notify residents of New York City who are employees or job candidates that they are using an AEDT and specify the job qualifications or characteristics that will be assessed by the tool. The notice should be provided at least 10 business days before using the AEDT. Alternatively, organizations can post the notice on their employment section webpage or include it in a written policy.

In conclusion, the New York City law regulating employers’ use of automated employment decision tools represents a significant step in addressing bias and discrimination in the workplace. The AI legalese decoder can assist employers in complying with the law by conducting bias audits and ensuring the fairness and transparency of AI tools used in hiring and promotions. The law’s enforcement phase and the accompanying FAQs provide much-needed clarity on its coverage, application, and requirements. By embracing this regulation, New York City is setting an example for other jurisdictions to follow in regulating AI and automation technologies in the workplace.

legal-document-to-plain-english-translator/”>Try Free Now: Legalese tool without registration

Find a LOCAL lawyer

Reference link