It’s becoming increasingly important for employers to be aware of new and changing AI regulations around the world, as more and more companies are using AI in their recruiting and HR processes.
Employers who hire in New York City need to prepare for the regulatory changes coming on January 1st, 2023 due to the Automated Employment Decision Tool law (AEDT). While this new legislation is top of mind for businesses who will be directly affected by it, there’s a much bigger picture to think about. While New York is not the first city to regulate AI in general, it is the first to impose these specific types of AI regulations and certainly won’t be the last.
Employers must strive to be compliant on a global scale — and not only to avoid fines and bad press, but to be ethical and fair when using AI for recruiting, and to create an inclusive and diverse workplace. Business leaders should start taking steps now to ensure they can adapt to AI regulations as they emerge and evolve.
If you use AI for recruiting (or if you plan to), what steps can you take to make sure you’re compliant?
Steps to staying compliant with AI for hiring
Have your AI audited for bias by a third party
One of the biggest concerns AI regulations seek to address is the potential for bias in recruiting. AI learns from human behaviors, and if those human traits have even an unconscious level of bias, it can feed into the AI and the technology can develop bias as well.
To make sure your AI-driven hiring tool is as fair as possible, a great first step to take is to have your AI audited by a third party every year. In fact, this will become necessary in places like New York City. The AEDT law will require employers who use AI for recruiting to have their technology audited ‘independently’ for bias at least once every twelve months. Each employer will also need to provide a summary of the most recent AI bias audit on their website.
An annual bias audit adds an additional layer of accountability for employers who use AI for recruiting. Having a third party inspect your software for bias can help companies avoid non-compliance issues and claims of discrimination in their hiring practices.
Notify candidates that you’re using AI
While this is another regulatory requirement coming soon to NYC, it’s also a general ‘best practice’ for using AI in HR. Transparency is a fundamental part of being compliant with AI.
If you use AI in your recruiting process, let your candidates know. Tell them that an AI-driven tool is being used in the process, and tell them exactly what qualifications and characteristics will be evaluated.
If your AI software is compliance focused, it should be clear to the user that the process is explainable and fair. It should be easy to explain (and understand) how the AI you’re using came to a particular decision or recommendation.
At Beamery, with regard to our own hiring, it’s important to us that our recruitment decisions are made based on skills, previous experience, and potential, and that is why we have created our products this way.
Disclose your data retention policy
Data privacy is more important than ever and it’s something that many of today’s job seekers care about — especially if you’re using AI to help make hiring decisions. Data protection laws already require employers to share information about the data they collect.
Make sure your candidates are aware of what types of data you collect and store during the recruiting process — and let them know how long you will keep their personal data in your system. This information should be either made available on your website or should be provided directly to candidates shortly after they enter the hiring process.
Consider having an ‘alternative selection process’
According to the new regulations coming to New York City, when candidates are notified that an AI-driven tool is being used in the hiring process, they will have the right to request an alternative process or accommodation. This means that candidates can ask for an ‘AI-free’ hiring process if that’s what they prefer. However, the law doesn’t state exactly what employers are required to do when an applicant makes this kind of request.
As an HR team, it is time to think about what alternative process you could offer to a candidate in these special cases, such as a manual assessment of their skills instead of an AI-driven assessment. While this likely won’t be a common request from job-seekers as AI is becoming more of a ‘normal’ part of our world, it’s worth thinking about the logistics and creating a plan.
AI is still relatively new, and as regulations begin to catch up to the technology, it will be important for employers to be able to handle these types of candidate requests, in order to remain compliant.
There are already a lot of rules in place that help protect and accommodate individuals with disabilities as consumers and on the job market.
For example, the Americans with Disabilities Act (ADA) and the European Accessibility Act have all sorts of important regulations for companies to follow regarding hiring those with disabilities, and how businesses can best serve their customers who have disabilities (i.e. making sure their website graphics have alt text for those who are visually impaired).
For recruiting and hiring specifically, accessibility is critical especially if you’re using AI. The US Equal Employment Opportunity Commission (EEOC) talks in depth about candidates with disabilities being “screened out”, which occurs when a disability prevents a candidate from meeting (or lowers their performance on) a selection criterion, and the candidate loses the job opportunity as a result.
In order to be compliant and accessible to all applicants, make sure your HR team has a plan in place to accommodate those with disabilities. For example, if a great candidate is required to take an assessment as part of the hiring process, but their disability may prevent them from showing their true knowledge by taking the assessment online, consider allowing them to complete a verbal assessment instead.
Every disability is different, and will require unique solutions, but if you want to create a truly inclusive work environment, it’s critical to accommodate candidates with varying disabilities.
Beamery’s AI-driven software
As an employer, you’re responsible for being compliant with how and when you use AI in your HR process. Choose your AI vendor carefully, and make sure their product has gone through the necessary diligence, and complies with new and changing regulations.
Beamery’s AI-driven software helps companies manage every stage of their talent lifecycle from sourcing and recruiting, to talent mobility and career development — all while helping meet DE&I goals.
Learn more about what Beamery has to offer and how you can stay compliant while staying ahead of the curve with AI for recruiting.
The information provided on this website does not, and is not intended to, constitute legal advice; instead, all information, content, and materials available here are for general informational purposes only.