AI and Personal Data – Ready for the Great Reset Yet?
You don’t need to answer that question. You do have a right to privacy. Trouble is, New York City and California are transforming their concerns about privacy (not yours, employer), into new laws that will come into effect on 01 January, 2023.
The new laws concern Automated Employment Decision Tools (AEDTs).
And even though the “good news” is that total confusion over the laws and the “high volume of public comments” has led to a delay in enforcement – April 15, 2023 for?New York City?and July 01, 2023 for California – legal eagles are advising employers to get prepared now.
Inviting a couple of lawyers round for Christmas dinner is something to consider.
No? OK – let’s look into it.
?
Automated Employment Decision Tools – and You
For a non-comprehensive list of tools in use by employers, we’ll point out:
Under the new law in New York City, requirements will include:
“Violations of the provisions of the bill would be subject to a civil penalty.”?— NYC
Well, that part is pretty clear, even if they did neglect to add “Happy new year!”?Touted fines?for NYC are $500 for first violations and up to $1,500 for each subsequent violation.
Just to be perfectly unclear on what AEDTs are, according to the NYC law, here’s a quote:
“The term ‘automated employment decision tool’ (or ‘AEDT’) is broadly defined as ‘any computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence, that issues simplified output, including a score, classification, or recommendation, that is used to substantially assist or replace discretionary decision making for making employment decisions that impact natural persons.’”
In California,?Attorney General, Rob Bonta, was thinking along these lines in November:
Try to bear in mind that he’s only trying to help.
Burdens of Sourcing, Hiring, Retention – And?Help!
As an employer or hiring team, your motives for using AEDTs is to improve the process of identifying, interviewing, and ultimately hiring the best fit candidates for open positions, with a view to enhancing productivity, retention and growth.
领英推荐
Increasingly easy online job applications means potentially massive response to advertised positions, with possibly many more fake-it-’till-you-make-it hopefuls than highly qualified experts in the relevant field. What to do? Enter Applicant Tracking Systems and any other automated help that becomes available to support analysis, assessment, and hiring the best.
And now, as you eagerly endeavor to optimize these complex AI assistants to achieve optimal outcomes, you find yourself needing legal assistance just to start pushing the buttons.
Still, progress.
With that in mind, related or similar local, state and even federal mandates are looming over the new year and our collective festive cheer, so getting on top of this one may be the first step to being capable of protecting yourself from the next one(s).
Unfortunately, getting on top of this one could be an uphill battle, even for the lawyered-up. That “high volume of public comments” mentioned above includes complaints about how vague many of the key terms actually are, and how many questions have been left unanswered.
So the cynical could think that the delay in enforcement may be more for the people writing those “key terms” to figure out what they actually mean than it is for the employers who must start abiding by them as soon as humanely possible.
In September, the Department of Consumer and Worker Protection (DCWP), proffered what it calls “Proposed Rules” to help employers get to grips with the question of how to comply with the new law, which is a nice gesture during the season of giving (and taking). Apparently, they’re still not finalized, but have been stated in this article already and can be found in greater depth?here.
?So here's that help! promised above:
What’s the Point?
The point of all this is essentially to assess potential bias against anybody in a protected category: race, ethnicity, sex, for example. Those selected to move forward or given a classification by a AEDT would be analyzed with a view to identifying bias or lack of it. In New York City the following questions can tentatively be answered here (next step: ask your lawyer):
?
Independent Auditors?
The audit would require an “independent auditor” – person or group not connected to the development or use of an AEDT responsible for an audit. Potentially, this would allow for consultants/contractors to be brought in and could mean legitimately using an in-house compliance team, if independent in the way mandated.
?
Informing Candidates and/or Employees?
Giving notice would or could entail posting the notice of AEDT use on the careers or jobs (as applicable in each instance), section of the company website, job posting, or email to candidates or employees.
?
Plan of Attack?
In October, the Biden administration published:?Blueprint for an AI Bill of Rights. Here’s a quote to get you in the festive mood:
“Among the great challenges posed to democracy today is the use of technology, data, and automated systems in ways that threaten the rights of the American public. Too often, these tools are used to limit our opportunities and prevent our access to critical resources or services. These problems are well documented. In America and around the world, systems supposed to help with patient care have proven unsafe, ineffective, or biased. Algorithms used in hiring and credit decisions have been found to reflect and reproduce existing unwanted inequities or embed new harmful bias and discrimination. Unchecked social media data collection has been used to threaten people’s opportunities, undermine their privacy, or pervasively track their activity—often without their knowledge or consent.”
After all the cynicism prevalent in 2022, doesn’t that remind you of the final scene of?"A Christmas Carol"?