Do we need new laws for new technology?
How many times have you heard that our laws are struggling to keep up with the rapid pace of new technology?
Enforcement of existing technology-neutral laws while we work out AI regulation
A light bulb moment for me is reading in Ed Santow’s book on human rights and AI that there is a false premise that existing law doesn’t apply to AI.
“While it’s true that Australia has very few AI-specific laws, the vast majority of our laws are technology neutral… However, the regulatory ecosystem can take time to figure out how to enforce existing law in a new technological context, and so perhaps it should be no surprise that there’s a period where new technology like AI, appears to be operating in a lawless space.”
With a long lead time for new or amending legislation, in many cases years (if ever), it’s important to identify where there’s an opportunity to focus advocacy on how to resource the enforcement of existing technology-neutral laws alongside campaigns for longer term law reform.
With the United States signalling its intention to avoid AI guardrails and, perhaps even further, to retaliate with tariffs against any attempts by foreign governments to regulate AI in a way that could “inhibit the growth or intended operation of United States companies”, enforcement of existing laws to protect well established rights in Australia becomes even more critical and will perhaps be more able to withstand this regulation backlash.
Workplace surveillance as an example
Take for example, Australian laws on workplace surveillance. Basic Rights Queensland 's Working Women Queensland (WWQ) are already seeing surveillance evidence weaponised by employers against working women who attempt to exercise their rights as workers or who make complaints against management. Recognising that the use of AI tools to conduct workplace surveillance and monitoring stands to exacerbate these existing issues, Wotton Kearney has partnered with WWQ to explore potential legal safeguards, particularly for women who are vulnerable to excessively intrusive workplace surveillance. See the WWQ submission to the recent Australian Parliament Inquiry into the Future of Work.
In a forthcoming article with WWQ, WK Associate Emma Campbell looks at the case study of Grace.
"Grace who is 25, works in retail and is of Asian ethnicity. As she is young, female, of colour and low paid, she is 49% more likely than her counterparts to be subject to workplace surveillance. If she had no union representation, this figure would increase to 69%."
The surveillance law framework is significantly fragmented across Australia. Only New South Wales, Victoria and the Australian Capital Territory have specific legislation to govern how organisations can monitor and conduct surveillance in their workplaces, often just requiring employers to provide notice rather than requiring informed consent of workers. The recommendations that have been made for over a decade to replace existing state surveillance laws with a Commonwealth Act that is technology-neutral have not yet resulted in the development in the law required for consistent worker protections against overreach or misuse of surveillance across Australia. In the meantime, we need to think about ways to interpret/strengthen/expand the enforcement of the existing laws to protect workers from the use of AI tools to obtain data through surveillance without their consent and provide recourse where that data is used in a discriminatory manner.
Neurotechnology – also rapidly emerging
AI isn’t the only new technology of concern in the context of surveillance. According to Dr Allan McCay , Australia is also quietly entering an era of workplace neuro-surveillance. As far back as a decade ago, the Sydney Morning Herald reported on mineworkers having their brains monitored for alertness to minimise the risk of accidents at work.
"What if lawyers in the not-too-distant future had their brains monitored so their employers or clients could pay for their work, not by billable hours, but by billable units of attention?"
I’m hosting a Wotton Kearney event about the implications of neurotechnology for lawyers on 12 March where we will be asking Dr McCay this question and about the new legal and human rights protections needed for privacy and integrity of thought. Through our research with the Australian Human Rights Commission, we have also been looking at what existing laws we can lean on in the meantime.?
Thanks, Leanne, for highlighting these critical issues and for your support of WWQ’s work in this space. The rapid expansion of AI-driven workplace surveillance is already impacting working women, and your insights reinforce the urgency of both enforcing existing laws and advocating for stronger safeguards. Looking forward to continuing the conversation!
Co-Director of the Human Technology Institute and Professor of Responsible Technology at UTS
1 周Brilliantly said! (And thank you for the generous shout out!)
AI and data - strategy, value, risk, policy
1 周Guidance can be very helpful to plug the gap - it's something we should be doing more of. Hence the recent joint work by the actuaries institute and human rights commission on AI, discrimination, and insurance. https://actuaries.logicaldoc.cloud/download-ticket?ticketId=36aea01e-e5e5-4b08-9016-640051021053