Words by Eloise Dalton, Director of Working Women Queensland.

 

Artificial intelligence is on the agenda at this week’s economic reform roundtable, given its potential to boost productivity. The debate is already lively with calls for less regulation to unlock productivity gains, contrasted with unions pressing for stronger protections for workers.

In these conversations and debates, we mustn’t lose sight of the very real impact of AI and surveillance technologies on the people using them every day, especially workers who have the least power or voice.

At Working Women Queensland, we’ve already seen surveillance and monitoring evidence weaponised against our clients. We’re concerned not just about bias in AI decision-making, but also about the rise of workplace surveillance, which is becoming more invasive, less transparent and harder to challenge.

 

Rising levels of workplace surveillance

Most workers accept a certain level of monitoring at work as fairly commonplace. But here are some numbers to put that into sharp relief:

In a survey conducted by global law firm Herbert Smith Freehills,
• 90% of employers use software to track the location of remote employees
• 82% plan to adopt digital tools to monitor staff wellbeing, and
• 43% already use sentiment analysis software to detect and address wellbeing issues.¹

Surveillance once meant things like CCTV cameras, webcam or email monitoring. Now, emerging AI systems can track keystrokes, scan facial expressions, analyse vocal tone and even monitor heart rates to infer how a worker is feeling. Most troubling, it is possible to infer a workers’ intentions, such as the likelihood of leaving a job, whether they are likely to be union members or
predict political orientation.

The risks are obvious. Surveillance becomes excessive when it extends beyond the workplace, collects disproportionate amounts of data to the states purpose, or is weaponised and used to justify unfair or discriminatory decisions.

Workplace surveillance doesn’t affect everyone equally. UK research shows women in the private sector are at greater risk of workplace surveillance than their male counterparts, while young people and Black workers also face higher risks.² Unionised workers are twice as likely to be consulted before monitoring tools are introduced. That’s a stark reminder of how power and inequality can shape these systems.

 

Consent and control

WWQ often hears from women asked to hand over personal information, usually in response to attempts to access basic entitlements like leave or flexible work. Emerging technologies stand to make the problem worse by giving employers access to sensitive health or personal data, sometimes by accident, sometimes by design.

The issue of consent is woven into both the carrying out of workplace surveillance and the use and control of data collected. Workers might “agree” to surveillance, but how genuine is that agreement when the alternative is to risk your job and economic security?

Our laws aren’t keeping pace. Workplace surveillance is only regulated in three of seven State jurisdiction, leaving huge gaps in protection. The Australian Law Reform Commission has already warned that these inconsistencies undermine workers’ privacy.³

The idea of “letting it rip” and foregoing regulation in favour of productivity is pretty troubling when considering the most vulnerable voices. Workers should be able to know what’s being collected, be able to correct personal information and challenge decisions made with that information.

 

When bias becomes discrimination

The dangers aren’t hypothetical. The use of AI systems to monitor employees and the collection of vast quantities of employee data (both incidentally and intentionally) increases the potential for employers to use the information in a punitive or discriminatory way.

Consider Adora* who contacted us for advice after learning her redundancy was decided by an AI-generated matrix.

Adora was pregnant and performing modified duties because her employer’s policies prevented her from doing her substantive role. She had no attendance or disciplinary issues, but because her performance in the modified role was compared against those performing full duties, she didn’t stack up. The algorithm didn’t account for safe duties, injuries or pregnancy.

Put simply: the system was biased. This cost Adora her job and her access to safety net benefits.

 

What needs to change

At WWQ, we’re calling for:

1. Workplace surveillance laws:
Unifying and strengthening Australia’s workplace surveillance laws to ensure proper regulation of excessive surveillance practices and ensure the monitoring is proportional to its purpose.

2. Requiring appropriate consent:
Ensuring privacy laws clarify consent should be voluntary, informed, current, specific and unambiguous and ensuring employers are in practice restricting collection of information to that which is necessary for or related to the employer’s business activities.

3. Bias audits:
Requiring AI systems to undergo regular bias audits to avoid further increasing the risk of discrimination based on race, gender, age, or other protected characteristics.

4. Union Involvement:
Involving unions in decision-making about surveillance tools to ensure deployment aligns with employee interests and to alleviate the burden on employees to understand information handling practices.

 

Behind these conversation and debates is a person like Adora. Someone just trying to work, provide for their family and be treated fairly. If we want technology to truly drive productivity, then it must be designed and regulated in a way that protects workers, not undermines them.

Recognising that the use of artificial intelligence (AI) tools to conduct workplace surveillance and monitoring stands to exacerbate existing issues, Wotton Kearney has partnered with WWQ to explore potential legal safeguards. This partnership is a case study in how targeted pro-bono assistance can enable community legal organisations to engage in proactive advocacy.

 

*Not her real name.

1 Herbert Smith Freehills, ‘Remote/Controlled: The Future of Work Report’ 2021
2 Henry Parkes, ‘Watching me, watching you: Worker Surveillance in the UK after the Pandemic’ (Report, March 2023).
3 Australian Law Reform Commission, ‘Final Report 123: Serious Invasions of Privacy in the Digital Era’, June 2014.