Failure to regulate AI will fail Australian workers
Unregulated use of artificial intelligence has the power to irreparably damage the Australian workforce and society as a whole, according to a new report.
‘For all of us: making artificial intelligence work for working people’, co-authored by Dr Dominic Meagher, a research fellow at the Crawford School of Public Policy, and published by the John Curtin Research Centre, highlights how Australia could be left floundering without a strategy to regulate AI in the workplace.
“New technology doesn't change what having a fair society is all about, and Australia has already worked out how to be a pretty fair place,” Dr Meagher said.
It’s being recommended that politicians create a policy to ensure the ‘fair go’ attitude in Australia survives this digital revolution. With the rapid growth of AI, it is paramount that action is taken now, and not left too late, as it was with social media.
“We need to shape AI, not be shaped by it,” the report highlights. This means a deliberate policy agenda is needed that uses the new technology to strengthen jobs, deepen skills and build a more capable, resilient workforce, rather than being used as a tool by companies to cut jobs in a bid to boost their bottom line.
One key element of this is that companies and staff need to be held accountable for how AI is used. Dr Meagher elaborates, “If AI suggests something, and you action it, you are still responsible, and you need to be held responsible in the eyes of the law.” In the context of workplaces, this means that a company must be held accountable for their fair work obligations, no matter how AI is being implemented.
If Australia fails to put in place these firm guardrails around accountability and repercussions, Dr Meagher says the economic damage will fall on workers first, before causing broader social costs. “We already know how the self-regulation story ends and there is too much at stake with AI,” he notes.
The new framework around AI in the workplace needs to be built with worker consultation and embedded into existing frameworks to ensure knowledge around workplaces is coherent, says Dr Meagher. He gave the example of how when new equipment is delivered to a construction site, you don’t throw out the rule book around safety and start from scratch, you add to it. This needs to be the case with AI, where additional regulations are added into existing mechanisms.
Taking this pro-worker approach to AI will ensure that technology is used to enhance, not replace human work, ensuring skills, transparency, and human oversight and lifelong learning. Creating this framework, according to the report, will create confidence in Australians to embrace AI, because we know the gains will be shared, not monopolised, and the dignity of work will remain. Getting this right means that we can turn the disruption of AI into shared prosperity, as is the Australian way.
The message is being heard around the world with Dr Meagher being interviewed for ABC News online, ABC News Breakfast, ABC Radio Canberra, ABC Radio Melbourne, the Weekend Australian, SBS News and coverage reaching as far as the Philippines.
You can read the report here. As to what’s next, the next report in the series is about information authentication, focusing on how we trust what we are seeing is reliable.