
Guide Helps Australian Workers Expose Tech Wrongdoings
The Human Rights Law Center has released new guidance empowering Australian tech workers to speak out against harmful company practices or products.
tour guide, Technology related reportssummarizes the legally protected avenues and practical considerations that raise concerns about harmful effects of technology.
“We’ve heard a lot about harmful behavior from tech companies this year and there’s no doubt there will be more to come,” Alice Dawkins, executive director of Reset Tech Australia, which co-authored the report, said in a statement. Author.
She added: “We know that providing Australians with comprehensive protection against digital harm will take time – opening the door to public accountability through reporting is particularly urgent.”
Technology’s potential harm is the focus of Australian market
There are relatively few technology-related whistleblowing in Australia. Indeed, Kieran Pender, deputy legal director of the Human Rights Law Centre, said “the tech whistleblowing wave has not yet hit Australia.”
However, due to new Australian government laws and various tech-related scandals and media reports, the potential harms involved in technology and platforms have come into focus.
Australia bans teenagers under 16 from using social media
Australia has Legislation prohibits citizens under 16 from using social mediawhich will take effect at the end of 2025.
Tech companies’ ‘digital duty of care’
Australia Legislation for “Digital Duty of Care” is underway following a review of its Online Security Act 2021. It follows a similar legislative approach to the UK and EU versions.
Bad automation in tax bot debt scandal
Technology-assisted automation comes in the form of taxpayer data matching and income averaging. The Australian Taxation Office is tracking down 470,000 tax receipts issued incorrectly. The so-called Robodebt scheme was deemed illegal and triggered a sweeping royal commission.
The use of artificial intelligence data and its impact on Australian employment
An Australian Senate select committee recently recommended enacting an artificial intelligence law to regulate artificial intelligence companies. OpenAI, Meta and Google LLM will be classified as “high risk” under the new law.
Many concerns relate to the possible unauthorized use of copyrighted material in AI model training materials, as well as the impact of AI on the livelihoods of creators and other workers. the latest one OpenAI whistleblower shares some of U.S. concerns
Agree on issues in AI model health data
Technology-related whistleblowing guidance states that there are reports that Australian radiology company hands over patient’s medical scan results Without their knowledge or consent, healthcare AI startups use scans to train AI models.
Photos of Australian children used by AI model
Analysis by Human Rights Watch found that LAION-5B is a data set used to train some popular artificial intelligence tools by scraping Internet data. Links containing identifiable photos of Australian children. There was no consent from the children or their families.
Compensation after Facebook Cambridge Analytica scandal
Office of the Australian Information Commissioner Meta’s $50 million settlement recently approved There have been allegations that Facebook user data was collected by an app, potentially leaked to Cambridge Analytica and other companies, and potentially used for political analysis.
Concerns over immigration detainee algorithm
Technology-related whistleblowing guidance cites relevant Algorithm for assessing the level of risk associated with immigration detainees. While the data and ratings are questionable, the algorithm’s ratings are said to influence how immigration detainees are managed.
Australian tech workers have detailed reporting protection measures
The guidance outlines in detail the protections available to technology employee whistleblowers. For example, it explains that in Australia’s private sector, there are different whistleblower laws that cover certain “discloseable matters” that make employees eligible for legal protection.
Under the Companies Act, “discloseable matters” arise when there are reasonable grounds for suspecting that the information relates to inappropriate conduct or improper matters or circumstances within the organization.
look: Accenture, SAP lead AI bias diversity issues and solutions
Public sector employees can take advantage of public interest disclosure legislation where there is a significant risk to health, safety or the environment.
“Digital technology issues can arise in both the public and private sectors, meaning your disclosures may be caught under private sector whistleblower laws or PID schemes – depending on the organization your report relates to,” the guidance advises Australian employees.
“In most cases this is easy to determine, but if not we encourage you to seek legal advice.”
Australia: A testing ground for “good, bad and illegal” technology
Whistleblower Frances Haugen, source of internal Facebook information Leads to Wall Street Journal investigation into Facebook documentswrote a repost for Australia Guide. She said the Australian government was sending a signal for technical accountability measures but its plans were “still in their infancy”.
“In many ways, Australia is a testing center for many of the world’s existing technology giants and an incubator for good, evil and illegal behavior,” she claims in the reporting guide.
look: Australia proposes mandatory guardrails for artificial intelligence
The authors note in a press release that more people in Australia are facing harm from new technologies, digital platforms and artificial intelligence than ever before. However, they note that the role of whistleblowers in uncovering wrongdoing has been largely ignored in policy debates.
“The depth, breadth and velocity of new digital risks are emerging in real time,” Haugen wrote.
“Timely disclosure remains critical in order to gain a clearer understanding of the risks and potential harms posed by digital products and services,” she concluded.
2024-12-23 13:30:00