Tech Bytes: RMIT Experts Critique Australian AI Safety Standard for Insufficient Measures

September 09, 2024 03:41 PM AEST | By Team Kalkine Media
 Tech Bytes: RMIT Experts Critique Australian AI Safety Standard for Insufficient Measures
Image source: shutterstock

Last week, the Australian Federal Government introduced the 'Voluntary Artificial Intelligence (AI) Safety Standard,' a guideline designed to outline ten suggested AI safeguards and their implementation. 

The government asserts that these guardrails aim to enable organizations to leverage AI benefits while managing associated risks to people and groups. However, experts from RMIT University have expressed skepticism regarding the standard's potential impact. They view it as a step in the right direction but criticize it for being vague. 

Balancing Safeguards with Practicality 

RMIT’s Kok-Leong Ong, Professor of Business Analytics and Director of the Enterprise AI and Data Analytics Hub, acknowledges the importance of integrating safeguards as AI usage expands. Nevertheless, Ong points out that the voluntary nature of these measures might not be effective enough. The ambiguity of the guidelines may lead to inconsistent application among businesses, leaving them to self-assess risks. 

Moreover, Ong highlights concerns about workforce readiness. Many employees lack the necessary training to implement the new safeguards effectively. The balance between safety and efficiency is also a challenge, as some measures, such as mandatory disclosure of AI use, could disrupt existing processes. An ABC survey revealed that a significant portion of businesses using AI do not inform their employees or customers, and many have not conducted human rights or risk assessments on their AI practices. 

Insufficient Current Standards 

The Federal Government is working on more comprehensive regulations for AI. Three options are being considered: adjusting existing digital software laws, creating a standalone act, or imposing restrictions on AI tools deemed too risky. One proposal includes mandating that developers and deployers of AI disclose their use of these tools to Australians, particularly when decisions about individuals are made by AI or when interactions involve AI-generated content. 

Lisa Given, Professor of Information Sciences and Director of the RMIT Centre for Human-AI Information Environments and the Social Change, notes that while the voluntary standards are a positive interim measure, mandatory regulations are crucial for ensuring consumer protection. Such measures would align Australia with international standards, such as those in the European Union. Given emphasizes that mandatory safeguards are essential for ensuring transparency in AI's design, application, and use. 

Addressing Public Concerns 

A survey conducted by the University of Queensland in March revealed a divided public opinion on AI. While 40% of Australians support AI development, another 30% are opposed. Opinions on whether AI will ultimately benefit society are also split, with 40% affirming its positive impact and 40% dissenting. There is, however, a strong consensus on the need for enhanced regulatory oversight, with 90% of respondents advocating for the establishment of a new regulatory body dedicated to AI. 

Federal Industry Minister Ed Husic has acknowledged these concerns and encourages businesses to take proactive measures to address them. He emphasizes that while AI holds significant potential, ensuring that adequate protections are in place is crucial to maintaining public trust and safety. 


Disclaimer

The content, including but not limited to any articles, news, quotes, information, data, text, reports, ratings, opinions, images, photos, graphics, graphs, charts, animations and video (Content) is a service of Kalkine Media Pty Ltd (Kalkine Media, we or us), ACN 629 651 672 and is available for personal and non-commercial use only. The principal purpose of the Content is to educate and inform. The Content does not contain or imply any recommendation or opinion intended to influence your financial decisions and must not be relied upon by you as such. Some of the Content on this website may be sponsored/non-sponsored, as applicable, but is NOT a solicitation or recommendation to buy, sell or hold the stocks of the company(s) or engage in any investment activity under discussion. Kalkine Media is neither licensed nor qualified to provide investment advice through this platform. Users should make their own enquiries about any investments and Kalkine Media strongly suggests the users to seek advice from a financial adviser, stockbroker or other professional (including taxation and legal advice), as necessary. Kalkine Media hereby disclaims any and all the liabilities to any user for any direct, indirect, implied, punitive, special, incidental or other consequential damages arising from any use of the Content on this website, which is provided without warranties. The views expressed in the Content by the guests, if any, are their own and do not necessarily represent the views or opinions of Kalkine Media. Some of the images/music that may be used on this website are copyright to their respective owner(s). Kalkine Media does not claim ownership of any of the pictures displayed/music used on this website unless stated otherwise. The images/music that may be used on this website are taken from various sources on the internet, including paid subscriptions or are believed to be in public domain. We have used reasonable efforts to accredit the source wherever it was indicated as or found to be necessary.


AU_advertise

Advertise your brand on Kalkine Media

Sponsored Articles


Investing Ideas

Previous Next
We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.