CSIRO publishes report on ethical use of AI
CSIRO has published a new report designed to help businesses implement ethical, responsible artificial intelligence.
The report, ‘Implementing Australia’s AI Ethics Principles: A Selection of Responsible AI Practices and Resources’, was developed for the agency by the Gradient Institute.
Released under a Creative Commons licence, the report includes recommendations for practices including impact assessments, data curation, fairness measures, pilot studies and organisational training, all aimed at helping businesses develop robust and responsible AI.
It has been published following research from Fifth Quadrant and the Gradient Institute finding that despite 82% of Australian businesses believing they were practising AI responsibly, less than 24% had measures in place to ensure they were aligned with responsible AI practices.
National AI Centre Director Stela Solar said many businesses don’t know how to responsibly navigate the fast-evolving AI environment and meet customer expectations for ethical use of the technology.
“We hear from businesses that their ability to innovate with AI is directly correlated with their ability to earn trust from the communities they serve. This is also reflected in Australia’s AI Ecosystem Momentum report, which found that trust, privacy, security, data quality and skills were the top blockers to AI projects,” she said.
“AI systems that are developed without appropriate checks and balances can have unintended consequences that can significantly damage company reputation and customer loyalty.”
Databricks unveils new suite of AI tools
Databricks' new AI tools are designed to assist enterprise customers in bringing AI agent...
Study finds one-third of tech professionals switched jobs in the past two years
Heavy workloads and long hours were cited as the top stressors for IT professionals, while firms...
Human error damaging productivity for Aussie IT teams
Research commissioned by Tanium suggests that 43% of Australian IT teams are spending up to three...