Guardrails and guidelines as AI embeds itself deeper

Business leaders, scientists and politicians know the spread of artificial intelligence is unstoppable and are trying to erect guardrails before it transgresses community expectations.

The impact of AI at work and school is uncertain and trust is lagging, even as the technologies bring advances in automation and breakthroughs in medicine, fraud detection and road and air safety.

Experts released guidelines at an industry event in Canberra on Tuesday as digital technologies become more deeply embedded in everyday life.

To build trust, the advice written by consultants KPMG and the Australian Information Industry Association calls for strong governance around how AI is developed and used.

Education and health care are key users along with financial services, agriculture, mining and logistics.

But opinion is divided on whether tools such as ChatGPT are job-killers or economic growth accelerators.

The user guide and checklist launched by Science Minister Ed Husic alongside industry leaders come amid fears AI is helping would-be cyber criminals and the spread of misinformation.

Chatbots that can draft computer code are ripe for criminal abuse, according to law enforcement agencies.

To reduce fear and win trust the new industry guide urges governments to promote the benefits, such the success of AI in helping people fill in tax forms, and to educate consumers about technology safeguards.

With no specific laws in place, organisations are being asked to self-regulate and be more open about what tools they’re using.

A recent study by KPMG and the University of Queensland found just over one-third (35 per cent) of people believe there are enough safeguards, laws or regulations in place to make AI use safe, while 40 per cent of respondents trusted the use of AI at work.

The survey also found Australians want an independent regulator to monitor its use, and need more information on data privacy.

 

Marion Rae
(Australian Associated Press)

Read it on Apple news

0

Like This