Professor Robyn Ward - 'AI won’t replace us, but it will redefine how we work'

Latest NewsBioPharma

At the AI Health Summit in Sydney, Professor Robyn Ward AM, Chair of the Pharmaceutical Benefits Advisory Committee (PBAC), offered a human perspective on artificial intelligence, reflecting on its disruptive potential, the lessons from past technological revolutions, and why education, ethics, and equity must remain the guiding principles of its role in healthcare and health technology assessment (HTA).

Professor Ward, who is also Deputy Vice-Chancellor (Research and Enterprise) and Senior Vice-President at Monash University, began her address with an observation drawn from the education sector’s experience with AI, recalling how universities confronted the arrival of generative tools like ChatGPT in 2022. Many expected the sector to respond with strict rules and prohibitions. Instead, universities adopted an open but values-based approach, permitting the use of AI under principles of responsible and ethical application.

“They didn’t ban it,” she said. “They said, use it responsibly.”

Yet, she said, that despite this institutional endorsement, most educators remained cautious. Around 60 per cent of lecturers did not allow AI use in student assessments, while 83 per cent of students were using AI “because it made life easier and faster,” she said, “but they didn’t really trust it.” Most feared they might be breaking rules, and fewer than one in three felt they had received any real guidance on how to use AI appropriately.

Professor Ward said this disconnect between institutional policy and individual behaviour reveals something fundamental about how societies adapt to disruptive technologies.

“Even when we say we’re embracing AI, humans vary enormously in how we adopt new tools,” she said. “Some will move fast, others will be fearful, and that variation is something we have to plan for.”

The lesson for HTA, she suggested, is that its adoption must begin with education, guidance, and transparency. “We can’t assume everyone will engage in the same way, and we must not allow AI to exclude people or patient voices who aren’t comfortable with it.”

Ward pushed back against the long-standing fear that AI will eliminate professional roles. Citing the oft-quoted 2016 prediction from computer scientist Geoffrey Hinton that “we should stop training radiologists,” she noted how reality has proved the opposite. “Radiologists weren’t replaced. Their salaries have gone up, and there’s still a shortage,” she said. “AI didn’t remove the need for them, but it changed how they work.”

That shift, she argued, has repeated across sectors. A 2025 report found that AI-competent workers command a significant remuneration premium and that companies effectively using AI reported three times the revenue growth per employee. The same principle, she suggested, should apply to HTA. “We have an opportunity to use AI to amplify our expertise, not to replace it.”

The PBAC Chair distilled her message into three imperatives for how Australia should approach AI in health technology assessment. They are that it remains evidence-based, be led by values, and stay patient-centred.

The evidence question, she acknowledged, is becoming more complex. “What counts as evidence, and how do we know it’s verifiable, is changing,” she said. Young people, she noted, are more sceptical about AI than many assume. She highlighted research that showed 97 per cent of students say they worry about AI ‘hallucinations’, or false outputs. 

On the issue of values, Professor Ward warned against letting technology widen inequalities. “We don’t want new technologies that further entrench inequity in healthcare,” she said. “AI must be a tool that supports and augments human care, not one that divides or replaces it.” That, she argued, requires constant engagement with patients and communities, understanding their cultural and social contexts, and ensuring that technological change reflects their needs.

When asked whether AI is already influencing PBAC’s work, Ward was pragmatic. “We have to assume everyone is using AI in some way. In letters, in correspondence, in preparation,” she said. “Why would HTA be any different?” She described AI as both a disruptor and a positive force, likely to accelerate the discovery, testing, and development of new therapies. “Pre-clinical and phase 2 trials are already being transformed by AI. Submissions to the TGA and PBAC will start to reflect that, and that’s a good thing. It will demand new skills of interpretation, but it should improve outcomes.”

Professor Ward said she sees AI not as a source of more uncertainty but as a way to manage it better. “We live in an uncertain world, and AI won’t change that,” she said. “But it can help us understand uncertainty more precisely, and that’s valuable.”

Asked whether PBAC had received government direction on AI, Ward said there had been no formal request, but she expected the issue to evolve quickly. Evaluators, she predicted, would increasingly need AI skills as a basic competency. “In many areas, this will become mandated,” she said. “AI literacy will just be part of the workforce requirement.”

She also welcomed the entry of technology companies into the life-sciences sector, describing their presence as “positive and inevitable.” These new players, she said, were already transforming drug development by improving accuracy and reducing the waste of failed trials. “We shouldn’t be impervious to that,” she said. “Tech companies are improving access and reducing the trial-and-error process that’s defined drug development for decades.”

When asked whether AI might add to the complexity of HTA, Ward smiled. “Humans have an extraordinary ability to create complexity even where there is none,” she said. “We’ll have to see.”

She said that AI will not replace evaluators. “There will be plenty of jobs for people,” she said. “It just changes what we do. The healthier model is adaptation, not replacement.”

Looking ahead, Professor Ward warned against the risks of slow institutional adoption, highlighting a lesson learned the hard way during Australia’s delayed embrace of genomic testing. “It took years to integrate genomics into mainstream care,” she said. “Without the testing, patients couldn’t access the drugs they needed. We can’t make that mistake again with AI.”

AI, she said, could easily fragment the health system, with new models of care emerging outside established institutions. “That’s already happening,” she said. “It can’t be something we avert our eyes from. Patient groups played a crucial role in pulling genomics back into the mainstream, and they’ll need to do the same with AI.”

Professor Ward’s message was one of optimism tempered by pragmatism. “AI is already everywhere. It’s in every letter and every workflow, and it will be in every submission soon enough,” she said. “Our job is not to stop it, but to shape it.”

And when asked how she personally uses AI, the PBAC chair smiled. “I’m a novice,” she admitted. “I use Co-pilot, but I still prefer what I write myself. I put my text into AI, but I usually like what I wrote better than what she writes.”