Associate Professor Darren Saunders is known for keeping misinformation in check and helping the public to better understand science.
The medical researcher also won a prestigious Eureka Prize for his outstanding contribution to public science communication.
“I’m a scientist who communicates; I make both my science and other people’s science accessible to the public using evidence-based practice,” A/Prof Saunders said.
“I have a conversation with people – I try to understand where they’re coming from and what information they’re looking for; I listen to and acknowledge their beliefs, fears, anxieties and questions, and then help them to navigate through the evidence.”
A/Prof Saunders said in order to get their message across, it was key for science communicators to show that scientists were real people who were there to help.
“That’s really important in the health space, in the context of the COVID-19 pandemic, because people are stressed and anxious, and they have a lot of big questions that are affecting their day-to-day lives,” he said.
“So, it’s okay for people to be scared and confused about what’s going on, but it’s critical for us to be able to help people find accurate information and put facts and evidence into the discussion.”
Know your audience
A/Prof Saunders said it was important for science communicators to understand who their audience was in order to communicate effectively.
“We need to distinguish between the everyday person who’s looking for information and wants to make good decisions for themselves and their family, versus people who are manipulating the situation for profit and have been doing so in various forms for years,” he said.
“There are some influential voices out there trying to direct eyeballs to their websites to sell stuff, who spread and amplify misinformation: that’s their identity, their business model.
“And a new, heightened aspect of this crisis is that there are people seeking political gain as well. So, it’s essential to know who’re you’re talking to, because it changes how you approach engaging with them.”
Talking to friends, family and children
A/Prof Saunders said acknowledging fears or anxieties was key in talking to people you knew who believed in myths and conspiracies.
“If you’re talking to a colleague, friend or family member, you already have a level of trust in the relationship – so, build off that: avoid ridiculing them, but address their fears,” he said.
“Acknowledge that they have an interesting idea and ask them to think it through, then suggest sources who you trust, and explain what those sources say on the issue and why you trust them.
“Avoid engaging in a partisan or political way – that’s probably going to make things worse and we’re seeing that play out in public: the rampant politicisation of health and medical information, which has always been on the fringes of medicine but never before on such a big scale.”
A/Prof Saunders said another approach he used was to ask people how such myths or conspiracies worked in practice.
“I always like to ask how the conspiracy is maintained; for example: ‘Are you suggesting that every scientist, journalist and politician on the planet is keeping XYZ a secret – how is that possible?’” he said.
“So, use questions to show that the myth or conspiracy might not be what it seems.”
A/Prof Saunders said it was also important to know how to discuss myths and conspiracies with children.
“Kids are a great one. I’ve got kids and I talk about this stuff with them all the time and it’s really about helping them to understand the nature of the information, and the difference between what someone thinks and what factual evidence looks like,” he said.
“With kids, you can help them to learn what sort of information you can trust. So, it’s the same approach, but at a basic level.
“You can also help them to understand the motivation behind the information. For example, you can ask them: ‘Is that person saying that because they want you to buy something from their website?’ Kids get that.”
The misinformation superhighway
A/Prof Saunders said technology such as the internet was a “double-edged sword” when it came to science communication.
“In some ways, the proliferation of online information has increased the challenge for science communicators – it’s difficult for people to change their beliefs and people have a natural bias towards information that reinforces those beliefs,” he said.
“Particularly with social media, information travels fast: there might be a tiny study with a few participants and a very small effect, that gets picked up by someone with a big platform and suddenly it’s blown out of all proportion.
“But in other ways we are seeing people gravitate towards trusted voices – so, there’s been a bit of a shift back towards embracing and listening to people who are experts in the subject they’re talking about.”
A/Prof Saunders said there were also parallels between how myths and conspiracies spread online and how viruses spread amongst people.
“Researchers have been studying how information spreads in various forms, particularly on the internet, and there’s really good evidence that you almost treat it in the same way you try to stop the spread of a physical virus,” he said.
“Understand the network: there are people who are misinformation ‘super-spreaders’ with a large platform and a large audience, who can disseminate misinformation rapidly.
“So, in this context you try to arm people with the skills and knowledge to help them filter out this misinformation, which is also known as ‘nerd immunity’ – a word play on ‘herd immunity’.”
A/Prof Saunders said one thing to be mindful of, however, when myth-busting or fact-checking misinformation, was the pitfall of reinforcing or promoting the myth or conspiracy.
“There is some evidence that you could, if you are not careful in how you frame the discussion, inadvertently amplify a belief in someone’s mind,” he said.
“If you’re dealing with big influencers who spread misinformation online, avoid arguing or engaging with them, but insert the evidence straight under their social media post so people can see the facts immediately.
“And avoid the trap of posting links to myths or conspiracies, because that boosts the algorithm to enable more people to see the misinformation in their social media feeds. Screenshot it instead.”
A/Prof Saunders said people also needed to be wary of fake social media accounts spreading misinformation.
“In terms of trolls, block and ignore them. But it’s important to note, interesting evidence is emerging that something like half of all social media accounts discussing COVID-19 are fake – they’re bots,” he said.
“So, you might just be arguing with a computer.”
Show, don’t tell
A/Prof Saunders summarised his principles of best practice science communication in the minefield of COVID-19 myths and conspiracies.
“First, amplify facts and evidence; second, elevate experts in the discussion and third, show, don’t tell,” he said.
“Demonstrate how a scientist thinks: help to arm people with the skills and knowledge to see through the holes in misinformation or identify where there are conflicts of interest at play.
“As scientists, we should be out there demonstrating effective communication in the way we want it to be done, in the same space as those spreading misinformation, so we can crowd out the bad message with a good message.
“It’s also important for scientists to work with journalists and the media to help them communicate science to the public, because if we’re not out there doing it, that leaves space for misinformation to come in and take over.”