Woman blowing out smoke from her mouth
Woman blowing out smoke from her mouth

How a chatbot can give (ethical) advice on how to quit smoking

Can a chatbot help you quit smoking, or in making safer choices in sexual health? Research by Erkan Başar, who will defend his PhD at Radboud University on November 25, shows that it’s possible – but only if the chatbot is given context and structure in advance by actual medical professionals.

Millions of people around the world already turn to chatbots on a daily basis for medical advice and mental health support. The problem? ChatGPT and its competitors are not equipped to provide safe, ethical and healthy guidance.  “They calculate the most probable next word, not what is best for the person”, explains computer scientist Erkan Başar. “They lack emotional depth and the ability to understand a user’s motivations, and are only trained to respond coherently. In health contexts like quitting smoking or seeking sexual advice, that can lead to manipulation or even harm.”

Motivational interviewing

Başar explored whether proven counselling strategies could make chatbot conversations safer. This would be done in a manner similar to motivational interviewing, which is often employed by professional and is a technique that has been proven to be effect. Motivational interviewing evokes a person’s own reasons for change instead of confronting them. You don’t just tell someone why smoking is bad, you help them explore their own motivation for quitting.

“We applied this to chatbots by designing conversations that follow expert strategies rather than leaving everything to chance”, explains Başar. “The chatbot asks questions and responds in ways that respect autonomy, helping users reflect on their own motivations. This structured approach ensures ethical guidance, something ChatGPT alone cannot guarantee.”

The rise of engaging chatbots

Başar began his research in 2020, when chatbots were still limited in their capabilities. “Most of the time, they were like interactive forms. You’d have a very narrow number of questions you could ask, and then you’d often get the same answers back to your questions . Successful health counseling requires that you as a patient feel engaged, that you come back and keep talking to your counsellor.” The more traditional chatbots were incapable of providing that form of engagement, being too restrictive and repetitive.

The rise of large language models over the years, and the introduction of ChatGPT in 2022 changed matters, explains Başar. “This generation of chatbots uses large language models to bring more flexibility and variety to conversations. You never get the same response twice, and conversations can be tailored to your needs much more effectively. Suddenly, people were talking to chatbots for weeks or even months, which is also crucial for long-term behaviour change.  However, this could have unintended consequences on health, since these models are designed by AI companies to keep users engaged, not to offer real counseling or expert advice. For this reason, we recommend against using general-purpose AI for health behaviour change advice. Instead, we need to find the middle ground, and develop engaging chatbots that are specifically designed for certain topics, so they can provide guidance safely and ethically.”

Human expertise remains essential

Despite these advances, Başar does not see chatbots as a replacement for medical professionals. “A counsellor is crucial to understanding your specific situation, and to provide regular feedback and adjustment of the treatment you need. But imagine you see your counsellor regularly because you want to quit smoking or discuss sexual health. But you can’t reach them 24/7. If you’re craving a cigarette at midnight, a chatbot can be there in those moments when a human can’t. It can’t do everything human counsellors can, but it can provide just enough of a support when you need it. This is about increasing availability, not replacing human expertise.” 

Contact information

For further information, please contact Erkan Başar or Press office & science communication via 024 361 6000 of media [at] ru.nl (media[at]ru[dot]nl). 

Theme
Behaviour, Artificial intelligence (AI), Health & Healthcare