Chatbots on the rise but failing customers

Disclosure: Lifestyle Wealth Partners Pty Ltd and its advisers are authorised representatives of Fortnum Private Wealth Ltd ABN 54 139 889 535 AFSL 357306. General Advice Warning: Any information on this website is general advice and does not take into account any person's objectives, financial situation or needs. Please consider your own circumstances and consider whether the advice is right for you before making a decision. Always obtain a Product Disclosure Statement (If applicable) to understand the full implications and risks relating to the product and consider the Statement before making any decision about whether to acquire the financial product.

The four-year market value of chatbots is expected to reach more than $162 billion but users are still frustrated by online assistants.

Queensland researchers surveyed 145 participants and found that when informed a human interaction option was available late in the service, aggression increased.

The Queensland University of Technology research demonstrates how chatbot service failure in an encounter can produce different effects on customers’ intention to engage in aggression.

For that reason, researchers suggest companies using chatbots rewrite their scripts to offer early human interaction as an option.

The mini online helpers, set to be used by 95 per cent of online customers within three years, are found across online service industries worldwide.

Their benefits are enormous when scripted correctly, assisting customers through an automated bot directly to their desired purchases.

But they often fail to meet customers’ expectations, can undermine the customer service experience and lead to service failures, QUT Associate Professor Paula Dootson said.

“In Japan, a hotel virtual assistant robot was fired in 2019 for repeated malfunctions such as mistaking snoring for voice commands and waking guests – a critical service delivery failure,” she said.

“Beyond issues of voice recognition, the scripts chatbots rely on to respond to customers can become problematic when the chatbot does not correctly interpret a request, which makes it challenging for the chatbot assistant to respond in a meaningful way to the customer.

“Users can then feel frustrated and angry, become reluctant to use chatbots in the future, are less likely to make the purchase or even switch to using another service provider entirely.”

Ms Dootson’s research, in accordance with Assistant Professor Yu-Shan (Sandy) Huang from Texas A&M University-Corpus Christi, found that late notice on human assistance availability prompted aggression.

“Our results indicate that disclosing the option to engage with a human employee late in the chatbot interaction, after the service failure, increased the likelihood of emotion-focused coping, which can lead to customer aggression,” Prof Dootson said.

As a result potential customers can be driven to alternate websites.

The solution is for organisations to disclose options for human interaction early in the interactions to avoid chatbot service failures, “thereby making customers aware of the possible human intervention prior to the occurrence of chatbot service failures”, Prof Dootson said.

 

Fraser Barton
(Australian Associated Press)

0

Like This