The news is by your side.

Australia Post customer reveals bizarre messaging with chatbot, exposing a major flaw in the system

0

An Australia Post customer has shared a frustrating conversation with an online chatbot, revealing a major flaw in the system.

The customer revealed an online text message conversation they had with the company in which they attempted to update their mailing address.

The customer shared the strange chat last week via a post on social media site Reddit.

“Hello, I am moving into a new construction home and would like to forward my mail,” the customer wrote.

An Australia Post customer revealed an angry message exchange with a company chatbot

The Australia Post website states: 'Where possible the chatbot will attempt to answer your question' (stock image shown)

The Australia Post website states: ‘Where possible the chatbot will attempt to answer your question’ (stock image shown)

It then turns out to be a simple misunderstanding, where the chatbot asks for a tracking number.

“Sorry, to check if the redirect is eligible, we need a valid tracking number,” the chatbot says, before asking the customer if they need anything else.

The customer explains that there is no tracking number and a frustrating exchange loop ensues.

The customer asks to speak to a human representative, but the chatbot seems stuck in a repetitive cycle.

“Please let me know what your question is about so I can direct you to the appropriate team for assistance,” it says.

In one response to the fed-up customer, the chatbot writes the bizarre word ‘doinks’.

Social media users shared the frustrations the poster is experiencing.

“It’s hard to find good employees these days,” read one comment.

“Sometimes you have to program a response that consists of an exclamation… so that customers feel like the agent is listening to them, interacting with them, or empathizing with them,” said another.

While a third said: ‘We just use ‘Oh dear!’ or ‘I see!’ — ‘Doinks’ is an odd choice.”

Another advised to “always use that old-fashioned thing called a telephone” when dealing with such matters.

The text exchange included a bizarre response to the customer, revealing that his email information was private

The text exchange included a bizarre response to the customer, revealing that his email information was private

Messaging involves the customer attempting to speak to a human representative

Messaging involves the customer attempting to speak to a human representative

Others related to the post and revealed their own similar experiences with online chatbots.

“The point is not to help, but to give up in frustration,” said one person, calling the chatbot “hilariously bad.”

The Australia Post website states: ‘Where possible the chatbot will attempt to answer your question’.

‘If the chatbot doesn’t know the answer, it will connect you to one of our support agents who are available during business hours to resolve more complex questions. And you can ask to speak to someone if you prefer,” it says.

‘Our chatbot is built using machine learning, which means it gets better and smarter over time. If it doesn’t know the answer to a question you ask today, it will learn how to answer it in the future.”

Australia Post reported this in a statement 7NEWS the postal giant ‘continually works to improve the digital experience for customers’.

“Australia Post apologizes to customers for this poor experience and encourages them to call us,” a spokesperson said.

“While the majority of our chatbot interactions are working well, we continue to work on improving and improving the service.”

Leave A Reply

Your email address will not be published.