Digital advisors Trust in digital advisors
Trust is considered to be the basis for good customer service. Growing digitization means customer contact is being carried out via various channels, such as digital advisors. This raises the question of how trust can be built via a digital channel.
Trust is a key factor in customers deciding to contact companies or to purchase products and services from them. The definition of trust differs depending on the field. But all definitions agree that it is an attitude or expectation towards someone or something you have to rely on and involves assuming a degree of risk. This means there is a relationship and an element of dependence. In offline retail, employees play a key role in building trust. During interpersonal contact, this trust is shaped by whether a person is perceived as being likeable and competent. But how do things look when it comes to trust in digital advisors used in customer contact?
Customers now have high expectations in terms of accessing information at any time and obtaining a response to concerns or questions 24/7. Chatbots are increasingly being deployed in customer advice to meet customer demand for constant availability. Chatbots are digital dialogue systems which, for example, answer recurring customer queries around the clock. They are one of the fastest growing communication channels. More than 67% of consumers worldwide have had interaction with a chatbot over the past 12 months. User expectations are also increasing in line with more extensive deployment. They often include unrealistic expectations of chatbot functions (Klein et al., 2019).
The literature describes the following factors which influence trust in chatbots (based on Nordheim et al., 2019 and Følstad et al, 2018):
- Competence and expertise of chatbots
- Humanness
- Appearance
- User-friendliness
A chatbot’s competence and humanness can, for example, be conveyed via their style of writing. If a chatbot provides factually correct information but in a humorous and friendly way, user trust increases. A name or an avatar can influence the humanness factor (Følstad, et al., 2018). Avatars can be given a brand-specific design and used to create a persona (Mema et al., 2020). User engagement can also be improved through the use of avatars (Pearl, 2016). Studies indicate that abstract, artificial figures come across as being more likeable and are perceived as more trustworthy than human-like avatars (Sieber, 2019; Mori et al., 2012). We will only find out the extent to which communication between humans and machines is set to change in the future.
Due to the current situation, Connecta Bern will again be held as a digital event in 2021. Connecta is renowned for shining a light on the diverse nature of digitization and this year will be no different with content presented across the three formats of Connecta Blog, Connecta TV and Connecta Talk. Find out more here: www.swisspost.ch/connecta.
Følstad A., Nordheim C.B., Bjørkli C.A. (2018) What Makes Users Trust a Chatbot for Customer Service? An Exploratory Interview Study. In: Bodrunova S. (eds) Internet Science. INSCI 2018. Lecture Notes in Computer Science, vol 11193. Springer, Cham. https://doi.org/10.1007/978-3-030-01437-7_16
Mema, D., Goehlich, V., & Morelli, F. (2020). Evaluierung der Wahrnehmung von Chatbots im Kundenservice zur Optimierung der Mensch-Maschine-Interaktion. Anwendungen und Konzepte der Wirtschaftsinformatik. https://ojs-hslu.ch/ojs3211/index.php/akwi Nr. 12
Nordheim, C.B., Følstad, A., & Bjørkli, C. (2019). An Initial Model of Trust in Chatbots for Customer Service - Findings from a Questionnaire Study. Interact. Comput., 31, 317-335.
Pearl, C. (2016). Designing voice user interfaces: Principles of conversational experiences. " O'Reilly Media, Inc.".
Sieber A. (2019) Conversational Design. In: Dialogroboter. Springer VS, Wiesbaden. https://doi.org/10.1007/978-3-658-24393-7_5
Mori, M., MacDorman, K. F., & Kageki, N. (2012). The Uncanny Valley [From the Field] in IEEE Robotics & Automation Magazine, vol. 19, no. 2, pp. 98-100
((commentsAmount)) Comments