Home   >   CSC-OpenAccess Library   >    Manuscript Information
Full Text Available

(223.5KB)
This is an Open Access publication published under CSC-OpenAccess Policy.
Publications from CSC-OpenAccess Library are being accessed from over 74 countries worldwide.
Trusting Smart Speakers: Understanding the Different Levels of Trust between Technologies
Alec Wells, Aminu Bello Usman, Justin McKeown
Pages - 72 - 81     |    Revised - 31-05-2020     |    Published - 30-06-2020
Volume - 14   Issue - 2    |    Publication Date - June 2020  Table of Contents
MORE INFORMATION
KEYWORDS
Direct Voice Input, Risk, Security, Technology and Trust.
ABSTRACT
The growing usage of smart speakers raises many privacy and trust concerns compared to other technologies such as smart phones and computers. In this study, a proxy measure of trust is used to gauge users’ opinions on three different technologies based on an empirical study, and to understand which technology most people are most likely to trust. The collected data were analyzed using the Kruskal-Wallis H test to determine the statistical differences between the users’ trust level of the three technologies: smart speaker, computer and smart phone. The findings of the study revealed that despite the wide acceptance, ease of use and reputation of smart speakers, people find it difficult to trust smart speakers with their sensitive information via the Direct Voice Input (DVI) and would prefer to use a keyboard or touchscreen offered by computers and smart phones. Findings from this study can inform future work on users’ trust in technology based on perceived ease of use, reputation, perceived credibility and risk of using technologies via DVI.
1 Bruce, S. (2000). Secrets and Lies-Digital Security in a Networked World.
2 Henshel, D., Cains, M. G., Hoffman, B., & Kelley, T. (2015). Trust as a human factor in holistic cyber security risk assessment. Procedia Manufacturing, 3, 1117-1124.
3 Newman, N. (2018). The future of voice and the implications for news.
4 Cohen, P. R., & Oviatt, S. L. (1995). The role of voice input for human-machine communication. proceedings of the National Academy of Sciences, 92(22), 9921-9927.
5 Ratnasingham, P. (1998). The importance of trust in electronic commerce. Internet research 313-321.
6 Hardr�, P. L. (2016). When, how, and why do we trust technology too much?. In Emotions, Technology, and Behaviors (pp. 85-106). Academic Press.
7 Sasse, M. A. (2005). Usability and trust in information systems. Edward Elgar.
8 Kiran, A. H., & Verbeek, P. P. (2010). Trusting our selves to technology. Knowledge, Technology & Policy, 23(3-4), 409-427.
9 Sherchan, W., Nepal, S., & Paris, C. (2013). A survey of trust in social networks. ACM Computing Surveys (CSUR), 45(4), 1-33.
10 S.Perez. "Voice-enabled smart speakers to reach 55% of U.S. households by 2022, says report". Internet: https://techcrunch.com/2017/11/08/voice-enabled-smart-speakers-to-reach-55-of-u-s-households-by-2022-says-report/, Nov. 8, 2017 [Nov. 29, 2019].
11 Coeckelbergh, M. (2012). Can we trust robots?. Ethics and information technology, 14(1), 53-60.
12 Cadwalladr, C., & Graham-Harrison, E. (2018). Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach. The guardian, 17, 22.
13 Zheng, S., Apthorpe, N., Chetty, M., & Feamster, N. (2018). User perceptions of smart home IoT privacy. Proceedings of the ACM on Human-Computer Interaction, 2(CSCW), 1-20.
14 Lessard, L. (2001). The Effects of operator interface on head position and workload: direct voice input versus manual input for the CH-146 Griffon helicopter (Doctoral dissertation, Carleton University).
15 Williams, M., Nurse, J. R., & Creese, S. (2017, August). Privacy is the boring bit: user perceptions and behaviour in the Internet-of-Things. In 2017 15th Annual Conference on Privacy, Security and Trust (PST) (pp. 181-18109). IEEE.
16 Chin, E., Felt, A. P., Sekar, V., & Wagner, D. (2012, July). Measuring user confidence in smartphone security and privacy. In Proceedings of the eighth symposium on usable privacy and security (pp. 1-16).
17 Usman, A. B., & Gutierrez, J. (2019). DATM: a dynamic attribute trust model for efficient collaborative routing. Annals of Operations Research, 277(2), 293-310.
18 Uslaner, E. (1999). The Moral Foundations of Trust University of Maryland. College Park, MD.
19 Uslaner, E. M. (2008). Trust as a moral value. The handbook of social capital, 101-121.
20 Cook, K. S., Yamagishi, T., Cheshire, C., Cooper, R., Matsuda, M., & Mashima, R. (2005). Trust building via risk taking: A cross-societal experiment. Social psychology quarterly, 68(2), 121-142.
21 Hancock, P. A., Billings, D. R., Schaefer, K. E., Chen, J. Y., De Visser, E. J., & Parasuraman, R. (2011). A meta-analysis of factors affecting trust in human-robot interaction. Human factors, 53(5), 517-527.
22 Gold, C., Körber, M., Hohenberger, C., Lechner, D., & Bengler, K. (2015). Trust in automation-Before and after the experience of take-over scenarios in a highly automated vehicle. Procedia Manufacturing, 3, 3025-3032.
23 Freedy, A., DeVisser, E., Weltman, G., & Coeyman, N. (2007, May). Measurement of trust in human-robot collaboration. In 2007 International Symposium on Collaborative Technologies and Systems (pp. 106-114). IEEE.
24 Atkinson, D. J., & Clark, M. H. (2013, March). Autonomous agents and human interpersonal trust: Can we engineer a human-machine social interface for trust?. In 2013 AAAI Spring Symposium Series.
25 Gefen, D. (2004). What makes an ERP implementation relationship worthwhile: Linking trust mechanisms and ERP usefulness. Journal of Management Information Systems, 21(1), 263-288.
26 Grabner-Kräuter, S., & Kaluscha, E. A. (2003). Empirical research in on-line trust: a review and critical assessment. International journal of human-computer studies, 58(6), 783-812.
27 Corritore, C. L., Kracher, B., & Wiedenbeck, S. (2003). On-line trust: concepts, evolving themes, a model. International journal of human-computer studies, 58(6), 737-758.
28 He, Y., Chen, Q., & Kitkuakul, S. (2018). Regulatory focus and technology acceptance: Perceived ease of use and usefulness as efficacy. Cogent Business & Management, 5(1), 1459006.
29 Tseng, S., & Fogg, B. J. (1999). Credibility and computing technology. Communications of the ACM, 42(5), 39-44.
30 Schnall, R., Higgins, T., Brown, W., Carballo-Dieguez, A., & Bakken, S. (2015). Trust, perceived risk, perceived ease of use and perceived usefulness as factors related to mHealth technology use. Studies in health technology and informatics, 216, 467.
Mr. Alec Wells
Department of Computer Science, York St John University, York, YO31 8JY - United Kingdom
alec.wells@yorksj.ac.uk
Mr. Aminu Bello Usman
Department of Computer Science, York St John University, York, YO31 8JY - United Kingdom
Dr. Justin McKeown
Department of Computer Science, York St John University, York, YO31 8JY - United Kingdom