Currently, businesses use a variety of artificial intelligence (AI) applications, such as service robots (Wirtz et al., 2018). Aside from the innumerable benefits, their quick and broad deployment has also led to a number of problematic issues (Honig & Oron-Gilad, 2018). For example, several studies focused on how people reacted to failing algorithms (Srinivasan & Sarial-Abi, 2021). Even fewer studies investigated how people react when robots fail (i.e. Choi et al., 2021). Prominent marketing strategies involved depicting resilient and well-engineered robots in states of falling, failing, beating up, and aiming at evoking various feelings (i.e. empathy, warmth, or comfort). Despite it being a significant phenomenon, almost no previous research investigated how consumers react to robots depicted as falling, failing, beaten up, or lost (see Table 1 for a list of popular robot failures and falls).
The most prominently portrayed type of robot failure in popular media is “the fall.” In this research, our goal is to investigate what people think and feel about the phenomenon of “failing robots” in the context of a “fall, as consumers evaluate the same technology (i.e., robots) in somewhat diverse ways (Siino & Hinds, 2005; Gretzel & Murphy, 2019). We present the initial findings of our content analysis to pinpoint the specific concepts consumers focused on when formulating their thoughts and feelings on falling robots.
We started out our in-depth exploratory research by gathering open-ended consumer verbatims from a convenience sample of university students. They responded to a specific robot fall news item, accompanied by a visual, which aided in the discovery of new and relevant issues centered on the “failing robots” theme. Visuals are frequently used in exploratory studies to aid in the probing of meanings and reactions (Christodoulides et al. 2021).
A convenience sample of eighty-eight (42 female) undergraduates studying business at a major European university took part in the current study in partial fulfillment of their course requirements. We have not prioritized generalizability and scale, which are not key aspects in qualitative sampling (Holloway & Jefferson, 2000). The average age of the participants turned out to be 23.09 (SD = 2.504, ranging between 19-34) and on average they reported medium income.
Participants were first shown a piece of news about a robot falling (Appendix 1). The photograph was captioned, “The fall of one of a robotics company’s robots at a trade show.” They were asked to answer a few questions about their evaluation of the robot and the news depiction, as well as some control variables like their involvement in robotics, anthropomorphism level for the robot, and demographics.
As two researchers, we concurrently open-coded the short-essay reactions to the robot and the news. We used manual coding to gain insight into the characteristics and dimensions of attitudes toward malfunctioning robots. The initial results of consumer verbatim analysis revealed several broad themes of forming an opinion or attitude toward the issue. The following are some of the most notable examples in the three thematic categories: (1) seeing robots as the futuristic technological advancement of humans and getting upset with them falling; (2) seeing robots as yet another simple machine and not minding much about them falling; and (3) in-between: having mixed feelings toward the robots’ falling
Seeing robots as a futuristic technological advancement for humans:
“The sad part is that this shows that we are behind in terms of technology”
“They can develop themselves and improve their technology. Every success comes from after failure.”
“It wasn’t just a falling robot; the ideas and experiences fell, too.”
“I am not interested in how and why it fell off the stage at all, but I am curious about what is planned for the future of this robot in terms of AI improvements.”
Seeing robots as yet another simple machine:
“I feel nothing about the robot’s fall”
“I have no emotion about the robot’s fall”
“Since it’s a machine, a fault could appear at any time, which, on the other hand, shows that we cannot rely on robots 100% and that human interaction is gonna be always needed.”
“Since that was a robot, I did not feel anything.”
Having mixed feelings:
“This news is partly fun and partly sad”
“They’re exciting, but a little scary”
The first group got depressed with the robot’s fall and took it as a sign that the technological advancement and efforts set forth for such were wasted, or at best, they would like to understand what the key learning was to make sure that desired progress can be achieved in future technologies. The second group did not take the robot’s fall as something to be of importance as it was yet another machine that surrounds modern daily life; however, they still felt a bit disappointed that humans were needed to compensate for the robot’s falls. Lastly, the third group had mixed feelings and reported confusion when they saw a robot fall.
Consumers may experience unexpected and mixed emotions after witnessing robotic failures, and these emotions may then influence their attitudes and related behavioral intentions. For example, previous research demonstrated that using service robots attenuated consumers’ embarrassment (Pitardi et al. 2021). It is yet unknown which emotions and mechanisms are in place for evaluating failing robots in a favorable or unfavorable light by the consumers. The apparent polarity reflected in the verbatims points out that using robot fall strategies is a double-edged sword for robotic service providers, producers, and even retailers, and the involved mechanisms need to be analyzed in-depth in order to avoid unanticipated and unintended consequences.