Hostname: page-component-cd9895bd7-8ctnn Total loading time: 0 Render date: 2024-12-25T18:02:07.745Z Has data issue: false hasContentIssue false

Playful framings of social robots in dementia care: reconsidering the principle of transparency in interactions with robot animals

Published online by Cambridge University Press:  14 November 2024

Clara Iversen*
Affiliation:
Department of Social Work, Uppsala University, Uppsala, Sweden
Marcus Persson
Affiliation:
Department of Behavioural Sciences and Learning, Division of Education and Sociology, Linköping University, Linköping, Sweden
David Redmalm
Affiliation:
School of Health, Care and Social Welfare, Division of Sociology, Mälardalen University, Vasteras, Sweden
*
Corresponding author: Clara Iversen; Email: clara.iversen@uu.se
Rights & Permissions [Opens in a new window]

Abstract

Research on social robots in dementia care has focused on their effects, for example in relation to the patients’ wellbeing or the care-givers’ working environment. Such approaches to social robots treat them as stable objects with a singular function. Combining social gerontology with social studies of science, the current study offers a new angle by asking: How do patients and care-givers in care homes for older people establish a shared definition of the situation in interactions involving robot animals? Drawing on ethnography and multimodal conversation analysis of 211 minutes of video recordings in two care homes in Sweden, we demonstrate the embodied work by which participants in interactions establish activities with robot animals. In contrast to the ideal of transparency in social robotics, we show that a central affordance of the robots is their vagueness, which allows for their inclusion in playful interactions. Playful framings of the robots highlight their social functions and downplay care-giver–patient asymmetries. However, situations where patients resist a playful frame actualise a dilemma of social inclusion, on the one hand, and the right to not participate in play, on the other. Showing this, the article contributes to knowledge on how people age with technology; in particular, it draws attention to the limits of an ideal of transparency when social robots are included in dementia care.

Type
Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2024. Published by Cambridge University Press.

Introduction

Scholars working at the intersection of ageing and science and technology studies have acknowledged that humans today become older with multiple others – not only other humans but also animals, technologies and things (e.g. Jenkins Reference Jenkins2017). One such ‘other’ increasingly present in care for older persons is the robot animal. Robot animals are designed to react to talk, touch and movement – functions that aim to give the user a sense of interacting socially with another living being (see Alač Reference Alač2016). Research on robot animals in dementia care has so far focused on their effects (e.g. Abbott et al. Reference Abbott, Orr, McGill, Whear, Bethel, Garside, Stein and Thompson-Coon2019; Moyle et al. Reference Moyle, Bramble, Jones and Murfield2018), for example in calming agitated patients or offering a conversation starter (Pfadenhauer and Dukat Reference Pfadenhauer and Dukat2015), but less is known about how care-givers and persons with dementia actually approach and make sense of social robots (Persson et al. Reference Persson, Redmalm and Iversen2021). To advance the understanding of older people in their social and cultural contexts, we ask the following research question: How do patients and care-givers in care homes for older people establish a shared definition of the situation in interactions involving robot animals?

In social robotics, the principle of transparency has been promoted to avoid interactional problems related to deception or disappointment when people interact with robots. This means that robots’ appearance and behaviour should display their actual capacities (e.g. Złotowski et al. Reference Złotowski, Khalil and Abdallah2020). The principle can be understood as particularly relevant in dementia care, where patients may have difficulties navigating reality (Kontos Reference Kontos2004; Örulv and Hydén Reference Örulv and Hydén2006). While there is a general acceptance of a limited use of deception to promote wellbeing in dementia care (e.g. Hunter et al. Reference Hunter, Hadjistavropoulos and Kaasalainen2016), guidelines concerning social robots tend to describe avoiding deception as a key part of treating patients with dignity (Blackman Reference Blackman2013). For instance, in Sweden, where the current study takes place, the Medical Ethics Council states that patients should not be led to believe that the robot has capabilities that it in fact lacks (Statens medicin-etiska råd [SMER (Swedish Medical Ethics Council)] 2014, 60). One way to minimise this type of misunderstanding is to ensure that everyone involved has complete information about the robot’s skills (SMER 2014). Such a transparent approach may seem like an obvious ethical standard at first sight. However, always prioritising information over situational sensitivity can lead to instrumental interactions (see Fine Reference Fine1983), potentially interfering with how care-givers and patients relate to each other. By studying how meaning emerges when social robots are used, we can shed light on how ethical guidelines relate to actual practice (see Peräkylä and Vehviläinen Reference Peräkylä and Vehviläinen2003), which is important as technology is increasingly introduced in care.

When studying how people with dementia make sense of themselves and their surroundings, analyses based on video-recorded interactions can make up for reduced verbal capacity (Kontos Reference Kontos2004; Schneider et al. Reference Schneider, Hazel, Morgner and Dening2019). Therefore, we draw on the detailed qualitative approach of multimodal conversation analysis (Mondada Reference Mondada2018) to study the embodied and verbal resources in play when meaning emerges between humans and robots. In this way, we make the methodological choice to set aside ‘normative assumptions [as] to what it means to be human’ (Jenkins Reference Jenkins2017, 1487) and account for both humans’ and robots’ verbal and embodied conduct.

Literature review: interactions with social robots in care for older people

Social robots in care for older people

Research on how people manage the reality of social robots is still scarce. In studies of social robots in care for older people, there is an overwhelming focus on robots’ impact on patients’ wellbeing. For instance, robots can be used in dementia care to reduce stress and anxiety and help people remember their life stories (e.g. Abbott et al. Reference Abbott, Orr, McGill, Whear, Bethel, Garside, Stein and Thompson-Coon2019; Moyle et al. Reference Moyle, Bramble, Jones and Murfield2018; Pfadenhauer and Dukat Reference Pfadenhauer and Dukat2015). Studies have also analysed the effects on the workforce in care settings, as robots can replace and/or facilitate care-giver–patient interactions (e.g. Jung et al. Reference Jung, van der Leij and Kelders2017; Roger et al. Reference Roger, Guse, Mordoch and Osterreicher2012). Related to the reduced involvement of care-givers, there are also studies that caution against the use of robot animals, arguing that they can increase older people’s isolation (Sharkey Reference Sharkey2014) and negatively affect their dignity (Parks Reference Parks2010).

Several interview studies have focused on patients’ and care-givers’ views of social robots (e.g. Birks et al. Reference Birks, Bodak, Barlas, Harwood and Pether2016; Johansson-Pajala et al. Reference Johansson-Pajala, Thommes, Hoppe, Tuisku, Hennala, Pekkarinen, Melkas and Gustafsson2020); such studies tend to show humans’ idealised understanding of robots rather than their actual use. Moreover, they often take for granted a distinction between autonomous human actors and passive tools or instruments; humans are seen as ascribing meanings to robots, which are waiting to be deployed. This approach downplays how the robots’ own affordances can shape human–robot interactions, thus neglecting the co-constitution of ageing and technology (Peine and Neven Reference Peine and Neven2021). In a video-based study where dementia patients in a hospital setting were interviewed while using an animal robot, Hung et al. (Reference Hung, Gregorio, Mann, Wallsworth, Horne, Berndt, Liu, Woldum, Au-Yeung and Chaudhury2021) note that the robot is treated as a ‘buddy’, a conversational piece or a source of comfort during the potentially stressful hospital visit. While drawing on situations framed by the interviewers’ questions, the study shows how video-based research can highlight different ways of understanding and responding to robot animals.

Interactions with social robots

The details of how care-givers use and make sense of robots in interaction with patients have received less attention (Persson et al. Reference Persson, Ferm, Redmalm and Iversen2023), although such knowledge can inform us about people’s own orientations towards the benefits and risks of social robots (Flinkfeldt et al. Reference Flinkfeldt, Iversen, Jørgensen, Monteiro and Wilkins2022). Existing studies of actual encounters between humans and social robots in other contexts than care for older persons have examined how humans understand robots’ interactional contributions and how agency is distributed between people and robots. In a study of a social robot in preschool, Alač (Reference Alač2016) shows that the robot is treated as a living creature while it is handled as a material thing. For example, a teacher’s description, ‘she is just a robot’, illustrates the way in which the robot is gendered as if it were a real person, while at the same time being denoted ‘just’ a robot (also see Hung et al. Reference Hung, Gregorio, Mann, Wallsworth, Horne, Berndt, Liu, Woldum, Au-Yeung and Chaudhury2021). Studying a tele-operated childlike robot interacting with people with acquired brain injuries, Krummheuer (Reference Krummheuer2016) also found flexible orientations to the robot’s ontological status: it was treated as an autonomous actor, a mediator of the operator and a hybrid, in which robot and operator merged into one actor. The robot made it possible to play with established identities; for example, a patient could become a care-giver (Krummheuer Reference Krummheuer2016; also see Hung et al. Reference Hung, Gregorio, Mann, Wallsworth, Horne, Berndt, Liu, Woldum, Au-Yeung and Chaudhury2021).

In these studies, the interactions with robots took place with a facilitator who could stage, monitor and repair problems in interactions (also see Chevallier Reference Chevallier2023). By contrast, Tuncer et al. (Reference Tuncer, Licoppe, Luff and Heath2023) show that in unsupervised interactions between persons and robots, for example during museum visits, problems arose when the robot was not able to understand relevant responses to its own questions, such as pointing in response to a ‘where?’ question. By examining video-recorded interactions including care-givers and several persons with dementia involved in interactions with robot animals, the current study adds to the small but growing field of research on actual interactions with social robots. Showing how the reality of social robots is managed in dementia care advances our knowledge of how ageing with technology is produced in situated practice.

Theoretical framework: working consensus in an unknown territory

Frames as an approach to sense-making

Drawing on Goffman’s (Reference Goffman1974) notion of frames, we examine the situated emergence of a working consensus. The concept of frames refers to the temporary, situational ‘schemata of interpretation’ (Goffman Reference Goffman1974, 21) of social actions that help interactional participants interpret the social situations in which they participate. Two main points of the concept are that (1) utterances need a reference to an activity to be understood and (2) the understanding of a situation is not static or given but built as people start to engage in interaction; their actions both assume and produce specific frames. Thus, Goffman’s (Reference Goffman1974, 173–186) interactional approach to sense-making allows us to examine how care-givers and patients in interaction develop a temporary tacit agreement – a working consensus – related to ongoing activities with robots. In the analysis, we show the emergence of two such main frames: play and small talk.

The fact that play and small talk were the main activities highlights the social function of robots in the situations we recorded: both activities have often been defined in terms of their non-institutionality; rather than having instrumental goals, play and small talk sustain relations (Huizinga Reference Huizinga1964; Malinowski Reference Malinowski1936). At the same time, studies have shown that play and small talk are relevant for institutional tasks, for example how play can be a part of training social competence in group homes (Finlay et al. Reference Finlay, Antaki, Walton and Stribling2008) and how small talk can pave the way for advice in helplines (Iversen et al. Reference Iversen, Flinkfeldt, Tuncer and Laurier2022). In relation to dementia care, the line between activities having a purpose in themselves and a goal outside the situation is especially blurred, since being social is considered a crucial quality that dementia care aims to maintain and train (e.g. Birks et al. Reference Birks, Bodak, Barlas, Harwood and Pether2016).

Transparency and playfulness in care

Given the suggestions in social robotics for a principle of transparency to avoid interactional problems related to deception or disappointment, the focus on sense-making in this setting is particularly relevant (e.g. Złotowski et al. Reference Złotowski, Khalil and Abdallah2020). While the ideal of transparency – described in the introduction – can appear as a given, it may actually collide with central aspects of dementia care. For instance, prioritising knowledge-related interactional actions, such as giving information, runs the risk of preventing actors from relating to one another in playful ways (Fine Reference Fine1983).

The importance of playfulness in relation to dementia is increasingly acknowledged (e.g. Killick Reference Killick2013); for example, studies have identified playfulness as a key factor in activities aimed at promoting the health of persons with dementia, such as dancing (Kontos Reference Kontos2004; Wright Reference Wright2018), making music (Dowlen et al. Reference Dowlen, Keady, Milligan, Swarbrick, Ponsillo, Geddes and Riley2022) and gardening (Buse et al. Reference Buse, Balmer, Keady, Nettleton and Swift2023). Play can highlight patients’ abilities to initiate, modify and co-construct moments of engagement and imagination using verbal and non-verbal communication (Kontos et al. Reference Kontos, Miller, Mitchell and Stirling-Twist2017; also see Buse et al. Reference Buse, Balmer, Keady, Nettleton and Swift2023). Our focus on the emergence of joint activity frames (Goffman Reference Goffman1974) highlights how robot animals can be involved in building playful activities, but also how this raises ethical questions regarding social inclusion versus integrity and the right not to play (also see Hansen Reference Hansen2023; Nordenfelt Reference Nordenfelt2004).

By adopting a theoretical framework that enables us to examine how care-givers and patients approach robot animals in flexible ways, we can shed new light on the discussion about transparency in social robotics related to care for people with dementia. Our research question is therefore: How do patients and care-givers establish a shared definition of the situation in interactions involving robot animals?

Methods

Sampling and recruitment

The study draws on ethnographic fieldwork in five care homes in Sweden and multimodal conversation analysis of video recordings from two of the homes. Our inclusion criterion was that the care homes had to be using robot animals in their daily work. The care homes were all not-for-profit because our project was part of a research programme about the work environment in municipal organisations. We thus excluded for-profit homes and care homes that had just begun using robots (see Table 1).

Table 1. Care homes

All authors contacted care homes based on recommendations from our expert advisory team (with representatives from organisations working with care and social robotics) as well as via regional dementia care coordinators. We first presented the study to managers and care-givers, who gave their written consent to participate. Patients were given time to consider participation after receiving letters about the study from their care-givers. Iversen and Persson recorded situations based on the care-givers’ suggestions, often moments when the patients participating in the study were together in a social area. During our ethnographic fieldwork, the care-givers introduced us to patients who had provided written consent to be filmed during the ethnography, and we again requested permission to film after showing our video equipment. Because tablets are easy to use, we used them as our recording devices so that care-givers and researchers could set up quickly when the need arose. We could also show the recordings to the patients on the tablets to make sure that they understood what they had consented to.

Data collection and analysis

Ethnography is a suitable methodology for studying people’s sense-making in situated contexts (Hammersley and Atkinson Reference Hammersley and Atkinson2007) and has proven useful for studying technology in care practice (e.g. Beedholm et al. Reference Beedholm, Frederiksen, Frederiksen and Lomborg2015; Lipp Reference Lipp2024) – especially in regard to the use of social robots in care (Chevallier Reference Chevallier2023; Wright Reference Wright2018). Multimodal conversation analysis of video-recorded situations enriches this approach by making it possible to re-examine in detail how activity frames based on verbal and embodied resources emerge sequentially (Sidnell and Stivers Reference Sidnell and Stivers2013; Tuncer et al. Reference Tuncer, Licoppe, Luff and Heath2023). Rather than adopting a normative frame, conversation analysis starts from ‘unmotivated looking’, that is, an effort to describe what the research participants treat as meaningful activities (Flinkfeldt et al. Reference Flinkfeldt, Iversen, Jørgensen, Monteiro and Wilkins2022). Combining these approaches, we make a methodological contribution – a contextualised and detailed analysis of situated sense-making.

Observations focusing on the care-givers’ work were conducted by one researcher at a time, and the duration spanned from one-day visits to smaller care homes (Redmalm) to week-long visits to larger care homes (Iversen, Persson and Redmalm). In total, the research team spent about 100 hours at care homes, documented in field notes taken during and after visits. In the current study, the notes are used to contextualise the recordings; for example, they provide information about the recurrence of situations that we filmed, what happened before and after the recordings, and other situations involving robots that we did not film. Studies focusing on the ethnographic fieldwork have been published elsewhere (Persson et al. Reference Persson, Ferm, Redmalm and Iversen2023; Persson, Iversen et al. Reference Persson, Iversen and Redmalm2024; Persson, Thunman et al. Reference Persson, Thunman, Iversen and Redmalm2024).

Video recordings were made by Iversen (S4–S8, Appendix 1) and Persson (S1–S3, Appendix 1). After obtaining the participants’ consent as described earlier, we started filming, placing the tablet on a nearby stand and positioning it to avoid filming patients who were not involved in the study. We sometimes participated and sometimes left the interaction, depending on the situation. In total, we recorded 211 minutes of interactions, including eight social situations (see Appendix 1).

Iversen used multimodal conversation analysis to examine the video recordings, while the ethnographic notes collected by all authors helped us contextualise this detailed analysis. Multimodal conversation analysis is based on conversation analysis, which is an empirically driven and microanalytic method developed from ethnomethodology that is used to analyse how people, turn-by-turn, make themselves understandable in social situations (e.g. Sidnell and Stivers Reference Sidnell and Stivers2013). In addition to analysing verbal conduct, multimodal conversation analysis adds the material aspects of interaction by drawing on video recordings and transcripts that show how embodied practices are timed in relation to verbal practices; for example, how a gaze can show that an utterance is directed at a particular other (Mondada Reference Mondada2018; see Appendix 2 for transcription conventions). The method has proven useful to study the interaction between people and social robots (Alač Reference Alač2016) as well as how people with dementia use their material environment to accomplish social actions (Majlesi and Ekström Reference Majlesi and Ekström2016).

The analysis was conducted according to the following steps: (1) identifying a relevant phenomenon (the establishment of a working consensus) in the video recordings through data sessions with researchers in the conversation analysis community; (2) building a collection of instances (see Sidnell and Stivers Reference Sidnell and Stivers2013) where participants established a working consensus (20 instances), based on verbatim transcripts with notes about bodily behaviour; (3) transcribing in detail (Mondada Reference Mondada2018; Appendix 2); (4) examining each instance in the collection in terms of how communicative turns (including non-verbal) were composed (e.g. grammatically, prosodically and lexically) and how they built sequences of actions (i.e. how participants initiated actions and how others responded; see Sidnell and Stivers Reference Sidnell and Stivers2013); and (5) examining patterns in identified practices and actions, including identifying actions that pointed to similar activities (e.g. how exaggerated surprise, joking and laughter built a playful activity frame). Through these steps, we identified playfulness as well as small talk as key frames for establishing a working consensus. While playfulness, initiated by the care-giver or the patient, was present in the majority of examples in our collection (in 18 of 20 instances), small talk was also a common activity (12 of 20 instances).

Ethical issues

The study was approved by the Swedish Ethical Review Authority. A main ethical dilemma was, on the one hand, to make sure that patients who wanted to participate in the research were not excluded based on others’ judgements and, on the other, to avoid that patients participated in research that they did not understand and consent to.

The participants could stop the filming at any time and request for recordings to be deleted (i.e. process consent). Some patients who had given their written consent to participate declined participation when we were there, either via the care-giver or directly to us. There were also a care-giver and a patient who agreed to being recorded on audio but not on video (S4, Appendix 1). Being sensitive to such requests, we aimed to ensure that only care-givers and patients who understood what their participation involved were included in the study.

In the next section, we present the findings based on specific examples of activities, analysed in fine detail. The transcripts include only English translations to facilitate reading, but the Swedish originals can be found in Appendix 3. All names have been replaced by pseudonyms.

Findings: framing robots in play and small talk

The patients in the study have mild to moderate dementia and have lived in the care facilities from a few days to several years, so their familiarity with the robot animals and one another varies (see Table 2). The robot animals are of the same brand, mimicking real cats and dogs. They have pressure sensors under the fur, causing them to react to touch. The robot cat can move its head and one front paw. It can lie down as well as meow and purr with sound and vibrations. The robot dog has a small internal motor imitating a heartbeat. It can move its head and bark, triggered by both touch and sound.

Table 2 Patients participating in the video recordings

The analysis shows two primary activities in which social robots are involved: play and small talk. Using two examples that show patterns in the whole dataset, we first demonstrate how play emerges as a frame for making sense of robot animals. We then examine an example of how care-givers and patients can manage threats to a working consensus, drawing on practices connected to both small talk and play.

The emergence of a playful framing

In first encounters with robots, people rely on their own assumptions and on what they discover about the robot’s competencies in the course of their interaction (Tuncer et al. Reference Tuncer, Licoppe, Luff and Heath2023). Using excerpts from two situations, we show how care-givers and patients establish a playful framing with embodied and verbal resources. Such orientations to playfulness were present in the majority of instances in our collection (18 out of 20), and in all but one of the recorded interactions (S6, Appendix 1). Playfulness was also a recurrent theme in our ethnographic notes.

In the first example (S1, Appendix 1), Lisbet, who had arrived at the care home the day before, meets the robot cat for the first time. A notable feature of this situation, recurrent in other interactions, is how the care-giver initially invites Lisbet to decide on how to treat the robot (as alive or not) and then participates in building a playful framing. Before we join them, Lisbet and Maj, another patient who had been staying at the care home longer, are sitting at a kitchen table, and the care-giver comes up from behind Lisbet, carrying the robot cat while reaching for the button that activates it. She addresses Lisbet, ‘Have you seen, Lisbet,’ thereby calling attention to the robot without specifying how it should be seen. Lisbet asks if the robot is alive and the care-giver holds the robot cat so that Lisbet can examine it. Lisbet looks and touches it, then answers her own question with ‘No.’ The care-giver not answering thus allows Lisbet to decide for herself about the robot’s character.

Maj, who is sitting on the other side of the care-giver, then asks, ‘Is it the cat?’ The definite form demonstrates recognition and assumes that there is a cat in the ward known to both the care-giver and Maj (see Schegloff Reference Schegloff and Fox1996). The care-giver confirms Maj’s assumption, and the participants have thus established a working consensus about the robot – it is non-living and recognisable as ‘the cat’. Lisbet then makes a positive assessment:

Excerpt 1 08   LIS:   It is so beau:::tiful. 09          (.) 10   LIS:   Yes:: 11          (1.7) 12   LIS:   ∘It is truly +really^ beautiful.∘                          ^                      -->+      car                         ^turns ROB and puts it in front of LIS^ 13            (1.4) 14   ROB:      Meo*::w=      car          *looks at LIS--> 15   CAR:=.HHHo! 16         (.) 17   MAJ: ∘∘>M[eow=meow<∘∘ 18   ROB:        [Meo::w:: 19   (0.5) 20   LIS:   Oh ¨dear!                   ¨ ¨looks intensively at ROB¨

Here we see how the participants approach the robot cat in playful ways. Lisbet’s stance towards the robot is ambiguous (see Alač Reference Alač2016; Hung et al. Reference Hung, Gregorio, Mann, Wallsworth, Horne, Berndt, Liu, Woldum, Au-Yeung and Chaudhury2021): she treats it as an object by referring to the robot as ‘it’ (lines 8, 12), yet the petting and her soft voice are directed to the robot in a way that is common when people talk to babies, animals or other less competent actors (Ferguson Reference Ferguson1964). Having established that the robot is not alive, she thus builds a pretend play frame in which she treats it as a real animal.

The care-giver gives Lisbet more time to interact with the robot by placing it in front of her (line 12). In response to Lisbet’s petting, the robot meows (line 14). This prompts a look from the care-giver at Lisbet, inviting a joint stance of surprise (line 15). Such tokens of surprise in response to the robots’ behaviour were common in the care-givers’ interactions with the patients. Showing surprise at an expectedrobot behaviour can be understood as exaggerated and enacted (see Wilkinson and Kitzinger Reference Wilkinson and Kitzinger2006), contributing to the playful frame and building joyful anticipation towards the robot (Drew Reference Drew1987). Maj’s response (‘meow=meow’, line 17) mimics the cat, thereby showing what she has heard and responding to the cat in line with the playful frame. Harjunpää (Reference Harjunpää2022) has shown that mimicking is a recurrent practice in human–pet interaction, where the human uses an onomatopoeic expression imitating a pet’s sound to establish a brief conversational exchange. In the current situation, it can be understood as contributing to the pretend play where the robot is treated as a real pet.

The robot meows again, and Lisbet responds with ‘Oh dear’ (line 20), joining the care-giver’s stance towards the robot as doing something remarkable. It is usual for an extraordinary reality to be shared in play (Holt Reference Holt2016), so Lisbet can be seen as building on the care-giver’s exaggerated surprise while at the same time emphasising her endearment with the cat.

Thus, we see the emerging meaning of the robot as not alive but still something that is relevant to respond to with endearment and excitement. In establishing this working consensus, embodied action is central. The care-giver’s vague presentation of the robot and lack of verbal responses gives Lisbet first-hand access to examine and interact with the robot. This places her on an equal footing with the care-giver and encourages direct, rather than mediated, interaction between Lisbet and the robot.

In the next example (S5, Appendix 1), videotaped at a different care home, the patients Nora, Stig and Elisabet as well as the care-giver and the researcher are sitting by a coffee table. The patients know each other well and tell the researcher about one another (e.g. ‘Stig loves his plush dog’). The care-giver has brought a robot dog and is calling attention to it with the directive ‘Look’, inviting a joint focus and demonstrating her own interest in it (see Siitonen et al. Reference Siitonen, Rauniomaa and Keisanen2021). She lifts it towards Nora, and when the robot barks, Nora responds to it with a confirming ‘Yes.’ Similar to the care-giver’s introduction of the robot cat to Lisbet, this practice allows Nora to decide upon a variety of actions concerning the robot.

Stig, who is sitting across the table from Nora, holding his own plush dog, comments on the robot dog’s barking (line 4):

Excerpt 2a 04   STI:   &That one* barks indeed.& car         -->* sti   & looks at ROB          & 05   NOR: ↑Haof=haof↑ 06   STI:   ↑woof=woof↑ 07   CAR: +((Can)) you hea::r!  + +Looks at RES then STI+ 08      (.) 09   STI:   ^It [does? Nor   ^shakes rob--> 10   NOR:       [Hov hov.^ -->^ 11   CAR:/Ye:::s, ∘if you #2 li[*ste@n∘                             @ nor   /looks at camera, then ROB--> car                          *leans to STI, points up then at ROB--> sti                             @leans against CAR, looks at ROB@ 12   ROB:                         [r:vo: /rvo::[: nor                             -->/ 13   RES:                                      [Ye[a::h= 14   NOR:                                      [^↑Ye::::s!↑      ^ 15                                       ^nods towards ROB^ 16   NOR:   =Hao [↑skvo↑ [^yes (yes yes)^ ^nodding      ^ 17   CAR:       +[Oh dear* -->* +looks at STI--> 18   STI:        [£What’s this£ hah [hah hah 19   CAR:                           [Hah hah 20   STI:   [Hah hah hah] @is it him doing that= @points--> 21   CAR:   [hah hah hah] 22   NOR:   [^Haov=haov ^ heh heh heh heh ^shakes ROB^

Nora responds to Stig by mimicking the dog’s bark (line 5), followed by a similar response from Stig (line 6). Besides creating an exchange with the robot (see Harjunpää Reference Harjunpää2022), Nora mimicking the robot collaborates with Stig’s project of exploring it by offering her own experience to the other humans present. The care-giver collaborates in this exploration by acting as one of the observers, ‘Can you hear?’ (line 7). Similar to how she used ‘Look’ when she came with the robot, this prompts an action from the patient and describes what she herself is doing while refraining from instructing the others on how to approach the robot (see Siitonen et al. Reference Siitonen, Rauniomaa and Keisanen2021).

Stig then asks, ‘It does?’ (line 9), treating the previous observation as remarkable and pursuing further exploration of the robot. Nora collaborates by shaking the robot, as a thing, and continuing to mimic its sound. The care-giver confirms and elaborates with the instruction, ‘if you listen’, leaning against Stig and pointing (line 11), thereby initiating joint listening with Stig (see Yasui Reference Yasui2023).

The robot makes a ringing bark, and we can see how the playful frame is now established among the participants: Nora involves the robot by mimicking its pitch and nodding towards it (line 15, 16); the care-giver gazes at Stig, inviting him to an exaggerated stance of disbelief (line 17); and Stig joins her by asking ‘what’s this’ with a smiley voice followed by laughter, thus taking a stance of wonder (line 18). This is followed by comments and laughter from the others. After more laughter, Stig again comments on the dog’s barking, now treating it as a person (‘he’), and then addresses it:

Excerpt 2b 25   STI:   £He barks£@ a hah You are a [funny one hah -->@ car                               [hah hah hah 26      ah [ha hah 27   NOR:     [>£I’m &barking at you too£=>ah sti             &looks at NOR 28      woof!/<=[heh heh heh nor     -->/looks at RES 29   RES:           [Hah hah [hah 30   CAR:                    [Hah hah 31   STI:                    [woof woof woof

Nora furthers this playful frame by describing her reciprocal behaviour as ‘barking at you too’ before she produces an emphatic ‘Ah woof!’ (line 27–28). Nora is talking to the dog, treating it as the recipient (line 27–28). However, her gaze towards the researcher (line 28) indicates that her contribution is also for the benefit of the other participants. This generates heartfelt laughter from the researcher and the care-giver, and Stig smilingly offers a bark of his own.

In these two initial cases, care-givers and patients thus build playful frames – a shared fantasy (Fine Reference Fine1983) – around the robots, starting with the care-givers designing their actions to be on the same level as the patients, exploring the robots’ nature by looking and hearing together rather than instructing and clarifying how the robots should be understood. We see patients taking different approaches to the robot, treating its characteristics as more or less known. The care-givers’ presentations of the robots and the subsequent enactment of exaggerated surprise allow for such different approaches while inviting a shared playful frame of excitement and joy. In the next section, we see how participants can manage potentially conflicting frames.

Navigating conflicting frames

Patients’ different stances towards robots sometimes generated interactional trouble, which characterised the interaction in five of our 20 examined instances (in S3, S4, S6, S8, Appendix 1). Three of them still ended up in playfulness (S4, S8) and two involved care-givers backing down from such efforts (S3, S6). We will now examine a case (S8) where a working consensus about activities involving a robot dog is established in the face of patients’ different initial definitions of the situation: a playful treatment of the dog as if it were alive, and a sceptical approach that does not include the robot as a participant. We will show how the participants navigate between these two framings and use small talk to establish a working consensus in the face of a robot-related conflict.

The care-giver, the patient Sigrid and the researcher have been sitting and talking around a kitchen table with a robot dog on it for about 20 minutes. Sigrid has approached the robot as a pet, asking who takes care of it and interacting playfully with it, asking what it wants. Sigrid has also mentioned that she does not have time to care for the dog because she needs to leave soon. In this sense, she can be understood as confabulating – that is, producing false narratives or statements owing to some pathological factors, but with no intention of lying. As a social phenomenon, confabulations can be understood as a way of making sense of the current situation, the self and the world, but they can also complicate the building of a shared world with others (Örulv and Hydén Reference Örulv and Hydén2006). Care-givers therefore often respond to confabulation minimally, avoiding commitment to patients’ stories without explicitly challenging them (Lindholm Reference Lindholm2015).

In this situation, the care-giver and the researcher have been answering Sigrid’s questions in ways that do not challenge her assumptions about the world (e.g. saying that the robot stays at the care home and that the care-giver will take it when Sigrid needs to go). Meanwhile, the patient Cristina has been sitting by herself in a chair nearby. Cristina and Sigrid know each other and talk sometimes, but the care-giver has reported that Cristina can become jealous if Sigrid gets ‘too much attention’. The care-giver has previously asked Cristina to join them, but she has declined, saying, ‘You don’t want me there.’ We enter the situation as the care-giver again tries to involve Cristina, saying, ‘We are sitting here talking a bit, Cristina, do you want to join us?’ This is a general description of their activity as small talk, not mentioning the robot dog but not excluding it either. This time, Cristina accepts the invitation but displays a disengaged stance (‘guess I can’). As Cristina comes over to the table, Sigrid is singing a short tune to the robot dog while moving it towards Cristina.

Excerpt 3a 34      (1.0) 35   SIG:   *Damdidamdidam%bap    *(.) %dam dam di- ((singsong)) *turns ROB towards CRI* cri                 %sits down   % 36   SIG:   &I- Is it your dog? &looks at CRI--> 37      (2.0) 38   CRI:   @What/Why? @looks ahead--> 39      (0.7) 40   SIG:   Is it your dog? 41      (.)@(0.5) cri      @looks at SIG--> 42   SIG:   Do you have a dog? 43   CRI:   Hey I’m not @talk- I don’t answer such @ stupid &questions. -->@looks away from SIG at CAR@ sig                                                   &looks at CAR-->

Sigrid’s singing and embodied actions (line 35) are vague, but turning the robot towards Cristina offers her the possibility to acknowledge it (see Butler et al. Reference Butler, Duncombe, Mason and Sandford2016). When Cristina does not do so, Sigrid upgrades her efforts to involve Cristina by looking at her and asking the yes-or-no question, ‘Is it your dog?’ (line 36). This can be understood as a way of including Cristina in the previous activity by making her presence relevant. Cristina delays her answer (line 37) and then asks ‘What’ or ‘Why’ (line 38) without reciprocating Sigrid’s gaze. Sigrid treats it as trouble related to hearing by repeating the question. Cristina delays her answer again (line 41), and Sigrid redesigns her question, asking, ‘Do you have a dog?’ (line 42). This question stays with the topic, but, since it does not concern the robot dog, Sigrid can be understood as moving from the playful frame that involves the robot as a participant to a small talk frame not dependent on this specific robot dog, thus adjusting to Cristina’s resistance to play.

Cristina quickly returns with a meta comment, assessing Sigrid’s questions as ‘stupid’ (line 43), thus resisting Sigrid’s project of small talk, too. She thereby threatens the working consensus. Cristina looks at the care-giver, further dismissing Sigrid as a competent actor. Sigrid also gazes at the care-giver, with a puzzled and sad look on her face (line 43). In this sense, both patients are oriented towards a communicative breakdown. Sigrid is in a vulnerable situation as her interactional efforts have been criticised, and Cristina risks being told that her behaviour is unacceptable. The care-giver responds to this threatening situation by leaning towards Cristina, designing her intervening turn as information (lines 44–45):

44       (0.7)¨(0.5) car    ¨leans towards cri--> 45   CAR:   We are sitting and, y’know, talking about [different- 46   ROB:   [Woof woof 47   CAR:   you know that we have had dogs and I 48      have y’know too had ((a)) dog &Cristina= sig   &looks at cri and smiles 49   CRI:   =I haven’t had a dog. 50   CAR:No. 51   SIG:   O::::h you haven’t had a dog.

By telling Cristina about the activity, ‘We’re sitting and, y’know, talking about different-’ (line 45), the care-giver orients to the need to re-establish a working consensus based on a lack of understanding rather than ill will. She also supports Sigrid’s project of small talk about dogs. The use of the common knowledge token ‘y’know’ treats this as something Cristina should have known and can therefore be heard as a gentle reprimand (see Heinemann et al. Reference Heinemann, Lindström, Steensig, Stivers, Mondada and Steensig2011). The care-giver then changes tack to inform Cristina about what they have said, treating her as though she does not know about their previous conversation. The reprimand is thereby embedded in information and does not require a remedial response from Cristina. The care-giver switches from Sigrid’s present tense ‘Do you have a dog?’ (line 42) to the present perfect tense ‘have had dogs’ (line 47), which relaxes the need to relate this talk to the present situation and the robot’s ontological status. Thus, while supporting Sigrid, treating the activity she initiated as legitimate, the care-giver is sensitive to Cristina’s resistance.

Then the care-giver tells Cristina that she also has had a dog herself, moving from instructing Cristina about the activity to acting it out – doing small talk. By including herself in the self-disclosing activity, she levels asymmetries related to the care-giver–caretaker roles, potentially invoked as she instructed Cristina about their activity. As the care-giver talks, Sigrid looks at her and smiles, thus aligning with the activity of talking about dogs (line 48). By finishing with this declarative (‘I have y’know too had a dog’), the care-giver also relaxes the need for Cristina to respond. However, Cristina then says, ‘I haven’t had a dog’ (line 49). She thus moves away from her strong resistance to instead participate in the activity of talking about previous dogs, accepting the care-giver’s efforts to re-establish a working consensus. With an outdrawn change-of-state token (‘Oh’), Sigrid responds to Cristina’s response as an account; the contrastive stress on ‘had’ (line 51) treats not having had a dog as the reason for the trouble before.

They talk more about dogs and pets, and when the researcher reveals that she has pet rats at home, this generates a shared horrified reaction from the others, with Cristina remarking that ‘I wouldn’t even tell people about this’, with the others laughing. The situation is wrapped up with the care-giver saying that she will put the robot on a table, where he can sit on guard. Sigrid asks, ‘Are you kidding, does he sit there?’ and Cristina responds, ‘Yes, it’s not a toy.’ The care-giver agrees and says, ‘but one can touch him’, and hugs the robot, then moves it towards Cristina:

01   ROB:   aWoof! 02   CAR:   Yeah, he ¨tries to flirt with you too Cristina=¨ ¨bends towards CRI                    ¨ 03   CRI   =I saw that=[%yes %smiles and looks at ROB--> 04   CAR:               [You did, yeah he tr[ies y’know 05   CRI:                                   [Yeah I saw 06      that, Yeah yeah 07   CRI:   Yeah that’s fine 08      (.) 09   CAR:He probably thinks that you are ¨%a bit of a hard flirt.   ¨ ¨leans forward, hand on CRI¨ cri                                    %big smile--> 10   SIG:   HAH HAH HAH

Here we can note how the care-giver, closing down the small talk about pets, builds a playful frame around the robot involving Cristina. She ascribes a flirting behaviour to the dog, which is connected to playfulness as it touches upon a potential taboo (see Holt Reference Holt2016). Cristina joins in the playful stance by confirming – now smiling – that she herself has observed this (line 5-6). The care-giver takes it one step further by implicitly bringing up Cristina’s previous resistance and framing it playfully as ‘a bit of a hard flirt’ (line 9), to which Cristina responds with a smile. Sigrid joins with a big laugh. The care-giver’s actions here are retrospectively reproducing Cristina’s resistant actions in a playful frame.

To sum up, in a situation where one patient treats the robot as a ratified participant and another treats this as wrong, the working consensus about what activity to engage in is threatened. By adjusting the playful frame to small talk about past dogs and acting as a role model, the care-giver averts the threat, with collaboration from Sigrid. From here, they build a playful frame in the end. The care-giver’s work can be understood as moving between helping the patients notice certain aspects of the dog, averting trouble and acting as a participant on the same level as the patients.

Discussion

By combining multimodal conversation analysis and ethnography to examine in detail embodied and verbal practices in interactions with robot animals, this study has answered the question of how care-givers and patients establish a shared definition of the situation. The findings show that the interactions were characterised by playfulness. Together with the vague introductions of the robots, playfulness allowed the patients to differ in their approaches to the robots but still interact together. In addition, we showed how participants achieved a working consensus in the face of conflicting frames. Based on these findings, next we discuss two ethical questions: first in relation to the ideal of transparency and second how the right not to play can be understood and recognised.

Playfulness, vagueness and the ideal of transparency

By detailing how a technical innovation becomes socially meaningful in care for older persons, our study contributes to research on ageing and society ( e.g. Jenkins Reference Jenkins2017; Mok and Müller Reference Mok and Müller2014; Peine and Neven Reference Peine and Neven2021; Roger et al. Reference Roger, Guse, Mordoch and Osterreicher2012). In particular, by showing how vagueness is an interactional resource when care-givers and patients interact with robots, our findings shed new light on the transparency ideal in social robotics related to dementia care (SMER 2014; Złotowski et al. Reference Złotowski, Khalil and Abdallah2020).

Studies in human–robot interaction have shown that vagueness related to humanoid robots’ capabilities can disrupt interactional progress (Tuncer et al. Reference Tuncer, Licoppe, Luff and Heath2023). Furthermore, in dementia care, vagueness can be seen as a threat because it increases the risk of deceiving people who are vulnerable to experiencing distortions of reality (Blackman Reference Blackman2013; Örulv and Hydén Reference Örulv and Hydén2006; SMER 2014). In the current study, vague practices did not engender problems related to misunderstanding or deception. In contrast, we show that the ambiguous nature of robots, which has been noted before (see Alač Reference Alač2016; Hung et al. Reference Hung, Gregorio, Mann, Wallsworth, Horne, Berndt, Liu, Woldum, Au-Yeung and Chaudhury2021; Krummheuer Reference Krummheuer2016; Persson, Iversen et al. Reference Persson, Iversen and Redmalm2024), makes a flexible use possible. This flexibility might be accentuated by the pet form: animals can be treated as both present and absent in interactions, and their interactional contributions do not necessarily warrant responses. Therefore, we argue that while transparency is likely necessary in activities related to information transmission, situations characterised by play, humour and small talk may benefit from vague definitions of the situation. The findings of this study therefore point to the relevance of reconsidering the principle of transparency in social robotics (see Dunn et al. Reference Dunn, Balfour, Moyle, Cooke, Martin, Crystal and Yen2013).

Vagueness is a known interactional resource in situations where sensitive topics are discussed, such as in end-of-life care (Parry et al. Reference Parry, Land and Seymour2014). Similar to findings in such settings, in our study vagueness was a resource for care-givers to increase patients’ opportunities to decide on activity frameworks when interacting with robot animals. In addition, we found that a lack of understanding was treated as generating wonder and surprise – making the robots exciting rather than difficult. While care-givers had a central role as facilitators (see Alač Reference Alač2016; Ferm et al. Reference Ferm, Claesson, Ottesjö and Ericsson2015; Krummheuer Reference Krummheuer2016), this role was connected to encouraging and enabling playful interactions, thus acting as a participant rather than an authority. Play can be an inclusive activity; our analysis shows examples where it includes pretence that may be related to difficulties distinguishing between reality and fabrication (Lindholm Reference Lindholm2015) as well as playfulness related to just laughing together (Holt Reference Holt2016). Thus, the findings support studies that have highlighted the intertwined nature of sociability and institutional goals in institutional talk (e.g. Finlay et al. Reference Finlay, Antaki, Walton and Stribling2008; Iversen et al. Reference Iversen, Flinkfeldt, Tuncer and Laurier2022). The aim of therapeutic interventions for people with dementia is often external to social interactions, such as improving patients’ memory or practising their cognitive skills, thereby adhering to a specific normative ideal of what it is to be an adult human (Jenkins Reference Jenkins2017). Quinn and Blandon (Reference Quinn and Blandon2020), among others, have called for an extended use of playful interventions that turn the focus away from attempts to push back a decline in cognitive abilities towards finding ways of validating patients’ ‘energy, vitality and resistance’ (Quinn and Blandon Reference Quinn and Blandon2020, 19; also see Buse Reference Buse, Balmer, Keady, Nettleton and Swift2023; Dunn et al. Reference Dunn, Balfour, Moyle, Cooke, Martin, Crystal and Yen2013; Kontos et al. Reference Kontos, Miller, Mitchell and Stirling-Twist2017). Interactions involving robots that support playfulness and experimentation could potentially contribute to therapeutic interventions in dementia care, validating patients’ ways of being.

Social inclusion versus the right not to play

Our focus on collaborative sense-making (Goffman Reference Goffman1974) also highlights how different frames might result in schisms and suggests that this is a question at the core of the ethical issues surrounding robot animals in dementia care. In relation to ageing with technology, dignity is sometimes equated with managing ‘without the help of human hands’ (Hansen Reference Hansen2023, 1015). Previous research has therefore warned that social robots risk increasing social isolation and impacting negatively on patients’ dignity (e.g. Parks Reference Parks2010; Sharkey Reference Sharkey2014). However, Hung et al. (Reference Hung, Gregorio, Mann, Wallsworth, Horne, Berndt, Liu, Woldum, Au-Yeung and Chaudhury2021) show that a sense of inclusion – for example, an experience of being liked by the robot – is a key benefit that social robots can promote. In line with this, our findings suggest a different ethical problem: when using social robots, care-givers need to balance inclusion with the right not to be playful.

Ethical guidelines concerning dignity stress the importance of respecting a person’s right to integrity (e.g. United Nations 2006, Article 17) and scholars have acknowledged that care interventions in older age can potentially threaten this (e.g. Hansen Reference Hansen2023; Nordenfelt Reference Nordenfelt2004). In our final example, both the care-giver and Sigrid responded flexibly to Cristina’s resistance by moving from play to small talk, thereby prioritising inclusion over radically different ways of making sense of the robot. The care-giver’s subsequent work to include Cristina in a playful frame is reminiscent of findings by Finlay et al. (Reference Finlay, Antaki, Walton and Stribling2008), which show how the staff in a care home for persons with learning difficulties treat a lack of responses to play initiations as temporary resistance and a part of teasing and playfulness. By reframing initial resistance, staff members could overcome schisms. While this is related to the institutional imperative to encourage social interaction, it also involves the risk of disrespecting patients’ choice to stay passive – and thus their integrity (compare with Nordenfelt Reference Nordenfelt2004).

Given the potential threats to integrity that an illness such as dementia carries with it (see Tranvåg et al. Reference Tranvåg, Petersen and Nåden2013), it may be especially important that care-givers respect that not all patients approach robot animals in playful ways. Overzealous directions and encouragement can tip over into treating patients as incompetent, and, if care-givers pursue playfulness in the face of resistance, patients may experience that care-givers laugh at them rather than with them (see Finlay et al. Reference Finlay, Antaki, Walton and Stribling2008). Although the care-giver’s actions in our example successfully included all participants in a small talk frame and then in playfulness, it downplayed Cristina’s resistance to treating the robot as a ratified participant. This example thus highlights the complex decisions care-givers need to be prepared to make in the moment when playfulness is part of care (Kontos et al. Reference Kontos, Miller, Mitchell and Stirling-Twist2017; Złotowski et al. Reference Złotowski, Khalil and Abdallah2020).

Limitations

The ethical principles guiding the study have brought limitations to it. One limitation is that we asked for consent when care-givers thought that this was suitable, and as a result we included only patients with mild to moderate dementia. Previous studies (e.g. Jøranson et al. Reference Jøranson, Pedersen, Rokstad, Aamodt, Olsen and Ihlebæk2016; Takayanagi et al. Reference Takayanagi, Kirita and Shibata2014) have shown that care-givers are more supportive when patients with mild dementia interact with social robots and that patients with severe dementia may experience issues with robots owing to attention problems. However, our ethnography (Persson, Iversen et al. Reference Persson, Iversen and Redmalm2024) indicates that patients with severe dementia sometimes are more ready to accept robot animals. The use of social robots by patients with severe dementia is thus an area in need of study that the current article does not address.

Another limitation is that, while providing insights into the details of how patients and care-givers establish a working consensus, the findings are restricted to showing examples that the care-givers and patients agreed were suitable to film. While we did not notice any different behaviours when the camera was off, the fact that we participated in some interactions meant that the patients and care-givers treated us as guests. We tried to blend in, adjusting our behaviour to that of the other participants, but both care-givers and patients knew that we were interested in the robots, which may have led to their having a more prominent role in interactions than they would have had without us present. Similarly, care-givers may have wanted to demonstrate good practice. This means that we have not necessarily covered the most relevant frameworks when social robots are involved. While our ethnographic work confirmed that social situations such as the ones we filmed were regular at the care home and not staged for us, they were not the only ones. For example, patients often used robot animals alone in their own rooms or apartments, and robot animals were regularly introduced when a patient was agitated (Persson et al. Reference Persson, Ferm, Redmalm and Iversen2023; Persson, Thunman et al. Reference Persson, Thunman, Iversen and Redmalm2024). Thus, our study is limited to examining the robots’ emerging meaning in social situations seen as non-sensitive, involving care-givers and multiple patients with mild to moderate dementia.

Conclusion

The study contributes to showing certain aspects of how robot technologies ‘become[] relevant inthe complex relations and social setting in which they unfold their social and interactive power’ (Krummheuer Reference Krummheuer2016, 887) – namely, the robots’ construction that allows people to interact playfully with them and each other without firmly establishing their ontological status. By showing the stepwise emergence of different frames, the study also calls for expanding interview studies and the rich ethnographic accounts that are typical for science and technology studies with further detailed interactional analyses. The unproblematic approach to the lack of transparency about the robots’ nature, as well as the prioritisation of inclusion over the choice not to play, highlights the importance of studying in detail how care-givers and patients interact with social robots in their everyday lives, not just in situations guided by researchers’ interview questions. To fully explore the co-constitution of ageing and technology (Peine and Neven Reference Peine and Neven2021), we need to study how this unfolds in interaction.

Implications for practice

Although the study draws on a small dataset, the findings demonstrate the need to reconsider ethical problems related to the use of robot animals in dementia care. We show that the principle of transparency as an ethical ideal may conflict with activities such as play and small talk, which can validate patients’ ways of being. However, when playfulness involves boundary-pushing practices such as teasing, it requires extra care to ensure patients’ right not to participate in play.

Supplementary material

The supplementary material for this article can be found at https://doi.org/10.1017/S0144686X24000539.

Author contributions

The contributions were: conception of the work CI, MP and DR; acquisition of the data CI and MP; analysis of the data CI; drafting of the work CI; revising it critically for important intellectual content CI, MP and DR; and final approval of the version to be published CI, MP and DR.

Competing interests

The authors declare none.

Ethical standards

This study was approved by the Ethics Authority in Sweden (Dnr. 2020-01853).

References

Abbott, R, Orr, N, McGill, P, Whear, R, Bethel, A, Garside, R, Stein, K and Thompson-Coon, J (2019) How do ‘robopets’ impact the health and well‐being of patients in care homes? A systematic review of qualitative and quantitative evidence. International Journal of Older People Nursing 14, . https://doi.org/10.1111/opn.12239.CrossRefGoogle Scholar
Alač, M (2016) Social robots: Things or agents? AI and Society 31, 519535. https://doi.org/10.1007/s00146-015-0631-6.CrossRefGoogle Scholar
Beedholm, K, Frederiksen, K, Frederiksen, AMS and Lomborg, K (2015) Attitudes to a robot bathtub in Danish elder care: A hermeneutic interview study. Nursing and Health Sciences, 17, 280286. http://doi.org/10.1111/nhs.12184.CrossRefGoogle ScholarPubMed
Birks, M, Bodak, M, Barlas, J, Harwood, J and Pether, M (2016) Robotic seals as therapeutic tools in an aged care facility: A qualitative study. Journal of Aging Research 2016, . https://doi.org/10.1155/2016/8569602.CrossRefGoogle Scholar
Blackman, T (2013) Care robots for the supermarket shelf: A product gap in assistive technologies. Ageing & Society 33, 763781. https://doi.org/10.1017/S0144686X1200027X.CrossRefGoogle Scholar
Buse, C, Balmer, A, Keady, J, Nettleton, S and Swift, S (2023) ‘Ways of being’ in the domestic garden for people living with dementia: Doing, sensing and playing. Ageing & Society 125. https://doi.org/10.1017/S0144686X22001489.CrossRefGoogle Scholar
Butler, CW, Duncombe, R, Mason, C and Sandford, R (2016) Recruitments, engagements, and partitions: Managing participation in play. International Journal of Play 5, 4763. https://doi.org/10.1080/21594937.2016.1147287.CrossRefGoogle Scholar
Chevallier, M (2023) Staging Paro: The care of making robot(s) care. Social Studies of Science 53, 635659. https://doi.org/10.1177/03063127221126148.CrossRefGoogle ScholarPubMed
Dowlen, R, Keady, J, Milligan, C, Swarbrick, C, Ponsillo, N, Geddes, L and Riley, B (2022) In the moment with music: An exploration of the embodied and sensory experiences of people living with dementia during improvised music-making. Ageing & Society 42, 26422664. https://doi.org/10.1017/S0144686X21000210.CrossRefGoogle Scholar
Drew, P (1987) Po-faced receipts of teases. Linguistics 25, 219253. https://doi.org/10.1515/ling.1987.25.1.219.CrossRefGoogle Scholar
Dunn, J, Balfour, M, Moyle, W, Cooke, M, Martin, K, Crystal, C and Yen, A (2013) Playfully engaging people living with dementia: Searching for Yum Cha moments. International Journal of Play 2, 174186. https://doi.org/10.1080/21594937.2013.852052.CrossRefGoogle Scholar
Ferguson, CA (1964) Baby talk in six languages. American Anthropologist 66, 103114. https://doi.org/10.1525/aa.1964.66.suppl_3.02a00060.CrossRefGoogle Scholar
Ferm, UM, Claesson, BK, Ottesjö, C and Ericsson, S (2015) Participation and enjoyment in play with a robot between children with cerebral palsy who use AAC and their peers. Augmentative and Alternative Communication 31, 108123. https://doi.org/10.3109/07434618.2015.1029141.CrossRefGoogle ScholarPubMed
Fine, GA (1983) Shared Fantasy: Role Playing Games as Social Worlds. Chicago, IL: University of Chicago Press.Google Scholar
Finlay, WML, Antaki, C, Walton, C and Stribling, P (2008) The dilemma for staff in ‘playing a game’ with a person with profound intellectual disabilities: Empowerment, inclusion and competence in interactional practice. Sociology of Health and Illness 30, 531549. https://doi.org/10.1111/j.1467-9566.2007.01080.x.CrossRefGoogle ScholarPubMed
Flinkfeldt, M, Iversen, C, Jørgensen, SE, Monteiro, D and Wilkins, D (2022) Conversation analysis in social work research: A scoping review. Qualitative Social Work, 21, 10111042. https://doi.org/10.1177/14733250221124215.CrossRefGoogle Scholar
Goffman, E (1974) Frame Analysis: An Essay on the Organization of Experience. New York: Harper & Row.Google Scholar
Hammersley, M and Atkinson, P (2007) Ethnography: Principles in Practice. Abingdon: Routledge.Google Scholar
Hansen, AM (2023). Dignity equals distance? Pursuing dignity in care for older adults. Ageing & Society 43, 10031021. https://doi.org/10.1017/S0144686X21000891.CrossRefGoogle Scholar
Harjunpää, K (2022). Repetition and prosodic matching in responding to pets’ vocalizations. Langage et Société 176, 69102. https://doi.org/10.3917/ls.176.0071.CrossRefGoogle Scholar
Heinemann, T, Lindström, A and Steensig, J (2011) Addressing epistemic incongruence in question-answer sequences through the use of epistemic adverbs. In Stivers, T, Mondada, L and Steensig, J (eds), The Morality of Knowledge in Conversation. New York: Cambridge University Press, 107130.CrossRefGoogle Scholar
Holt, E (2016) Laughter at last: Playfulness and laughter in interaction. Journal of Pragmatics 100, 89102. https://doi.org/10.1016/j.pragma.2016.04.012.CrossRefGoogle Scholar
Huizinga, J (1964) Homo Ludens: A Study of the Play Element in Culture, vol. 4. Boston, MA: Beacon Press.Google Scholar
Hung, L, Gregorio, M, Mann, J, Wallsworth, C, Horne, N, Berndt, A, Liu, C, Woldum, E, Au-Yeung, A and Chaudhury, H (2021) Exploring the perceptions of people with dementia about the social robot Paro in a hospital setting. Dementia, 20, 485504. https://doi.org/10.1177/1471301219894141.CrossRefGoogle ScholarPubMed
Hunter, PV, Hadjistavropoulos, T and Kaasalainen, S (2016) A qualitative study of nursing assistants’ awareness of person-centred approaches to dementia care. Ageing & Society 36, 12111237. https://doi.org/10.1017/S0144686X15000276.CrossRefGoogle Scholar
Iversen, C, Flinkfeldt, M, Tuncer, S and Laurier, E (2022) The uses of small talk in social work: Weather as a resource for informally pursuing institutional tasks. Qualitative Social Work 21, 10431062. https://doi.org/10.1177/14733250221124218.CrossRefGoogle Scholar
Jenkins, N (2017) No substitute for human touch? Towards a critically posthumanist approach to dementia care. Ageing & Society 37, 14841498. https://doi.org/10.1017/S0144686X16000453.CrossRefGoogle Scholar
Johansson-Pajala, RM, Thommes, K, Hoppe, JA, Tuisku, O, Hennala, L, Pekkarinen, S, Melkas, H and Gustafsson, C (2020) Care robot orientation: What, who and how? Potential users’ perceptions. International Journal of Social Robotics 12, 11031117. https://doi.org/10.1007/s12369-020-00619-y.CrossRefGoogle Scholar
Jøranson, N, Pedersen, I, Rokstad, AM, Aamodt, G, Olsen, C and Ihlebæk, C (2016) Group activity with Paro in nursing homes: Systematic investigation of behaviors in participants. International Psychogeriatrics 28, 13451354. https://doi.org/10.1017/S1041610216000120.CrossRefGoogle ScholarPubMed
Jung, M, van der Leij, L and Kelders, S (2017) An exploration of the benefits of an animallike robot companion with more advanced touch interaction capabilities for dementia care. Frontiers in ICT 4, . https://doi.org/10.3389/fict.2017.00016.CrossRefGoogle Scholar
Killick, J (2013) Playfulness in Dementia. London: Jessica Kingsley.Google Scholar
Kontos, P (2004) Ethnographic reflections on selfhood, embodiment and Alzheimer’s disease. Ageing & Society 24, 829849. https://doi.org/10.1017/S0144686X04002375.CrossRefGoogle Scholar
Kontos, P, Miller, KL, Mitchell, GJ and Stirling-Twist, J (2017) Presence redefined: The reciprocal nature of engagement between elder-clowns and persons with dementia. Dementia 16, 4666. https://doi.org/10.1177/1471301215580895.CrossRefGoogle ScholarPubMed
Krummheuer, AL (2016) Who am I? What are you? Identity construction in encounters between a teleoperated robot and people with acquired brain injury. Social Robotics 9979, 880889. https://doi.org/10.1007/978-3-319-47437-3_86.CrossRefGoogle Scholar
Lindholm, C (2015) Parallel realities: The interactional management of confabulation in dementia care encounters. Research on Language and Social Interaction 48, 176199. https://doi.org/10.1080/08351813.2015.1025502.CrossRefGoogle Scholar
Lipp, B (2024) Robot drama: Investigating frictions between vision and demonstration in care robotics. Science, Technology, and Human Values 49, 318343. https://doi.org/10.1177/016224392211201.CrossRefGoogle Scholar
Majlesi, AR and Ekström, A (2016) Baking together: The coordination of actions in activities involving people with dementia. Journal of Aging Studies 38, 3746. https://doi.org/10.1016/j.jaging.2016.04.004.CrossRefGoogle ScholarPubMed
Malinowski, B (1936) Culture as a determinant of behavior. Scientific Monthly 43, 440449. https://doi.org/10.1155/2016/8569602.Google Scholar
Mok, Z and Müller, N (2014) Staging casual conversations for people with dementia. Dementia 13, 834853. https://doi.org/10.1177/1471301213488609.CrossRefGoogle ScholarPubMed
Mondada, L (2018) Multiple temporalities of language and body in interaction: Challenges for transcribing multimodality. Research on Language and Social Interaction 51, 85106. https://doi.org/10.1080/08351813.2018.1413878.CrossRefGoogle Scholar
Moyle, W, Bramble, M, Jones, C and Murfield, J (2018) Care staff perceptions of a social robot called Paro and a look-alike plush toy: A descriptive qualitative approach. Aging and Mental Health 22, 330335. https://doi.org/10.1080/13607863.2016.1262820.CrossRefGoogle Scholar
Nordenfelt, L (2004) The varieties of dignity. Health Care Analysis: Journal of Health Philosophy and Policy 12, 6989. https://doi.org/10.1023/B:HCAN.0000041183.78435.4b.CrossRefGoogle ScholarPubMed
Örulv, L and Hydén, L-C (2006) Confabulation: Sense-making, self-making and world-making in dementia. Discourse Studies 8, 647673. https://doi.org/10.1177/1461445606067333.CrossRefGoogle Scholar
Parks, J (2010) Lifting the burden of women’s care work: Should robots replace the ‘human touch’? Hypatia 25, 100120. http://doi.org/10.1111/j.1527-2001.2009.01086.xCrossRefGoogle Scholar
Parry, R, Land, V and Seymour, J (2014) How to communicate with patients about future illness progression and end of life: A systematic review. British Medical Journal Supportive Palliative Care 4, 331341. https://doi.org/10.1136/bmjspcare-2014-000649.CrossRefGoogle ScholarPubMed
Peine, A and Neven, L (2021) The co-constitution of ageing and technology: A model and agenda. Ageing & Society 41, 28452866. https://doi.org/10.1017/S0144686X20000641.CrossRefGoogle Scholar
Peräkylä, A and Vehviläinen, S (2003) Conversation analysis and the professional stocks of interactional knowledge. Discourse and Society 14, 727750. https://doi.org/10.1177/09579265030146003.CrossRefGoogle Scholar
Persson, M, Ferm, L, Redmalm, D and Iversen, C (2023) Working with robotic animals in dementia care: The significance of caregivers’ competences. Nordic Journal of Working Life Studies 13, 4969. https://doi.org/10.18291/njwls.136521.Google Scholar
Persson, M, Iversen, C and Redmalm, D (2024) Making robots matter in dementia care: Conceptualising the triadic interaction between caregiver, resident and robot animal. Sociology of Health and Illness 46, 11921211. https://doi.org/10.1111/1467-9566.13786.CrossRefGoogle ScholarPubMed
Persson, M, Redmalm, D and Iversen, C (2021) Caregivers’ use of robots and their effect on work environment: A scoping review. Journal of Technology in Human Services 40, 251277. https://doi.org/10.1080/15228835.2021.2000554.CrossRefGoogle Scholar
Persson, M, Thunman, E, Iversen, C and Redmalm, D (2024) Robotic misinformation in dementia care: Emotions as sense-making resources in residents’ encounters with robot animals. Frontiers in Sociology. Section: Sociology of Emotion 9, . https://doi.org/10.3389/fsoc.2024.1354978.Google ScholarPubMed
Pfadenhauer, M and Dukat, C (2015) Robot caregiver or robot-supported caregiving? International Journal of Social Robotics 7, 393406. https://doi.org/10.1007/s12369-015-0284-0.CrossRefGoogle Scholar
Quinn, J and Blandon, C (2020) Lifelong Learning and Dementia: A Posthumanist Perspective. Cham: Springer Nature.CrossRefGoogle Scholar
Roger, K, Guse, L, Mordoch, E and Osterreicher, A (2012) Social commitment robots and dementia. Canadian Journal of Ageing 31, 8794. http://doi.org/10.1017/s0714980811000663.Google ScholarPubMed
Schegloff, EA (1996) Some practices for referring to persons in talk-in-interaction: A partial sketch of a systematics. In Fox, BA (ed), Studies in Anaphora. Amsterdam: John Benjamins, 437486.CrossRefGoogle Scholar
Schneider, J, Hazel, S, Morgner, C and Dening, T (2019) Facilitation of positive social interaction through visual art in dementia: A case study using video-analysis. Ageing & Society 39, 17311751. https://doi.org/10.1017/S0144686X1800020X.CrossRefGoogle ScholarPubMed
Sharkey, A (2014) Robots and human dignity: A consideration of the effect of robot care on dignity of older people. Ethics of Information Technology 16, 6375. https://doi.org/10.1007/s10676-014-9338-5.CrossRefGoogle Scholar
Sidnell, J and Stivers, T (eds) (2013) The Handbook of Conversation Analysis. Chichester: Wiley Blackwell.Google Scholar
Siitonen, P, Rauniomaa, M and Keisanen, T (2021) Language and the moving body: Directive actions with the Finnish kato ‘look’ in nature-related activities. Frontiers in Psychology 12. https://doi.org/10.3389/fpsyg.2021.661784.CrossRefGoogle ScholarPubMed
Statens medicin-etiska råd [SMER (Swedish Medical Ethics Council)] (2014) Robotar Och Övervakning I Vården Av Äldre – Etiska Aspekter. [Robots and Surveillance in the Care of Elderly – Ethical Aspects]. Stockholm: SMER.Google Scholar
Takayanagi, K, Kirita, T and Shibata, T (2014) Comparison of verbal and emotional responses of elderly people with mild/moderate dementia and those with severe dementia in responses to seal robot, Paro. Frontiers in Aging Neuroscience 6, . https://doi.org/10.3389/fnagi.2014.00257.CrossRefGoogle ScholarPubMed
Tranvåg, O, Petersen, KA and Nåden, D (2013) Dignity-preserving dementia care: A metasynthesis. Nursing Ethics 20, 861880. https://doi.org/10.1177/0969733013485110.CrossRefGoogle Scholar
Tuncer, S, Licoppe, C, Luff, P and Heath, C (2023) Recipient design in human–robot interaction: The emergent assessment of a robot’s competence. AI and Society 39, 17951810. https://doi.org/10.1007/s00146-022-01608-7.CrossRefGoogle Scholar
United Nations (2006) Convention on the Rights of Persons with Disabilities. New York: United Nations.Google Scholar
Wilkinson, S and Kitzinger, C (2006) Surprise as an interactional achievement: Reaction tokens in conversation. Social Psychology Quarterly 69, 150182. https://doi.org/10.1177/019027250606900203.CrossRefGoogle Scholar
Wright, J (2018) Tactile care, mechanical hugs: Japanese caregivers and robotic lifting devices. Asian Anthropology 17, 2439. https://doi.org/10.1080/1683478X.2017.1406576.CrossRefGoogle Scholar
Yasui, E (2023) Sequence-initial pointing: Spotlighting what just happened as a cause of a new sequence. Discourse Studies 25, 409429. https://doi.org/10.1177/14614456221132464.CrossRefGoogle Scholar
Złotowski, J, Khalil, A and Abdallah, S (2020) One robot doesn’t fit all: Aligning social robot appearance and job suitability from a Middle Eastern perspective. AI and Society 35, 485500. https://doi.org/10.1007/s00146-019-00895-x.CrossRefGoogle Scholar
Figure 0

Table 1. Care homes

Figure 1

Table 2 Patients participating in the video recordings

Supplementary material: File

Iversen et al. supplementary material 1

Iversen et al. supplementary material
Download Iversen et al. supplementary material 1(File)
File 22.3 KB
Supplementary material: File

Iversen et al. supplementary material 2

Iversen et al. supplementary material
Download Iversen et al. supplementary material 2(File)
File 22.4 KB
Supplementary material: File

Iversen et al. supplementary material 3

Iversen et al. supplementary material
Download Iversen et al. supplementary material 3(File)
File 214.3 KB