Clark and Fischer (C&F) offer a new approach for how people construe social robots, arguing that people interpret social robots not as social agents, but as interactive depictions. We agree with the authors that like voice assistants, people expect to interact with social robots. However, in contrast to C&F, we argue that children do not construe social robots as interactive toys but instead treat them as social learning partners. Such a distinction is important, as children's environments are increasingly filled with robots: According to the Allied Business Intelligence Inc. by 2024, over 79 million homes will have at least one robot. Thus, an examination of research focusing on how children selectively learn from social others is critical to exploring children's early understanding of and interaction with social robots.
Young children can learn about the world around them through their own first-hand observations, exploration, experimentation, and by actively seeking information from social learning partners including testimony from adults (e.g., caregivers, teachers, peers; Harris, Koenig, Corriveau, & Jaswal, Reference Harris, Koenig, Corriveau and Jaswal2018; Wang, Tong, & Danovitch, Reference Wang, Tong and Danovitch2019) as well as nonhuman agents such as voice assistants (Siri, Alexa; Aeschlimann, Bleiker, Wechner, & Gampe, Reference Aeschlimann, Bleiker, Wechner and Gampe2020; Girouard-Hallam & Danovitch, Reference Girouard-Hallam and Danovitch2022; Girouard-Hallam, Streble, & Danovitch, Reference Girouard-Hallam, Streble and Danovitch2021; Oranç & Ruggeri, Reference Oranç and Ruggeri2021), computers (e.g., Danovitch & Alzahabi, Reference Danovitch and Alzahabi2013; Noles, Danovitch, & Shafto, Reference Noles, Danovitch and Shafto2015), or social robots (Breazeal et al., Reference Breazeal, Harris, DeSteno, Westlund, Dickens and Jeong2016; Oranç & Ruggeri, Reference Oranç and Ruggeri2021). Indeed, prior work demonstrates that toddlers (aged 18–24 months; Movellan, Eckhardt, Virnes, & Rodriguez, Reference Movellan, Eckhardt, Virnes and Rodriguez2009) and children (aged 3–6; Tanaka & Matsuzoe, Reference Tanaka and Matsuzoe2012) are able to learn new words from social robots, suggesting that from an early age, children treat such as agents as learning partners and critical sources of information.
To date, an extensive body of literature examining children's trust of testimony from others indicates that preschoolers are surprisingly selective when deciding whom to learn from (Harris, Reference Harris2012; see Harris et al., Reference Harris, Koenig, Corriveau and Jaswal2018 for review; Mills, Reference Mills2013). Preschoolers attend to the informant's epistemic characteristics such as an individual's prior accuracy or expertise (e.g., Birch, Vauthier, & Bloom, Reference Birch, Vauthier and Bloom2008; Harris & Corriveau, Reference Harris and Corriveau2011; Sobel & Kushnir, Reference Sobel and Kushnir2013) as well as social characteristics including familiarity, eye contact, confidence (or uncertainty), social group, and contingent interactions (e.g., Brink & Wellman, Reference Brink and Wellman2020; Corriveau & Harris, Reference Corriveau and Harris2009; Corriveau, Kinzler, & Harris, Reference Corriveau, Kinzler and Harris2013; Koenig, Clement, & Harris, Reference Koenig, Clement and Harris2004).
Importantly, unlike what is proposed by C&F, children appear to employ similar strategies when determining the credibility of social robots, as they do when they make inferences about humans (e.g., Danovitch et al., Reference Danovitch and Alzahabi2013; Oranç & Ruggeri, Reference Oranç and Ruggeri2021). Moreover, like they do with humans, children prefer to learn from an accurate over inaccurate computer (Danovitch et al., Reference Danovitch and Alzahabi2013) and an accurate over an inaccurate social robot (aged 3; Brink & Wellman, Reference Brink and Wellman2020; Geiskkovitch et al., Reference Geiskkovitch, Thiessen, Young and Glenwright2019). Similarly, they prefer to ask for information from a robot who engaged in greater contingent behavior (aged 3–5; Breazeal et al., Reference Breazeal, Harris, DeSteno, Westlund, Dickens and Jeong2016) or a more interactive teaching style (aged 4–6; Okita, Ng-Thow-Hing, & Sarvadevabhatla, Reference Okita, Ng-Thow-Hing and Sarvadevabhatla2009). Additionally, young children (aged 3–6) even attribute expertise to social robots in certain domains, with children are more likely to direct questions to robots (vs. an adult information) when the topic was about machines, but more likely to ask humans about psychological or physics-related questions. Taken together, these data support the idea that children engage with social robots in much the same way as they do with other social informants – and importantly, not simply as interactive depictions.
Further, although children as young as 3 recognize that nonhuman agents are not alive (Jipson & Gelman, Reference Jipson and Gelman2007), they treat them as they would other interlocutors. Indeed, children view computers and social robots as factual sources of information (Danovitch & Keil, Reference Danovitch and Keil2008) and attribute mental capacities, moral and psychological characteristics to social robots (Breazeal et al., Reference Breazeal, Harris, DeSteno, Westlund, Dickens and Jeong2016 [children aged 3–5]; Kahn et al., Reference Kahn, Kanda, Ishiguro, Gill, Ruckert, Shen and Severson2012 [children aged 9–12]; Bernstein & Crowley, Reference Bernstein and Crowley2008 [children aged 4–7]) as well as voice assistants (e.g., Girouard-Hallam et al., Reference Girouard-Hallam, Streble and Danovitch2021 [children aged 6–10]). Moreover, such judgments about the capacities of nonhuman agents can also impact children's learning preferences. For example, 3–6-year-olds who attributed greater perceptual abilities to robots were more likely to choose to learn from a robot rather than a human informant (Oranç & Ruggeri, Reference Oranç and Ruggeri2021). These data support the notion that children view such agents as true social learning partners, and not simply interactive toys similar to dolls.
In sum, the authors argue that children construe social robots as interactive toys.
However, we argue that equating social robots to other toys children may use in pretend play does not account for the critical role that robots play in children's early learning. We urge C&F to consider this more sophisticated view of social robots and how this would impact their theoretical perspective. Such a view is increasingly important in today's society. Children today spend a great deal of time interacting with and learning from nonhuman agents including social robots and voice assistants, highlighting the need for consideration of children's use of social robots as social learning partners across the lifespan.
Clark and Fischer (C&F) offer a new approach for how people construe social robots, arguing that people interpret social robots not as social agents, but as interactive depictions. We agree with the authors that like voice assistants, people expect to interact with social robots. However, in contrast to C&F, we argue that children do not construe social robots as interactive toys but instead treat them as social learning partners. Such a distinction is important, as children's environments are increasingly filled with robots: According to the Allied Business Intelligence Inc. by 2024, over 79 million homes will have at least one robot. Thus, an examination of research focusing on how children selectively learn from social others is critical to exploring children's early understanding of and interaction with social robots.
Young children can learn about the world around them through their own first-hand observations, exploration, experimentation, and by actively seeking information from social learning partners including testimony from adults (e.g., caregivers, teachers, peers; Harris, Koenig, Corriveau, & Jaswal, Reference Harris, Koenig, Corriveau and Jaswal2018; Wang, Tong, & Danovitch, Reference Wang, Tong and Danovitch2019) as well as nonhuman agents such as voice assistants (Siri, Alexa; Aeschlimann, Bleiker, Wechner, & Gampe, Reference Aeschlimann, Bleiker, Wechner and Gampe2020; Girouard-Hallam & Danovitch, Reference Girouard-Hallam and Danovitch2022; Girouard-Hallam, Streble, & Danovitch, Reference Girouard-Hallam, Streble and Danovitch2021; Oranç & Ruggeri, Reference Oranç and Ruggeri2021), computers (e.g., Danovitch & Alzahabi, Reference Danovitch and Alzahabi2013; Noles, Danovitch, & Shafto, Reference Noles, Danovitch and Shafto2015), or social robots (Breazeal et al., Reference Breazeal, Harris, DeSteno, Westlund, Dickens and Jeong2016; Oranç & Ruggeri, Reference Oranç and Ruggeri2021). Indeed, prior work demonstrates that toddlers (aged 18–24 months; Movellan, Eckhardt, Virnes, & Rodriguez, Reference Movellan, Eckhardt, Virnes and Rodriguez2009) and children (aged 3–6; Tanaka & Matsuzoe, Reference Tanaka and Matsuzoe2012) are able to learn new words from social robots, suggesting that from an early age, children treat such as agents as learning partners and critical sources of information.
To date, an extensive body of literature examining children's trust of testimony from others indicates that preschoolers are surprisingly selective when deciding whom to learn from (Harris, Reference Harris2012; see Harris et al., Reference Harris, Koenig, Corriveau and Jaswal2018 for review; Mills, Reference Mills2013). Preschoolers attend to the informant's epistemic characteristics such as an individual's prior accuracy or expertise (e.g., Birch, Vauthier, & Bloom, Reference Birch, Vauthier and Bloom2008; Harris & Corriveau, Reference Harris and Corriveau2011; Sobel & Kushnir, Reference Sobel and Kushnir2013) as well as social characteristics including familiarity, eye contact, confidence (or uncertainty), social group, and contingent interactions (e.g., Brink & Wellman, Reference Brink and Wellman2020; Corriveau & Harris, Reference Corriveau and Harris2009; Corriveau, Kinzler, & Harris, Reference Corriveau, Kinzler and Harris2013; Koenig, Clement, & Harris, Reference Koenig, Clement and Harris2004).
Importantly, unlike what is proposed by C&F, children appear to employ similar strategies when determining the credibility of social robots, as they do when they make inferences about humans (e.g., Danovitch et al., Reference Danovitch and Alzahabi2013; Oranç & Ruggeri, Reference Oranç and Ruggeri2021). Moreover, like they do with humans, children prefer to learn from an accurate over inaccurate computer (Danovitch et al., Reference Danovitch and Alzahabi2013) and an accurate over an inaccurate social robot (aged 3; Brink & Wellman, Reference Brink and Wellman2020; Geiskkovitch et al., Reference Geiskkovitch, Thiessen, Young and Glenwright2019). Similarly, they prefer to ask for information from a robot who engaged in greater contingent behavior (aged 3–5; Breazeal et al., Reference Breazeal, Harris, DeSteno, Westlund, Dickens and Jeong2016) or a more interactive teaching style (aged 4–6; Okita, Ng-Thow-Hing, & Sarvadevabhatla, Reference Okita, Ng-Thow-Hing and Sarvadevabhatla2009). Additionally, young children (aged 3–6) even attribute expertise to social robots in certain domains, with children are more likely to direct questions to robots (vs. an adult information) when the topic was about machines, but more likely to ask humans about psychological or physics-related questions. Taken together, these data support the idea that children engage with social robots in much the same way as they do with other social informants – and importantly, not simply as interactive depictions.
Further, although children as young as 3 recognize that nonhuman agents are not alive (Jipson & Gelman, Reference Jipson and Gelman2007), they treat them as they would other interlocutors. Indeed, children view computers and social robots as factual sources of information (Danovitch & Keil, Reference Danovitch and Keil2008) and attribute mental capacities, moral and psychological characteristics to social robots (Breazeal et al., Reference Breazeal, Harris, DeSteno, Westlund, Dickens and Jeong2016 [children aged 3–5]; Kahn et al., Reference Kahn, Kanda, Ishiguro, Gill, Ruckert, Shen and Severson2012 [children aged 9–12]; Bernstein & Crowley, Reference Bernstein and Crowley2008 [children aged 4–7]) as well as voice assistants (e.g., Girouard-Hallam et al., Reference Girouard-Hallam, Streble and Danovitch2021 [children aged 6–10]). Moreover, such judgments about the capacities of nonhuman agents can also impact children's learning preferences. For example, 3–6-year-olds who attributed greater perceptual abilities to robots were more likely to choose to learn from a robot rather than a human informant (Oranç & Ruggeri, Reference Oranç and Ruggeri2021). These data support the notion that children view such agents as true social learning partners, and not simply interactive toys similar to dolls.
In sum, the authors argue that children construe social robots as interactive toys.
However, we argue that equating social robots to other toys children may use in pretend play does not account for the critical role that robots play in children's early learning. We urge C&F to consider this more sophisticated view of social robots and how this would impact their theoretical perspective. Such a view is increasingly important in today's society. Children today spend a great deal of time interacting with and learning from nonhuman agents including social robots and voice assistants, highlighting the need for consideration of children's use of social robots as social learning partners across the lifespan.
Financial support
This work received no specific grant from any funding agency.
Competing interest
None.