Hostname: page-component-cd9895bd7-gbm5v Total loading time: 0 Render date: 2024-12-26T17:31:51.213Z Has data issue: false hasContentIssue false

Social robots as social learning partners: Exploring children's early understanding and learning from social robots

Published online by Cambridge University Press:  05 April 2023

Amanda Haber
Affiliation:
Wheelock College of Education and Human Development, Boston University, Boston, MA 02215, USA haber317@bu.edu; kcorriv@bu.edu
Kathleen H. Corriveau
Affiliation:
Wheelock College of Education and Human Development, Boston University, Boston, MA 02215, USA haber317@bu.edu; kcorriv@bu.edu

Abstract

Clark and Fischer propose that people interpret social robots not as social agents, but as interactive depictions. Drawing on research focusing on how children selectively learn from social others, we argue that children do not view social robots as interactive toys but instead treat them as social learning partners and critical sources of information.

Type
Open Peer Commentary
Copyright
Copyright © The Author(s), 2023. Published by Cambridge University Press

Clark and Fischer (C&F) offer a new approach for how people construe social robots, arguing that people interpret social robots not as social agents, but as interactive depictions. We agree with the authors that like voice assistants, people expect to interact with social robots. However, in contrast to C&F, we argue that children do not construe social robots as interactive toys but instead treat them as social learning partners. Such a distinction is important, as children's environments are increasingly filled with robots: According to the Allied Business Intelligence Inc. by 2024, over 79 million homes will have at least one robot. Thus, an examination of research focusing on how children selectively learn from social others is critical to exploring children's early understanding of and interaction with social robots.

Young children can learn about the world around them through their own first-hand observations, exploration, experimentation, and by actively seeking information from social learning partners including testimony from adults (e.g., caregivers, teachers, peers; Harris, Koenig, Corriveau, & Jaswal, Reference Harris, Koenig, Corriveau and Jaswal2018; Wang, Tong, & Danovitch, Reference Wang, Tong and Danovitch2019) as well as nonhuman agents such as voice assistants (Siri, Alexa; Aeschlimann, Bleiker, Wechner, & Gampe, Reference Aeschlimann, Bleiker, Wechner and Gampe2020; Girouard-Hallam & Danovitch, Reference Girouard-Hallam and Danovitch2022; Girouard-Hallam, Streble, & Danovitch, Reference Girouard-Hallam, Streble and Danovitch2021; Oranç & Ruggeri, Reference Oranç and Ruggeri2021), computers (e.g., Danovitch & Alzahabi, Reference Danovitch and Alzahabi2013; Noles, Danovitch, & Shafto, Reference Noles, Danovitch and Shafto2015), or social robots (Breazeal et al., Reference Breazeal, Harris, DeSteno, Westlund, Dickens and Jeong2016; Oranç & Ruggeri, Reference Oranç and Ruggeri2021). Indeed, prior work demonstrates that toddlers (aged 18–24 months; Movellan, Eckhardt, Virnes, & Rodriguez, Reference Movellan, Eckhardt, Virnes and Rodriguez2009) and children (aged 3–6; Tanaka & Matsuzoe, Reference Tanaka and Matsuzoe2012) are able to learn new words from social robots, suggesting that from an early age, children treat such as agents as learning partners and critical sources of information.

To date, an extensive body of literature examining children's trust of testimony from others indicates that preschoolers are surprisingly selective when deciding whom to learn from (Harris, Reference Harris2012; see Harris et al., Reference Harris, Koenig, Corriveau and Jaswal2018 for review; Mills, Reference Mills2013). Preschoolers attend to the informant's epistemic characteristics such as an individual's prior accuracy or expertise (e.g., Birch, Vauthier, & Bloom, Reference Birch, Vauthier and Bloom2008; Harris & Corriveau, Reference Harris and Corriveau2011; Sobel & Kushnir, Reference Sobel and Kushnir2013) as well as social characteristics including familiarity, eye contact, confidence (or uncertainty), social group, and contingent interactions (e.g., Brink & Wellman, Reference Brink and Wellman2020; Corriveau & Harris, Reference Corriveau and Harris2009; Corriveau, Kinzler, & Harris, Reference Corriveau, Kinzler and Harris2013; Koenig, Clement, & Harris, Reference Koenig, Clement and Harris2004).

Importantly, unlike what is proposed by C&F, children appear to employ similar strategies when determining the credibility of social robots, as they do when they make inferences about humans (e.g., Danovitch et al., Reference Danovitch and Alzahabi2013; Oranç & Ruggeri, Reference Oranç and Ruggeri2021). Moreover, like they do with humans, children prefer to learn from an accurate over inaccurate computer (Danovitch et al., Reference Danovitch and Alzahabi2013) and an accurate over an inaccurate social robot (aged 3; Brink & Wellman, Reference Brink and Wellman2020; Geiskkovitch et al., Reference Geiskkovitch, Thiessen, Young and Glenwright2019). Similarly, they prefer to ask for information from a robot who engaged in greater contingent behavior (aged 3–5; Breazeal et al., Reference Breazeal, Harris, DeSteno, Westlund, Dickens and Jeong2016) or a more interactive teaching style (aged 4–6; Okita, Ng-Thow-Hing, & Sarvadevabhatla, Reference Okita, Ng-Thow-Hing and Sarvadevabhatla2009). Additionally, young children (aged 3–6) even attribute expertise to social robots in certain domains, with children are more likely to direct questions to robots (vs. an adult information) when the topic was about machines, but more likely to ask humans about psychological or physics-related questions. Taken together, these data support the idea that children engage with social robots in much the same way as they do with other social informants – and importantly, not simply as interactive depictions.

Further, although children as young as 3 recognize that nonhuman agents are not alive (Jipson & Gelman, Reference Jipson and Gelman2007), they treat them as they would other interlocutors. Indeed, children view computers and social robots as factual sources of information (Danovitch & Keil, Reference Danovitch and Keil2008) and attribute mental capacities, moral and psychological characteristics to social robots (Breazeal et al., Reference Breazeal, Harris, DeSteno, Westlund, Dickens and Jeong2016 [children aged 3–5]; Kahn et al., Reference Kahn, Kanda, Ishiguro, Gill, Ruckert, Shen and Severson2012 [children aged 9–12]; Bernstein & Crowley, Reference Bernstein and Crowley2008 [children aged 4–7]) as well as voice assistants (e.g., Girouard-Hallam et al., Reference Girouard-Hallam, Streble and Danovitch2021 [children aged 6–10]). Moreover, such judgments about the capacities of nonhuman agents can also impact children's learning preferences. For example, 3–6-year-olds who attributed greater perceptual abilities to robots were more likely to choose to learn from a robot rather than a human informant (Oranç & Ruggeri, Reference Oranç and Ruggeri2021). These data support the notion that children view such agents as true social learning partners, and not simply interactive toys similar to dolls.

In sum, the authors argue that children construe social robots as interactive toys.

However, we argue that equating social robots to other toys children may use in pretend play does not account for the critical role that robots play in children's early learning. We urge C&F to consider this more sophisticated view of social robots and how this would impact their theoretical perspective. Such a view is increasingly important in today's society. Children today spend a great deal of time interacting with and learning from nonhuman agents including social robots and voice assistants, highlighting the need for consideration of children's use of social robots as social learning partners across the lifespan.

Financial support

This work received no specific grant from any funding agency.

Competing interest

None.

References

Aeschlimann, S., Bleiker, M., Wechner, M., & Gampe, A. (2020). Communicative and social consequences of interactions with voice assistants. Computers in Human Behavior, 112, 106466. https://doi.org/10.1016/j.chb.2020.106466CrossRefGoogle Scholar
Bernstein, D., & Crowley, K. (2008). Searching for signs of intelligent life: An investigation of young children's beliefs about robot intelligence. Journal of the Learning Sciences, 17(2), 225247. https://doi.org/10.1080/10508400801986116CrossRefGoogle Scholar
Birch, S. A., Vauthier, S. A., & Bloom, P. (2008). Three- and four-year-olds spontaneously use others’ past performance to guide their learning. Cognition, 107(3), 10181034. https://doi.org/10.1016/j.cognition.2007.12.008CrossRefGoogle ScholarPubMed
Breazeal, C., Harris, P. L., DeSteno, D., Westlund, J. M. K., Dickens, L., & Jeong, S. (2016). Young children treat robots as informants. Topics in Cognitive Science, 8(2), 481491. https://doi.org/10.1111/tops.12192CrossRefGoogle ScholarPubMed
Brink, K. A., & Wellman, H. M. (2020). Robot teachers for children? Young children trust robots depending on their perceived accuracy and agency. Developmental Psychology, 56(7), 12681277. https://doi.org/10.1037/dev0000884CrossRefGoogle ScholarPubMed
Corriveau, K., & Harris, P. L. (2009). Choosing your informant: Weighing familiarity and recent accuracy. Developmental Science, 12(3), 426437. https://doi.org/10.1111/j.1467-7687.2008.00792.xCrossRefGoogle ScholarPubMed
Corriveau, K. H., Kinzler, K. D., & Harris, P. L. (2013). Accuracy trumps accent in children's endorsement of object labels. Developmental Psychology, 49(3), 470479. https://doi.org/10.1037/a0030604CrossRefGoogle ScholarPubMed
Danovitch, J. H., & Alzahabi, R. (2013). Children show selective trust in technological informants. Journal of Cognition and Development, 14(3), 499513. https://doi.org/10.1080/15248372.2012.689391CrossRefGoogle Scholar
Danovitch, J. H., & Keil, F. C. (2008). Young humeans: The role of emotions in children's evaluation of moral reasoning abilities. Developmental Science, 11(1), 3339. https://doi.org/10.1111/j.1467-7687.2007.00657.xCrossRefGoogle ScholarPubMed
Geiskkovitch, D. Y., Thiessen, R., Young, J. E., & Glenwright, M. R. (2019). What? That's not a chair!: How robot informational errors affect children's trust towards robots. In ACM/IEEE International Conference on Human-Robot Interaction (pp. 4856).Google Scholar
Girouard-Hallam, L. N., & Danovitch, J. H. (2022). Children's trust in and learning from voice assistants. Developmental Psychology, 58(4), 646. https://doi.org/10.1037/dev0001318CrossRefGoogle ScholarPubMed
Girouard-Hallam, L. N., Streble, H. M., & Danovitch, J. H. (2021). Children's mental, social, and moral attributions toward a familiar digital voice assistant. Human Behavior and Emerging Technologies, 5, hbe2.321. https://doi.org/10.1002/hbe2.321Google Scholar
Harris, P. L. (2012). Trusting what you're told: How children learn from others. Harvard University Press..Google Scholar
Harris, P. L., & Corriveau, K. H. (2011). Young children's selective trust in informants. Philosophical Transactions of the Royal Society Biological Sciences, 366(1567), 11791187. https://doi.org/10.1098/rstb.2010.0321CrossRefGoogle ScholarPubMed
Harris, P. L., Koenig, M., Corriveau, K. H., & Jaswal, V. K. (2018). Cognitive foundations of learning from testimony. Annual Review of Psychology, 69, 251273. https://doi.org/10.1146/annurev-psych-122216-011710CrossRefGoogle ScholarPubMed
Jipson, J. L., & Gelman, S. A. (2007). Robots and rodents: Children's inferences about living and nonliving kinds. Child Development, 78(6), 16751688. https://doi.org/10.1111/j.1467-8624.2007.01095.xCrossRefGoogle ScholarPubMed
Kahn, P. H., Kanda, T., Ishiguro, H., Gill, B. T., Ruckert, J. H., Shen, S., … Severson, R. L. (2012). Do People Hold a Humanoid Robot Morally Accountable for the Harm it Causes? In Proceedings of the Seventh Annual ACM/IEEE International Conference on Human–Robot Interaction, pp. 33–40. https://doi.org/10.1145/2157689.2157696CrossRefGoogle Scholar
Koenig, M., Clement, F., & Harris, P. L. (2004). Trust in testimony: Children's use of true and false statements. Psychological Science, 15(10), 694698. https://doi.org/10.1111/j.0956-7976.2004.00742.xCrossRefGoogle ScholarPubMed
Mills, C. M. (2013). Knowing when to doubt: Developing a critical stance when learning from others. Developmental Psychology, 49(3), 404418. https://doi.org/10.1037/a0029500CrossRefGoogle ScholarPubMed
Movellan, J., Eckhardt, M., Virnes, M., & Rodriguez, A. (2009). Sociable Robot Improves Toddler Vocabulary Skills. In Proceedings of the 4th ACM/IEEE International Conference on Human Robot Interaction – HRI ’09, p. 307. https://doi.org/10.1145/1514095.1514189CrossRefGoogle Scholar
Noles, N. S., Danovitch, J. H., & Shafto, P. (2015). Children's trust in technological and human informants. Proceedings of the Cognitive Science Society, 6, 17211726.Google Scholar
Okita, S. Y., Ng-Thow-Hing, V., & Sarvadevabhatla, R. (2009). Learning together: ASIMO developing an interactive learning partnership with children. In RO-MAN 2009-The 18th IEEE International Symposium on Robot and Human Interactive Communication (pp. 11251130). IEEE.10.1109/ROMAN.2009.5326135CrossRefGoogle Scholar
Oranç, C., & Ruggeri, A. (2021). “Alexa, let me ask you something different” children's adaptive information search with voice assistants. Human Behavior and Emerging Technologies, 3(4), 595605. https://doi.org/10.1002/hbe2.270CrossRefGoogle Scholar
Sobel, D. M., & Kushnir, T. (2013). Knowledge matters: How children evaluate the reliability of testimony as a process of rational inference. Psychological Review, 120(4), 779797. https://doi.org/10.1037/a0034191CrossRefGoogle ScholarPubMed
Tanaka, F., & Matsuzoe, S. (2012). Children teach a care-receiving robot to promote their learning: Field experiments in a classroom for vocabulary learning. Journal of Human–Robot Interaction, 1(1), 7895. https://doi.org/10.5898/JHRI.1.1.TanakaCrossRefGoogle Scholar
Wang, F., Tong, Y., & Danovitch, J. (2019). Who do I believe? Children's epistemic trust in internet, teacher, and peer informants. Cognitive Development, 50, 248260. https://doi.org/10.1016/j.cogdev.2019.05.006CrossRefGoogle Scholar