Hostname: page-component-cd9895bd7-q99xh Total loading time: 0 Render date: 2024-12-29T20:13:55.408Z Has data issue: false hasContentIssue false

Using the sender–receiver framework to understand the evolution of languages-of-thought

Published online by Cambridge University Press:  28 September 2023

Ronald J. Planer*
Affiliation:
School of Liberal Arts, Faculty of Arts, the Social Science and Humanities, University of Wollongong, Wollongong, NSW, Australia rplaner@uow.edu.au https://scholars.uow.edu.au/display/ronald_planer

Abstract

This commentary seeks to supplement the case Quilty-Dunn et al. make for the psychological reality of languages-of-thought (LoTs) in two ways. First, it focuses on the reduced physical demands which LoT architectures often make compared to alternative architectures. Second, it embeds LoT research within a broader framework that can be leveraged to understand the evolution of LoTs.

Type
Open Peer Commentary
Copyright
Copyright © The Author(s), 2023. Published by Cambridge University Press

Quilty-Dunn et al. adduce evidence for the psychological reality of languages-of-thought (LoTs) from a wide range of empirical domains. Their case inherits support from each domain, while depending on none. This is a powerful way to make such a case. Their article is, moreover, timely. It is a most welcome antidote to the steady rise in antirepresentationalist sentiment in many philosophy of cognitive science circles in recent years. Overarching theories of cognition that eschew any role for computational procedures applied to structured symbols are not serious contenders unless and until they adequately account for detailed empirical information of the sort discussed by Quilty-Dunn et al.

So, my impression of their article is strongly positive. Here, my aim is to supplement their case in two ways. First, by drawing attention to a distinct empirical rationale for LoTs. And second, by situating LoT research within a broader framework that promises to shed light on the evolution of LoTs.

LoT-based architectures often make much reduced physical demands compared to alternative architectures. Symbols are constructed in a combinatorial fashion, and their sequence properties play a role in individuating symbols. This allows for efficient representation. Additionally, the meaning of a (complex) symbol is a function of that symbol's parts, together with their mode of composition. Symbols, to some extent, analytically deconstruct their referents. Such symbols allow for the use of compact computational procedures (as opposed to, say, lookup tables). Together, these principles can reduce the demand on physical resources (e.g., neurons) by orders of magnitude.

These points have been most forcefully argued by Gallistel and colleagues (Gallistel, Reference Gallistel1990, Reference Gallistel2008; Gallistel & King, Reference Gallistel and King2011), often with examples drawn from animal cognition. A good case is the caching behavior of western scrub jays. These birds are estimated to encode the location of thousands of caches (Clayton & Krebs, Reference Clayton and Krebs1995). Moreover, for each location, they encode what was cached, when it was cached, and whether they were watched while caching it (their caches are often pilfered) (Clayton, Yu, & Dickinson, Reference Clayton, Yu and Dickinson2001). Additionally, they make flexible use of this information (e.g., to retrieve cached items in an efficient way) (Clayton et al., Reference Clayton, Yu and Dickinson2001). Arguably, scrub jays could not physically realize the requisite symbols and computations except by instantiating an LoT. And even if they could, an LoT architecture might still have been selectively favored for its increased economy. Brain tissue is expensive, after all.

But how might such symbol systems evolve in the first place? Progress on this question can be made by using the “sender–receiver framework.” This framework is inspired by the signaling games first presented by David Lewis (Reference Lewis1969). At their simplest, a signaling game features a sender who can observe the variable state of the world and send a signal (but cannot act), and a receiver who can observe the signal (but not the world), and act. Acts have consequences for both sender and receiver, and both have preferences regarding which act should be done when. Lewis showed that, given certain conditions (e.g., rationality, common interest, common knowledge), informative signaling can arise and stabilize. Decades later, these games were revisited by Skyrms who showed how Lewis's constraints could be significantly relaxed (Skyrms, Reference Skyrms1995, Reference Skyrms2004, Reference Skyrms2010). Indeed, Skyrms showed how even completely mindless agents can evolve informative signaling under many conditions.

Skyrms's generalization of the Lewis model allows us to apply that model within organisms, not just between them (Godfrey-Smith, Reference Godfrey-Smith2014; Planer, Reference Planer2019; Planer & Godfrey-Smith, Reference Planer and Godfrey-Smith2021). Two cognitive mechanisms (or one and the same cognitive mechanism at different times) can serve as sender and receiver in a Lewis–Skyrms-style setup. And this allows us to see (with the aid of the theory and results that have grown up around signaling games in recent decades) how signaling systems, including rather complex ones, can arise and stabilize in brains over phylogenetic and ontogenetic timeframes. This includes systems that are plausibly conceived of as LoTs (Planer, Reference Planer2019).

Using the sender–receiver framework, Planer and Godfrey-Smith (Reference Planer and Godfrey-Smith2021) present a taxonomy of signs displaying different forms of structure (Table 1). Unfortunately, there is not scope here to go through the details of this taxonomy. Suffice it to say that the taxonomy is structured by two tripartite distinctions among signs, namely, atomic-composite-combinatorial, and nominal-organized-encoding, which are envisaged as plausible, incremental evolutionary pathways. On this taxonomy, an LoT is a sign system (used in cognition) that is simultaneously combinatorial and encoding. As a combinatorial sign system, it contains signs that are constructed out of other signs belonging to the system (and hence, there is sharing of parts across signs), and moreover, the order of the parts of a sign matters to how the sign functions in communication and/or computation. And as an encoding sign system, there is a systematic principle (or set of such principles) that assigns meaning to complex signs based not only on the identity of their parts, but also on where those parts occur in the sign (and so, particular locations within a complex sign have meaning). It is combinatoriality that allows for maximally efficient representation and encoding principles that allow for the use of compact, efficient algorithms. These properties are very close to those Quilty-Dunn et al. call “discrete constituency” and “role-filler independence” (while “predicate–argument structure” [target article, sect. 2, para. 9] can be understood as a special case of encoding).

Table 1. Taxonomy of signs based on their formal and semantic structure

Adapted from Planer and Godfrey-Smith (Reference Planer and Godfrey-Smith2021).

A final methodological point. The sender–receiver framework is closely associated with a family of formal signaling models. And although the orientation to sign use that the framework fosters is not inherently formal (Planer & Godfrey-Smith, Reference Planer and Godfrey-Smith2021), these models are very useful. For they make testing ideas about the emergence of various forms of structure tractable. Research on the evolution of LoTs can no doubt benefit from these formal tools. Most obviously, signaling models might be used to investigate whether and under what conditions Quilty-Dunn et al.'s six core properties indeed cluster (or form subclusters). Additionally, such models might be used to test the idea that LoTs evolve at interfaces between other systems, as interface systems can be naturally modeled as intermediaries in so-called signaling chains.

Financial support

This research received no specific grant from any funding agency, commercial, or not-for-profit sectors.

Competing interest

None.

References

Clayton, N. S., & Krebs, J. R. (1995). Memory in food-storing birds: From behaviour to brain. Current Opinion in Neurobiology, 5(2), 149154.CrossRefGoogle Scholar
Clayton, N. S., Yu, K. S., & Dickinson, A. (2001). Scrub jays (Aphelocoma coerulescens) form integrated memories of the multiple features of caching episodes. Journal of Experimental Psychology: Animal Behavior Processes, 27(1), 17.Google ScholarPubMed
Gallistel, C. R. (1990). The organization of learning (Vol. 336). MIT Press.Google Scholar
Gallistel, C. R. (2008). Learning and representation. Learning and Memory: A Comprehensive Reference, 1, 227242.Google Scholar
Gallistel, C. R., & King, A. P. (2011). Memory and the computational brain: Why cognitive science will transform neuroscience. John Wiley.Google Scholar
Godfrey-Smith, P. (2014). Sender–receiver systems within and between organisms. Philosophy of Science, 81(5), 866878.CrossRefGoogle Scholar
Lewis, D. (1969). Convention: A philosophical study. Harvard University Press.Google Scholar
Planer, R. J. (2019). The evolution of languages of thought. Biology & Philosophy, 34, 127.CrossRefGoogle Scholar
Planer, R. J., & Godfrey-Smith, P. (2021). Communication and representation understood as sender–receiver coordination. Mind & Language, 36(5), 750770.CrossRefGoogle Scholar
Skyrms, B. (1995). Evolution of the social contract. Cambridge University Press.Google Scholar
Skyrms, B. (2004). The stag hunt and the evolution of social structure. Cambridge University Press.Google Scholar
Skyrms, B. (2010). Signals: Evolution, learning, and information. Oxford University Press.CrossRefGoogle Scholar
Figure 0

Table 1. Taxonomy of signs based on their formal and semantic structure