No CrossRef data available.
Article contents
Why cardinalities are the “natural” natural numbers
Published online by Cambridge University Press: 11 December 2008
Abstract
According to Rips et al., numerical cognition develops out of two independent sets of cognitive primitives – one that supports enumeration, and one that supports arithmetic and the concepts of natural numbers. I argue against this proposal because it incorrectly predicts that natural number concepts could develop without prior knowledge of enumeration.
- Type
- Open Peer Commentary
- Information
- Copyright
- Copyright © Cambridge University Press 2008
References
Dantzig, T. (1967) Number, the language of science; a critical survey written for the cultured non-mathematician. Free Press.Google Scholar
Hartnett, P. M. & Gelman, R. (1998) Early understandings of numbers: Paths or barriers to the construction of new understandings? Learning and Instruction: The Journal of the European Association for Research in Learning and Instruction 8(4):341–74.Google Scholar
Hughes, M. (1986) Children and number: Difficulties in learning mathematics. Basil Blackwell.Google Scholar
Hurford, J. R. (1987) Language and number: The emergence of a cognitive system. Blackwell.Google Scholar
Kaplan, R. (1999) The nothing that is: A natural history of zero. Oxford University Press.Google Scholar
Le Corre, M., (under review) Disorder in the count list.Google Scholar
Le Corre, M., Van de Walle, G., Brannon, E. M. & Carey, S. (2006) Revisiting the competence/performance debate in the acquisition of counting principles. Cognitive Psychology 52(2):130–69.Google Scholar
Wynn, K. (1992b) Children's acquisition of the number words and the counting system. Cognitive Psychology 24:220–51.Google Scholar