Hostname: page-component-cd9895bd7-7cvxr Total loading time: 0 Render date: 2024-12-28T00:08:15.640Z Has data issue: false hasContentIssue false

Efficiency, information theory, and neural representations

Published online by Cambridge University Press:  30 August 2019

Joseph T. Devlin
Affiliation:
Centre for Speech and Language, Department of Experimental Psychology, University of Cambridge, Cambridge, CB2 3EB, England jtd21@cam.ac.uksam26@cam.ac.ukrpr23@cam.ac.ukcsl.psychol.cam.ac.uk/~jdevlin
Matt H. Davis
Affiliation:
Medical Research Council, Cognition and Brain Sciences Unit, Cambridge, CB2 2EF England matt.davis@mrc-cbu.cam.26.ac
Stuart A. McLelland
Affiliation:
Centre for Speech and Language, Department of Experimental Psychology, University of Cambridge, Cambridge, CB2 3EB, England jtd21@cam.ac.uksam26@cam.ac.ukrpr23@cam.ac.ukcsl.psychol.cam.ac.uk/~jdevlin
Richard P. Russell
Affiliation:
Centre for Speech and Language, Department of Experimental Psychology, University of Cambridge, Cambridge, CB2 3EB, England jtd21@cam.ac.uksam26@cam.ac.ukrpr23@cam.ac.ukcsl.psychol.cam.ac.uk/~jdevlin

Abstract

We contend that if efficiency and reliability are important factors in neural information processing then distributed, not localist, representations are “evolution's best bet.” We note that distributed codes are the most efficient method for representing information, and that this efficiency minimizes metabolic costs, providing adaptive advantage to an organism.

Type
Brief Report
Copyright
2000 Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)