Loading AI tools
Program at the California Institute of Technology From Wikipedia, the free encyclopedia
The Computation and Neural Systems (CNS) program was established at the California Institute of Technology in 1986 with the goal of training PhD students interested in exploring the relationship between the structure of neuron-like circuits/networks and the computations performed in such systems, whether natural or synthetic. The program was designed to foster the exchange of ideas and collaboration among engineers, neuroscientists, and theoreticians.
In the early 1980s, having laid out the foundations of VLSI,[1] Carver Mead became interested in exploring the similarities between computation done in the brain and the type of computations that could be carried out in analog silicon electronic circuits. Mead joined with John Hopfield, who was studying the theoretical foundations of neural computation,[2] to expand his study. Mead and Hopfield's first joint course in this area was entitled “Physics of Computation”; Hopfield teaching about his work in neural networks and Mead about his work in the area of replicating neuronal structures in highly integrated electronic circuits.[3] Given the interest among both students and faculty, they decided to expand upon these themes in the following year. Richard Feynman joined them and three separate courses resulted: Hopfield's on neural networks, Mead's on neuromorphic analog circuits,[4] and Feynman's course on the physics of computation.[3][5] At this point, Mead and Hopfield realized that a new field was emerging with neural scientists and the people doing the computer models and circuits all talking to each other.
In the fall of 1986, John Hopfield championed forming an interdisciplinary Ph.D. program to give birth to a scholarly community studying questions arising at the interface between neurobiology and electrical engineering, computer science and physics. It was called Computation and Neural Systems (CNS). The unifying theme of the program was the relationship between the physical structure of a computational system (physical or biological hardware), the dynamics of its operation and the computational problems that it can efficiently solve. The creation of this multidisciplinary program stems largely from progress on several previously unrelated fronts: the analysis of complex neural systems at both the single-cell and the network levels [6] using a variety of techniques (in particular, patch clamp recordings, intracellular and extra-cellular single and multi-unit electrophysiology in the awake animal and functional brain imaging techniques, such as functional magnetic resonance imaging (fMRI)), the theoretical analysis of nervous structures (computational neuroscience) and the modeling of artificial neural networks for engineering purposes.[2] The program started out with a small number of existing faculty in the various divisions. Amongst the early founding faculty were Carver Mead, John Hopfield, David Van Essen, Geoffrey Fox, James Bower, Mark Konishi, John Allman, Ed Posner and Demetri Psaltis. In that year, the first external professor, Christof Koch, was hired.
Since 1990, about 110 graduate students have been awarded a PhD in CNS and 14 a MS in CNS. About two-thirds of CNS graduates pursued an academic career, with the remaining CNS graduates founding and/or joining start-up companies. Over this time, the average duration of PhD has been 5.6 years.
During this time, the executive officers of the CNS Program were John Hopfield, Demetri Psaltis, Christof Koch, and Pietro Perona. The current executive officer is Thanos Siapas.[7]
CNS faculty founded and co-founded a number of conferences and workshops:
Seamless Wikipedia browsing. On steroids.
Every time you click a link to Wikipedia, Wiktionary or Wikiquote in your browser's search results, it will show the modern Wikiwand interface.
Wikiwand extension is a five stars, simple, with minimum permission required to keep your browsing private, safe and transparent.