A comparative survey of automated parameter-search methods for compartmental neural models

Michael C. Vanier, James M. Bower

Research output: Contribution to journalArticlepeer-review

126 Scopus citations

Abstract

One of the most difficult and time-consuming aspects of building compartmental models of single neurons is assigning values to free parameters to make models match experimental data. Automated parameter-search methods potentially represent a more rapid and less labor-intensive alternative to choosing parameters manually. Here we compare the performance of four different parameter-search methods on several single-neuron models. The methods compared are conjugate-gradient descent, genetic algorithms, simulated annealing, and stochastic search. Each method has been tested on five different neuronal models ranging from simple models with between 3 and 15 parameters to a realistic pyramidal cell model with 23 parameters. The results demonstrate that genetic algorithms and simulated annealing are generally the most effective methods. Simulated annealing was overwhelmingly the most effective method for simple models with small numbers of parameters, but the generic algorithm method was equally effective for more complex models with larger numbers of parameters. The discussion considers possible explanations for these results and makes several specific recommendations for the use of parameter searches on neuronal models.

Original languageEnglish (US)
Pages (from-to)149-171
Number of pages23
JournalJournal of Computational Neuroscience
Volume7
Issue number2
DOIs
StatePublished - Sep 24 1999

Keywords

  • Compartmental model
  • Genetic algorithm
  • Parameter search
  • Simulated annealing

ASJC Scopus subject areas

  • Sensory Systems
  • Cognitive Neuroscience
  • Cellular and Molecular Neuroscience

Fingerprint Dive into the research topics of 'A comparative survey of automated parameter-search methods for compartmental neural models'. Together they form a unique fingerprint.

Cite this