Research Article
Learning in a Distributed Software Architecture for Large-Scale Neural Modeling
478 downloads
@INPROCEEDINGS{10.1007/978-3-642-32615-8_65, author={Jasmin L\^{e}veill\^{e} and Heather Ames and Benjamin Chandler and Anatoli Gorchetchnikov and Ennio Mingolla and Sean Patrick and Massimiliano Versace}, title={Learning in a Distributed Software Architecture for Large-Scale Neural Modeling}, proceedings={Bio-Inspired Models of Network, Information, and Computing Systems. 5th International ICST Conference, BIONETICS 2010, Boston, USA, December 1-3, 2010, Revised Selected Papers}, proceedings_a={BIONETICS}, year={2012}, month={10}, keywords={Large-scale system learning laws neural networks neural network software heterogeneous computing}, doi={10.1007/978-3-642-32615-8_65} }
- Jasmin Léveillé
Heather Ames
Benjamin Chandler
Anatoli Gorchetchnikov
Ennio Mingolla
Sean Patrick
Massimiliano Versace
Year: 2012
Learning in a Distributed Software Architecture for Large-Scale Neural Modeling
BIONETICS
Springer
DOI: 10.1007/978-3-642-32615-8_65
Abstract
Progress on large-scale simulation of neural models depends in part on the availability of suitable hardware and software architectures. Heterogeneous hardware computing platforms are becoming increasingly popular as substrates for general-purpose simulation. On the other hand, recent work highlights that certain constraints on neural models must be imposed on neural and synaptic dynamics in order to take advantage of such systems. In this paper we focus on constraints related to learning in a simple visual system and those imposed by a new neural simulator for heterogeneous hardware systems, (Cog).
Copyright © 2010–2024 ICST