A Python interface to NEST - Institute of Neuromorphic Engineering

12 downloads 3389 Views 1MB Size Report
lot of SLI code has already been written, and we did not want to render this code useless with newer versions of NEST. Sec- ond, Python is not yet available on ...
10.2417/1200912.1703

A Python interface to NEST Jochen Martin Eppler With PyNEST, stimulus generation, simulation and data analysis can be performed in a single programming language. NEST1 is a simulator for large networks of spiking neurons, and has a long history of use in computational neuroscience for the simulation of large spiking networks. It runs on many different architectures (ranging from normal desktop computers to computer clusters with thousands of processor cores2 ), is written in CCC, and has a built-in simulation language interpreter (SLI) to help the user set up the network. NEST’s simulation language is stack-based and inspired by PostScript,3 which means that each function expects its arguments to be on the stack and returns results back to it. SLI is a high-level language with functional operators like Map and data-structures like associative arrays. However, learning SLI has turned out to be difficult for many users: a more convenient simulation language was required. When we were thinking about a new scripting interface for NEST, Python was almost unknown in computational neuroscience. However, we noticed a strong trend towards it in the scientific community in general.4 Python has a number of advantages over commercial programming environments like Matlab5 or Mathematica:6 it is installed on almost all Linux and MacOSbased computers, is free, and is being developed by an active community. Thanks to the multitude of packages for scientific computing (http://www.scipy.org), Python can be used for stimulus generation, data analysis, and plotting in the same way its commercial alternatives can. As a result, a number of other neuroscience laboratories are also using Python.7 The usual approach to creating Python interfaces for existing software is to create a wrapper library that exposes all data structures and functions of the application to Python. We decided to deviate from this approach and keep the existing SLI interface as an intermediate layer between the new user interface and the simulation engine. The reason for this is threefold: first, a lot of SLI code has already been written, and we did not want to render this code useless with newer versions of NEST. Second, Python is not yet available on some exotic hardware platforms, which we still need to use. Third, NEST needs to remain independent of third party software to guarantee long-term sustainability.

Figure 1. Shown is the NEST architecture. The lowest level is the simulation engine, which is used by the simulation language interpreter and by the PyNEST low-level API. The PyNEST high-level API uses the low-level API to communicate with the simulation engine. The user’s simulation code can use functions from PyNEST, from Python, and from its extension modules. As a result, the PyNEST interface consists of two separate layers: the low-level API, written using the Python C API,8 is responsible for the data conversion from SLI to Python and back. It provides access to SLI with only three functions: sli push() for pushing data onto the operand stack, sli pop() for getting data from the SLI stack back to Python, and sli run() for executing SLI commands. The high-level API uses these functions of the low-level API to provide Python versions of all important SLI commands. The functions of the high-level API are used by the user’s simulation code. The complete architecture of PyNEST is shown in Figure 1. SLI already provides all the necessary commands to build and simulate a neural network. Thus, we can create a complete Python interface to NEST by creating wrapper functions that just call the respective functions in SLI. This technique is illustrated in the following listing, which defines a function that returns the list of available models: def Models(): sli_run("modeldict") return sli_pop().keys() In general, each function has three parts: First, we push the arguments onto the SLI stack with sli push(). Second, we execute one or more SLI commands to perform the desired action Continued on next page

10.2417/1200912.1703 Page 2/2

inside of NEST using sli run(). Third, we retrieve the results from the stack via sli pop(). Note that the example above does not contain a call to sli push(), as no arguments are required. The low-level API catches all errors of NEST and raises an appropriate Python exception. The function sli push() has to convert a given Python object to the corresponding SLI data type. We first determine the type of the Python object and then instantiate a new SLI datum of the right type. sli pop() converts a SLI datum to a Python object. This conversion has a more elegant implementation, because it can exploit the fact that each SLI datum knows its own type. A SLI datum thus can convert itself to a Python object of the right type. To avoid that SLI datums directly depend on Python, we use the acyclic visitor pattern,9 which moves Python dependent code to a separate class. The details of this technique are explained in Eppler et al. 2008.10 We have shown an alternative approach for creating Python bindings for an application by using a generic interpreterinterpreter interaction instead of direct wrapping of the underlying functions and data structures. The implementation of PyNEST is described in detail together with examples and the complete API reference in Eppler et al. 2008.10 NEST’s source code is available under an open-source license for noncommercial use on the homepage of the NEST Initiative at http://nest-initiative.org. Currently, we are investigating methods for writing neuron and synapse models in Python instead of C++ in order to ease the development for users not familiar with the internal workings of NEST. In another project, we are improving the scalability of NEST for very large clusters with tens of thousands of processors, e.g. the IBM BlueGene architecture.

Author Information Jochen Martin Eppler Honda Research Institute Europe GmbH Offenbach am Main, Germany Bernstein Center for Computational Neuroscience Freiburg im Breisgau, Germany J. M. Eppler received his Diploma in Computer Science from the University of Freiburg in 2006. Currently, he is pursuing his PhD in a joint project by the Honda Research Institute Europe and the Bernstein Center for Computational Neuroscience in Freiburg. References 1. M.-O. Gewaltig and M. Diesmann, NEST (Neural Simulation Tool), Scholarpedia 2 (4), p. 1430, 2007. 2. H. E. Plesser, J. M. Eppler, A. Morrison, M. Diesmann, and M.-O. Gewaltig, Efficient Parallel Simulation of Large-Scale Neuronal Networks on Clusters of Multiprocessor Computers, in A.-M. Kermarrec, L. Boug´e, and T. Priol (eds.), EuroPar 2007: Parallel Processing, pp. 672–681, Springer-Verlag, Berlin, 2007. doi:10.1007/978-3-540-74466-5 71 3. Adobe Systems Inc., The PostScript Language Reference Manual, 2 ed., Addison-Wesley, 1991. 4. P. F. Dubois, Guest Editor’s Introduction: Python: Batteries Included, Computing in Science and Engineering 9 (3), pp. 7–9, 2007. doi:10.1109/MCSE.2007.51 5. MathWorks, MATLAB The Language of Technical Computing: Using MATLAB. Natick, MA, 2002. 3 Apple Hill Drive, Natick, Mass. 01760-2098 6. S. Wolfram, The Mathematica Book, 5 ed., Wolfram Media Incorporated, 2003. ¨ 7. R. Kotter, J. A. Bednar, A. Davison, M. Diesmann, M.-O. Gewaltig, M. Hines, and E. Muller (eds.), Frontiers in Neuroinformatics: Special topic on Python in Neuroscience, Frontiers Research Foundation, 2009. http://frontiersin.org/neuroinformatics/specialtopics/8 8. G. van Rossum, Python/C API Reference Manual 2008. http://docs.python.org/api/api.html 9. A. Alexandrescu, Modern C++ Design, Addison-Wesley, Boston, 2001. 10. J. M. Eppler, M. Helias, E. Muller, M. Diesmann, and M. Gewaltig, PyNEST: a convenient interface to the NEST simulator, Front. Neuroinform. 2, p. 12, 2008. doi:10.3389/neuro.11.012.2008

The author is grateful for the support of the NEST Initiative, in particular to Markus Diesmann, Marc-Oliver Gewaltig, Moritz Helias, Eilif Muller, and Hans Ekkehard Plesser. Partially funded by DIP F1.2, BMBF Grant 01GQ0420 to the Bernstein Center for Computational Neuroscience Freiburg, EU Grant 15879 (FACETS), the Next-Generation Supercomputer Project of MEXT (Japan), and the Helmholtz Alliance on Systems Biology (Germany).

c 2009 Institute of Neuromorphic Engineering

Suggest Documents