Welcome to The Neuromorphic Engineer
Biological Models » Developmental

Silicon synapse implements multiple neural computational primitives

PDF version | Permalink

Chiara Bartolozzi and Giacomo Indiveri

4 February 2008

A compact analog circuit produces realistic excitatory postsynaptic currents in response to presynaptic action potentials and is compatible with circuits that model plasticity on both short and long time scales.

Synapses are highly specialized structures that transmit information between neurons. When an action potential generated by a neuron reaches a presynaptic terminal, a cascade of events produces a flow of ionic currents into or out of the postsynaptic neuron's membrane, with a characteristic time course that can last up to several hundreds of milliseconds.1 Modeling the detailed dynamics of postsynaptic currents can be a crucial step for learning neural codes and encoding spatio-temporal patterns of spikes.2 However, both software computational models and VLSI (very large silicon integration) implementations of neural systems have often neglected the dynamic aspects of synaptic currents.

Modeling the temporal dynamics of each individual synapse in a network of integrate and fire (I&F) neurons can be very onerous in terms of silicon real-estate for dedicated VLSI implementations. A compromise between highly detailed models of synaptic dynamics and no dynamics at all is to use computationally efficient models that account for the basic macroscopic properties of synaptic transmission. We recently proposed a novel VLSI synaptic circuit, the differential-pair integrator (DPI),3 that implements one of these efficient models based on exponentials4 and supports a wide range of synaptic properties. The design of the DPI synapse was inspired by a series of similar circuits proposed in the literature: these collectively share many of the advantages of our solution but individually lack one or more of the features of our design.3

The schematic diagram of the silicon synapse is shown above. The functional parts of the circuit implement exponential ligand-gated postsynaptic current generation (DPI), short term depression (STD), NMDA (N-methyl-D-aspartate) postsynaptic-receptor voltage gating and conductance-based functionality (G).

Figure 1 shows the schematic diagram of the silicon synapse. The basic DPI block reproduces the functionality of ligand-gated AMPA (α-amino-3-hydroxy-5-methylisoxazole-4- propionic acid) receptors. Assuming subthreshold operation and saturation regime, the transfer function of the circuit is a first order differential equation:

where τ=CUT/(κIτ), and the term Igain represents a p-type subthreshold current controlled by Vthr. Similarly, the amplitude of the current Iτ is set by the voltage Vτ, and the one of Iw is set by Vw.

The differential equation is non-linear, however, when stimulated with an input step signal: in this case the DPI output current Isyn will rise to values such that IsynIgain, if IwIτ. In these conditions the Isyn dependence in the second term of Equation 1 cancels out, and the non-linear differential equation simplifies to the canonical first-order low-pass-filter equation:

The synaptic efficacy depends on both the current Iw and Igain. The presence of the Mw nFET (n-type field-effect transistor) makes the DPI compatible with previously proposed circuits for implementing synaptic plasticity, both on short time scales with models of short-term-depression5,6 (STD block in Figure 1), and on longer time scales with spike-based learning mechanisms, such as spike-timing-dependent-plasticity7 (STDP).

The thresholding property of the differential pair in the NMDA block of Figure 1 is used to reproduce the voltage gating property of NMDA receptors. Similarly the differential-pair in the G block is used to implement a linear dependence of the postsynaptic current on the difference between the postsynaptic membrane depolarization and the synapse reversal potential Vgthr, implementing conductance-based behaviour. A complementary version of the DPI circuit can be used to implement inhibitory GABAα (γ-aminobutyric acid) receptors.

The silicon synapse has the following response properties: (a) synaptic scaling of the excitatory post-synaptic current (EPSC) amplitude by changing either Vw or Vthr independently; (b) NMDA voltage gating (the blue presynaptic spikes have a visible effect only when the postsynaptic membrane voltage is greater that a set NMDA threshold, dashed line); (c) conductance-based behaviour as the postsynaptic amplitude decreases when the membrane potential gets close to the reverse potential of the synapse (dashed line).

An important additional computational primitive that can be implemented in the DPI, thanks to the extra synaptic efficacy parameter Igain is “homeostatic plasticity”.8 This slow adaptation mechanism is independent of spike-based learning rules that act on Vw. It can be used to reduce the intrinsic inhomogeneities of the VLSI circuits, thus improving the stability, robustness, and mismatch tolerance of VLSI networks of spiking neurons in the face of the tuning of circuits' parameters. Figure 2 shows the effect of changing Igain and Iw on the synaptic efficacy, as well as the effects of the NMDA and G blocks on the postsynaptic membrane potential.

In summary, we presented a new hardware model of synaptic dynamics that increases the similarities between silicon and biological synapses. The exponential time course of the postsynaptic currents implements linear synaptic summation, an essential property of synaptic transmission, observed in real neurons and used in computational models.4 Very simple additional circuits can extend the basic DPI functionality to include the phenomenological implementation of additional synaptic properties. Such circuits enrich the ensemble of computational primitives that can be emulated on silicon, in a unified framework that can encompass them all in a single compact circuit. When integrated in VLSI it is possible to implement massively-parallel and complex linear and non-linear synaptic-transmission computational primitives, and to carry out elaborate neural computational experiments in real-time.


Chiara Bartolozzi
Robotics, Brain, and Cognitive Sciences, Italian Institute of Technology

Dr. Chiara Bartolozzi obtained her PhD at the Institute for Neuroinformatics, ETH Zurich, developing a multi-chip system for visual selective attention. She is currently in a post-doctoral position at the Italian Institute of Technology, working on the design of neuromorphic sensors for robotic platforms and circuits for the processing and transmission of neural signals.

Giacomo Indiveri
Institute of Neuroinformatics, University of Zurich and Swiss Federal Institute of Technology (UZH|ETHZ)

Dr. Giacomo Indiveri received his Bachelors degree and PhD in Electronic Engineering from the the University of Genoa, Italy and the venia legendi in Neuromorphic Engineering from the ETH Zurich, Switzerland. He is currently an Assistant Professor at the UZH|ETH Institute of Neuroinformatics. His current research interests include the design and implementation of distributed artificial neural systems for sensory processing and spike-based plasticity mechanisms.

  1. C. Koch, Biophysics of Computation: Information Processing in Single Neurons Series Synaptic input , pp. 85-116, Oxford University Press, 1999.

  2. R. Gütig and H. Sompolinsky, The tempotron: a neuron that learns spike timing?based decisions, Nature Neuroscience 9, pp. 420-428, 2006.

  3. C. Bartolozzi and G. Indiveri, A spiking VLSI selective attention multi?chip system with dynamic synapses and integrate and fire neurons, Proc. Neural Info. Proc. Sys. 2006, 2008. (In press.)

  4. A. Destexhe, Z. Mainen and T. Sejnowski, Methods in Neuronal Modelling, from ions to networks, Kinetic Models of Synaptic Transmission, pp. 1-25, MIT Press, 1998.

  5. C. Rasche and R. Hahnloser, Silicon Synaptic Depression, Biological Cybernetics 84 (1), pp. 57-62, 2001.

  6. M. Boegerhausen, P. Suter and S.-C. Liu, Modeling short-term synaptic depression in silicon, Neural Computation 15 (2), pp. 331-348 Feb, 2003.

  7. G. Indiveri, E. Chicca and R. Douglas, A VLSI array of low-power spiking neurons and bistable synapses with spike?timing dependent plasticity, IEEE Trans. Neural Networks 17 (1), pp. 211-221 Jan, 2006.

  8. G. Turrigiano, Homeostatic plasticity in neural networks: the more things change, the more they stay the same, Trends in Neurosci. 22 (5), pp. 221-227, 1999.

DOI:  10.2417/1200802.0053


Tell us what to cover!

If you'd like to write an article or know of someone else who is doing relevant and interesting stuff, let us know. E-mail the editor and suggest the subject for the article and, if you're suggesting someone else's work, tell us their name, affiliation, and e-mail.