Unit | Neural Coding and Brain Computing Unit (Tomoki Fukai) |
Teacher | Milena Menezes Carvalho |
Lecture Material | https://github.com/oist-ncbc/skill-pill-plus/blob/master/Tutorial-3_Synapses-and-Networks |
Library | https://github.com/iwasakishuto/University/tree/gh-pages/3A/theme/program/OIST/tutorial3.py |
import sys
sys.path.append("path/to/library") # Like "University/3A/theme/program"
Synapses¶
Neuronal connectivity¶
- Neurons are connected at specific sites called synapses.
- Usually, the axon of a presynaptic neuron will make contact with the dendrite (or soma) of a postsynaptic in what's called a chemical synapse:
- neurotransmitters are transferred via the tiny gap between the two neurons (synaptic cleft) thanks to biochemical processes.
- This generates a change in membrane potential at the postsynaptic site, called postsynaptic potential.
- Electrical synapses, also known as gap junctions, can also be present -- in this case, specialized proteins make a direct electrical connection between neurons. [1]
In this tutorial we will connect two or more neurons in NEST to create our first neuronal networks.
- First, we will learn the basic commands following [3] and [4]
- Then apply these to a well-known balanced network known as the Brunel network.
- By the end of this tutorial, you should be able to understand that a network's stability and activity behavior is deeply influenced by its parametrization.
Sources:¶
[1] Wulfram Gerstner and Werner M. Kistler, "Spiking Neuron Models". Cambridge University Press, 2002
[2] Peter Dayan and L. F. Abbot, "Theoretical Neuroscience: Computational and Mathematical Modeling of Neural Systems". The MIT Press, 2001
[3] PyNEST tutorial: Part 2, Populations of Neurons
[4] PyNEST tutorial: Part 3, Connecting Networks with Synapses
Synapse types¶
- The simplest synapse type is the
static_synapse
.- The synaptic weight, a measure of how strong is the influence of the presynaptic neuron on the postsynaptic neuron, is static, i.e. does not change over time.
- The synaptic transmission, however, is not instantaneous.
- The synaptic delay is defined as the "time" between the presynaptic neuron activation (action potential) and the moment the postsynaptic potential is generated.
- In NEST, we can check the default values for all parameters using the
GetDefaults
function:nest.GetDefaults('static_synapse')
- The common model which is more suit to real biological event is spike-time dependent plasticity.
- The synaptic weight changes according to the temporal order between pre and postsynaptic spike times.
- In NEST, the most common STDP mechanisms are implemented in
stdp_synapse
. - The change for normalized synaptic weights is described by
$$
\begin{aligned}
\Delta w
&= \begin{cases}
- \lambda f{-}(w) \times K-(\Delta t) & \text{if $\Delta t \leq 0$,} \ \lambda f{+}(w) \times K+(\Delta t) & \text{if $\Delta t > 0$,} \end{cases}\ \Delta t &\equiv t{post} - t{pre}\ K{(+,-)}(\Delta t) &= \exp(-|\Delta t| / \tau{(+,-)}) \end{aligned} $$
- The update functions $$f_{+}(w) = (1-w)^{\mu_{+}} \text{ and } f_{-}(w) = \alpha w^{\mu_{-}}$$ create synaptic potentiation (stronger weights) when causal spiking is detected ($\Delta t > 0$), otherwise generating synaptic depression (weaker weights).
- This rule is also known as temporally asymmetric Hebbian plasticity and has been thoroughly studied under different parametrizations. (below)
STDP Type | Parametrization | Ref. |
---|---|---|
multiplicative STDP | $\mu_{+}=\mu_{-}=1.0$ | [6] |
additive STDP | $\mu_{+}=\mu_{-}=0.0$ | [7] |
Guetig STDP | $\mu_{+}=\mu_{-}=[0.0, 1.0]$ | [5] |
van Rossum STDP | $\mu_{+}=0.0, \mu_{-} = 1.0$ | [8] |
Sources:¶
[5] Guetig et al. (2003). Learning input correlations through nonlinear temporally asymmetric hebbian plasticity. Journal of Neuroscience, 23:3697-3714 DOI: https://doi.org/10.1523/JNEUROSCI.23-09-03697.2003
[6] Rubin J, Lee D, Sompolinsky H (2001). Equilibrium properties of temporally asymmetric Hebbian plasticity. Physical Review Letters, 86:364-367. DOI: https://doi.org/10.1103/PhysRevLett.86.364
[7] Song S, Miller KD, Abbott LF (2000). Competitive Hebbian learning through spike-timing-dependent synaptic plasticity. Nature Neuroscience 3(9):919-926. DOI: https://doi.org/10.1038/78829
[8] van Rossum MCW, Bi G-Q, Turrigiano GG (2000). Stable Hebbian learning from spike timing-dependent plasticity. Journal of Neuroscience, 20(23):8812-8821. DOI: https://doi.org/10.1523/JNEUROSCI.20-23-08812.2000
Introduction: synapses in NEST¶
- Check the synapse types you can call
- Found in the following link:
- Running the following command:
import nest nest.Models('synapses')
- Found in the following link:
- For STDP synaptic models, the time constant of the depressing window of STDP (
tau_minus
) is exceptionally a parameter of the post-synaptic neuron.nest.Create("iaf_psc_alpha", params={"tau_minus": 30.0})
- However,
tau_plus
is a synapse parameter.nest.SetDefaults("stdp_synapse",{"tau_plus": 15.0})
- Customized variants of a synapse model can be created using
CopyModel()
, and can be used anywhere that a built-in model name can be used.nest.CopyModel("stdp_synapse","layer1_stdp_synapse",{"Wmax": 90.0})
- Connectivity rules for multiple neurons can be defined. Please check [3] to learn more details about creating and connecting neuron populations.
- Synaptic parameters can also be randomly distributed by assigning a dictionary to the parameter. This should contain the target distribution and its optional parameters, as listed below:
Distributions | Keys |
---|---|
normal |
mu , sigma |
lognormal |
mu , sigma |
uniform |
low , high |
uniform_int |
low , high |
binomial |
n , p |
exponential |
lambda |
gamma |
order , scale |
poisson |
lambda |
- Synapse information can be retrieved from its origin, target and synapse model using
GetConnections()
:
Example.¶
num_nurons = 10
epop1 = nest.Create("iaf_psc_delta", num_nurons, params={"tau_m": 30.0})
epop2 = nest.Create("iaf_psc_delta", num_nurons)
conn_dict = {"rule": "fixed_indegree", "indegree": 5}
syn_dict = {"model": "stdp_synapse", "alpha": 1.0}
nest.Connect(epop1, epop2, conn_dict, syn_dict)
neuron = nest.Create("iaf_psc_alpha")
syn_dict = {"model": "stdp_synapse",
"alpha": {"distribution": "uniform", "low": 0.1, "high": 2.},
"weight": {"distribution": "uniform", "low": 0.5, "high": 5.},
"delay": 1.0}
nest.Connect(epop1, neuron, "all_to_all", syn_dict)
# Synapse information can be retrieved
print(nest.GetConnections(epop1, target=epop2, synapse_model="stdp_synapse"))
# We can then extract the data using `GetStatus()`.
# Specific information can be retrieved by providing a list of desired parameters.
conns = nest.GetConnections(epop1, synapse_model="stdp_synapse")
conn_vals = nest.GetStatus(conns, ["target","weight"])
Networks¶
Example1: connecting two neurons¶
- In this example, we will create and connect two neurons, A and B.
- Neuron A is of type
iaf_psc_delta
:- Receives external current $I_e = 376.0 pA$.
- This current is sufficient to elicit a spike every $\approx$ 50ms.
- Neuron B is solely connected to neuron A:
- Can only spike if the input from A is strong enough.
- Neuron A is of type
Let's observe how B can be influenced by A.
import nest
import matplotlib.pyplot as plt
from OIST.tutorial3 import NeuronA, NeuronB, plotMultipleNeurons
# First, verify the activity of neuron A.
nest.ResetKernel()
modelA = NeuronA()
modelA.build()
ax = plotMultipleNeurons(modelA)
plt.show()
Now let's create neuron B as iaf_psc_delta
with no external current:
modelB = NeuronB()
modelB.build()
ax = plotMultipleNeurons(modelA, modelB)
plt.show()
Neuron B is still inactive. We need to connect both neurons:
nest.Connect(modelA.neuron, modelB.neuron, {"rule": "one_to_one"}, {"model": "static_synapse"})
ax = plotMultipleNeurons(modelA, modelB)
plt.show()
Q. Suggest at least 3 alterations we can make in this example to elicit spiking activity from B.
- Lower the threshold
V_th
:nest.SetStatus(neuron_B,{'V_th': -69.})
# default: -55 - Higher thr current
I_e
- Highre the weights between
nuron_A
andnuron_B
</span>
# Lower the threshold V_th.
nest.ResetKernel()
modelA = NeuronA()
modelA.build()
modelB = NeuronB()
modelB.set_params("neuron", V_th=-69.)
modelB.build()
nest.Connect(modelA.neuron, modelB.neuron, {"rule": "one_to_one"}, {"model": "static_synapse"})
ax = plotMultipleNeurons(modelA, modelB)
plt.show()
# Higher the current I_e.
nest.ResetKernel()
modelA = NeuronA()
modelA.build()
modelB = NeuronB()
modelB.set_params("neuron", I_e=350.)
modelB.build()
nest.Connect(modelA.neuron, modelB.neuron, {"rule": "one_to_one"}, {"model": "static_synapse"})
ax = plotMultipleNeurons(modelA, modelB)
plt.show()
# Highre the weights between "nuron_A" and "nuron_B"
nest.ResetKernel()
modelA = NeuronA()
modelA.build()
modelB = NeuronB()
modelB.build()
nest.Connect(modelA.neuron, modelB.neuron, {"rule": "one_to_one"}, {"model": "static_synapse", "weight": 14.9})
nest.Simulate(300.)
nest.Connect(modelA.neuron, modelB.neuron, {"rule": "one_to_one"}, {"model": "static_synapse", "weight": 15.})
ax = plotMultipleNeurons(modelA, modelB)
plt.show()
Because of the shape of B, and scale of observation, we couldn't find the spikes but neuron B also spikes after connection weight increased.
Q. Check what happens if the neuron model is different: for example, change `iaf_psc_delta` to `hh_psc_alpha`.
# Lower the threshold V_th.
nest.ResetKernel()
modelA = NeuronA(model_name="hh_psc_alpha")
modelA.set_params("neuron", I_e=250.)
modelA.build()
modelB = NeuronB(model_name="hh_psc_alpha")
modelB.build()
nest.Connect(modelA.neuron, modelB.neuron, {"rule": "one_to_one"}, {"model": "static_synapse"})
ax = plotMultipleNeurons(modelA, modelB)
plt.show()
Q. connect B to A reciprocally and check their dynamics. How can you describe their behavior?
nest.ResetKernel()
modelA = NeuronA()
modelA.build()
modelB = NeuronB()
modelB.set_params("neuron", V_th=-65.)
modelB.build()
nest.Connect(modelA.neuron, modelB.neuron, {"rule": "one_to_one"}, {"model": "static_synapse"})
nest.Connect(modelA.neuron, modelB.neuron, {"rule": "one_to_one"}, {"model": "static_synapse", "weight": 5.})
nest.Connect(modelB.neuron, modelA.neuron, {"rule": "one_to_one"}, {"model": "static_synapse", "weight": 1})
ax = plotMultipleNeurons(modelA, modelB)
plt.show()
Example2: Random Balanced Network¶
We will now simulate a network of integrate-and-fire (IAF), excitatory and inhibitory spiking neurons connected randomly with fixed weights. This network (called Brunel Network) was proposed and studied in [10]:
- The network is composed of
- excitatory neurons (80%)
- inhibitory neurons (20%)
- connection probability $\epsilon = 10\%$ (
epsilon
). - All neuron types can be connected (E$\rightarrow$E, E$\rightarrow$I, I$\rightarrow$E and I$\rightarrow$I).
- Neurons receive external input from oustide the network modeled by a Poisson spike train generator[11]
- rate
nu_ex
, $\nu_{ex} = \eta * \nu_{th}$ - threshold rate $\nu_{th}$ (
nu_th
) is calculated using $$\nu_{th} = \frac{\theta}{J * \text{CE} * \tau_{m}}$$ - In general, the interspike intervals drawn are exponentially distributed.
- rate
- The synaptic input current on each neuron is calculated as
$$I_i(t) = \tau \sum_j J_j \sum_k \delta(t-\delta-t_j^k)$$
- Sum over $j$ covers all synapses (internal and external to the network)
- Sum over $k$ covers all spikes arriving at times $t^k_j$ with delay $\delta$ (
delta
).
This simple postsynaptic current model is called delta synapse, in which stimulation is applied in a single point in time for simplicity using the Dirac delta function. In biological systems, synaptic models are diverse and can vary within the same neuron depending on the function of the synapse, for example.
In NEST, we will create a network of iaf_psc_delta
neurons connected using static_synapse
with weights $J_E$ and $J_I = -g*J_E$. We will study how this network changes its behavior depending on its parametrization.
Sources:¶
[10] Brunel N (2000). Dynamics of sparsely connected networks of excitatory and inhibitory spiking neurons. Journal of Computational Neuroscience 8, 183-208.
[11] Poisson distributionm
from OIST.tutorial3 import BrunelNetwork
model = BrunelNetwork()
# Model parameters.
model.__dict__
neuron_params = {
"C_m" : 1.0,
"tau_m" : 20.0,
"t_ref" : 2.0,
"E_L" : 0.0,
"V_reset": 0.0,
"V_m" : 0.0,
"V_th" : 20.0
}
model.build(N_total_neuron=12500, N_rec_volt=5, N_rec=50, J=0.1, **neuron_params)
model.simulate(T=900.)
Below we plot the voltage, raster, and activity rate of the excitatory and inhibitory populations.
model.plotVoltageEx()
model.plotVoltageIn()
model.plotVoltageCompare(0)
model.plotRaster()
- One of the main findings in Brunel's paper is the different activity states that can be observed in this network depending on
- $g$: the ratio between inhibitory and excitatory synaptic weights
- $\eta$: the external excitation rate relative to the threshold rate
- $\delta$: the synaptic delay
- $\epsilon$: the connection probability.
- By altering these arguments, he observed different collective behaviors identified as
- asynchronous irregular (AI)
- asynchronous regular (AR)
- synchronous irregular (SI)
- synchronous regular (SR).
- These behaviors are exemplified in the images below.
Q. Use the graphs as a starting point. Try to elaborate on how each of the four parameters affects the state of this network.