You are on page 1of 73

Short Survey of

Brain Machine Interfaces


Jose C. Principe, Ph.D.
Distinguished Professor ECE, BME

Computational NeuroEngineering Laboratory
Electrical and Computer Engineering Department
University of Florida
www.cnel.ufl.edu
principe@cnel.ufl.edu
Acknowledgments
Dr. Justin Sanchez, University of Florida
Dr. Phil Kim, Brown University

My students: Yiwen Wang
Antonio Paiva
Il Park
Aysegul Gunduz

NSF (DDDAS, CRCNS), NIH NIBIB, DARPA

Outline
Brain Machine Interface
Definitions
Types
Hardware challenges
BMI Models using rate codes
BMI Models using spike trains
Why is a New Neurotechnology Emerging?





Synergistic advances in
Neuroscience
Understanding the brain as information processing
system, localization, coding
Physical Interfaces
Tiny size, stable materials, effective at sensing and
stimulating
Device miniaturization
Low power electronics, fast computers, algorithms
Brain Machine Interfaces (BMI)





A man made device that either substitutes a
sensory input to the brain, repairs functional
communication between brain regions or
translates intention of movement.


Types of BMIs
Sensory (Input BMI): Providing sensory input to form percepts when natural
systems are damaged.
Ex: Visual, Auditory Prosthesis

Motor (Output BMI): Converting motor intent to a command output (physical
device, damaged limbs)
Ex: Prosthetic Arm Control

Cognitive BMI: Interpret internal neuronal state to deliever feedback to the
neural population.
Ex: Hippocampus bypass

Clinical BMIs: Treat neurological disorders
Parkinsons disease
Epilepsy
Mood disorders

One can also tap in the peripheral nervous system (PNS) to use the
nerves to actuate limbs

Deep Brain Stimulation (DBS)





Donoghue, 2005
Parkinson's disease
Medtronics
Vagal Nerve Stimulation





Epilepsy
Mood disorders
Cyberonics
Cochlear Implants





(http://www.cochlear.com)
Electronics
(12-22 channels)
battery
microphone
Headpiece
(antenna)
Internal receiver
stimulator
wires
Electrodes
(12-22)
Auditory
Nerve
Visual prosthesis





Prof. J. Lobo Antunes was involved in this project

http://www.artificialvision.com/vision/asaio1.html
Cognitive/Memory Prosthesis





Berger et al, 2008
damage
Neuromotor Prosthesis
From Thought to Action
Many neuropathies leave cognition intact but disrupt the
control of the motor system
Spinal cord injury
ALS
Cerebral palsy
Stroke
Lock-in syndrome
Muscular dystrophy/atrophy
Limb loss
Goal is to bypass the motor system and create a direct
path between the cortex and an external device.

Proof of Conceptand media fanfare!
What are the underlying principles behind neural control of devices?
J.R. Wolpaw et al. 2002
BCI (BMI) bypasses the brains normal pathways of peripheral nerves (and muscles)
General Architecture
INTENT
PERCEPT
ACTION
STIMULUS
Decoding
Coding
BRAIN MACHINE
Neural Interface Physical Interface
The Fundamental Concept
Stimulus Neural Response
Coding Given To be inferred
Decoding To be inferred Given
Need to understand how brain processes information.
Levels of Abstraction for Neurotechnology

Brain is an extremely
complex system
10
12
neurons
10
15
synapses
Specific
interconnectivity
Tapping into the Nervous System

The choice and availability of brain signals and
recording methods can greatly influence the ultimate
performance of the BMI.

The level of BMI performance may be attributed to
selection of electrode technology, choice of model, and
methods for extracting rate, frequency, or timing codes.
http://ida.first.fhg.de/projects/bci/bbci_official/
Coarse(mm)
Choice of Scale for Neuroprosthetics

Bandwidth
(approximate)
Localization
Scalp
Electrodes
0 ~ 80 Hz Volume
Conduction
Cortical Surface
Electro-
corticogram
(ECoG)
0 ~ 500Hz Cortical Surface
Implanted
Electrodes
0 ~ 7kHz Single Neuron
Spatial Resolution of Recordings

Moran
Florida Multiscale Signal Acquisition
EEG
ECoG
Microelectrodes
Least
Invasive
Highest
Resolution
NRG IRB
Approval for
Human
Studies
NRG
IACUC
Approval for
Animal
Studies
Develop a experimental paradigm with a nested hierarchy
for studying neural population dynamics.
5/5/2012 22
Examples of Multiscale Signals
Scalp EEG
Penfield
Spikes and
LFPs
In vivo extracellular
Common BMI-BCI Methods
BMIs --- Invasive, work with intention of movement
Spike trains, field potentials, ECoG
Very specific, potentially better performance

BCIs --- Noninvasive, subjects must learn how to control their
brain activity
EEG
Very small bandwidth
Brain Computer Interfaces (BCI) EEG







Sensory Motor Rhythm
Wasdworth Center, NY
How to put it together?
NeoCortical Brain Areas Related to Movement
Posterior Parietal (PP)
Visual to motor
transformation

Premotor (PM) and Dorsal
Premotor (PMD) -
Planning and guidance
(visual inputs)

Primary Motor (M1)
Initiates muscle contraction

Electrophysiology:
Electrode Arrays
50m polyimide insulated
tungsten
250m separation
Wire impedance of 500K
1.5M

footing
two polyimide cables
Flexible polyimide cable
integrated with rigid metal
electrodes
Design Strategy
Metal electrodes
(array of 16)
Glass (Pyrex) wafer
Cured Polyimide

Sputter nickel, pattern via lift-off
Coat with polyimide,
Etch polyimide from contact pads &
probe tip, Insulate free probe tips
(CVD Parylene C)
Remove from substrate
Cut out individual probes
Footing to prescribe insertion depth
Batch fabricated to
reduce assembly time
UF Electrode Arrays
0 0.01 0.02 0.03 0.04
-40
-30
-20
-10
0
10
20
30
40
50
Time (s)
M
i c
r
o
v
o
l t s
J. C. Sanchez, N. Alba, T. Nishida, C. Batich, and P. R. Carney,
"Structural modifications in chronic microwire electrodes for cortical
neuroprosthetics: a case study," IEEE Transactions on Neural
Systems and Rehabilitation Engineering, 2006
28mm
15mm
12mm Thru vias to
RX/Power Coil
+
12.5 mm
Coil winding
3.5 mm
50m pitch
Electrodes
Coin Battery
(10 x 2.5 mm)
Thru vias to
Battery
Supporting
screws
Flexible
substrate
TX antenna
Modular
Electrodes
Electrode
attachment
sites
IF-IC
RFIC
18 mm
Coil
Battery
Patterned
Substrate
Supporting
Substrate
Electrode
Array
IC
Flip-chip
connection
Specifications:
16 flexible microelectrodes (40 dB, 20 KHz)
Wireless (500 Kpulse/sec)
2mW of power (72-96 hours between charges)
FWIRE: Florida Wireless Implantable
Recording Electrodes
RatPack
Low-Power, Wireless, Portable BMIs

Requirements
Total Weight: < 100g
Small Form Factor
Battery Powered: Run for 4
hours
64 channels
Methods
Customized electronics
Novel discriminative coders
achieving 64:1 compression
with high SNRs



UF PICO System (Backpack)
PICO system = DSP + Wireless
Generation 3
Motor Tasks Performed
-40 -30 -20 -10 0 10 20 30 40
-40
-30
-20
-10
0
10
20
30
40
T
a
s
k

1

T
a
s
k

2

Data
2 Owl monkeys Belle,
Carmen
2 Rhesus monkeys
Aurora, Ivy
54-192 sorted cells
Cortices sampled: PP,
M1, PMd, S1, SMA
Neuronal activity rate
and behavior is time
synchronized and
downsampled to 10Hz

100 msec Binned Counts Raster of 105 neurons (spike sorted)

Firing Rates
Time
N
e
u
r
o
n

N
u
m
b
e
r
200 400 600 800 1000 1200 1400 1600 1800 2000
10
20
30
40
50
60
70
80
90
100
Ensemble Correlations Local in Time are Averaged with
Global Models
Computational Models of Neural Intent

Two different levels of neurophysiology realism

Black Box models no realism, function relation between
input desired response

Generative Models minimal realism, state space models
using neuroscience elements
Signal Processing Approaches with Black
Box Modeling
Accessing 2 types of signals (cortical activity and behavior) leads us to a
general class of I/O models.

Data for these models are rate codes obtained by binning spikes on 100
msec windows.

Optimal FIR Filter linear, feedforward
TDNN nonlinear, feedforward
Multiple FIR filters mixture of experts
RMLP nonlinear, dynamic
Optimal Linear Model

Ten tap embedding with 105
neurons
For 1-D topology contains
1,050 parameters (3,150)
The Wiener solution (coincides
with linear regression)

Optimal Linear Model

Let us assume that a M dimensional multiple time series is
generated by a stationary stable vector autoregressive (VAR)
model (b is a colum vector and W are MxM matrices of
coefficients)

In matrix notation this can be written X=AZ+U

The multivariate least square estimation chooses the estimator
that minimizes
) ( ) ( ... ) 1 ( ) (
1
n u L n x W n x W b n x
L
+ + + + =
1 ] [
1 ) ( ] [
1 ] [
] ,..., [
) 1 ( ] ,..., [
1 ) 1 ( ] ,..., , 1 [
) 1 ( ] ,..., , [
] ,... [
2
1
1 0
1
1
1
MTx U vec
x M L M A vec
MTx X vec
MxT u u U
xT ML Z Z Z
x ML x x Z
ML Mx W W b A
MxT x x X
T
T
T
L n n n
L
T
=
+ =
=
=
+ =
+ =
+ =
=

+
v
o
_
)} ( ) {( ) (
1 1
AZ X AZ X tr J
T T
E = E =

v v o
_ o
o
o
) ( 2 ) ( 2
) (
1 1
E E =
c
c
Z ZZ
J
T
_ o ) ) ((

1
I Z ZZ
T
=

1
) (

=
T T
ZZ XZ A

Optimal Linear Model



Effectively we use a regularized
solution


Normalized LMS with weight
decay is a simple starting point.



Four multiplies, one divide and
two adds per weight update

) ( ) (
) (
) ( ) 1 (
2
n x n e
n x
n w n w
o
q
+
+ = +
p w
1
) (

+ = I R
Time-Delay Neural Network (TDNN)

The first layer is a bank of linear
filters followed by a nonlinearity.
The number of delays to span I
second
y(n)= wf(wx(n))
Trained with backpropagation
Topology contains a ten tap
embedding and five hidden
PEs 5,255 weights (1-D)
Principe, UF
Multiple Switching Local Models

Multiple adaptive filters that compete to win the modeling of a signal
segment.
Structure is trained all together with normalized LMS/weight decay
Needs to be adapted for input-output modeling.
We selected 10 FIR experts of order 10 (105 input channels)
d(n)
Recurrent Multilayer Perceptron (RMLP)
Nonlinear Black Box
Spatially recurrent dynamical
systems
Memory is created by feeding
back the states of the hidden
PEs.
Feedback allows for continuous
representations on multiple
timescales.
If unfolded into a TDNN it can be
shown to be a universal mapper
in R
n

Trained with backpropagation
through time
) ) 1 ( ) ( ( ) (
1 1 1 1
b y W x W y + + = t t f t
f
2 1 2 2
) ( ) ( b y W y + = t t
Model Building Techniques
Train the adaptive system with neuronal firing rates
(100 msec) as the input and hand position as the
desired signal.
Training - 20,000 samples (~33 minutes of neuronal
firing)
Freeze weights and present novel neuronal data.
Testing - 3,000 samples (5 minutes of neuronal
firing)



Results (Belle)


Signal to error ratio (dB)

Correlation Coefficient



(average)

(max)

(average)

(max)

LMS

0.8706

7.5097

0.6373

0.9528

Kalman

0.8987

8.8942

0.6137

0.9442

TDNN

1.1270

3.6090

0.4723

0.8525

Local Linear

1.4489

23.0830

0.7443

0.9748

RNN

1.6101

32.3934

0.6483

0.9852



Based on 5 minutes of test data, computed over 4 sec
windows (training on 30 minutes)
Computing Sensitivities Through the
Models
T
i
i t
T
f t
T
t
t
1
1
2
2
) (
) (
W D W D W
x
y
|
|
.
|

\
|
=
A c
c
[
A
=

) ) 1 ( ) ( ( ) (
1 1 1 1
b y W x W y + + = t t f t
f
2 1 2 2
) ( ) ( b y W y + = t t
Feedforward RMLP Eqs.
General form of RMLP
Sensitivity
Feedforward Linear Eq.
General form of Linear
Sensitivity
W
x
y
=
c
c
) (
) (
t
t
) ( ) ( t t Wx y =
Identify the neurons that affect the output the most.
Data Analysis : The Effect of Sensitive Neurons on Performance

0 20 40 60
-20
0
20
40
60
Hightest Sensitivity Neurons
0 20 40 60
-20
0
20
40
60
Middle Sensitivity Neurons
0 20 40 60
-20
0
20
40
60
Lowest Sensitivity Neurons
0 20 40 60 80
0
0.2
0.4
0.6
0.8
1
P
r
o
b
a
b
i
l
i
t
y
3D Error Radius (mm)
Movements (hits) of Test Trajectory
10 Highest Sensitivity
84 Intermediate Sensitivity
10 Lowest Sensitivity
All Neurons
0 20 40 60 80 100 120
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
S
e
n
s
i
t
i
v
i
t
y
Primate 1, Session 1
Neurons
93
19
29
5
4
84
7
26
45
104
Decay trend appears in all
animals and behavioral
paradigms
Cortical Contributions Belle Day 2
0 20 40
-20
0
20
40
60
Area 1
0 20 40
-20
0
20
40
60
Area 2
0 20 40
-20
0
20
40
60
Area 3
0 20 40
-20
0
20
40
60
Area 4
0 20 40
-20
0
20
40
60
Areas 12
0 20 40
-20
0
20
40
60
Areas 13
0 20 40
-20
0
20
40
60
Areas 14
0 20 40
-20
0
20
40
60
Areas 23
0 20 40
-20
0
20
40
60
Areas 24
0 20 40
-20
0
20
40
60
Areas 34
0 20 40
-20
0
20
40
60
Areas 123
0 20 40
-20
0
20
40
60
Areas 124
0 20 40
-20
0
20
40
60
Areas 134
0 20 40
-20
0
20
40
60
Areas 234
0 20 40
-20
0
20
40
60
Areas 1234
Area 1 PP
Area 2 M1
Area 3 PMd
Area 4 M1 (right)
Train 15 separate RMLPs with every combination of cortical input.
Is there enough information in spike
trains for modeling movement?
Analysis is based on the time embedded model
Correlation with desired is based on a linear filter output for
each neuron
Utilize a non-stationary tracking algorithm
Parameters are updated by LMS
Build a spatial filter
Adaptive in real time
Sparse structure based on regularization for enables
selection

Adapted by LMS
Adapted by on-line LAR
(Kim et. al., MLSP, 2004)
Architecture

x
1
(n)
z
-1

z
-1

E
y
1
(n)
w
11
w
1L
/
/

x
M
(n)
z
-1

z
-1

E y
M
(n)
w
M1
w
ML
/
/


E
y
2
(n)


c
1
c
M
) (

n d
c
2
Training Algorithms

Tap weights for every time lag is updated by LMS

Then, the spatial filter coefficients are obtained by on-line version of
least angle regression (LAR) (Efron et. al. 2004)
|
i
=0

r = y-X| = y
Find argmax
i
|x
i
T
r|
x
j
|
j
r = y-X| = y-x
j
|
j
Adjust |
j
s.t.
-k, |x
k
T
r|=|x
i
T
r|
.

.

.

x
1
x
k
y

x
j
|
j
r = y-(x
j
|
j
+ x
k
|
k
)

Adjust |
j
& |
k
s.t.
-q, |x
q
T
r|=|x
k
T
r|=|x
i
T
r|
|
k
) ( ) ( 2 ) ( ) 1 ( n x n e n w n w
ij ij ij
q + = +
Application to BMI Data Tracking
Performance
Application to BMI Data Neuronal
Subset Selection
Hand
Trajectory
(z)
Neuronal
Channel
Index
Early
Part
Late
Part
Generative Models for BMIs

Use partial information about the physiological system, normally
in the form of states.

They can be either applied to binned data or to spike trains
directly.

Here we will only cover the spike train implementations.

Difficulty of spike train Analysis:
Spike trains are point processes, i.e. all the information is contained
in the timing of events, not in the amplitude fo the signals!
Build an adaptive signal processing framework for
BMI decoding in the spike domain.

Features of Spike domain analysis
Binning window size is not a concern
Preserve the randomness of the neuron behavior.
Provide more understanding of neuron physiology (tuning) and
interactions at the cell assembly level
Infer kinematics online
Deal with nonstationary
More computation with millisecond time resolution
Goal
Recursive Bayesian Approach

) ,
~
(
~
t t
n X H Z
t
t
=
State
Time-series
model
cont. observ.
Prediction
) , (
~
1
1

=
t
t
t
t v X F X
Updating
t
Z
P(state|observation)
Recursive Bayesian approach

State space representation



First equation (system model) defines a first order Markov process.
Second equation (observation model) defines the likelihood of the
observations p(z
t
|x
t
) . The problem is completely defined by the
prior distribution p(x
0
).
Although the posterior distribution p(x
0:t
|u
1:t
,z
1:t
) constitutes the
complete solution, the filtering density p(x
t
|u
1:t
, z
1:t
) is normally
used for on-line problems.
The general solution methodology is to integrate over the unknown
variables (marginalization).

+ =
+ =
+
t t t t t
t t t
n x u h z
v x f x
) , (
) (
1
Recursive Bayesian approach


There are two stages to update the filtering density:
Prediction (Chapman Kolmogorov)


System model p(x
t
|x
t-1
) propagates into the future the posterior density
Update


Uses Bayes rule to update the filtering density. The following equations
are needed in the solution.

}

=
1 1 : 1 1 : 1 1 1 1 : 1 1 : 1
) , | ( ) | ( ) , | (
t t t t t t t t t
dx z u x p x x p z u x p
) , | (
) , | ( ) , | (
) , | (
1 : 1
1 : 1 1 : 1
: 1 : 1


=
t t t
t t t t t t
t t t
z u u p
z x x p u x z p
z u x p
1 1 1 1 1 1 1 1 1 1
) ( ) ( ) | ( ) , | ( ) | (

} }
= =
t t t t t t t t t t t t t
dv v p x v x dv x v p x v x p x x p
}
=
t t t t t t t t t
dn n p n x u h z u x z p ) ( ) ) , ( ( ) , | (
t t t t t t t t t t
dx u z x p u x z p u z z p
}

= ) , | ( ) , | ( ) , | (
1 : 1 1 : 1 1 : 1
State estimation framework for BMI decoding

Tuning function

Kinematics
state
Neural Tuning
function
Multi-spike trains
observation
x
k k-1
x
k
F
k-1
v
= ( )
,
k
x
k
z
k
H
k
n
=
) (
,
0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2
x 10
5
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
time
s p i k e
0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2
x 10
5
-1.5
-1
-0.5
0
0.5
1
1.5
time (ms)
v
e
l o
c
i t y
Decoding
Kinematic dynamic model

Key Idea: work with the probability of spike firing which is a
continuous random variable
Kalman filter for BMI decoding

Kinematic
State
Neuron tuning
function Firing rate
Continuous
Observation
P(state|observation)
Prediction
Updating
Gaussian
Linea
r
Linea
r
[Wu et al. 2006]
For Gaussian noises and linear prediction and observation models, there
is an analytic solution called the Kalman Filter.
Particle Filter for BMI decoding

Kinematic
State
Neuron tuning
function Firing rate
Continuous
Observation
P(state|observation)
Prediction
Updating
nonGaussian
Linea
r
Exponential
[Brockwell et al. 2004]
In general the integrals need to be approximated by sums using
Monte Carlo integration with a set of samples drawn from the
posterior distribution of the model parameters.
Step 2- Tuning Function Estimation

Neural firing Model




Assumption :

generation of the spikes depends only on the kinematic
vector we choose.
Linear
filter
nonlinear f
Poisson
model
velocity
spikes
) (
t
t
v k f =
) (
t t
Poisson spike =
Step 2- Linear Filter Estimation

Spike Triggered Average (STA)

Geometry interpretation
] [ ) ] [ (
|
1
v E I v v E k
spike v
T

+ = o
-30 -20 -10 0 10 20 30
-25
-20
-15
-10
-5
0
5
10
15
20
25
1st Principal Component
2
n
d

P
r
i
n
c
i
p
a
l

C
o
m
p
o
n
e
n
t
neuron 72: VpS PCA
Vp
VpS
1
st
Principal component
2
n
d

P
r
i
n
c
i
p
a
l

c
o
m
p
o
n
e
n
t

Step 2- Nonlinear f estimation


Step 2- Diversity of neural nonlinear properties



Ref: Paradoxical cold
[Hensel et al. 1959]
Step 2- Estimated firing probability and
generated spikes


Step 3: Sequential Estimation Algorithm for
Point Process Filtering

Consider the neuron as an inhomogenous Poisson point process

Observing N(t) spikes in an interval AT, the posterior of the spike
model is

The probability of observing an event in At is

And the one step prediction density (Chapman-Kolmogorov)


The posterior of the state vector, given an observation AN

} exp{ ) (
k k k
v k t + =
t
t t t t N t t N
t t t t
t A
= A +
=
A
)) ( ), ( ), ( | 1 ) ( ) ( Pr(
lim )) ( ), ( ), ( | (
0
H x
H x
) ) , | ( exp( ) ) , | ( ( ) , | ( t t t t N P
k k k
N
k k k k k k
k
A A = A
A
H x H x H x
) | (
) | ( ) , | (
) , | (
k k
k k k k k
k k k
N p
p N P
N p
H
H x H x
H x
A
A
= A
1 1 1 1 1
) , | ( ) , | ( ) | (

A =
}
k k k k k k k k k
d N p p p x H x H x x H x
Step 3: Sequential Estimation Algorithm for
Point Process Filtering

Monte Carlo Methods are used to estimate the integral. Let
represent a random measure on the posterior density, and represent
the proposed density by
The posterior density can then be approximated by

Generating samples from using the principle of Importance
sampling

By MLE we can find the maximum or use direct estimation with kernels
of mean and variance

) | (
: 1 : 0 k k
N q x

=
~ A
N
i
i
t t
i
t t t
x x k w N x p
1
: 0 : 0 : 1 : 0
) , ( ) | ( o
S
N
i
i
k
i
k
w
1 : 0
} , {
=
x
S
N
i
i
k
i
k
w
1 : 0
} , {
=
x
S
N
i
i
k
i
k
w
1 : 0
} , {
=
x
S
N
i
i
k
i
k
w
1 : 0
} , {
=
x
S
N
i
i
k
i
k
w
1 : 0
} , {
=
x
) , | (
) | ( ) | (
) | (
) | (
1
1
1
: 1 : 0
: 1 : 0
k
i
k
i
k
i
k
i
k
i
k k i
k
k
i
k
k
i
k i
k
N q
p N p
w
N q
N p
w
A
A
=

x x
x x x
x
x

=
A =
S
N
i
i
k
i
k
k
k
N p
1
~
) | ( x x x
) ) ( ) ( ( ) | (
~
1
~
T
k
i
k
N
i
k
i
k
i
k
k k
S
N p V x x x x x + A =

=
o

) | (
: 1 : 0 k k
N q x
Posterior density at a time index
-2.5 -2 -1.5 -1 -0.5 0 0.5
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
velocity
p
r
o
b
a
b
i
l
i
t
y
pdf at time index 45.092s
posterior density
desired velocity
velocity by seq. estimation (collapse)
velocity by seq. estimation (MLE)
velocity by adaptive filtering
Step 3: Causality concerns

0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2
x 10
5
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
time
s
p
i k
e
0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2
x 10
5
-1.5
-1
-0.5
0
0.5
1
1.5
time (ms)
v
e
l o
c
i t
y

=
=
1 , 0
2 ) ; (
)
) (
)) ( | (
( log )) ( | ( )) ( ( ) (
spike X
KX spike
spike p
lag KX spike p
lag KX spike p lag KX p lag I
lag
For 185 neurons, average delay is 220.108 ms
0 50 100 150 200 250 300 350 400 450 500
0.4
0.6
0.8
1
1.2
1.4
1.6
1.8
2
2.2
2.4
time delay (ms)
I
(
s
p
k
,
K
X
)
(
T
i
m
e
D
e
l
a
y
)
I(spk,KX) as function of time delay
neuron 80
neuron 72
neuron 99
neruon 108
neruon 77
Figure 3-14 Mutual information as function of time delay for 5 neurons.
Step 3: Information Estimated Delays
Step 4:
Monte Carlo sequential kinematics estimation

) (
i
i
t
t
X k f =
Kinematic
State
Neural Tuning
function
spike trains
Prediction
i
t
i
t
t
i
t v X F X
1
1

+ =
Updating
) | (
) (
1
i
t
j
t
i
t
i
t
N p w w A

) ( j
t
N A
NonGaussian
P(state|observation)

=
~ A
N
i
i
t t
i
t
j
t t
x x k w N x p
1
: 0 : 0
) (
: 1 : 0
) ( ) | (

=
~
N
i
i
k k
i
k k
k k W N p
1
: 1
) ( ) | ( x x x

Reconstruct the kinematics from neuron spike
trains
650 700 750 800
-30
-20
-10
0
10
t
Px
650 700 750 800
-40
-20
0
20
40
t
Py
650 700 750 800
-2
-1
0
1
t
Vx
650 700 750 800
-2
0
2
t
Vy
650 700 750 800
-0.1
0
0.1
0.2
0.3
t
Ax
650 700 750 800
-0.1
0
0.1
0.2
0.3
t
Ay
desired
cc
exp
=0.7002
cc
MLE
=0.69188
desired
cc
exp
=0.015071
cc
MLE
=0.040027
desired
cc
exp
=0.91319
cc
MLE
=0.91162
desired
cc
exp
=0.81539
cc
MLE
=0.8151
desired
cc
exp
=0.97445
cc
MLE
=0.95376
desired
cc
exp
=0.80243
cc
MLE
=0.67264
Table 3-2 Correlation Coefficients between the Desired Kinematics and the
Reconstructions

CC
Position Velocity Acceleration
x y x y x y
Expectation 0.8161 0.8730 0.7856 0.8133 0.5066 0.4851
MLE 0.7750 0.8512 0.7707 0.7901 0.4795 0.4775
Table 3-3 Correlation Coefficient Evaluated by the Sliding Window

CC
Position Velocity Acceleration
x y x y x y
Expectation
0.84010
0.0738
0.8945
0.0477
0.7944
0.0578
0.8142
0.0658
0.5256
0.0658
0.4460
0.1495
MLE
0.7984
0.0963
0.8721
0.0675
0.7805
0.0491
0.7918
0.0710
0.4950
0.0430
0.4471
0.1399
Results comparison
[Sanchez, 2004]


Conclusion
Our results and those from other laboratories show it is possible to
extract intent of movement for trajectories from multielectrode array
data.
The current results are very promising, but the setups have limited
difficulty, and the performance seems to have reached a ceiling at an
uncomfortable CC < 0.9
Recently, spike based methods are being developed in the hope of
improving performance. But difficulties in these models are many.
Experimental paradigms to move the field from the present level need
to address issues of:
Training (no desired response in paraplegic)
How to cope with coarse sampling of the neural population
How to include more neurophysiology knowledge in the design

You might also like