Home‎ > ‎Publications‎ > ‎

"Strictly positive definite spike train kernels for point process divergence" by Park, Seth, Rao and Principe

Exploratory tools that are sensitive to arbitrary statistical variations in spike train observations open up the possibility of novel neuroscientific discoveries. Developing such tools, however, is difficult due to the lack of Euclidean structure of the spike train space, and an experimenter usually prefers simpler tools that only capture limited statistical features of the spike train such as mean spike count or mean firing rate. We explore strictly positive definite kernels on the space of spike trains to offer both a structural representation of this space and a platform for developing statistical measures that explore features beyond count or rate. We apply these kernels to construct measures of divergence between two point processes, and use them for hypothesis testing, that is, to observe if two sets of spike trains originate from the same underlying probability law. Although there exist positive definite spike train kernels in the literature, we establish that these kernels are not strictly definite, and thus, do not induce measures of divergence. We discuss the properties of both these existing non-strict kernels and the novel strict kernels in terms of their computational complexity, choice of free parameters, and performance on both synthetic and real data through kernel principal component analysis and hypothesis testing.

Download [not available yet] Bibtex [not available yet] Matlab code

Related publications

I. Park, S. Seth, M. Rao, J. C. Principe, Estimating symmetric chi-square divergence for point processesin IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2011. 

S. Seth, A. J. Brockmeier, J. C. Principe, A metric based approach toward point process divergence, in IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2011.

S. Seth, I. Park, A. Brockmeier, M. Semework, J. Choi, J. Francis, J. C. Principe, A novel family of non-parametric cumulative based divergences for point processesin Advances in Neural Information Processing Systems (NIPS), 2010. [description and code]