Assessing dependence between two sets of spike trains or between a set of input stimuli and the corresponding generated spike trains is crucial in many neuroscientific applications, such as in analyzing functional connectivity among neural assemblies, and in neural coding. Dependence between two random variables is traditionally assessed in terms of mutual information. However, although well explored in the context of real or vector valued random variables, estimating mutual information still remains a challenging issue when the random variables exist in more exotic spaces such as the space of spike trains. In the statistical literature, on the other hand, the concept of dependence between two random variables has been presented in many other ways, e.g. using copula, or using measures of association such as Spearman’s rho, and Kendall’s tau. Although these methods are usually applied on the real line, their simplicity, both in terms of understanding and estimating, make them worth investigating in the context of spike train dependence. In this paper, we generalize the concept of association to any abstract metric spaces. This new approach is an attractive alternative to mutual information, since it can be easily estimated from realizations without binning or clustering. It also provides an intuitive understanding of what dependence implies in the context of realizations. We show that this new methodology effectively captures dependence between sets of stimuli and spike trains. Moreover, the estimator has desirable small sample characteristic, and it often outperforms an existing similar metric based approach.
example.m Example demonstrating the use of generalized association
genassorep.m Generalized association between two sets of observations
distspike.m Pairwise distance matrix between a set of spike trains
skpd.m Victor-Purpura spike train distance by Daniel Reich and Jonathan Victor