This paper proposes a unified framework for several available measures of independence by generalizing the concept of information theoretic learning (ITL). The key component of ITL is the use of inner product between two density functions as a measure of similarity between two random variables. We show that by generalizing the inner product using a symmetric strictly positive definite kernel and by choosing appropriate kernels, it is possible to reproduce a number of popular measures of independence. This unified framework also allows the design of new strictly positive definite kernels and corresponding measures of independence. Following this framework we explore a new measure of independence, and apply it in the context of linear independent component analysis (ICA). An attractive property of the proposed method is that it does not involve any free parameter, and we demonstrate that it performs equally well compared to the existing methods for ICA.