Submit a preprint

1

A quick and easy way to estimate entropy and mutual information for neuroscienceuse asterix (*) to get italics
Mickael Zbili, Sylvain RamaPlease use the format "First name initials family name" as in "Marie S. Curie, Niels H. D. Bohr, Albert Einstein, John R. R. Tolkien, Donna T. Strickland"
2021
<p>Calculations of entropy of a signal or mutual information between two variables are valuable analytical tools in the field of neuroscience. They can be applied to all types of data, capture nonlinear interactions and are model independent. Yet the limited size and number of recordings one can collect in a series of experiments makes their calculation highly prone to sampling bias. Mathematical methods to overcome this so-called “sampling disaster” exist, but require significant expertise, great time and computational costs. As such, there is a need for a simple, unbiased and computationally efficient tool for estimating the level of entropy and mutual information. In this paper, we propose that application of entropy-encoding compression algorithms widely used in text and image compression fulfill these requirements. By simply saving the signal in PNG picture format and measuring the size of the file on the hard drive, we can estimate entropy changes through different conditions. Furthermore, with some simple modifications of the PNG file, we can also estimate the evolution of mutual information between a stimulus and the observed responses through different conditions. We first demonstrate the applicability of this method using white-noise-like signals. Then, while this method can be used in all kind of experimental conditions, we provide examples of its application in patch-clamp recordings, detection of place cells and histological data. Although this method does not give an absolute value of entropy or mutual information, it is mathematically correct, and its simplicity and broad use make it a powerful tool for their estimation through experiments.</p>
You should fill this box only if you chose 'All or part of the results presented in this preprint are based on data'. URL must start with http:// or https://
https://github.com/Sylvain-Deposit/PNG-EntropyYou should fill this box only if you chose 'Scripts were used to obtain or analyze the results'. URL must start with http:// or https://
https://github.com/Sylvain-Deposit/PNG-EntropyYou should fill this box only if you chose 'Codes have been used in this study'. URL must start with http:// or https://
Entropy, Mutual Information, Electrophysioogy, Histology, PNG, DEFLATE, Place Fields
NonePlease indicate the methods that may require specialised expertise during the peer review process (use a comma to separate various required expertises).
Electrophysiology
e.g. John Doe john@doe.com
No need for them to be recommenders of PCI Neuro. Please do not suggest reviewers for whom there might be a conflict of interest. Reviewers are not allowed to review preprints written by close colleagues (with whom they have published in the last four years, with whom they have received joint funding in the last four years, or with whom they are currently writing a manuscript, or submitting a grant proposal), or by family members, friends, or anyone for whom bias might affect the nature of the review - see the code of conduct
e.g. John Doe john@doe.com
2020-08-06 13:36:25
Haudur Freyja Olafsdottir
Federico Stella, Anonymous