The Tsallis entropy of natural information
Estimating the information contained in natural data, such as electroencephalography data, is unusually difficult because the relationship between the physical data and the information that it encodes is unknown. This unknown relationship is often called the encoding problem. The present work provides a solution to this problem by deriving a method to estimate the Tsallis entropy in natural data. The method is based on two findings. The first finding is that the physical instantiation of any information event, that is, the physical occurrence of a symbol of information, must begin and end at a discontinuity or critical point (maximum, minimum, or saddle point) in the data. The second finding is that, in certain data types such as the encephalogram (EEG), the variance within of an EEG waveform event is directly proportional to its probability of occurrence.
Year of publication: |
2007
|
---|---|
Authors: | Sneddon, Robert |
Published in: |
Physica A: Statistical Mechanics and its Applications. - Elsevier, ISSN 0378-4371. - Vol. 386.2007, 1, p. 101-118
|
Publisher: |
Elsevier |
Subject: | Information | Tsallis | Entropy | EEG | Electroencephalography | Alzheimer's | ADRD | Encoding | Memory | Non extensive |
Saved in:
Online Resource
Saved in favorites
Similar items by subject
-
Uva, Tomás, (2015)
-
Pagan, Natália Munari, (2024)
-
Examining consumer psychology and marketing professionals perceptions towards neuromarketing
Selvalakshmi, V., (2023)
- More ...
Similar items by person
-
Empirical Comparisons of Bilinear and Nonbilinear Utility Theories,
Sneddon, Robert, (2001)
- More ...