What is the definition of information from Shannon?
Information is the reduction of uncertainty
Shannon’s axioms of uncertainty measure
For equiprobable states uncertainty grows with the number that states
Uncertainty should be continuous in probability of states
“Branching” property (let’s discuss some other time)
Shannon’s Entropy
Entropy of a system is a measure of uncertainty
is maximal when all outcomes are equally likely
Measures how much knowing one variable (e.g., stimulus S) reduces uncertainty about another (e.g., response R).
From uncertainty to information
Stimulus (S) —> Channel (e.g. neuron) -> Response (R)
Mutual information is the overlap of the Stimulus entropy and response entropy
What is mutual information
I(S;R) = 0 → no info about S in R;
I(S;R) = H(S) → full info.
🔑 Mutual information generalizes correlation to nonlinear relationships.
Efficient Coding Hypothesis
Information maximisation as a normative objective for neural systems
Applications in Neuroscience of Spike Coding & Entropy Estimation
Neural responses are treated as words (binary or multi-letter depending on spike binning).
Finer time resolution (smaller bins) increases response vocabulary and entropy.
Estimation techniques allow quantifying the information content in spike trains.
🔸 Experiment Example:
Recordings from fly motion-sensitive H1 neuron.
Precise spike timing reveals sub-millisecond structure.
Natural stimuli evoke more structured, information-rich responses.
Why does a moth fly against the wind/a smell and then suddenly casts perpendicular when the wind stops?
Increase probability of encountering odor
What is the Efficient Coding Hypothesis?
Proposed by Barlow (1961): Sensory systems evolve to maximize information transmission.
Laughlin (1981): Fly photoreceptors match response distribution to input statistics (histogram equalization).
Fairhall et al. (2001): Sensory systems can dynamically adapt to changes in stimulus distribution.
What’s the goal od population coding & image statistics?
Population codes reflect information-maximizing structure.
Neuronal receptive fields (e.g. in V1) are shaped to efficiently represent natural image statistics.
Sparse coding (Olshausen & Field, 1996) and Independent Component Analysis (ICA) model how V1 cells decorrelate and represent features.
Information in Behavior: What is Infotaxis?
Infotaxis (Vergassola et al., 2007): A search strategy that maximizes expected information gain, not just odor concentration.
Used to model animal foraging behavior (e.g., C. elegans, Calhoun et al. 2014).
Explains exploration vs. exploitation using entropy reduction.
summary slide
Please, explain the key principles of the concept of population coding and describe which parameters determine the coding precision of a population code. Which statistical measure can be used to quantify the coding precision of a population code?
Instead of relying on single neurons, population coding refers to the idea that a group (population) of neurons collectively encodes information about a stimulus (e.g., direction of motion, sound frequency, etc.).
Each neuron responds to a range of stimuli but with a preferred value (e.g., a certain angle or frequency).
Statistical Measure of Coding Precision
Zuletzt geändertvor 5 Tagen