University of Warsaw - Central Authentication System
Strona główna

Information theory for cognitive sciences

General data

Course ID: 2500-EN-COG-OB2Z-C-1
Erasmus code / ISCED: 14.4 Kod klasyfikacyjny przedmiotu składa się z trzech do pięciu cyfr, przy czym trzy pierwsze oznaczają klasyfikację dziedziny wg. Listy kodów dziedzin obowiązującej w programie Socrates/Erasmus, czwarta (dotąd na ogół 0) – ewentualne uszczegółowienie informacji o dyscyplinie, piąta – stopień zaawansowania przedmiotu ustalony na podstawie roku studiów, dla którego przedmiot jest przeznaczony. / (0313) Psychology The ISCED (International Standard Classification of Education) code has been designed by UNESCO.
Course title: Information theory for cognitive sciences
Name in Polish: Information theory for cognitive sciences
Organizational unit: Faculty of Psychology
Course groups:
ECTS credit allocation (and other scores): 3.00 Basic information on ECTS credits allocation principles:
  • the annual hourly workload of the student’s work required to achieve the expected learning outcomes for a given stage is 1500-1800h, corresponding to 60 ECTS;
  • the student’s weekly hourly workload is 45 h;
  • 1 ECTS point corresponds to 25-30 hours of student work needed to achieve the assumed learning outcomes;
  • weekly student workload necessary to achieve the assumed learning outcomes allows to obtain 1.5 ECTS;
  • work required to pass the course, which has been assigned 3 ECTS, constitutes 10% of the semester student load.
Language: English
Type of course:

obligatory courses

Mode:

Classroom

Short description:

The course introduces the basic concepts and measures of information theory. It provides an overview of its practical applications within cognitive sciences and other related fields such as biology, linguistics and social sciences. Limitations of the information theory as applied to cognitive sciences will also be addressed, basing on current discussions of the semantic aspects of information and the link between information-theoretical and thermodynamic entropy.

Full description:

The course introduces the basic concepts and measures of information theory. It provides an overview of its practical applications within cognitive sciences and other related fields such as biology, linguistics and social sciences.

The first part of the class will be devoted to the most important concepts of the theory (entropy, information measures, coding and compression of information, complexity). Next we will present applications of the measures designed within the theory of information in physics, biology, neuropsychology, linguistics. Finally, the limitations of information theory as applied to cognitive sciences will also be addressed, basing on current discussions of the semantic or pragmatic aspects of information and the link between information-theoretical and thermodynamic entropy.

Bibliography:

1. Introduction. Range of problems for information theory. Communication and Information.

a. Gleick: Information: Prologue, Chapters 1,6,7; optional: Chapter 8,9,13.

b. Additional readings/tutorials on probability theory and logarithms.

2. Mathematical bases of information theory: entropy, conditional entropy, mutual information

Cover, T. M., Thomas, J. A. (2006). “Elements of Information Theory”, New York: Wiley. Chapter 2.

3. Mathematical bases of information theory: relative entropy, divergences & data processing inequality

a. Cover, T. M., Thomas, J. A. (2006). “Elements of Information Theory”, New York: Wiley; Chapter 1&2.

b. Kullback-Leibler divergence: formulation & applications (presenter, discussion leader)

c. Data Processing Inequality: principle & applications (presenter, discussion leader)

d. Sufficient statistics and Maximum Entropy Principle

4. Foundations of Information Theory

a. Weaver, W. Recent contributions to the mathematical theory of communication. & Shannon, C. “The Mathematical Theory of Communication” (selected fragments). In: Shannon, C. & Weaver, In: The Mathematical Theory of Communication. The University of Illinois Press: Urbana. (presenters, discussion leaders)

b. Brillouin, L. (1969). Nauka a Teoria Informacji. Rozdział 1.

5. Applications: Neurobiology

Ta. ononi, G.; Edelman, G.M.; Sporns, O. Complexity and coherency: Integrating information in the brain. Trends Cognitive Science 1998, 2, 474–484 (fragments - till p. 480, without “Reconciling information processing and information storage: matching complexity”)

(presenters, discussion leaders)

6. Applications: Language

a. Zipf, G. K. (1964). „The Psychobiology of Language: an introduction to dynamic Philology.” Rozdział II „The Form and Behavior of Words” (recommended)

b. Piantadosi, S.T. “2014 Zipf ’s word frequency law in natural language: A critical review and future directions” Psychon Bull Rev (presenter, discussion leader)

c. Coupé et al. (2019). Different languages, similar encoding efficiency: Comparable information rates across the human communicative niche. Science Advances, Vol. 5, no. 9, eaaw2594, DOI: 10.1126/sciadv.aaw2594 (presenter, discussion leader)

7. Applications: Networks & structure

a. Introduction to Networks

b. Klein, B., & Hoel, E. (2020). The Emergence of Informative Higher Scales in Complex Networks. Complexity, 2020, 1–12.

(presenter, discussion leader)

8. Algorithmic information theory

a. Cover, T. M., Thomas, J. A. (2006). “Elements of Information Theory”, Chapter 14 (fragments).

b. Intro to algorithmic information theory

9. Complexity Measures I

a. Soler-Toscano, F., Zenil, H., Delahaye, J.-P., & Gauvrit, N. (2014). Calculating Kolmogorov Complexity from the Output Frequency Distributions of Small Turing Machines. PLoS ONE, 9(5), e96223. https://doi.org/10.1371/journal.pone.0096223

(presenter, discussion leader)

Gauvrit, N., Zenil, H., Soler-Toscano, F., Delahaye, J.-P., & Brugger, P. (2017). Human behavioral complexity peaks at age 25. PLOS Computational Biology, 13(4), e1005408. https://doi.org/10.1371/journal.pcbi.1005408

(1 presenter, 1 discussion leader)

10. Complexity measures II: Information, energy, cognition

a. Lloyd & Pagels, 1988, Complexity as Thermodynamic Depth, ANNALS OF PHYSICS 188, str. 186-191 (recommended).

b. Klamut, Kutner & Struzik, 2020, Towards a universal measure of complexity, Entropy (recommended)

c. Deacon, T. & Koutroufinis, S. (2014). Complexity and Dynamical Depth. Information, 5, 404-423. (presenters, discussion leader)

11. Can Shannon information be a basis for semantic information?

a. Hasselman, F. (2022). Radical embodied computation: Emergence of meaning through the reproduction of similarity by analogy (...)) (recommended)

b. Isaac, A. (2019) The Semantics Latent in Shannon Information.

(presenters, discussion leader)

12 – 14. Project presentations @ mini conference

Learning outcomes:

Students after completing the course will be able to:

Describe the history of the notion of the quantity of information, efforts to formalize it in relation to research on and modeling of cognitive systems. (K_W01, K_W02)

Define the most important concepts of the theory (information quantity, entropy, mutual and joint information) and use mathematical formulas to compute them. (K_W01, K_W08)

Indicate the main areas of application for the measures of information within cognitive sciences and related disciplines. (K_W02, K_U01, K_U03, K_K02)

Use terminology pertaining to information theory and its application. (K_W08)

Discuss information-theoretic problems with specialists from other fields. (K_U07, K_K07)

Formulate questions within cognitive science, which can be answered using information-theoretic measures. (K_U02; K_K06)

Assessment methods and assessment criteria:

40% Project and its presentation

30% Short paper presentation and guiding the discussion

20% Homework(s)

10% Class presence and active participation

Attendance to the seminar is obligatory, 2 unexcused absences are allowed.

Students must respect the principles of academic integrity. Cheating and plagiarism (including copying work from other students, internet or other sources) are serious violations that are punishable and instructors are required to report all cases to the administration.

CMP I, philosophy of mind course, basic skills in math, probability

Classes in period "Winter semester 2023/24" (past)

Time span: 2023-10-01 - 2024-01-28
Selected timetable range:
Navigate to timetable
Type of class:
Seminar, 30 hours more information
Coordinators: (unknown)
Group instructors: Joanna Rączaszek-Leonardi, Szymon Talaga
Students list: (inaccessible to you)
Examination: Course - Grading
Seminar - Grading

Classes in period "Winter semester 2024/25" (future)

Time span: 2024-10-01 - 2025-01-26
Selected timetable range:
Navigate to timetable
Type of class:
Seminar, 30 hours more information
Coordinators: (unknown)
Group instructors: Szymon Talaga, Julian Zubek
Students list: (inaccessible to you)
Examination: Course - Grading
Seminar - Grading
Course descriptions are protected by copyright.
Copyright by University of Warsaw.
Krakowskie Przedmieście 26/28
00-927 Warszawa
tel: +48 22 55 20 000 https://uw.edu.pl/
contact accessibility statement USOSweb 7.0.3.0 (2024-03-22)