TelecomParis_IPParis.png Telecom Paris
Dep. Informatique & Réseaux

Dessalles_2018.png J-L. DessallesHome page

March 2023


IA225/325/703: Algorithmic Information and A.I.

Lecturer:     Jean-Louis Dessalles

                                                other AI courses 5


Can we characterize intelligent behavior?
Are there theoretical foundations on which Artificial Intelligence can be grounded?

Artificial Intelligence is more than just a collection of brilliant, innovative methods to solve problems.
If you are interested in machine learning or are planning to explore it, the course will make you see artificial learning in an entirely new way. You will know how to formulate optimal hypotheses for a learning task. And you will be able to analyze learning techniques such as clustering or neural networks as just ways of compressing information.
If you are interested in reasoning, you will understand that reasoning by analogy, reasoning by induction, explaining, proving, etc. are all alike; they all amount to providing more compact descriptions of situations.
If you are interested in mathematics, you will be amazed at the fact that crucial notions such as probability and randomness can be redefined in terms of algorithmic information. You will also understand that there are theoretical limits to what artificial intelligence can do.
If you are interested in human intelligence, you will find some intriguing results in this course. Thanks to algorithmic information, notions such as unexpectedness, interest and, to a certain extent, aesthetics, can be formally defined and computed, and this may change your views on what artificial intelligence can achieve in the future.


With this course...


  1. This course does NOT address the notion of "computational complexity" which measures the speed of algorithms.



Chapter 1.         Introduction to
    Algorithmic Information Theory (AIT)
Complexity measured by code length.
Complexity of integers.
Conditional Complexity.
Chapter 2.         AIT and data:
    Measuring Information through compression
Language recognition through compression.
Huffman codes - Complexity and frequency.
Zipf’s law.
"Google" distance - Meaning distance.
Chapter 3.         Algorithmic information applied to mathematics  Incomputability of C.
Algorithmic probability - Algorithmic Information.
Gödel’s theorem revisited.
Chapter 4.         Machine Learning and Algorithmic Information Induction - Minimum Description Length (MDL).
Analogy as complexity minimization.
Machine Learning and compression.
Chapter 5.         AIT and subjective information Cognitive complexity.
Simplicity and coincidences.
Subjective probability & subjective information.


  1. Answers to questions during the lab sessions are recorded and evaluated.
  2. You will have to answer a short quiz on the last day.
  3. You will make a small original contribution (typically, as a continuation of a lab work question). This micro-study should emphasize the link with Kolmogorov complexity and Algorithmic Information. You are expected to choose a topic of study, and to do something for this project (typically write a small program). The topic of the project must be related to K-complexity.. You will write a small report. All reports will be bundled up into proceedings made available to all registered students.
  4. You will have the opportunity to make a 4 minute oral presentation of your work .