logo_ipparis.png     TelecomParis_endossem_IPP_RVB_100pix.png Telecom Paris
Dep. Informatique & Réseaux

Dessalles_2018.png J-L. DessallesHome page

November 2021

IA325/IA703: Algorithmic Information and Artificial Intelligence

Lecturers:     Jean-Louis Dessalles
and             Pierre-Alexandre Murena
Etienne Houzé

                                                other AI courses 5


Algorithmic Information Theory (AIT) is based on the mathematical notion of complexity, which has been invented 50 years ago to solve issues related to machine learning, randomness and proof theory. It derives from a fundamental intuition: Complex objects cannot be described by short algorithms. Complexity corresponds to the size of algorithms (and not to their speed; see caveat below).

Creating Artificial intelligence is one of the greatest challenges in the history of humankind. Programs are said to be "intelligent" because they solve difficult problems, such as playing the game of Go. Unfortunately, Artificial intelligence is often perceived as no more than that, just a collection of brilliant, innovative methods to solve problems. Most people don’t imagine that intelligent behaviour can be universally described in terms of algorithmic information.

There is currently a growing interest in Complexity and AIT for their role in the theoretical foundations of Artificial Intelligence. Moreover, practical approaches to complexity based on compression techniques or minimum length descriptions offer efficient techniques in machine learning. AIT plays an important role in mathematics, for instance to set limits to what a formal theory or an intelligent system can do. More recently, AIT has been shown essential to address aspects of human intelligence, such as perception, relevance, decision making and emotional intensity.

  1. This course does not address the notion of "computational complexity" which measures the speed of algorithms.



Chapter 1.     Description complexity

Complexity measured by code length.
Complexity of integers.
Conditional Complexity.
Chapter 2.     Measuring Information through compression

Language recognition through compression.
Huffman codes - Complexity and frequency.
Zipf’s law.
"Google" distance - Meaning distance.
Chapter 3.     Algorithmic information applied to mathematics 

Incomputability of C.
Algorithmic probability - Algorithmic Information.
Gödel’s theorem revisited.
Chapter 4.     Machine Learning and Algorithmic Information

Induction - Minimum Description Length (MDL).
Analogy as complexity minimization.
Machine Learning and compression.
Chapter 5.     Subjective information and simplicity

        (Soon available)     
Cognitive complexity.
Simplicity and coincidences.
Subjective probability & subjective information.


   read →
PdfIcon.png     Chapitre1 (coding & description complexity)    
   read →
PdfIcon.png     Chapitre2 (complexity, frequency & compression)    
   read →
PdfIcon.png     Chapitre3 (complexity & maths)    

            Video du cours 3
   read →
PdfIcon.png     Chapitre4 (complexity & ML)    


  1. Answers to questions during the lab sessions are recorded and evaluated.
  2. You will have to answer a short quiz on the last day.
  3. You will make a small original contribution (typically, as a continuation of a lab work question). This micro-study should emphasize the link with Kolmogorov complexity and Algorithmic Information. You are expected to choose a topic of study, and to do something for this project (typically write a small program). The topic of the project must be related to K-complexity.
  4. You will prepare a 3 minute video to present your work.

Before then end of the course

Your small report will describe your project and what you found (typically: 2 or 3 pages, more if there are images). Don’t forget to include: All contributions that pass will be grouped together into a document made accessible to all.

Short bibliography

En français: