TelecomParis_IPParis.png Telecom Paris
Dep. Informatique & Réseaux

Dessalles_2018.png J-L. DessallesHome page

March 2023

5

AIAI0.png
IA225/325/703: Algorithmic Information and A.I.

Lecturer:     Jean-Louis Dessalles

                                                other AI courses 5






Objectives

Algorithmic Information Theory (AIT) is based on the mathematical notion of complexity, which has been invented 50 years ago to solve issues related to machine learning, randomness and proof theory. It derives from a fundamental intuition: Complex objects cannot be described by short algorithms. Complexity corresponds to the size of algorithms (and not to their speed; see caveat below).

Creating Artificial intelligence is one of the greatest challenges in the history of humankind. Programs are said to be "intelligent" because they solve difficult problems, such as playing the game of Go. Unfortunately, Artificial intelligence is often perceived as no more than that, just a collection of brilliant, innovative methods to solve problems. Most people don’t imagine that intelligent behaviour can be universally described in terms of algorithmic information.

There is currently a growing interest in Complexity and AIT for their role in the theoretical foundations of Artificial Intelligence. Moreover, practical approaches to complexity based on compression techniques or minimum length descriptions offer efficient techniques in machine learning. AIT plays an important role in mathematics, for instance to set limits to what a formal theory or an intelligent system can do. More recently, AIT has been shown essential to address aspects of human intelligence, such as perception, relevance, decision making and emotional intensity.

Caveat:

  1. This course does not address the notion of "computational complexity" which measures the speed of algorithms.






Content

Topics

Chapter 1.     Introduction to
Algorithmic Information Theory (AIT)

    
Complexity measured by code length.
Complexity of integers.
Conditional Complexity.
Chapter 2.     AIT and data:
Measuring Information through compression

Compressibility.
Language recognition through compression.
Huffman codes - Complexity and frequency.
Zipf’s law.
"Google" distance - Meaning distance.
Chapter 3.     Algorithmic information applied to mathematics 

Incomputability of C.
Algorithmic probability - Algorithmic Information.
Randomness.
Gödel’s theorem revisited.
Chapter 4.     Machine Learning and Algorithmic Information
Induction - Minimum Description Length (MDL).
Analogy as complexity minimization.
Machine Learning and compression.
Chapter 5.     Subjective information and simplicity

    Watch the lecture recorded in 2023

Cognitive complexity.
Simplicity and coincidences.
Subjective probability & subjective information.
Relevance.

Slides

   read →
PdfIcon.png     Chapter 1 (coding & description complexity)    
   read →
PdfIcon.png     Chapter 2 (complexity, frequency & compression)    
   read →
PdfIcon.png     Chapter 3 (complexity & maths)    
   read →
PdfIcon.png     Chapter 4 (complexity & ML)    
   read →
PdfIcon.png     Chapter 5 (complexity, AI & cognition)    

Read    ➜
   read →
PdfIcon.png     Interesting paper in The New Yorker on ChatGPT and compression    

À lire    ➜     
   read →
PdfIcon.png     Chapitre récent de JP Delahaye sur simplicité, abondance et évolution    
    
   read →
PdfIcon.png     Chapitre récent de JP Delahaye qui parle de crypto en lien avec complexité de Kolmogorov    






Validation

  1. Answers to questions during the lab sessions are recorded and evaluated.
  2. You will have to answer a short quiz on the last day.
  3. You will make a small original contribution (typically, as a continuation of a lab work question). This micro-study should emphasize the link with Kolmogorov complexity and Algorithmic Information. You are expected to choose a topic of study, and to do something for this project (typically write a small program). The topic of the project must be related to K-complexity.. You will write a small report. All reports will be bundled up into proceedings made available to all registered students.
  4. You will have the opportunity to make a 4 minute oral presentation of your work .

Your Project


Your project will typically build on some topic studied during the Lab Work sessions. You should pick a problem that you want to investigate further. A few suggestions are made. Initiative is welcome in any case. Note: you have to "do" something (typically write or extend a program).
  1. You should seek for simple and clear-cut results from which we can learn something (even if you study is inconclusive, we want to know clearly why). Initiative and logical clarity will be appreciated.
  2. Try to be realistic about what you can do. Your study should not be trivial, and it should lead somewhere.
  3. Please favour Python when writing code.

Before March 16th: ➜    Please indicate here what you intend to do as a project.


    If you change your mind, redo the inscription.
➜    You may consult the others’ projects. Try to play a minority game in your choice!

Before the 23rd of March:

On March 23 and 30:

  1. You will be asked to talk during 4 minutes about your small study (from you seat). Your audience is not the teachers, but the other students.
  2. Be interesting
  3. Be scientifically sound    
  1. On March 30, you will also be asked to answer a small quiz in English (~ 30 min.; no documents allowed)

Before April 2:

Report

  1. Write you report    ➜    Please use this template: MSWord or LibreOffice or LaTeX or Pdf
  2. Upload your program and any relevant material    ➜    Uploading page

All contributions that pass will be grouped together into a document made accessible to all.






Short bibliography

En français:

    

Line.jpg