Dep. Informatique & Réseaux J-L. Dessalles ← Home pageMarch 2023
Artificial Intelligence is more than just a collection of brilliant, innovative methods to solve problems.
If you are interested in machine learning or are planning to explore it, the course will make you see artificial learning in an entirely new way. You will know how to formulate optimal hypotheses for a learning task. And you will be able to analyze learning techniques such as clustering or neural networks as just ways of compressing information.
If you are interested in reasoning, you will understand that reasoning by analogy, reasoning by induction, explaining, proving, etc. are all alike; they all amount to providing more compact descriptions of situations.
If you are interested in mathematics, you will be amazed at the fact that crucial notions such as probability and randomness can be redefined in terms of algorithmic information. You will also understand that there are theoretical limits to what artificial intelligence can do.
If you are interested in human intelligence, you will find some intriguing results in this course. Thanks to algorithmic information, notions such as unexpectedness, interest and, to a certain extent, aesthetics, can be formally defined and computed, and this may change your views on what artificial intelligence can achieve in the future.
Algorithmic Information Theory (AIT)
Complexity measured by code length.
Complexity of integers.
AIT and data:
Measuring Information through compression
Language recognition through compression.
Huffman codes - Complexity and frequency.
"Google" distance - Meaning distance.
|Chapter 3.||Algorithmic information applied to mathematics||
Incomputability of C.
Algorithmic probability - Algorithmic Information.
Gödel’s theorem revisited.
|Chapter 4.||Machine Learning and Algorithmic Information||
Induction - Minimum Description Length (MDL).
Analogy as complexity minimization.
Machine Learning and compression.
|Chapter 5.||AIT and subjective information||
Simplicity and coincidences.
Subjective probability & subjective information.