The principle of Information Theory. The concept of probability. Shannon s information measure and the concept of entropy. Conditional, joint, and mutual information measures. Axiomatic foundation. The communication model. The discrete memoryless source. Source coding. Coding Methods: Fano s method, Shannon s method, Huffman s method, Gilbert-Moore s method. Most probable messages. Shannon s first coding theorem. The discrete source with memory. Markov processes. The information of a discrete source with memory. Coding issues. Error-correcting codes. The Hamming, Plotkin, and Singleton bounds. Hadamard codes. Codes generated from block designs. Reed-Muller codes. Golay codes. Codes and Latin squares. Code equivalence. Linear codes. Equivalence of linear codes. Dual codes. Hamming codes. Perfect Codes. Cyclic Codes. Weight enumerators.
- Teacher: Χρήστος Κουκουβίνος
ECTS : 5
Language : el