Yury Polyanskiy (Massachusetts Institute of Technology), Yihong Wu (Connecticut Yale University)
Information Theory
Yury Polyanskiy (Massachusetts Institute of Technology), Yihong Wu (Connecticut Yale University)
Information Theory
- Gebundenes Buch
- Merkliste
- Auf die Merkliste
- Bewerten Bewerten
- Teilen
- Produkt teilen
- Produkterinnerung
- Produkterinnerung
This enthusiastic introduction to the fundamentals of information theory builds from classical Shannon theory through to modern applications in statistical learning. Includes over 210 student exercises, emphasising practical applications in statistics, machine learning and modern communication theory. Accompanied by online instructor solutions.
Andere Kunden interessierten sich auch für
- Khalid SayoodIntroduction to Data Compression95,99 €
- Itzhak Gilboa (Tel-Aviv University)Theory of Decision Under Uncertainty40,99 €
- Itzhak Gilboa (Tel-Aviv University)Theory of Decision Under Uncertainty114,99 €
- Himanshu Tyagi (Bangalore Indian Institute of Science)Information-theoretic Cryptography89,99 €
- Jon Barwise (Indiana University)Information Flow67,99 €
- Jon Barwise (Indiana University)Information Flow82,99 €
- Michael Drmota (Austria Technische Universitat Wien)Analytic Information Theory143,99 €
-
-
-
This enthusiastic introduction to the fundamentals of information theory builds from classical Shannon theory through to modern applications in statistical learning. Includes over 210 student exercises, emphasising practical applications in statistics, machine learning and modern communication theory. Accompanied by online instructor solutions.
Produktdetails
- Produktdetails
- Verlag: Cambridge University Press
- Seitenzahl: 748
- Erscheinungstermin: 3. Juli 2025
- Englisch
- Abmessung: 258mm x 181mm x 40mm
- Gewicht: 1656g
- ISBN-13: 9781108832908
- ISBN-10: 1108832903
- Artikelnr.: 70725904
- Herstellerkennzeichnung
- Libri GmbH
- Europaallee 1
- 36244 Bad Hersfeld
- gpsr@libri.de
- Verlag: Cambridge University Press
- Seitenzahl: 748
- Erscheinungstermin: 3. Juli 2025
- Englisch
- Abmessung: 258mm x 181mm x 40mm
- Gewicht: 1656g
- ISBN-13: 9781108832908
- ISBN-10: 1108832903
- Artikelnr.: 70725904
- Herstellerkennzeichnung
- Libri GmbH
- Europaallee 1
- 36244 Bad Hersfeld
- gpsr@libri.de
Yury Polyanskiy is a Professor of Electrical Engineering and Computer Science at the Massachusetts Institute of Technology, with a focus on information theory, statistical machine learning, error-correcting codes, wireless communication, and fault tolerance. He is the recipient of the 2020 IEEE Information Theory Society James Massey Award for outstanding achievement in research and teaching in Information Theory.
Part I. Information measures: 1. Entropy
2. Divergence
3. Mutual information
4. Variational characterizations and continuity of information measures
5. Extremization of mutual information: capacity saddle point
6. Tensorization and information rates
7. f-divergences
8. Entropy method in combinatorics and geometry
9. Random number generators
Part II. Lossless Data Compression: 10. Variable-length compression
11. Fixed-length compression and Slepian-Wolf theorem
12. Entropy of ergodic processes
13. Universal compression
Part III. Hypothesis Testing and Large Deviations: 14. Neyman-Pearson lemma
15. Information projection and large deviations
16. Hypothesis testing: error exponents
Part IV. Channel Coding: 17. Error correcting codes
18. Random and maximal coding
19. Channel capacity
20. Channels with input constraints. Gaussian channels
21. Capacity per unit cost
22. Strong converse. Channel dispersion. Error exponents. Finite blocklength
23. Channel coding with feedback
Part V. Rate-distortion Theory and Metric Entropy: 24. Rate-distortion theory
25. Rate distortion: achievability bounds
26. Evaluating rate-distortion function. Lossy Source-Channel separation
27. Metric entropy
Part VI. : 28. Basics of statistical decision theory, 29. Classical large-sample asymptotics
30. Mutual information method
31. Lower bounds via reduction to hypothesis testing, 32. Entropic bounds for statistical estimation
33. Strong data processing inequality.
2. Divergence
3. Mutual information
4. Variational characterizations and continuity of information measures
5. Extremization of mutual information: capacity saddle point
6. Tensorization and information rates
7. f-divergences
8. Entropy method in combinatorics and geometry
9. Random number generators
Part II. Lossless Data Compression: 10. Variable-length compression
11. Fixed-length compression and Slepian-Wolf theorem
12. Entropy of ergodic processes
13. Universal compression
Part III. Hypothesis Testing and Large Deviations: 14. Neyman-Pearson lemma
15. Information projection and large deviations
16. Hypothesis testing: error exponents
Part IV. Channel Coding: 17. Error correcting codes
18. Random and maximal coding
19. Channel capacity
20. Channels with input constraints. Gaussian channels
21. Capacity per unit cost
22. Strong converse. Channel dispersion. Error exponents. Finite blocklength
23. Channel coding with feedback
Part V. Rate-distortion Theory and Metric Entropy: 24. Rate-distortion theory
25. Rate distortion: achievability bounds
26. Evaluating rate-distortion function. Lossy Source-Channel separation
27. Metric entropy
Part VI. : 28. Basics of statistical decision theory, 29. Classical large-sample asymptotics
30. Mutual information method
31. Lower bounds via reduction to hypothesis testing, 32. Entropic bounds for statistical estimation
33. Strong data processing inequality.
Part I. Information measures: 1. Entropy
2. Divergence
3. Mutual information
4. Variational characterizations and continuity of information measures
5. Extremization of mutual information: capacity saddle point
6. Tensorization and information rates
7. f-divergences
8. Entropy method in combinatorics and geometry
9. Random number generators
Part II. Lossless Data Compression: 10. Variable-length compression
11. Fixed-length compression and Slepian-Wolf theorem
12. Entropy of ergodic processes
13. Universal compression
Part III. Hypothesis Testing and Large Deviations: 14. Neyman-Pearson lemma
15. Information projection and large deviations
16. Hypothesis testing: error exponents
Part IV. Channel Coding: 17. Error correcting codes
18. Random and maximal coding
19. Channel capacity
20. Channels with input constraints. Gaussian channels
21. Capacity per unit cost
22. Strong converse. Channel dispersion. Error exponents. Finite blocklength
23. Channel coding with feedback
Part V. Rate-distortion Theory and Metric Entropy: 24. Rate-distortion theory
25. Rate distortion: achievability bounds
26. Evaluating rate-distortion function. Lossy Source-Channel separation
27. Metric entropy
Part VI. : 28. Basics of statistical decision theory, 29. Classical large-sample asymptotics
30. Mutual information method
31. Lower bounds via reduction to hypothesis testing, 32. Entropic bounds for statistical estimation
33. Strong data processing inequality.
2. Divergence
3. Mutual information
4. Variational characterizations and continuity of information measures
5. Extremization of mutual information: capacity saddle point
6. Tensorization and information rates
7. f-divergences
8. Entropy method in combinatorics and geometry
9. Random number generators
Part II. Lossless Data Compression: 10. Variable-length compression
11. Fixed-length compression and Slepian-Wolf theorem
12. Entropy of ergodic processes
13. Universal compression
Part III. Hypothesis Testing and Large Deviations: 14. Neyman-Pearson lemma
15. Information projection and large deviations
16. Hypothesis testing: error exponents
Part IV. Channel Coding: 17. Error correcting codes
18. Random and maximal coding
19. Channel capacity
20. Channels with input constraints. Gaussian channels
21. Capacity per unit cost
22. Strong converse. Channel dispersion. Error exponents. Finite blocklength
23. Channel coding with feedback
Part V. Rate-distortion Theory and Metric Entropy: 24. Rate-distortion theory
25. Rate distortion: achievability bounds
26. Evaluating rate-distortion function. Lossy Source-Channel separation
27. Metric entropy
Part VI. : 28. Basics of statistical decision theory, 29. Classical large-sample asymptotics
30. Mutual information method
31. Lower bounds via reduction to hypothesis testing, 32. Entropic bounds for statistical estimation
33. Strong data processing inequality.