Information-Theoretic Methods in Data Science
Herausgeber: Eldar, Yonina C.; Rodrigues, Miguel R. D.
Information-Theoretic Methods in Data Science
Herausgeber: Eldar, Yonina C.; Rodrigues, Miguel R. D.
- Gebundenes Buch
- Merkliste
- Auf die Merkliste
- Bewerten Bewerten
- Teilen
- Produkt teilen
- Produkterinnerung
- Produkterinnerung
The first unified treatment of the interface between information theory and emerging topics in data science, written in a clear, tutorial style. Covering topics such as data acquisition, representation, analysis, and communication, it is ideal for graduate students and researchers in information theory, signal processing, and machine learning.
Andere Kunden interessierten sich auch für
- Jose C. PrincipeInformation Theoretic Learning147,99 €
- Serkan GünelAn Information Theoretic Approach to Nonlinear Systems38,99 €
- Badong ChenKalman Filtering Under Information Theoretic Criteria85,99 €
- Jose C. PrincipeInformation Theoretic Learning147,99 €
- Emrah AsanVideo Shot Boundary Detection by Graph Theoretic Approaches38,99 €
- Rodney A. KennedyHilbert Space Methods in Signal Processing136,99 €
- Juntao ChenA Game- and Decision-Theoretic Approach to Resilient Interdependent Network Analysis and Design37,99 €
-
-
-
The first unified treatment of the interface between information theory and emerging topics in data science, written in a clear, tutorial style. Covering topics such as data acquisition, representation, analysis, and communication, it is ideal for graduate students and researchers in information theory, signal processing, and machine learning.
Produktdetails
- Produktdetails
- Verlag: Cambridge University Press
- Seitenzahl: 562
- Erscheinungstermin: 8. April 2021
- Englisch
- Abmessung: 250mm x 175mm x 35mm
- Gewicht: 1139g
- ISBN-13: 9781108427135
- ISBN-10: 1108427138
- Artikelnr.: 57017817
- Verlag: Cambridge University Press
- Seitenzahl: 562
- Erscheinungstermin: 8. April 2021
- Englisch
- Abmessung: 250mm x 175mm x 35mm
- Gewicht: 1139g
- ISBN-13: 9781108427135
- ISBN-10: 1108427138
- Artikelnr.: 57017817
1. Introduction Miguel Rodrigues, Stark Draper, Waheed Bajwa and Yonina
Eldar; 2. An information theoretic approach to analog-to-digital
compression Alon Knipis, Yonina Eldar and Andrea Goldsmith; 3. Compressed
sensing via compression codes Shirin Jalali and Vincent Poor; 4.
Information-theoretic bounds on sketching Mert Pillanci; 5. Sample
complexity bounds for dictionary learning from vector- and tensor-valued
data Zahra Shakeri, Anand Sarwate and Waheed Bajwa; 6. Uncertainty
relations and sparse signal recovery Erwin Riegler and Helmut Bölcskei; 7.
Understanding phase transitions via mutual Information and MMSE Galen
Reeves and Henry Pfister; 8. Computing choice: learning distributions over
permutations Devavrat Shah; 9. Universal clustering Ravi Raman and Lav
Varshney; 10. Information-theoretic stability and generalization Maxim
Raginsky, Alexander Rakhlin and Aolin Xu; 11. Information bottleneck and
representation learning Pablo Piantanida and Leonardo Rey Vega; 12.
Fundamental limits in model selection for modern data analysis Jie Ding,
Yuhong Yang and Vahid Tarokh; 13. Statistical problems with planted
structures: information-theoretical and computational limits Yihong Wu and
Jiaming Xu; 14. Distributed statistical inference with compressed data
Wenwen Zhao and Lifeng Lai; 15. Network functional compression Soheil Feizi
and Muriel Médard; 16. An introductory guide to Fano's inequality with
applications in statistical estimation Jonathan Scarlett and Volkan Cevher.
Eldar; 2. An information theoretic approach to analog-to-digital
compression Alon Knipis, Yonina Eldar and Andrea Goldsmith; 3. Compressed
sensing via compression codes Shirin Jalali and Vincent Poor; 4.
Information-theoretic bounds on sketching Mert Pillanci; 5. Sample
complexity bounds for dictionary learning from vector- and tensor-valued
data Zahra Shakeri, Anand Sarwate and Waheed Bajwa; 6. Uncertainty
relations and sparse signal recovery Erwin Riegler and Helmut Bölcskei; 7.
Understanding phase transitions via mutual Information and MMSE Galen
Reeves and Henry Pfister; 8. Computing choice: learning distributions over
permutations Devavrat Shah; 9. Universal clustering Ravi Raman and Lav
Varshney; 10. Information-theoretic stability and generalization Maxim
Raginsky, Alexander Rakhlin and Aolin Xu; 11. Information bottleneck and
representation learning Pablo Piantanida and Leonardo Rey Vega; 12.
Fundamental limits in model selection for modern data analysis Jie Ding,
Yuhong Yang and Vahid Tarokh; 13. Statistical problems with planted
structures: information-theoretical and computational limits Yihong Wu and
Jiaming Xu; 14. Distributed statistical inference with compressed data
Wenwen Zhao and Lifeng Lai; 15. Network functional compression Soheil Feizi
and Muriel Médard; 16. An introductory guide to Fano's inequality with
applications in statistical estimation Jonathan Scarlett and Volkan Cevher.
1. Introduction Miguel Rodrigues, Stark Draper, Waheed Bajwa and Yonina
Eldar; 2. An information theoretic approach to analog-to-digital
compression Alon Knipis, Yonina Eldar and Andrea Goldsmith; 3. Compressed
sensing via compression codes Shirin Jalali and Vincent Poor; 4.
Information-theoretic bounds on sketching Mert Pillanci; 5. Sample
complexity bounds for dictionary learning from vector- and tensor-valued
data Zahra Shakeri, Anand Sarwate and Waheed Bajwa; 6. Uncertainty
relations and sparse signal recovery Erwin Riegler and Helmut Bölcskei; 7.
Understanding phase transitions via mutual Information and MMSE Galen
Reeves and Henry Pfister; 8. Computing choice: learning distributions over
permutations Devavrat Shah; 9. Universal clustering Ravi Raman and Lav
Varshney; 10. Information-theoretic stability and generalization Maxim
Raginsky, Alexander Rakhlin and Aolin Xu; 11. Information bottleneck and
representation learning Pablo Piantanida and Leonardo Rey Vega; 12.
Fundamental limits in model selection for modern data analysis Jie Ding,
Yuhong Yang and Vahid Tarokh; 13. Statistical problems with planted
structures: information-theoretical and computational limits Yihong Wu and
Jiaming Xu; 14. Distributed statistical inference with compressed data
Wenwen Zhao and Lifeng Lai; 15. Network functional compression Soheil Feizi
and Muriel Médard; 16. An introductory guide to Fano's inequality with
applications in statistical estimation Jonathan Scarlett and Volkan Cevher.
Eldar; 2. An information theoretic approach to analog-to-digital
compression Alon Knipis, Yonina Eldar and Andrea Goldsmith; 3. Compressed
sensing via compression codes Shirin Jalali and Vincent Poor; 4.
Information-theoretic bounds on sketching Mert Pillanci; 5. Sample
complexity bounds for dictionary learning from vector- and tensor-valued
data Zahra Shakeri, Anand Sarwate and Waheed Bajwa; 6. Uncertainty
relations and sparse signal recovery Erwin Riegler and Helmut Bölcskei; 7.
Understanding phase transitions via mutual Information and MMSE Galen
Reeves and Henry Pfister; 8. Computing choice: learning distributions over
permutations Devavrat Shah; 9. Universal clustering Ravi Raman and Lav
Varshney; 10. Information-theoretic stability and generalization Maxim
Raginsky, Alexander Rakhlin and Aolin Xu; 11. Information bottleneck and
representation learning Pablo Piantanida and Leonardo Rey Vega; 12.
Fundamental limits in model selection for modern data analysis Jie Ding,
Yuhong Yang and Vahid Tarokh; 13. Statistical problems with planted
structures: information-theoretical and computational limits Yihong Wu and
Jiaming Xu; 14. Distributed statistical inference with compressed data
Wenwen Zhao and Lifeng Lai; 15. Network functional compression Soheil Feizi
and Muriel Médard; 16. An introductory guide to Fano's inequality with
applications in statistical estimation Jonathan Scarlett and Volkan Cevher.