122,99 €
inkl. MwSt.
Versandkostenfrei*
Versandfertig in über 4 Wochen
payback
61 °P sammeln
  • Gebundenes Buch

An up-to-date, self-contained introduction to a state-of-the-art machine learning approach, Ensemble Methods: Foundations and Algorithms shows how these accurate methods are used in real-world tasks. It gives you the necessary groundwork to carry out further research in this evolving field. After presenting background and terminology, the book covers the main algorithms and theories, including Boosting, Bagging, Random Forest, averaging and voting schemes, the Stacking method, mixture of experts, and diversity measures. It also discusses multiclass extension, noise tolerance, error-ambiguity…mehr

Produktbeschreibung
An up-to-date, self-contained introduction to a state-of-the-art machine learning approach, Ensemble Methods: Foundations and Algorithms shows how these accurate methods are used in real-world tasks. It gives you the necessary groundwork to carry out further research in this evolving field. After presenting background and terminology, the book covers the main algorithms and theories, including Boosting, Bagging, Random Forest, averaging and voting schemes, the Stacking method, mixture of experts, and diversity measures. It also discusses multiclass extension, noise tolerance, error-ambiguity and bias-variance decompositions, and recent progress in information theoretic diversity. Moving on to more advanced topics, the author explains how to achieve better performance through ensemble pruning and how to generate better clustering results by combining multiple clusterings. In addition, he describes developments of ensemble methods in semi-supervised learning, active learning, cost-sensitive learning, class-imbalance learning, and comprehensibility enhancement.
This self-contained introduction shows how ensemble methods are used in real-world tasks. It first presents background and terminology for readers unfamiliar with machine learning and pattern recognition. The book then covers the main algorithms and theories, including Boosting, Bagging, Random Forest, averaging and voting schemes, and diversity measures. Moving on to more advanced topics, the author explains details behind ensemble pruning and clustering ensembles. He also describes developments in semi-supervised learning, active learning, cost-sensitive learning, class-imbalance learning, and comprehensibility enhancement.
Autorenporträt
Zhi-Hua Zhou is a professor in the Department of Computer Science and Technology and the National Key Laboratory for Novel Software Technology at Nanjing University. Dr. Zhou is the founding steering committee co-chair of ACML and associate editor-in-chief, associate editor, and editorial board member of numerous journals. He has published extensively in top-tier journals, chaired many conferences, and won six international journal/conference/competition awards. His research interests encompass the areas of machine learning, data mining, pattern recognition, and multimedia information retrieval.