Schade – dieser Artikel ist leider ausverkauft. Sobald wir wissen, ob und wann der Artikel wieder verfügbar ist, informieren wir Sie an dieser Stelle.
  • Format: ePub

An effective guide to using ensemble techniques to enhance machine learning models
About This Book Learn how to maximize popular machine learning algorithms such as random forests, decision trees, AdaBoost, K-nearest neighbor, and more | Get a practical approach to building efficient machine learning models using ensemble techniques with real-world use cases | Implement concepts such as boosting, bagging, and stacking ensemble methods to improve your model prediction accuracy Who This Book Is For
This book is for data scientists, machine learning practitioners, and deep learning
…mehr

  • Geräte: eReader
  • mit Kopierschutz
  • eBook Hilfe
  • Größe: 15.4MB
  • FamilySharing(5)
Produktbeschreibung
An effective guide to using ensemble techniques to enhance machine learning models

About This Book
  • Learn how to maximize popular machine learning algorithms such as random forests, decision trees, AdaBoost, K-nearest neighbor, and more
  • Get a practical approach to building efficient machine learning models using ensemble techniques with real-world use cases
  • Implement concepts such as boosting, bagging, and stacking ensemble methods to improve your model prediction accuracy
Who This Book Is For

This book is for data scientists, machine learning practitioners, and deep learning enthusiasts who want to implement ensemble techniques and make a deep dive into the world of machine learning algorithms. You are expected to understand Python code and have a basic knowledge of probability theories, statistics, and linear algebra.

What You Will Learn
  • Understand why bagging improves classification and regression performance
  • Get to grips with implementing AdaBoost and different variants of this algorithm
  • See the bootstrap method and its application to bagging
  • Perform regression on Boston housing data using scikit-learn and NumPy
  • Know how to use Random forest for IRIS data classification
  • Get to grips with the classification of sonar dataset using KNN, Perceptron, and Logistic Regression
  • Discover how to improve prediction accuracy by fine-tuning the model parameters
  • Master the analysis of a trained predictive model for over-fitting/under-fitting cases
In Detail

Ensembling is a technique of combining two or more similar or dissimilar machine learning algorithms to create a model that delivers superior prediction power. This book will show you how you can use many weak algorithms to make a strong predictive model. This book contains Python code for different machine learning algorithms so that you can easily understand and implement it in your own systems.

This book covers different machine learning algorithms that are widely used in the practical world to make predictions and classifications. It addresses different aspects of a prediction framework, such as data pre-processing, model training, validation of the model, and more. You will gain knowledge of different machine learning aspects such as bagging (decision trees and random forests), Boosting (Ada-boost) and stacking (a combination of bagging and boosting algorithms).

Then you'll learn how to implement them by building ensemble models using TensorFlow and Python libraries such as scikit-learn and NumPy. As machine learning touches almost every field of the digital world, you'll see how these algorithms can be used in different applications such as computer vision, speech recognition, making recommendations, grouping and document classification, fitting regression on data, and more.

By the end of this book, you'll understand how to combine machine learning algorithms to work behind the scenes and reduce challenges and common problems.

Style and approach

This comprehensive guide offers the perfect blend of theory, examples, and implementations of real-world use cases.


Dieser Download kann aus rechtlichen Gründen nur mit Rechnungsadresse in A, B, BG, CY, CZ, D, DK, EW, E, FIN, F, GR, H, IRL, I, LT, L, LR, M, NL, PL, P, R, S, SLO, SK ausgeliefert werden.