Ensemble Methods: Foundations and Algorithms (Hardcover)
暫譯: 集成方法:基礎與演算法 (精裝版)

Zhi-Hua Zhou

買這商品的人也買了...

商品描述

An up-to-date, self-contained introduction to a state-of-the-art machine learning approach, Ensemble Methods: Foundations and Algorithms shows how these accurate methods are used in real-world tasks. It gives you the necessary groundwork to carry out further research in this evolving field.

 

After presenting background and terminology, the book covers the main algorithms and theories, including Boosting, Bagging, Random Forest, averaging and voting schemes, the Stacking method, mixture of experts, and diversity measures. It also discusses multiclass extension, noise tolerance, error-ambiguity and bias-variance decompositions, and recent progress in information theoretic diversity.

 

Moving on to more advanced topics, the author explains how to achieve better performance through ensemble pruning and how to generate better clustering results by combining multiple clusterings. In addition, he describes developments of ensemble methods in semi-supervised learning, active learning, cost-sensitive learning, class-imbalance learning, and comprehensibility enhancement.

商品描述(中文翻譯)

一部最新的、獨立的介紹,關於最先進的機器學習方法,**Ensemble Methods: Foundations and Algorithms** 展示了這些準確的方法如何應用於現實世界的任務。它為您提供了在這個不斷發展的領域中進行進一步研究所需的基礎知識。

在介紹背景和術語之後,本書涵蓋了主要的算法和理論,包括 Boosting、Bagging、隨機森林(Random Forest)、平均和投票方案、堆疊方法(Stacking)、專家混合(mixture of experts)以及多樣性度量。它還討論了多類別擴展、噪聲容忍度、錯誤模糊性和偏差-方差分解,以及在信息理論多樣性方面的最新進展。

接下來進入更高級的主題,作者解釋了如何通過集成修剪(ensemble pruning)來實現更好的性能,以及如何通過結合多個聚類來生成更好的聚類結果。此外,他還描述了集成方法在半監督學習、主動學習、成本敏感學習、類別不平衡學習和可理解性增強方面的發展。