Large-Scale Kernel Machines
暫譯: 大規模核機器

Léon Bottou, Olivier Chapelle, Dennis DeCoste, Jason Weston

買這商品的人也買了...

商品描述

Description

Pervasive and networked computers have dramatically reduced the cost of collecting and distributing large datasets. In this context, machine learning algorithms that scale poorly could simply become irrelevant. We need learning algorithms that scale linearly with the volume of the data while maintaining enough statistical efficiency to outperform algorithms that simply process a random subset of the data. This volume offers researchers and engineers practical solutions for learning from large scale datasets, with detailed descriptions of algorithms and experiments carried out on realistically large datasets. At the same time it offers researchers information that can address the relative lack of theoretical grounding for many useful algorithms.

After a detailed description of state-of-the-art support vector machine technology, an introduction of the essential concepts discussed in the volume, and a comparison of primal and dual optimization techniques, the book progresses from well-understood techniques to more novel and controversial approaches. Many contributors have made their code and data available online for further experimentation. Topics covered include fast implementations of known algorithms, approximations that are amenable to theoretical guarantees, and algorithms that perform well in practice but are difficult to analyze theoretically.

商品描述(中文翻譯)

**描述**

普遍存在的網路電腦顯著降低了收集和分發大型數據集的成本。在這種情況下,擴展性差的機器學習演算法可能會變得無關緊要。我們需要能夠隨著數據量線性擴展的學習演算法,同時保持足夠的統計效率,以超越僅處理隨機子集的演算法。本書為研究人員和工程師提供了從大型數據集中學習的實用解決方案,詳細描述了在現實大型數據集上進行的演算法和實驗。同時,它也為研究人員提供了可以解決許多有用演算法相對缺乏理論基礎的信息。

在詳細描述最先進的支持向量機技術、介紹本書中討論的基本概念以及比較原始和對偶優化技術之後,本書從已知的技術進展到更具創新性和爭議性的做法。許多貢獻者已經在線上提供了他們的代碼和數據,以便進一步實驗。涵蓋的主題包括已知演算法的快速實現、可提供理論保證的近似方法,以及在實踐中表現良好但難以進行理論分析的演算法。

最後瀏覽商品 (20)