Natural Language Processing with Transformers: Building Language Applications with Hugging Face (Paperback)
Tunstall, Lewis, Werra, Leandro Von, Wolf, Thomas
- 出版商: O'Reilly
- 出版日期: 2022-03-01
- 售價: $2,170
- 貴賓價: 9.5 折 $2,062
- 語言: 英文
- 頁數: 410
- 裝訂: Quality Paper - also called trade paper
- ISBN: 1098103246
- ISBN-13: 9781098103248
-
其他版本:
Natural Language Processing with Transformers, Revised Edition (Paperback)
買這商品的人也買了...
-
$1,270$1,207 -
$770$732 -
$480$379 -
$332了不起的 Markdown
-
$750$638 -
$880$748 -
$2,457Practical Natural Language Processing: A Comprehensive Guide to Building Real-World Nlp Systems (Paperback)
-
$520$411 -
$780$616 -
$780$616 -
$780$616 -
$1,590$1,511 -
$2,250$2,138 -
$680$537 -
$1,188Mastering Transformers: Build state-of-the-art models from scratch with advanced natural language processing techniques (Paperback)
-
$403Python 深度強化學習 : 基於 Chainer 和 OpenAI Gym
-
$2,043$1,935 -
$690$545 -
$3,360$3,192 -
$3,120$2,964 -
$458動手學強化學習
-
$2,043$1,935 -
$594$564 -
$680$537 -
$2,350$2,233
相關主題
商品描述
Since their introduction in 2017, transformers have quickly become the dominant architecture for achieving state-of-the-art results on a variety of natural language processing tasks. If you're a data scientist or coder, this practical book shows you how to train and scale these large models using Hugging Face Transformers, a Python-based deep learning library.
Transformers have been used to write realistic news stories, improve Google Search queries, and even create chatbots that tell corny jokes. In this guide, authors Lewis Tunstall, Leandro von Werra, and Thomas Wolf, among the creators of Hugging Face Transformers, use a hands-on approach to teach you how transformers work and how to integrate them in your applications. You'll quickly learn a variety of tasks they can help you solve.
- Build, debug, and optimize transformer models for core NLP tasks, such as text classification, named entity recognition, and question answering
- Learn how transformers can be used for cross-lingual transfer learning
- Apply transformers in real-world scenarios where labeled data is scarce
- Make transformer models efficient for deployment using techniques such as distillation, pruning, and quantization
- Train transformers from scratch and learn how to scale to multiple GPUs and distributed environments
作者簡介
Lewis Tunstall is a data scientist in Switzerland, focused on building machine learning powered applications for startups and enterprises in the domains of natural language processing and time series. A former theoretical physicist, he has over 10 years experience translating complex subject matter to lay audiences and has taught machine learning to university students at both the graduate and undergraduate levels
Leandro von Werra is a data scientist at Swiss Mobiliar where he leads the company's natural language processing efforts to streamline and simplify processes for customers and employees. He has experience working across the whole machine learning stack and is the creator of a popular Python library that combines Transformers with reinforcement learning. He also teaches data science and visualization at the Bern University of Applied Sciences.
Thomas Wolf is Chief Science Officer and co-founder of HuggingFace. His team is on a mission to catalyze and democratize NLP research. Prior to HuggingFace, Thomas gained a Ph.D. in physics, and later a law degree. He worked as a physics researcher and a European Patent Attorney.