Transformers for Natural Language Processing : Build, train, and fine-tune deep neural network architectures for NLP with Python, PyTorch, 2/e (Paperback)
暫譯: 自然語言處理的變壓器:使用 Python 和 PyTorch 構建、訓練及微調深度神經網絡架構,第二版(平裝本)
Rothman, Denis
- 出版商: Packt Publishing
- 出版日期: 2022-03-25
- 售價: $3,390
- 貴賓價: 9.5 折 $3,221
- 語言: 英文
- 頁數: 564
- 裝訂: Quality Paper - also called trade paper
- ISBN: 1803247339
- ISBN-13: 9781803247335
-
相關分類:
Python、程式語言、Text-mining
-
相關翻譯:
基於 GPT-3、ChatGPT、GPT-4 等 Transformer 架構的自然語言處理 (簡中版)
-
其他版本:
Transformers for Natural Language Processing and Computer Vision, 3/e (Paperback)
買這商品的人也買了...
-
$2,993The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2/e (Hardcover)
-
$1,800$1,710 -
$730$694 -
$1,421Fundamentals of Machine Learning for Predictive Data Analytics : Algorithms, Worked Examples, and Case Studies, 2/e (Hardcover)
-
$1,690$1,606 -
$1,590$1,511 -
$2,030$1,929 -
$1,188Mastering Transformers: Build state-of-the-art models from scratch with advanced natural language processing techniques (Paperback)
-
$2,300$2,185 -
$890$703 -
$2,190$2,081 -
$880$695 -
$2,250$2,138 -
$848圖神經網絡:基礎、前沿與應用
-
$380$300 -
$780$616 -
$630$498 -
$580$383 -
$311你好,ChatGPT AI ChatGPT GPT-3 GPT-4
-
$980$774 -
$490$387 -
$2,446Generative Deep Learning: Teaching Machines to Paint, Write, Compose, and Play, 2/e (Paperback)
-
$630$498 -
$680$537 -
$750$592
相關主題
商品描述
Under the hood working of transformers, fine-tuning GPT-3 models, DeBERTa, vision models, and the start of Metaverse, using a variety of NLP platforms: Hugging Face, OpenAI API, Trax, and AllenNLP
Key Features
- Implement models, such as BERT, Reformer, and T5, that outperform classical language models
- Compare NLP applications using GPT-3, GPT-2, and other transformers
- Analyze advanced use cases, including polysemy, cross-lingual learning, and computer vision
Book Description
Transformers are a game-changer for natural language understanding (NLU) and have become one of the pillars of artificial intelligence.
Transformers for Natural Language Processing, 2nd Edition, investigates deep learning for machine translations, speech-to-text, text-to-speech, language modeling, question-answering, and many more NLP domains with transformers.
An Industry 4.0 AI specialist needs to be adaptable; knowing just one NLP platform is not enough anymore. Different platforms have different benefits depending on the application, whether it's cost, flexibility, ease of implementation, results, or performance. In this book, we analyze numerous use cases with Hugging Face, Google Trax, OpenAI, and AllenNLP.
This book takes transformers' capabilities further by combining multiple NLP techniques, such as sentiment analysis, named entity recognition, and semantic role labeling, to analyze complex use cases, such as dissecting fake news on Twitter. Also, see how transformers can create code using just a brief description.
By the end of this NLP book, you will understand transformers from a cognitive science perspective and be proficient in applying pretrained transformer models to various datasets.
What you will learn
- Discover new ways of performing NLP techniques with the latest pretrained transformers
- Grasp the workings of the original Transformer, GPT-3, BERT, T5, DeBERTa, and Reformer
- Find out how ViT and CLIP label images (including blurry ones!) and reconstruct images using DALL-E
- Carry out sentiment analysis, text summarization, casual language analysis, machine translations, and more using TensorFlow, PyTorch, and GPT-3
- Measure the productivity of key transformers to define their scope, potential, and limits in production
Who this book is for
If you want to learn about and apply transformers to your natural language (and image) data, this book is for you.
A good understanding of NLP, Python, and deep learning is required to benefit most from this book. Many platforms covered in this book provide interactive user interfaces, which allow readers with a general interest in NLP and AI to follow several chapters of this book.
商品描述(中文翻譯)
**變壓器的內部運作、微調 GPT-3 模型、DeBERTa、視覺模型,以及元宇宙的開始,使用各種 NLP 平台:Hugging Face、OpenAI API、Trax 和 AllenNLP**
#### 主要特點
- 實現如 BERT、Reformer 和 T5 等模型,超越傳統語言模型的表現
- 比較使用 GPT-3、GPT-2 和其他變壓器的 NLP 應用
- 分析進階用例,包括多義詞、跨語言學習和計算機視覺
#### 書籍描述
變壓器是自然語言理解(NLU)的遊戲改變者,並已成為人工智慧的支柱之一。
《自然語言處理的變壓器,第二版》探討了使用變壓器進行機器翻譯、語音轉文字、文字轉語音、語言建模、問答系統等多個 NLP 領域的深度學習。
一位工業 4.0 的 AI 專家需要具備適應能力;僅僅了解一個 NLP 平台已經不夠了。不同的平台根據應用的不同,具有不同的優勢,無論是成本、靈活性、實施的簡易性、結果還是性能。在本書中,我們分析了許多使用 Hugging Face、Google Trax、OpenAI 和 AllenNLP 的用例。
本書進一步發揮變壓器的能力,結合多種 NLP 技術,如情感分析、命名實體識別和語義角色標註,以分析複雜的用例,例如剖析 Twitter 上的假新聞。此外,還將展示變壓器如何僅使用簡短描述來生成代碼。
在這本 NLP 書籍結束時,您將從認知科學的角度理解變壓器,並能熟練應用預訓練的變壓器模型於各種數據集。
#### 您將學到什麼
- 探索使用最新預訓練變壓器執行 NLP 技術的新方法
- 理解原始變壓器、GPT-3、BERT、T5、DeBERTa 和 Reformer 的運作
- 瞭解 ViT 和 CLIP 如何標註圖像(包括模糊的圖像!)並使用 DALL-E 重建圖像
- 使用 TensorFlow、PyTorch 和 GPT-3 執行情感分析、文本摘要、隨意語言分析、機器翻譯等
- 測量關鍵變壓器的生產力,以定義其在生產中的範圍、潛力和限制
#### 本書適合誰
如果您想學習並將變壓器應用於您的自然語言(和圖像)數據,這本書適合您。
為了從本書中獲益,您需要對 NLP、Python 和深度學習有良好的理解。本書涵蓋的許多平台提供互動式用戶界面,這使得對 NLP 和 AI 有一般興趣的讀者能夠跟隨本書的幾個章節。
目錄大綱
1. What are Transformers?
2. Getting Started with the Architecture of the Transformer Model
3. Fine-Tuning BERT Models
4. Pretraining a RoBERTa Model from Scratch
5. Downstream NLP Tasks with Transformers
6. Machine Translation with the Transformer
7. The Rise of Suprahuman Transformers with GPT-3 Engines
8. Applying Transformers to Legal and Financial Documents for AI Text Summarization
9. Matching Tokenizers and Datasets
10. Semantic Role Labeling with BERT-Based Transformers
11. Let Your Data Do the Talking: Story, Questions, and Answers
12. Detecting Customer Emotions to Make Predictions
13. Analyzing Fake News with Transformers
14. Interpreting Black Box Transformer Models
15. From NLP to Task-Agnostic Transformer Models
16. The Emergence of Transformer-Driven Copilots
17. Appendix I ― Terminology of Transformer Models
目錄大綱(中文翻譯)
1. What are Transformers?
2. Getting Started with the Architecture of the Transformer Model
3. Fine-Tuning BERT Models
4. Pretraining a RoBERTa Model from Scratch
5. Downstream NLP Tasks with Transformers
6. Machine Translation with the Transformer
7. The Rise of Suprahuman Transformers with GPT-3 Engines
8. Applying Transformers to Legal and Financial Documents for AI Text Summarization
9. Matching Tokenizers and Datasets
10. Semantic Role Labeling with BERT-Based Transformers
11. Let Your Data Do the Talking: Story, Questions, and Answers
12. Detecting Customer Emotions to Make Predictions
13. Analyzing Fake News with Transformers
14. Interpreting Black Box Transformer Models
15. From NLP to Task-Agnostic Transformer Models
16. The Emergence of Transformer-Driven Copilots
17. Appendix I ― Terminology of Transformer Models