Mastering Transformers : The Journey from BERT to Large Language Models and Stable Diffusion, 2/e (Paperback)
暫譯: 掌握 Transformers:從 BERT 到大型語言模型與穩定擴散的旅程,第二版(平裝本)
Yıldırım, Savaş, Chenaghlu, Meysam Asgari-
- 出版商: Packt Publishing
- 出版日期: 2024-06-03
- 售價: $1,600
- 貴賓價: 9.5 折 $1,520
- 語言: 英文
- 頁數: 462
- 裝訂: Quality Paper - also called trade paper
- ISBN: 1837633789
- ISBN-13: 9781837633784
-
相關分類:
LangChain
立即出貨 (庫存=1)
買這商品的人也買了...
-
$2,223Natural Language Processing with Transformers, Revised Edition (Paperback)
-
$1,250$1,188 -
$2,679Generative Deep Learning: Teaching Machines to Paint, Write, Compose, and Play, 2/e (Paperback)
-
$4,270$4,057 -
$680$537 -
$2,660Deep Learning at Scale: At the Intersection of Hardware, Software, and Data (Paperback)
-
$2,500$2,375 -
$2,660Hands-On Large Language Models: Language Understanding and Generation (Paperback)
商品描述
Explore transformer-based language models from BERT to GPT, delving into NLP and computer vision tasks, while tackling challenges effectively
Key Features:
- Understand the complexity of deep learning architecture and transformers architecture
- Create solutions to industrial natural language processing (NLP) and computer vision (CV) problems
- Explore challenges in the preparation process, such as problem and language-specific dataset transformation
- Purchase of the print or Kindle book includes a free PDF eBook
Book Description:
Transformer-based language models such as BERT, T5, GPT, DALL-E, and ChatGPT have dominated NLP studies and become a new paradigm. Thanks to their accurate and fast fine-tuning capabilities, transformer-based language models have been able to outperform traditional machine learning-based approaches for many challenging natural language understanding (NLU) problems.
Aside from NLP, a fast-growing area in multimodal learning and generative AI has recently been established, showing promising results. Mastering Transformers will help you understand and implement multimodal solutions, including text-to-image. Computer vision solutions that are based on transformers are also explained in the book. You'll get started by understanding various transformer models before learning how to train different autoregressive language models such as GPT and XLNet. The book will also get you up to speed with boosting model performance, as well as tracking model training using the TensorBoard toolkit. In the later chapters, you'll focus on using vision transformers to solve computer vision problems. Finally, you'll discover how to harness the power of transformers to model time series data and for predicting.
By the end of this transformers book, you'll have an understanding of transformer models and how to use them to solve challenges in NLP and CV.
What You Will Learn:
- Focus on solving simple-to-complex NLP problems with Python
- Discover how to solve classification/regression problems with traditional NLP approaches
- Train a language model and explore how to fine-tune models to the downstream tasks
- Understand how to use transformers for generative AI and computer vision tasks
- Build transformer-based NLP apps with the Python transformers library
- Focus on language generation such as machine translation and conversational AI in any language
- Speed up transformer model inference to reduce latency
Who this book is for:
This book is for deep learning researchers, hands-on practitioners, and ML/NLP researchers. Educators, as well as students who have a good command of programming subjects, knowledge in the field of machine learning and artificial intelligence, and who want to develop apps in the field of NLP as well as multimodal tasks will also benefit from this book's hands-on approach. Knowledge of Python (or any programming language) and machine learning literature, as well as a basic understanding of computer science, are required.
商品描述(中文翻譯)
探索從 BERT 到 GPT 的基於變壓器的語言模型,深入研究自然語言處理 (NLP) 和計算機視覺任務,同時有效應對挑戰
主要特點:
- 理解深度學習架構和變壓器架構的複雜性
- 創建解決工業自然語言處理 (NLP) 和計算機視覺 (CV) 問題的方案
- 探索準備過程中的挑戰,例如問題和語言特定數據集的轉換
- 購買印刷版或 Kindle 書籍包括免費 PDF 電子書
書籍描述:
基於變壓器的語言模型如 BERT、T5、GPT、DALL-E 和 ChatGPT 在自然語言處理 (NLP) 研究中佔據主導地位,並成為一種新範式。由於其準確且快速的微調能力,基於變壓器的語言模型在許多具有挑戰性的自然語言理解 (NLU) 問題上超越了傳統的基於機器學習的方法。
除了 NLP,最近在多模態學習和生成式 AI 領域快速增長,顯示出良好的結果。《掌握變壓器》將幫助您理解和實現多模態解決方案,包括文本到圖像。書中還解釋了基於變壓器的計算機視覺解決方案。您將首先了解各種變壓器模型,然後學習如何訓練不同的自回歸語言模型,如 GPT 和 XLNet。這本書還將幫助您提升模型性能,並使用 TensorBoard 工具包跟踪模型訓練。在後面的章節中,您將專注於使用視覺變壓器來解決計算機視覺問題。最後,您將發現如何利用變壓器的力量來建模時間序列數據和進行預測。
在這本變壓器書籍結束時,您將理解變壓器模型及其如何用於解決 NLP 和 CV 中的挑戰。
您將學到的內容:
- 專注於使用 Python 解決簡單到複雜的 NLP 問題
- 探索如何使用傳統的 NLP 方法解決分類/回歸問題
- 訓練語言模型並探索如何將模型微調到下游任務
- 理解如何將變壓器用於生成式 AI 和計算機視覺任務
- 使用 Python 變壓器庫構建基於變壓器的 NLP 應用
- 專注於語言生成,例如機器翻譯和任何語言的對話 AI
- 加速變壓器模型推理以減少延遲
本書適合誰:
本書適合深度學習研究人員、實踐者以及機器學習 (ML)/自然語言處理 (NLP) 研究人員。教育工作者以及對編程科目有良好掌握的學生,具備機器學習和人工智慧領域的知識,並希望在 NLP 及多模態任務領域開發應用的人士,也將從本書的實踐方法中受益。需要具備 Python(或任何編程語言)和機器學習文獻的知識,以及對計算機科學的基本理解。