E-Book, Englisch, 384 Seiten
Rothman Transformers for Natural Language Processing
1. Auflage 2021
ISBN: 978-1-80056-863-1
Verlag: De Gruyter
Format: EPUB
Kopierschutz: 0 - No protection
Build innovative deep neural network architectures for NLP with Python, PyTorch, TensorFlow, BERT, RoBERTa, and more
E-Book, Englisch, 384 Seiten
ISBN: 978-1-80056-863-1
Verlag: De Gruyter
Format: EPUB
Kopierschutz: 0 - No protection
No detailed description available for "Transformers for Natural Language Processing".
Autoren/Hrsg.
Fachgebiete
Weitere Infos & Material
Table of Contents - Getting Started with the Model Architecture of the Transformer
- Fine-Tuning BERT Models
- Pretraining a RoBERTa Model from Scratch
- Downstream NLP Tasks with Transformers
- Machine Translation with the Transformer
- Text Generation with OpenAI GPT-2 and GPT-3 Models
- Applying Transformers to Legal and Financial Documents for AI Text Summarization
- Matching Tokenizers and Datasets
- Semantic Role Labeling with BERT-Based Transformers
- Let Your Data Do the Talking: Story, Questions, and Answers
- Detecting Customer Emotions to Make Predictions
- Analyzing Fake News with Transformers
- Appendix: Answers to the Questions