Mastering transformers : build SOTA models from scratch with advanced natural language processing techniques /
Κύριος συγγραφέας: | Yildırım, Savas |
---|---|
Άλλοι συγγραφείς: | Asgari-Chenaghlu, Meysam |
Μορφή: | Βιβλίο |
Γλώσσα: | English |
Έκδοση: |
Birmingham, UK :
Packt Publishing,
2021.
|
Θέματα: | |
Classic Catalogue: | View this record in Classic Catalogue |
Παρόμοια τεκμήρια
-
Transformers for natural language processing : build innovative deep neural network architectures for NLP with Python, Pytorch, TensorFlow, BERT, RoBERTa, and more /
ανά: Rothman, Denis
Έκδοση: (2021) -
Advanced natural language processing with TensorFlow 2 : build effective real-world NLP applications using NER, RNNS, seq2seq models, transformers, and more /
ανά: Bansal, Ashish
Έκδοση: (2021) -
Getting started with Google BERT : build and train state-of-the-art natural language processing models using BERT /
ανά: Ravichandiran, Sudharsan
Έκδοση: (2021) -
Introduction to natural language processing /
ανά: Eisenstein, Jacob
Έκδοση: (2019) -
Exploration and mitigation of gender bias in word embeddings from transformer-based language models
ανά: Hossain, Ariyan, κ.ά.
Έκδοση: (2024)