Fine Tuning DistilBERT for Multiclass Text Classification | TensorFlow | NLP | Machine Learning

Fine Tuning DistilBERT for Multiclass Text Classification | TensorFlow | NLP | Machine Learning

Rohan-Paul-AI

1 год назад

26,427 Просмотров

Ссылки и html тэги не поддерживаются


Комментарии:

Dmitriy Neledva
Dmitriy Neledva - 09.11.2023 00:56

This is a great video 😊

Ответить
Lam Le
Lam Le - 12.10.2023 16:46

Thank you! Very helpful. If possible, can you please do another video for fine-tuning XLNet, and also for multi-class text classification?

Ответить
Ozodbek Ozodov
Ozodbek Ozodov - 04.09.2023 19:22

very useful and easy to follow!

Ответить
Varghese K A
Varghese K A - 14.08.2023 07:28

which tensorflow version did you use?

Ответить
Kamna Narang
Kamna Narang - 09.07.2023 22:38

im getting error while running model=... as version TF not found on ur system. kindly help

Ответить
tofufa
tofufa - 12.06.2023 11:14

is it possible to get the top 5 prediction result, instead of just one?

Ответить
Cai Yu
Cai Yu - 08.06.2023 07:04

Great

Ответить
Hemal Shah
Hemal Shah - 03.05.2023 02:26

I got error on > import TFDistilBertForSequenceClassification with "AssertionError: Duplicate registrations for type 'experimentalOptimizer'" any solution ?

Ответить
Debojit Mandal
Debojit Mandal - 26.03.2023 01:36

hi please can u tell me how much u got when u gave this
trainer.evaluate() because i am getting a loss of 2.81 so i dont know is it good or bad

Ответить
Luis Valencia
Luis Valencia - 17.03.2023 13:33

Why are you downnloading stopwords if thats not even used in the rest of the code?

Ответить
Nazreen N
Nazreen N - 08.03.2023 11:05

Hi I'm not able to train the model using 'trainer.train()' function. Any solution for it?

Ответить
Venkatesan R
Venkatesan R - 23.09.2022 18:12

Your videos are great and useful. Can you discuss handling unstructured nlp task like documents unlabeled text processing?

Ответить
Ziad Ullah
Ziad Ullah - 23.09.2022 17:52

Great

Ответить