Комментарии:
dude you are awesome
ОтветитьWell Explained ..Thank you for this wonderful explanation 👏
Ответитьone question: we had 747 data points for each class, so how in confusion materix values are less as 187
Ответитьwhen running this code
return bert_encoder(preprocessor)['pooled_output']
I'm getting an error :
TypeError: Exception encountered when calling layer 'keras_layer_7' (type KerasLayer).
pruned(input_ids, input_mask, segment_ids) missing required arguments: input_ids, segment_ids.
Call arguments received by layer 'keras_layer_7' (type KerasLayer):
• inputs={'input_type_ids': 'tf.Tensor(shape=(2, 128), dtype=int32)', 'input_word_ids': 'tf.Tensor(shape=(2, 128), dtype=int32)', 'input_mask': 'tf.Tensor(shape=(2, 128), dtype=int32)'}
• training=None
ModuleNotFoundError: No module named 'tensorflow_text'
ОтветитьCan you do Spoiler Detection with BERT ? I have been trying for some time but I am not able to.
Ответитьcan we use the hidden layers(only CLS) generated from the bert model as a features , to train the tfdistilled bert for binary classification task
ОтветитьThank you so much for your videos! But i have a doubt since balancing the data in multi label classification doesn't help cause words have similar meanings ,what can be done?
ОтветитьSir, I need code as Classifier without sequential layers.
Ответитьvery interested
Ответитьmusk melon for Elon musk , hence 0.84 cosine similarity
ОтветитьThank you so much for this video. This is very helpful for my master's project. Please the model you built in the video, is it a fine-tuning of the last layer of BERT or completely retraining all BERT layers?
ОтветитьHI, thank you for good video. please have talked before about Elmo?
ОтветитьHello...can you please confirm if removal of stopwords,numbers,stemming etc is required in this case ?
ОтветитьPerfectly Expalined! Thanks a lot.
ОтветитьI am getting No matching distribution found for tensorflow_text==2.12 error while installing tensorflow_text using pip.Could you please help on this.Thank you
ОтветитьThanks for sharing this nice well explained concept.
Ответитьif my task is classification with Bert and RNN-specially BiGRU then what output of BERT I need to use ? Pooled or sequential? If I am using Pooled then I am getting error of dimension as rnn requires 3d tensor. Please help with it.
ОтветитьUnderrated channel tbh..He needs more recognition. Thanks a lot for supporting us.
Ответитьhelo sir, I want to know one thing for code mixing language like hindi in english letter does Muril BERT understand this hinglish which is english letter wriiten hindi pronounciation
ОтветитьOh thank god, finally someone who explains well AND covers the topic in enough depth to be useful.
ОтветитьMan, You are a legend!
ОтветитьGreat video. I am facing an issue installing tensorflow_hub: cannot import name 'deserialize_keras_object' from partially initialized module 'keras.saving.legacy.serialization' . Any thoughts?
ОтветитьThank you so much for your videos! You don't know how much you have helped me. I was really scared to dive into transformers but you have made it very easy to understand.
ОтветитьThank you for explaining Bert model. I am not why the model is taking 2 hours for each epoch, has anyone experienced the same ?
ОтветитьThanks for this nice tutorial !
ОтветитьWhich algorithm we are using for the text classification here? Can anyone tell me please?
Ответитьwonderful!! but... what if we have 3 or more categories instead of just 2? Thanks a lot
ОтветитьI am little bit confused; each sentence length should be 128 and each word be of 768 dimensions..
ОтветитьI can't find the dataset can someone help me
ОтветитьThank you so much sir pls where can I find the code?
ОтветитьThanks for the video, I am getting an error saying failed to covert Numpy array to Tensor.
ОтветитьCan you share the dataset link
ОтветитьThanks for the great explanation. Really heplful
ОтветитьThank you very much! You really helped me!
ОтветитьSir If it is a multi class classification the where should I change the code
ОтветитьI knew Jeff Bezos and banana has a lot in common 🤣 Great video btw
Ответитьyou didnt show how to save a trained model
ОтветитьThank you sir for your valuable lectures. Can you direct me to one of your content about XLNet Model?
ОтветитьHi, thanks for the Vid.
Is it possible to make that code run with and AMD GPU ?
Is there a code for NLP Model without labels (i mean Unsupervised ML) ?, i am struggling to find ;)
ОтветитьVery useful video, thanks a lot!
ОтветитьCan you please show how to plot the loss graph?
Ответитьplease do more in-depth stuff on NLP!!
ОтветитьPlease help,
How can I input SMOTE for oversampling in this model?
You are awesome, this is first actual slow enough and easy BERT starting video i've seen and suited me very much!
Ответитьsir what if i have multilabel dataset like 6 labels
ОтветитьHi sir, Thank you for this amazing video. I have followed your video and used bert model for text classification and the accuracy of my model is very low. Can you help me
Ответить