What does BERT stand for?
Answers
Answered by
3
BERT, which stands for Bidirectional Encoder Representations from Transformers, is a neural network-based technique for natural language processing pre-training.
Hope it helps.....
Answered by
1
Answer:
Bidirectional Encoder Representations from Transformers,
Explanation:
hppe smj aaya hoga apko....
dear follow me and mark me aab aage kya bolu smj jao
Similar questions