Which algorithm is the precursor to BERT?
Answers
Answered by
3
Answer:
BERT has its origins from pre-training contextual representations including Semi-supervised Sequence Learning, Generative Pre-Training, ELMo, and ULMFit. Unlike previous models, BERT is a deeply bidirectional, unsupervised language representation, pre-trained using only a plain text corpus.
Explanation:
PLZ MARK MEAS BRAINLIEST
Similar questions