Extracting Word Embedding & Sentence Embedding From BERT
Published:
Natural Language Preprocessing (NLP) is booming and rapidly increasing for a few years now. Picking an NLP model is not as hard as a few years before, since many models are currently developed by many NLP communities out there and can be freely downloaded and used in your model. That’s why our conceptual understanding of how is the best way to represent words and sentences is more important. Google’s BERT (Bidirectional Encoder Representation from Transformer) is one of the well-known models that have been used in a lot of research and projects.