Identifying the right meaning of the words using BERT

less than 1 minute read

Published:

An important reason for using contextualised word embeddings is that the standard embeddings assign one vector for every meaning of a word, however, there are multiple-meaning words. The hypothesis is that the use of the context can solve the problem of categorizing multiple-meaning words (homonyms and homographs) into the same embedding vector. In this story, we will analyse whether BERT embeddings can be used to classify different meanings of a word to prove that contextualised word embeddings solve the problem. See full article here