Data Science Asked by forgetso on February 27, 2021
Let’s say I have my 300 dimensional word embedding trained with Word2Vec and it contains 10,000 word vectors.
I have additional data on the 10,000 words in the form of a vector (10,000×1), containing values between 0 and 1. Can I simply append the vector to the word embedding so that I have a 301 dimensional embedding?
I am looking to calculate similarities between word vectors using cosine similarity.
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP