Data Science Asked on July 7, 2021
Are there any guidelines for choosing the embedding dimension size value in a custom Word2Vec embedding? I know that the default is 100 and that seems just as good as any. But I’m wondering if there is any data out there that creates heuristics from when you should deviate from this value.
I can’t imagine that there is much benefit in a smaller size, but there must be some value in a larger size? Or maybe it’s related to the size of my vocabulary? I have a relatively small vocabulary for my latest project (7,000 words) so maybe there is some ratio or proportion that I can apply?
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP