Data Science Asked on April 29, 2021
I have some doubts on how to represent the relationships between words in texts.
Let’s suppose I have two sentences like these:
Angela Merkel is a German politician who has been Chancellor of Germany since 2005.
What I would expect is a connection between name Angela and Merkel (Angela is the name, Merkel the surname) who is German and politician and Chancellor.
I read about the use of word2vec to determine the semantic structure of a sentence.
My question is therefore if this model can allow me to determine such semantic structure or if another method would be better.
There are a few models that are trained to analyse a sentence and classify each token (or recognise dependencies between words).
Part of speech tagging (POS) models assign to each word its function (noun, verb, ...) - have a look at this link
Dependency parsing (DP) models will recognize which words go together (in this case Angela and Merkel for instance) - check this out
Named entity recognition (NER) models will for instance say that "Angela Merkel" is a person, "Germany" is a country ... - another link
Correct answer by RonsenbergVI on April 29, 2021
Get help from others!
Recent Questions
Recent Answers
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP