Word embeddings: Beyond word2vec. Word embeddings is a very convenient and efficient way to extract semantic information from large collections of textual or textual-like data. We discuss a comparison of the performance of embeddings techniques like word2vec and GloVe as well as fastText and StarSpace in NLP related problems such as metaphor and sarcasm detection.
- Kostas Perifanos Lead Machine Learning Engineer
Word embeddings: Beyond word2vec
In-person
Saturday 21st April 2018