I would not consider word embeddings to be state of the art anymore.
Word Embeddings are like TF-IDF when word embeddings came out. Have a look at BERT model that just recently got published and is outperforming all kind if NLP tasks with one main Architecture.
I would consider BERT language model two levels higher than word embeddings, as it considers full context sensitive embeddings, dependent on the text left and right of the word in parallel.
mlucy|7 years ago
(Note: I'm fairly biased, since I work on https://www.basilica.ai, which among other things makes sentence embeddings available over a REST interface.)
yazr|7 years ago
(I do DRL but not NLP)
I sometimes read these DL papers and the requirements are not really feasible if you have to re-implement them in a modified domain.
irodov_rg|7 years ago
DoctorOetker|7 years ago
unknown|7 years ago
[deleted]