w# Word Embedding

A Word Embedding is a mapping from words in context into a real valued vector space used for NLP.

Also see One-Hot-Encoding.

Uses the Tokenization of the text.

Semantic Word Embedding

Word embeddings that take context into account by placing vectors in the vectorspace in such a way that they show relations among words. That way you can even encode word relations through vector differences