Word Embedding

Word embedding is the process of mapping words or phrases to vectors of real numbers.

The embedding algorithms can be based on factors such as distributional semantics, which uses semantic similarities between linguistic items based on their distributional properties in large samples of language data.