“什麼是Word Embedding(詞嵌入)”的個人理解
首先貼上一下Wiki英文的定義:
Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers. Conceptually it involves a mathematical embedding from a space with one dimension per word to a continuous
它的意思是說,Word Embedding是一系列語言NLP中語言模型和特徵模型的總稱,在數學上牽涉到將每個單詞一個維度的高維向量對映到一個低維連續向量的過程。
之所以叫Embedding(“嵌入”),是因為Embedding在數學上的定義:
In mathematics, an embedding (or imbedding[1]) is one instance of some mathematical structure contained within another instance, such as a
When some object X is said to be embedded in another object Y, the embedding is given by some injective and structure-preserving map f : X → Y. The precise meaning of "structure-preserving" depends on the kind of mathematical structure of which X and Y are instances. In the terminology of
主要表徵一個結構通過對映而包含到另一個結構中,比如,我們可以把整數“嵌入”進有理數之中。顯然,整數是一個集合,同時它又是有理數的一個子集。整數集合中的每個整數,在有理數集合中都能找到一個唯一的對應(其實就是它本身)。同時,整數集合中的每個整數所具有的性質,在有理數中同樣得到了保持。同理,我們也可以把有理數“嵌入”到實數中去。
參考連結:
英文維基
最後一段,Embedding的例子的來源