Word Embedding
A technique that maps words to dense numerical vectors where semantic relationships are captured. Similar words have similar vectors, and relationships like analogy are encoded in vector arithmetic.
Why It Matters
Word embeddings are the foundation that enabled modern NLP. They showed that meaning could be captured mathematically, paving the way for all subsequent language AI.
Example
Word2Vec showing that vector('King') - vector('Man') + vector('Woman') ≈ vector('Queen'), capturing the gender relationship mathematically.
Think of it like...
Like placing words on a map where distance represents meaning — 'happy' and 'joyful' are neighbors, while 'happy' and 'wrench' are on different continents.
Related Terms
Embedding
A numerical representation of data (text, images, etc.) as a vector of numbers in a high-dimensional space. Similar items are placed closer together in this space, enabling machines to understand semantic relationships.
Representation Learning
The process of automatically discovering useful features or representations from raw data, rather than manually engineering them. Deep learning excels at learning hierarchical representations.