Machine Learning

Word Embedding

A technique that maps words to dense numerical vectors where semantic relationships are captured. Similar words have similar vectors, and relationships like analogy are encoded in vector arithmetic.

Why It Matters

Word embeddings are the foundation that enabled modern NLP. They showed that meaning could be captured mathematically, paving the way for all subsequent language AI.

Example

Word2Vec showing that vector('King') - vector('Man') + vector('Woman') ≈ vector('Queen'), capturing the gender relationship mathematically.

Think of it like...

Like placing words on a map where distance represents meaning — 'happy' and 'joyful' are neighbors, while 'happy' and 'wrench' are on different continents.

Related Terms