An embedding is a learned representation that maps a word, Token or even a whole document into a high-dimensional Vector. The idea, popularised by Word2Vec and GloVe in 2013, is that semantically similar things end up close in this space — which is why geometric tricks like 'king' - 'man' + 'woman' ≈ 'queen' actually work. In modern LLMs every token is mapped to an embedding before it ever enters the Transformer, and the model's entire world knowledge effectively lives in this vector space. Today embeddings power not just language models but also Cosine Similarity-based search, RAG pipelines and recommendation systems.
MEVZU N°124ISTANBULYEAR I — VOL. III
Glossary · Beginner · 2013
Embedding
A way to represent a token or piece of text as a dense numerical vector that encodes its meaning.
- EN — English term
- Embedding
- TR — Turkish term
- Gömme (Embedding)