# Word2Vec Line by Line: Anatomy of Mikolov 2013's Skip-Gram + CBOW + Negative Sampling

> Source: https://sukruyusufkaya.com/en/learn/llm-muhendisligi/word2vec-mikolov-2013-skip-gram-cbow-negative-sampling
> Updated: 2026-05-13T13:00:27.017Z
> Category: LLM Mühendisliği
> Module: Module 7: Embedding Layer — The Vector Space of Meaning
**TLDR:** Line-by-line anatomy of Mikolov 2013 paper: Skip-Gram vs CBOW architecture differences, softmax computational bottleneck, hierarchical softmax (Huffman tree), negative sampling (Mikolov 2013b), subsampling, dynamic window. Pure Python implementation in 100 lines. Gensim Turkish word2vec training demo. Comparison with modern LLM embeddings.

