# WordPiece (BERT): Likelihood-Based Merges and Quiet Differences from BPE

> Source: https://sukruyusufkaya.com/en/learn/llm-muhendisligi/wordpiece-bert-likelihood-merges
> Updated: 2026-05-13T13:00:26.254Z
> Category: LLM Mühendisliği
> Module: Module 6: Tokenization Microsurgery
**TLDR:** WordPiece algorithm: from Schuster & Nakajima 2012 to BERT 2018. Likelihood-based merge score instead of frequency, ##suffix prefix convention, [UNK]/[CLS]/[SEP] special tokens, quiet but critical differences from BPE. Practical training with HuggingFace Tokenizers, BERT-base-Turkish-cased example, vocab design.

