# Encoder-Decoder RNN

> Source: https://sukruyusufkaya.com/en/glossary/encoder-decoder-rnn
> Updated: 2026-05-13T20:50:42.666Z
> Type: glossary
> Category: derin-ogrenme
**TLDR:** A classical sequential architecture that compresses an input sequence into context and generates an output sequence from it.

<p>The encoder-decoder RNN architecture was the foundational design for seq2seq tasks before attention became dominant. The encoder compresses the input sequence into a contextual representation, and the decoder generates the output sequence from that summary. While this design marked a major breakthrough in sequence transformation tasks, it struggled with context bottlenecks on long sequences. The rise of attention mechanisms was in large part a response to this limitation.</p>