# Custom autograd.Function and PyTorch Internals: Write Your Own Gradients

> Source: https://sukruyusufkaya.com/en/learn/llm-muhendisligi/custom-autograd-function-pytorch-internals
> Updated: 2026-05-13T13:00:23.992Z
> Category: LLM Mühendisliği
> Module: Module 2: Before PyTorch — NumPy and Autodiff from Scratch
**TLDR:** Extending PyTorch autograd: torch.autograd.Function subclasses, custom forward/backward, state saving via ctx, gradcheck validation, custom CUDA/Triton kernel wrap (preview), FlashAttention block matmul mini-implementation, second-order gradients and gradgradcheck.

