Hacker News new | past | comments | ask | show | jobs | submit login

Yes, if the below perhaps helps. Over my head but...

https://courses.grainger.illinois.edu/ece448/sp2023/slides/l...

From another source:

Backpropagation Through Time (BPTT) is an adaptation of backpropagation used for training recurrent neural networks (RNNs), which are designed to process sequences of data and have internal memory. Because the output at a given time step might depend on inputs from previous time steps, the forward pass involves unfolding the RNN through time, which essentially converts it into a deep feedforward neural network with shared weights across the time steps. The error for each time step is computed, and then BPTT is used to calculate the gradients across the entire unfolded sequence, propagating the error not just backward through the layers but also backward through the time steps. Updates are then made to the network weights in a way that should minimize errors for all time steps. This is computationally more involved than standard backpropagation and has its own challenges such as exploding or vanishing gradients"




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: