CarlosGG's Knowledge Garden 🪴

Search

Search IconIcon to open search

Gated Recurrent Units (GRUs)

Last updated Apr 11, 2022 Edit Source

GRU (Gated Recurrent Unit) aims to solve the vanishing gradient problem which comes with a standard recurrent neural network. GRU can also be considered as a variation on the LSTM. GRU’s got rid of the cell state and used the hidden state to transfer information. It also only has two gates, a reset gate and update gate

# Resources

# References