What’s all the hype with Transformers? Part 2: Memory for RNNs

2024-10-01T12:22:54+02:00Von |

What's all the hype with Transformers? Part 2: Memory for RNNs Contents Introduction Recurrent Neural Networks Towards a Solution: Gated Recurrent Units (GRUs) and Long Short-Term Memory (LSTM) Introduction In the previous post, we highlighted some of the issues we are faced with when attempting to process natural language