Über Tilman Sattler

Der Autor hat bisher keine Details angegeben.
Bisher hat Tilman Sattler, 2 Blog Beiträge geschrieben.
24 09. 2024

What’s all the hype with Transformers? Part 2: Memory for RNNs

2024-10-01T12:22:54+02:00Von |

What's all the hype with Transformers? Part 2: Memory for RNNs Contents Introduction Recurrent Neural Networks Towards a Solution: Gated Recurrent Units (GRUs) and Long Short-Term Memory (LSTM) Introduction In the previous post, we highlighted some of the issues we are faced with when attempting to process natural language

20 06. 2024

What’s all the Hype with Transformers? The Trouble with Natural Language Processing

2024-09-30T15:32:02+02:00Von |

What's all the Hype with Transformers? The Trouble with Natural Language Processing Contents Introduction The Troubles with Natural Language Processing Traditional Neural Networks Recurrent Neural Networks Outlook Introduction This post tries to highlight the practical differences between traditional RNNs and the new Transformer Architecture. We are not going to

Nach oben