24 09. 2024

What’s all the hype with Transformers? Part 2: Memory for RNNs

2024-10-01T12:22:54+02:00Von |

What's all the hype with Transformers? Part 2: Memory for RNNs Contents Introduction Recurrent Neural Networks Towards a Solution: Gated Recurrent Units (GRUs) and Long Short-Term Memory (LSTM) Introduction In the previous post, we highlighted some of the issues we are faced with when attempting to process natural language

20 06. 2024

What’s all the Hype with Transformers? The Trouble with Natural Language Processing

2024-09-30T15:32:02+02:00Von |

What's all the Hype with Transformers? The Trouble with Natural Language Processing Contents Introduction The Troubles with Natural Language Processing Traditional Neural Networks Recurrent Neural Networks Outlook Introduction This post tries to highlight the practical differences between traditional RNNs and the new Transformer Architecture. We are not going to

29 01. 2024

Personalizing AI – How to Build Apps Based on Your Own Data

2024-09-26T20:45:15+02:00Von |

Personalizing AI – How to Build Apps Based on Your Own Data Generative AI is on everyone's lips. Both software providers and companies with in-house developers have started to implement intelligent applications and functions based on generative AI models, such as those from OpenAI. One of the best ways

9 10. 2023

Erweiterung unseres Portfolios um AI 

2023-10-09T08:48:05+02:00Von |

Erweiterung unseres Portfolios um AI und weitere Microsoft Zertifizierung für AI. Neue Angebote zur Steigerung der Developer Productivity mit GitHub Copilot und der Entwicklung von intelligenten Anwendungen mit Azure OpenAI.

Nach oben