From the course: The AI Ecosystem for Developers: Models, Datasets, and APIs
Unlock this course with a free trial
Join today to access over 24,600 courses taught by industry experts.
NLP architectures: RNNs and transformers
From the course: The AI Ecosystem for Developers: Models, Datasets, and APIs
NLP architectures: RNNs and transformers
- [Instructor] Natural language processing, NLP, is the subfield of AI focused on sequences like text and speech. AI architecture in this domain are designed to model sequential dependencies, making them essential for tasks like language modeling, translation, and text generation. Some of the most popular model architecture used in NLP include recurrent neural networks, RNNs. RNNs are a class of neural networks designed for sequential data, where previous inputs influence future predictions. The components of RNNs include the input layer, which processes sequences like words or full names. Hidden state stores memory of previous inputs, allowing the model to retain context. Output layer produces predictions such as the next word in a sentence. Recurrent connections connects the hidden layer to itself, allowing information to persist across time steps. RNNs are used for language modeling, predicting the next word in a sequence, machine translation, translating text from one language to…
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.
Contents
-
-
-
-
Introduction to AI models and architecture5m 11s
-
(Locked)
NLP architectures: RNNs and transformers5m 49s
-
(Locked)
Computer vision architectures: CNNs and vision transformers6m 25s
-
(Locked)
Generative architectures: Diffusion and GANs6m 10s
-
(Locked)
Multimodal architectures: CLIP and Flamingo5m 29s
-
(Locked)
Efficient architectures7m 32s
-
-
-
-
-