From the course: The AI Ecosystem for Developers: Models, Datasets, and APIs
Unlock this course with a free trial
Join today to access over 24,600 courses taught by industry experts.
AI computing infrastructure
From the course: The AI Ecosystem for Developers: Models, Datasets, and APIs
AI computing infrastructure
- [Instructor] If your interaction has been with basic software systems, you probably don't think much about the computing infrastructure. However, training complex AI models involves billions of parameters and processing massive data sets which demand substantial computational power. In fact, one of the key reasons AI has become practical and scalable in recent years is the evolution of computing infrastructure that can efficiently handle the mathematical operations required for machine learning. AI models, especially deep learning models, require immense computational resources to handle the complex mathematical operations involved in tasks like training neural networks. Specialized hardware accelerators such as GPUs, graphical processing units, or TPUs, tensor processing units, are specifically designed to accelerate these operations, making it possible to train models faster and more efficiently than traditional CPUs. Other evolving accelerators are NPUs, neural processing units…
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.
Contents
-
-
-
-
-
-
-
(Locked)
AI development tools and frameworks: IDEs2m 53s
-
(Locked)
AI development tools and frameworks: ML frameworks1m 51s
-
(Locked)
AI development tools and frameworks: Debugging and versioning6m 51s
-
(Locked)
AI development tools and frameworks: Data annotation5m 19s
-
(Locked)
AI computing infrastructure6m 56s
-
(Locked)
AI research platforms6m 9s
-
(Locked)
AI model rankings: Leaderboards, benchmarks, and evaluation trends9m 31s
-
(Locked)
AI interoperability standards: Model Context Protocol (MCP)5m 45s
-
(Locked)
-