From the course: The AI Ecosystem for Developers: Models, Datasets, and APIs

Unlock this course with a free trial

Join today to access over 24,600 courses taught by industry experts.

AI computing infrastructure

AI computing infrastructure

- [Instructor] If your interaction has been with basic software systems, you probably don't think much about the computing infrastructure. However, training complex AI models involves billions of parameters and processing massive data sets which demand substantial computational power. In fact, one of the key reasons AI has become practical and scalable in recent years is the evolution of computing infrastructure that can efficiently handle the mathematical operations required for machine learning. AI models, especially deep learning models, require immense computational resources to handle the complex mathematical operations involved in tasks like training neural networks. Specialized hardware accelerators such as GPUs, graphical processing units, or TPUs, tensor processing units, are specifically designed to accelerate these operations, making it possible to train models faster and more efficiently than traditional CPUs. Other evolving accelerators are NPUs, neural processing units…

Contents