From the course: Computer Vision for Data Scientists
Unlock the full course today
Join today to access over 24,600 courses taught by industry experts.
Activation functions
From the course: Computer Vision for Data Scientists
Activation functions
- [Instructor] Activation functions are critical in neural networks. They introduce non-linearity into the network, enabling it to learn complex relationships and patterns in the input data. Without activation functions, neural networks would be limited to modeling linear relationships which is going to be insufficient for solving real world complex problems. Let's examine some common activation functions and their properties. First is ReLU. ReLU stands for rectified linear unit. The formula for ReLU, as shown here on the graph is F of X is equal to the max of zero or X. So what does that mean? Any value less than zero will get clipped to be zero and we only keep all values that are greater than zero. So ReLU is a piecewise linear function and it sets all negative input values to zero, and again retains positive input values. The ReLU activation is simple and computationally efficient. By keeping positive values…
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.