From the course: Certified Analytics Professional (CAP) Cert Prep
Unlock the full course today
Join today to access over 24,600 courses taught by industry experts.
Recalibrating the model through validation
From the course: Certified Analytics Professional (CAP) Cert Prep
Recalibrating the model through validation
- [Instructor] Recalibrating analytics models is a necessary and natural step after tracking them for potential deterioration. Validation is a crucial part of recalibration because it allows us to measure performance improvements after adjusting a model for better outcomes. In predictive analytics, we divide data into training and test sets. As its name suggests, the training data set is for training a machine learning or ML algorithm. While the test data sets are for validating the performance of the ML model in use. We call this approach holdout because the subset of the data is held out for testing. There is a trade off in this method. The more data you use for training, the less data left for testing. Also, there is only a single opportunity to test the model. Cross-validation is a good way to overcome these limitations. It partitions a dataset into K segments also called folds. K here is a placeholder for a natural…
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.
Contents
-
-
-
-
-
-
-
-
-
-
-
(Locked)
Understanding model lifecycle management3m 2s
-
(Locked)
Tracking model quality2m 5s
-
(Locked)
Recalibrating the model through validation2m 13s
-
(Locked)
Maintaining the model2m 18s
-
(Locked)
Supporting training activities2m 10s
-
(Locked)
Evaluating the business benefit of the model over time2m 10s
-
(Locked)
-
-