Batch predictions and promotions¶
Batch Predictions are the main way that users are able to get predictive insights back into their data warehouse. In this section we'll cover working with batch prediction jobs and promoting model versions.
Although each model will contain many model versions, only a single model version can be actively promoted at one time. Whenever a batch prediction job is run, the currently promoted model version will be used to make all predictions. Model versions can either be automatically promoted based on a policy, or manually via direct user interaction. Continual tracks all promotions for models in the system.
Continual currently supports the following promotion policies:
- Best: After a new model version is trained Continual will compare the performance metric to the existing promoted model version. If the new model version has better performance, it will be promoted.
- Latest: After a new model version is trained Continual will always promote that model version.
- Manual: No action is taken after a new model version is trained. Users must manually promote model versions to update the model.
The promotion policy can either be set the yaml file of a model, or in the model creation wizard when using the Web UI.
A prediction job refers to a batch scoring job. Continual uses these to keep track of the batch jobs that users have completed. A batch prediction job will read feature data from the feature store and make predictions with the currently promoted model. All predictions are stored in the project's specified feature store in your data warehouse.
A prediction job can either be be full or incremental. During a full
prediction, Continual performs a new prediction for each record in the model
spine. During an incremental prediction, Continual performs a prediction for
each record in the model spine that does not already have a prediction. A
record refers to a unique entity in the model spine definition, either the
time_index combination in the case of a temporal model.
Incremental predictions are ideal when you'd like to only predict new records
that are created over time.
Prediction table and view¶
Continual creates a table in your feature store for every model you create in
the system that tracks all predictions made by model versions in that model over
time. This table lives under
project_id.model_<model_id>_predictions_history for BigQuery). This is a full
historical view of your predictions for the model and represents a complete
audit. Continual additionally builds a view under
project_id.model_<model_id>_predictions for BigQuery) which represents the latest
prediction made for each record in your model spine.
Working with promotions¶
Setting a promotion policy¶
When defining a model, you can specify the promotion policy for the model directly in the yaml:
promote: policy: [latest | best | manual]
If not specified, continual will use the default value of
latest. To edit a
promotion policy, simply change the value in your model yaml file and update the
continual push. This is not a change to the model definition, so
continual will not immediately take any action, but it will change the promotion
behavior the next time a model version is trained.
In the Web UI, you can select the promotion policy of a model during creation or when editing a model:
Manually promoting a model version¶
In the case that you have elected to utilize a manual promotion policy, or if for some reason you just have the need to manually promote a policy, this is easily accomplished via the CLI:
continual promote <model_version_id>
From the Web UI, users can deploy a model by navigating to the list of Model Versions and click "Promote" next to the model they would like to promote
It's also possible to deploy a model version directly from the model version overview page:
When using the CLI, promotions can be viewed via the following command:
continual promotions list
This will print a list of promotions for the current project:
Promotions can be filtered by a specific model via the option
--model <model_id> You may You can also view the details of any individual
continual promotions get <promotion_id>
All promotions can be viewed from the model page. Simply navigate to the "Promotions" tab:
Each promotions displays when it was created, the model version it promoted, the performance of the winning experience, as well as the lifetime of the promotion.
Working with batch predictions¶
Setting a batch prediction schedule¶
When defining a model, you can specify the batch prediction schedule for the model directly in the yaml:
prediction: schedule: [cron_syntax] incremental: True
The schedule can be defined using standard
cron syntax. It is also acceptable to use
@hourly. You may also specify whether
to execute an incremental or full prediction via
incremental. The default
incremental: False (i.e. execute a full prediction).
If not specified, continual will not run a batch prediction unless it is
manually kicked off. To edit a prediction schedule, simply change the value in
your model yaml file and update the model with
continual push. This is not a
change to the model definition, so continual will not immediately take any
action, but the new schedule will go into effect immediately.
In the Web UI, you can select the prediction schedule of a model during creation or when editing a model:
Manually executing a batch prediction job¶
You may manually execute a batch prediction job outside the normal schedule via the following command:
continual batch-prediction run <model_id>
This will kick off a new batch prediction job of the named model with the
currently promoted model version. You may optionally add
execute an incremental prediction.
From the Web UI, users can execute a batch prediction by opening a model and clicking "Predict" in the top right corner of the page:
On the pop-up, you'll then be able to specify whether to do a full or incremental prediction.
Viewing batch predictions¶
When using the CLI, predictions can be viewed via the following command:
continual batch-predictions list
This will print a list of predictions for the current project:
You can filter on a specific model via the option
--model <model_id> You may
You can also view the details of any individual prediction via:
continual batch-predictions get <prediction_id>
In the Web UI, all predictions can be viewed from the model page. Simply navigate to the "Predictions" tab:
Each predictions displays when it was created, the model and model version it is linked to, and the duration of the job.
Cancelling a batch prediction job¶
In case a user wishes to cancel a batch prediction job, this may be done simply via the CLI with the following command:
continual batch-predictions cancel <prediction_id>
In the Web UI, users will always have the option of cancelling a currently running prediction: