# Set up accuracy monitoring

> Set up accuracy monitoring - Configure accuracy monitoring on a deployment's Accuracy Settings tab.

This Markdown file sits beside the HTML page at the same path (with a `.md` suffix). It summarizes the topic and lists links for tools and LLM context.

Companion generated at `2026-04-24T16:03:56.553104+00:00` (UTC).

## Primary page

- [Set up accuracy monitoring](https://docs.datarobot.com/en/docs/classic-ui/mlops/deployment-settings/accuracy-settings.html): Full documentation for this topic (HTML).

## Sections on this page

- [Enable target monitoring](https://docs.datarobot.com/en/docs/classic-ui/mlops/deployment-settings/accuracy-settings.html#enable-target-monitoring): In-page section heading.
- [Select an association ID](https://docs.datarobot.com/en/docs/classic-ui/mlops/deployment-settings/accuracy-settings.html#select-an-association-id): In-page section heading.
- [Association IDs for time series deployments](https://docs.datarobot.com/en/docs/classic-ui/mlops/deployment-settings/accuracy-settings.html#association-ids-for-time-series-deployments): In-page section heading.
- [Add actuals](https://docs.datarobot.com/en/docs/classic-ui/mlops/deployment-settings/accuracy-settings.html#add-actuals): In-page section heading.
- [Upload actuals with the API](https://docs.datarobot.com/en/docs/classic-ui/mlops/deployment-settings/accuracy-settings.html#upload-actuals-with-the-api): In-page section heading.
- [Define accuracy monitoring notifications](https://docs.datarobot.com/en/docs/classic-ui/mlops/deployment-settings/accuracy-settings.html#define-accuracy-monitoring-notifications): In-page section heading.
- [Examples of accuracy monitoring settings](https://docs.datarobot.com/en/docs/classic-ui/mlops/deployment-settings/accuracy-settings.html#examples-of-accuracy-monitoring-settings): In-page section heading.

## Related documentation

- [Classic UI documentation](https://docs.datarobot.com/en/docs/classic-ui/index.html): Linked from this page.
- [MLOps](https://docs.datarobot.com/en/docs/classic-ui/mlops/index.html): Linked from this page.
- [Deployment settings](https://docs.datarobot.com/en/docs/classic-ui/mlops/deployment-settings/index.html): Linked from this page.
- [Accuracy](https://docs.datarobot.com/en/docs/classic-ui/mlops/monitor/deploy-accuracy.html): Linked from this page.
- [Data Drift Settings](https://docs.datarobot.com/en/docs/classic-ui/mlops/deployment-settings/data-drift-settings.html): Linked from this page.
- [actuals](https://docs.datarobot.com/en/docs/reference/glossary/index.html#actuals): Linked from this page.
- [agent-monitored](https://docs.datarobot.com/en/docs/classic-ui/mlops/deployment/mlops-agent/monitoring-agent/index.html): Linked from this page.
- [report accuracy for the modelandits challengers](https://docs.datarobot.com/en/docs/classic-ui/mlops/deployment/mlops-agent/monitoring-agent/agent-use.html#report-accuracy-for-challengers): Linked from this page.
- [chat()hook documentation](https://docs.datarobot.com/en/docs/api/code-first-tools/drum/structured-custom-models.html#association-id): Linked from this page.
- [AI Catalog](https://docs.datarobot.com/en/docs/classic-ui/data/ai-catalog/catalog.html): Linked from this page.
- [make predictions](https://docs.datarobot.com/en/docs/api/dev-learning/python/predictions/index.html): Linked from this page.
- [Make predictions](https://docs.datarobot.com/en/docs/classic-ui/predictions/index.html): Linked from this page.
- [clear the deployment statistics](https://docs.datarobot.com/en/docs/classic-ui/mlops/manage-mlops/actions-menu.html#clear-deployment-statistics): Linked from this page.
- [performance optimization metric](https://docs.datarobot.com/en/docs/reference/pred-ai-ref/opt-metric.html): Linked from this page.
- [configure the conditions under which notifications are sent to them](https://docs.datarobot.com/en/docs/classic-ui/mlops/governance/deploy-notifications.html): Linked from this page.

## Documentation content

# Set up accuracy monitoring

You can monitor a deployment for accuracy using the [Accuracy](https://docs.datarobot.com/en/docs/classic-ui/mlops/monitor/deploy-accuracy.html) tab, which lets you analyze the performance of the model deployment over time using standard statistical measures and exportable visualizations. You can enable accuracy monitoring on the Accuracy > Settings tab. To configure accuracy monitoring, you must:

- Enable target monitoringin theData Drift Settings
- Select an association IDin the Accuracy Settings
- Add actualsin the Accuracy Settings

On a deployment's Accuracy Settings page, you can configure the Association ID and Upload Actuals settings and the accuracy monitoring Definition and Notifications settings:

| Field | Description |
| --- | --- |
| Association ID |  |
| Association ID | Defines the name of the column that contains the association ID in the prediction dataset for your model. Association IDs are required for setting up accuracy monitoring in a deployment. The association ID functions as an identifier for your prediction dataset so you can later match up outcome data (also called "actuals") with those predictions. |
| Require association ID in prediction requests | Requires your prediction dataset to have a column name that matches the name you entered in the Association ID field. When enabled, you will get an error if the column is missing. This cannot be enabled with Enable automatic association ID generation for prediction rows. |
| Enable automatic association ID generation for prediction rows | With an association ID column name defined, allows DataRobot to automatically populate the association ID values. This cannot be enabled with Require association ID in prediction requests. |
| Enable automatic actuals feedback for time series models | For time series deployments that have indicated an association ID. Enables the automatic submission of actuals, so that you do not need to submit them manually via the UI or API. Once enabled, actuals can be extracted from the data used to generate predictions. As each prediction request is sent, DataRobot can extract an actual value for a given date. This is because when you send prediction rows to forecast, historical data is included. This historical data serves as the actual values for the previous prediction request. |
| Upload Actuals |  |
| Drop file(s) here or choose file | Uploads a file with actuals to monitor accuracy by matching the model's predictions with actual values. Actuals are required to enable the Accuracy tab. |
| Assigned features | Configures the Assigned features settings after you upload actuals. |
| Definition |  |
| Set definition | Configures the metric, measurement, and threshold definitions for accuracy monitoring. |

## Enable target monitoring

In order to enable accuracy monitoring, you must also [enable target monitoring](https://docs.datarobot.com/en/docs/classic-ui/mlops/deployment-settings/data-drift-settings.html) in the Data Drift section of the Data Drift Settings tab.

If target monitoring is turned off, a message displays on the Accuracy tab to remind you to enable target monitoring.

## Select an association ID

To activate the [Accuracytab](https://docs.datarobot.com/en/docs/classic-ui/mlops/monitor/deploy-accuracy.html) for a deployment, first designate an association ID in the prediction dataset. The association ID is a [foreign key](https://www.tutorialspoint.com/Foreign-Key-in-RDBMS), linking predictions with future results (or [actuals](https://docs.datarobot.com/en/docs/reference/glossary/index.html#actuals)). It corresponds to an event for which you want to track the outcome; For example, you may want to track a series of loans to see if any of them have defaulted or not.

> [!NOTE] Important: Association ID for monitoring agent and monitoring jobs
> You must set an association ID before making predictions to include those predictions in accuracy tracking. For [agent-monitored](https://docs.datarobot.com/en/docs/classic-ui/mlops/deployment/mlops-agent/monitoring-agent/index.html) external model deployments with challengers (and monitoring jobs for challengers), the association ID should be `__DataRobot_Internal_Association_ID__` to [report accuracy for the modelandits challengers](https://docs.datarobot.com/en/docs/classic-ui/mlops/deployment/mlops-agent/monitoring-agent/agent-use.html#report-accuracy-for-challengers).

On the Accuracy > Settings tab of a deployment, the Association ID section has a field for the column name containing the association IDs. The column name you define in the Association ID field must match the name of the column containing the association IDs in the prediction dataset for your model. Each cell for this column in your prediction dataset should contain a unique ID that pairs with a corresponding unique ID that occurs in the actuals payload.

In addition, you can enable Require association ID in prediction requests to throw an error if the column is missing from your prediction dataset when you make a prediction request.

> [!NOTE] Association IDs for chat requests
> For DataRobot-deployed text generation and agentic workflow custom models that use the Bolt-on Governance API (chat completions), an association ID can be specified directly in the chat request using the `extra_body` field. Set `datarobot_association_id` in `extra_body` to use a custom association ID instead of the auto-generated one. For more information, see the [chat()hook documentation](https://docs.datarobot.com/en/docs/api/code-first-tools/drum/structured-custom-models.html#association-id).

You can set the column name containing the association IDs on a deployment at any time, whether predictions have been made against that deployment or not. Once set, you can only update the association ID if you have not yet made predictions that include that ID. Once predictions have been made using that ID, you cannot change it.

Association IDs (the contents in each row for the designated column name) must be shorter than 128 characters, or they will be truncated to that size. If truncated, uploaded actuals will require the truncated association IDs for your actuals in order to properly generate accuracy statistics.

### Association IDs for time series deployments

For time series deployments, prediction requests already contain the data needed to uniquely identify individual predictions. Therefore, it is important to consider the feature used as an association ID, depending on the deployment type, consider the following guidelines:

- Single-series deployments: DataRobot recommends using theForecast Datecolumn as the association ID because it is the date you are making predictions for. For example, if today is June 15th, 2022, and you are forecasting daily total sales for a store, you may wish to know what the sales will be on July 15th, 2022. You will have a single actual total sales figure for this date, so you can use “2022-07-15” as the association ID (the forecast date).
- Multiseries deployments: DataRobot recommends using a custom column containingForecast Date + Series IDas the association ID. If a single model can predict daily total sales for a number of stores, then you can use, for example, the association ID “2022-07-15 1234” for sales on July 15th, 2022 for store #1234.
- All time series deployments: You may want to forecast the same date multiple times as the date approaches. For example, you might forecast daily sales 30 days in advance, then again 14 days in advance, and again 7 days in advance. These forecasts all have the same forecast date, and therefore the same association ID.

> [!NOTE] Important
> Be aware that models may produce different forecasts when predicting closer to the forecast date. Predictions for multiple forecast distances are each tracked individually so that accuracy can be properly calculated for each forecast distance.

After you designate an association ID, you can toggle Enable automatic actuals feedback for time series models to on. This setting automatically submits actuals so that you do not need to submit them manually via the UI or API. Once enabled, actuals can be extracted from the data used to generate predictions. As each prediction request is sent, DataRobot can extract an actual value for a given date. This is because when you send prediction rows to forecast, historical data is included. This historical data serves as the actual values for the previous prediction request.

## Add actuals

You can directly upload datasets containing actuals to a deployment from the Accuracy > Settings tab (described here) or through the [API](https://docs.datarobot.com/en/docs/classic-ui/mlops/deployment-settings/accuracy-settings.html#upload-actuals-with-the-api). The deployment's prediction data must correspond to the actuals data you upload. Review the [row limits](https://docs.datarobot.com/en/docs/classic-ui/mlops/deployment-settings/accuracy-settings.html#actuals-upload-limit) for uploading actuals before proceeding.

1. To use actuals with your deployment, in theUpload Actualssection, clickChoose file. Either upload a file directly or select a file from theAI Catalog. If you upload a local file, it is added to theAI Catalogafter successful upload. Actuals must be snapshotted in the AI Catalog to use them with a deployment.
2. Once uploaded, complete the fields that populate in theActualssection. UnderAssigned features, each field has a dropdown menu that allows you to select any of the columns from your dataset: FieldDescriptionActual ResponseDefines the column in your dataset that contains the actual values.Association IDDefines the column that contains theassociation IDs.Timestamp (optional)Defines the column that contains the date/time when the actual values were obtained, formatted according toRFC 3339(for example, 2018-04-12T23:20:50.52Z).Keep actuals without matching predictionsDetermines if DataRobot stores uploaded actuals that don't match any existing predictions by association ID. Column name matchingThe column names for the association ID in the prediction and the actuals datasets do not need to match. The only requirement is that each dataset contains a column that includes an identifier that does match the other dataset. For example, if the columnstore_idcontains the association ID in the prediction dataset that you will use to identify a row and match it to the actual result, enterstore_idin theAssociation IDsection. In theUpload Actualssection underAssigned fields, in theAssociation IDfield, enter the name of the column in the actuals dataset that contains the identifiers associated with the identifiers in the prediction dataset.
3. After you configure theAssigned fields, clickSave. When you complete this configuration process andmake predictionswith a dataset containing the definedAssociation ID, theAccuracypage is enabled for your deployment.

## Upload actuals with the API

This workflow outlines how to enable the Accuracy tab for deployments using the DataRobot API.

1. From theAccuracy > Settingstab, locate theAssociation IDsection.
2. In theAssociation IDfield, enter the column name containing the association IDs in your prediction dataset.
3. EnableRequire association ID in prediction requests. This requires your prediction dataset to have a column name that matches the name you entered in theAssociation IDfield. You will get an error if the column is missing. NoteYou can set an association ID andnottoggle on this setting if you are sending prediction requests that do not include the association ID and you do not want them to error; however, until it is enabled, you cannot monitor accuracy for your deployment.
4. Make predictionsusing a dataset that includes the association ID.
5. Submit the actual values via the DataRobot API (for details, refer to the API documentation by signing in to DataRobot, clicking the question mark on the upper right, and selectingAPI Documentation; in the API documentation, selectDeployments > Submit Actuals - JSON). You should review therow limitsfor uploading actuals before proceeding. NoteThe actuals payload must contain theassociationIdandactualValue, with the column names for those values in the dataset defined during the upload process. If you submit multiple actuals with the same association ID value, either in the same request or a subsequent request, DataRobot updates the actuals value; however, this update doesn't recalculate the metrics previously calculated using that initial actuals value. To recalculate metrics, you canclear the deployment statisticsand reupload the actuals (or create a new deployment). Use the following snippet in the API to submit the actual values: importrequestsAPI_TOKEN=''USERNAME='johndoe@datarobot.com'DEPLOYMENT_ID='5cb314xxxxxxxxxxxa755'LOCATION='https://app.datarobot.com'defsubmit_actuals(data,deployment_id):headers={'Content-Type':'application/json','Authorization':'Token{}'.format(API_TOKEN)}url='{location}/api/v2/deployments/{deployment_id}/actuals/fromJSON/'.format(deployment_id=deployment_id,location=LOCATION)resp=requests.post(url,json=data,headers=headers)ifresp.status_code>=400:raiseRuntimeError(resp.content)returnresp.contentdefmain():deployment_id=DEPLOYMENT_IDpayload={'data':[{'actualValue':1,'associationId':'5d8138fb9600000000000000',# str},{'actualValue':0,'associationId':'5d8138fb9600000000000001',},]}submit_actuals(payload,deployment_id)print('Done')if__name__=="__main__":main() After submitting at least 100 actuals for a non-time series deployment (there is no minimum for time series deployments) and making predictions with corresponding association IDs, theAccuracytab becomes available for your deployment.

## Define accuracy monitoring notifications

For accuracy, the notification conditions relate to a [performance optimization metric](https://docs.datarobot.com/en/docs/reference/pred-ai-ref/opt-metric.html) for the underlying model in the deployment. Select from the same set of metrics that are available on the Leaderboard. You can visualize accuracy using the [Accuracy over Time graph](https://docs.datarobot.com/en/docs/classic-ui/mlops/monitor/deploy-accuracy.html#accuracy-over-time-graph) and the [Predicted & Actual graph](https://docs.datarobot.com/en/docs/classic-ui/mlops/monitor/deploy-accuracy.html#predicted-actual-graph). Accuracy monitoring is defined by a single accuracy rule. Every 30 seconds, the rule evaluates the deployment's accuracy. Notifications trigger when this rule is violated.

Before configuring accuracy notifications and monitoring for a deployment, set an [association ID](https://docs.datarobot.com/en/docs/classic-ui/mlops/deployment-settings/accuracy-settings.html#select-an-association-id). If not set, DataRobot displays the following message when you try to modify accuracy notification settings:

> [!NOTE] Note
> Only deployment Owners can modify accuracy monitoring settings. They can set no more than one accuracy rule per deployment.Consumers cannot modify monitoring or notification settings.Users can [configure the conditions under which notifications are sent to them](https://docs.datarobot.com/en/docs/classic-ui/mlops/governance/deploy-notifications.html) and see explained status information by hovering over the accuracy status icon:
> 
> [https://docs.datarobot.com/en/docs/images/notify-8.png](https://docs.datarobot.com/en/docs/images/notify-8.png)

To set up accuracy monitoring:

1. On theAccuracy Settingspage, in theDefinitionsection, configure the settings for monitoring accuracy: ElementDescription1MetricDefines the metric used to evaluate the accuracy of your deployment. The metrics available from the dropdown menu are the same as thosesupported by theAccuracytab.2MeasurementDefines the unit of measurement for the accuracy metric and its thresholds. You can selectvalueorpercentfrom the dropdown. Thevalueoption measures the metric and thresholds by specific values, and thepercentoption measures by percent changed. Thepercentoption is unavailable for model deployments that don't have training data.3"At Risk" / "Failing" thresholdsSets the values or percentages that, when exceeded, trigger notifications. Two thresholds are supported: when the deployment's accuracy is "At Risk" and when it is "Failing." DataRobot provides default values for the thresholds of the first accuracy metric provided (LogLoss for classification and RMSE for regression deployments) based on the deployment's training data. Deployments without training data populate default threshold values based on their prediction data instead. If you change metrics, default values are not provided. NoteChanges to thresholds affect the periods in which predictions are made across the entire history of a deployment. These updated thresholds are reflected in the performance monitoring visualizations on theAccuracytab.
2. After updating the accuracy monitoring settings, clickSave.

### Examples of accuracy monitoring settings

Each combination of metric and measurement determines the expression of the rule. For example, if you use the LogLoss metric measured by value, the rule triggers notifications when accuracy "is greater than" the values of 5 or 10:

However, if you change the metric to AUC and the measurement to percent, the rule triggers notifications when accuracy "decreases by" the values set for the threshold:
