Deployment inventory¶
Once models are deployed, the deployment inventory is the central hub for deployment management activity. It serves as a coordination point for all stakeholders involved in operationalizing models. From the inventory, you can monitor deployed model performance and take action as necessary, as it provides an interface to all actively deployed models.
Deployment lenses¶
There are two unique deployment lenses that modify the information displayed in the inventory:
-
The Prediction Health lens summarizes prediction usage and model status for all active deployments.
-
The Governance lens reports the operational and social aspects of all active deployments.
To change deployment lenses, click the active lens in the top right corner and select a lens from the dropdown.
Prediction Health lens¶
The Prediction Health lens is the default view of the deployment inventory, detailing prediction activity and model health for each deployment. Across the top of the inventory, the page summarizes the usage and status of all active deployments with color-coded health indicators.
Beneath the summary is an individual report for each deployment.
The following table describes the information available from the Prediction Health lens:
Category | Description |
---|---|
Deployment Name | Name assigned to the deployment at creation, the type of prediction server used, and the project name (DataRobot models only). |
Service | Service health of the individual deployment. The color-coded status indicates the presence or absence of errors in the last 24 hours. |
Drift | Data Drift occurring in the deployment. |
Accuracy | Model accuracy evaluated over time. |
Activity | A bar graph indicating the pattern of predictions over the past seven days. The starting point is the same for each deployment in the inventory. For example, a new deployment will plot that day's activity and six (blank) days previous. |
Avg. Predictions/Day | Average number of predictions per day over the last seven days. |
Last Prediction | Elapsed time since the last prediction was made against the model. |
Creation Date | Elapsed time since the deployment was created. |
Actions | Menu of additional model management activities, including adding data, replacing a model, setting data drift, and sharing and deleting deployments. |
Click on any model entry in the table to view details about that deployment. Each model-specific page provides the above information in a status banner.
Color-coded health indicators¶
The Service Health, Data Drift, and Accuracy summaries in the top part of the display provide an at-a-glance indication of health and accuracy for all deployed models. To view this more detailed information for an individual model, click on the model in the inventory list. The Service Health Summary tile measures the following error types over the last 24 hours. These are the Data Error Rate and System Error Rate errors recorded for an individual model on the Service Health tab.
What are 4xx and 5xx errors?
- 4xx errors indicate problems with the prediction request submission.
- 5xx errors indicate problems with the DataRobot prediction server.
If you've enabled timeliness tracking on the Usage > Settings tab, you can view timeliness indicators in the inventory. Timeliness indicators show if the prediction or actuals upload frequency meets the standards set by your organization.
Use the table below to interpret the color indicators for each deployment health category:
Color | Service Health | Data Drift | Accuracy | Timeliness | Action |
---|---|---|---|---|---|
Green / Passing | Zero 4xx or 5xx errors. | All attributes' distributions have remained similar since the model was deployed. | Accuracy is similar to when the model was deployed. | Prediction and/or actuals timeliness standards met. | No action needed. |
Yellow / At risk | At least one 4xx error and zero 5xx errors. | At least one lower-importance attribute's distribution has shifted since the model was deployed. | Accuracy has declined since the model was deployed. | N/A | Concerns found but no immediate action needed; monitor. |
Red / Failing | At least one 5xx error. | At least one higher-importance attribute's distribution has shifted since the model was deployed. | Accuracy has severely declined since the model was deployed. | Prediction and/or actuals timeliness standards not met. | Immediate action needed. |
Gray / Disabled | Unmonitored deployment. | Data drift tracking disabled. | Accuracy tracking disabled. | Timeliness tracking disabled. | Enable monitoring and make predictions. |
Gray / Not started | No service health events recorded. | Data drift tracking not started. | Accuracy tracking not started. | Timeliness tracking not started. | Make predictions. |
Gray / Unknown | No predictions made | Insufficient predictions made (min. 100 required). | Insufficient predictions made (min. 100 required). | N/A | Make predictions. |
Live inventory updates¶
The inventory automatically refreshes every 30 seconds and updates the following information:
Active Deployments¶
The Active Deployments tile indicates the number of deployments currently in use. The legend interprets the bar below the active deployment count:
- Your active deployments (blue)
- Other active deployments (white)
- Available new deployments (gray)
Inactive deployments
Inactive deployments do not count toward the deployments limit and do not support predictions, retraining, challengers, or model replacement. Deployments are created as inactive if you have reached your deployment allotment.
In the example above, the user's organization is allotted ten deployments. The user has seven active deployments, and there is one other active deployment in the organization. Users within the organization can create two more active deployments before reaching the limit. There are two inactive deployments not counted towards the deployment limit.
If you're active in multiple organizations, under Your active deployments, you can see how many of those active deployments are in This organization or Other organizations:
Deployments in other organizations
Your deployments in Other organizations do not count toward the allocated limit in the current organization.
Availability information
The availability information shown on the Active Deployments tile depends on the MLOps configuration for your organization.
Predictions¶
The Predictions tile indicates the number of predictions made since the last refresh.
Individual deployments show the number of predictions made on them during the last 30 seconds.
If a deployment's service health, drift, or accuracy status changes to Failing, the individual deployment will flash red to draw attention to it.
Sort deployments¶
The deployment inventory is initially sorted by the most recent creation date (reported in the Creation Date column). You can click a different column title to sort by that metric instead. A blue arrow appears next to the sort column's title, indicating if the order is ascending or descending.
Note
When you sort the deployment inventory, your most recent sort selection persists in your local settings until you clear your browser's local storage data. As a result, the deployment inventory is usually sorted by the column you selected last.
You can sort in ascending or descending order by:
- Deployment Name (alphabetically)
- Service, Drift, and Accuracy (by status)
- Avg. Predictions/Day (numerically)
- Last Prediction (by date)
- Build Environment (alphabetically)
- Creation Date (by date)
Note
The list is sorted secondarily by the time of deployment creation (unless the primary sort is by Creation Date). For example, if you sorted by drift status, all deployments whose status is passing would be ordered from most recent creation to oldest, followed by failing deployments most recent to oldest.
Filter deployments¶
To filter the deployment inventory, select Filters at the top of the inventory page.
The filter menu opens, allowing you to select the criteria by which deployments are filtered.
Filter | Description |
---|---|
Ownership | Filters by deployment owner. Select Owned by me to display only those deployments for which you have the owner role. |
Activation status | Filters by deployment activation status. Active deployments are able to monitor and return new predictions. Inactive deployments can only show insights and statistics about past predictions. |
Fairness status | Filters by deployment fairness status. Choose to filter by passing , at risk , failing , unknown , and not started . |
Service status | Filters by deployment service health status. Choose to filter by passing , at risk , failing , unknown , and not started . If a deployment has never had service health enabled, then it will not be included when this filter is applied. |
Drift status | Filters by deployment data drift status. Choose to filter by passing , at risk , failing , unknown , and not started . If a deployment previously had data drift enabled and reported a status, then the last-reported status is used for filtering, even if you later disabled data drift for that deployment. If a deployment has never had drift enabled, then it will not be included when this filter is applied. |
Accuracy status | Filters by deployment accuracy status. Choose to filter by passing , at risk , failing , unknown , and not started . If a deployment does not have accuracy information available, it is excluded from results when you apply the filter. |
Predictions timeliness status | Filters by predictions timeliness status. Choose to filter by passing , failing , disabled , or not started . |
Actuals timeliness status | Filters by actuals timeliness status. Choose to filter by passing , failing , disabled , or not started . |
Importance | Filters by the criticality of deployments, based on prediction volume, exposure to regulatory requirements, and financial impact. Choices include Critical, High, Moderate, and Low. |
Build environment | Filters by the environment in which the model was built. |
Prediction environment platforms | Filters by the platform the prediction environment runs on. |
After selecting the desired filters, click Apply Filters to update the deployment inventory. The Filters link updates to indicate the number of filters applied. You are notified if no deployments match your filters. To remove your filters, click the Clear all X filters shortcut, or open the filter dialog again and remove them manually.
Self-Managed AI Platform deployments with monitoring disabled¶
Availability information
This section is only applicable to the Self-Managed AI Platform. If you are a Self-Managed AI Platform administrator interested in enabling model monitoring for deployments by implementing the necessary hardware, contact DataRobot Support.
The use of DataRobot's monitoring functionality depends on having hardware with PostgreSQL and rsyslog installed. If you don't have these services, you will still be able to create, manage, and make predictions against deployments, but all monitoring-related functionality will be disabled automatically.
When Deployment Monitoring is disabled, the Deployments inventory is still accessible, but monitoring tools and statistics are disabled. A notification at the top of the page informs you of the monitoring status.
The actions menu on the Deployments inventory page still allows you to share or delete a deployment and replace a model.
When you select a deployment, you can still access the predictions code snippet from the Predictions tab.