DataRobot User Models¶
While DataRobot provides hundreds of built-in models, there are situations where you need preprocessing or modeling methods that are not currently supported out of the box. To create a custom inference model, you must provide a model artifact—either defined in a custom.py file or a serialized artifact with a file extension corresponding to the chosen environment language and any additional custom code required to use the model.
Before adding custom models and environments to DataRobot, you must prepare and structure the files required to run them successfully. The tools and templates necessary to prepare custom models are hosted in the DataRobot User Models GitHub Repository. (Log in to GitHub before clicking this link.) DataRobot recommends understanding the following requirements to prepare your custom model for upload to the Workshop.
| Topic | Describes |
|---|---|
| Custom model components | How to identify the components required to run custom inference models. |
| Assemble structured custom models | How to assemble and validate structured custom models compatible with DataRobot. |
| Assemble unstructured custom models | How to assemble and validate unstructured custom models compatible with DataRobot. |
| Define custom model metadata | How to use the model-metadata.yaml file to specify additional information about a custom inference model. |
| Define custom model runtime parameters | How to add runtime parameters to a custom model through the model metadata, making your custom model code easier to reuse. |
| DRUM CLI tool | How to download and install the DataRobot User Models (DRUM) CLI to work with and test custom models and custom environments locally before uploading to DataRobot. |
| Test a custom model locally | How to test custom inference models in your local environment using the DataRobot Model Runner tool. |