Define runtime parameters¶
Define environment variables to supply different values to custom model code at runtime by including them as runtime parameters, making your custom model easier to reuse. Define runtime parameters in code or through the UI:
-
Code: Provide a
model-metadata.yamlfile in the model artifact. Define this file before uploading a model to Workshop, or use the template available in Workshop through a custom model's Files > Create dropdown. The YAML structure is defined on this page. -
UI: Define runtime parameters in the Runtime parameters section of the custom model in Workshop. For more information, see the Create custom models documentation.
Runtime parameters are injected into containers in two ways:
- As standard environment variables without prefixes or JSON parsing for simple types (so you can use
os.getenvto access environment variables without thedatarobot-drumlibrary). - For backward compatibility also in the legacy prefixed (
MLOPS_RUNTIME_PARAM_*) and JSONified format.
Parameters created via the Workshop UI persist and merge when you upload new code versions, ensuring a seamless development flow.
Runtime parameter considerations
The system uses a blocklist of reserved patterns (e.g., DRUM_*, MLOPS_*, KUBERNETES_*), managed in the dynamic configuration. Matching supports the * wildcard (not full regular expression syntax). Reserved names aren't blocked entirely: if a runtime parameter uses a reserved name, the UI displays a warning. How the variable is exposed depends on the context:
- Custom models: Only in prefixed (
MLOPS_RUNTIME_PARAM_*) and JSONified format—not as a raw (unprefixed) environment variable, to prevent system conflicts. - Custom apps: Prefixed, but values are not packed into a JSON payload (except for credentials).
- Custom jobs: No prefix and no JSON payload (except for credential types); the variable is available as a raw environment variable.
For credential-type runtime parameters, the system automatically unpacks JSON fields into separate environment variables rather than a single string. For example, a credential named MAIN_AWS_CREDENTIAL with the following JSON structure:
{"awsAccessKeyId": "<your-key-id>", "awsSecretAccessKey": "<your-access-key>"}
is unpacked into the following environment variables, combining the parameter name + JSON key, in uppercase:
MAIN_AWS_CREDENTIAL_AWS_ACCESS_KEY_ID="<your-key-id>"
MAIN_AWS_CREDENTIAL_AWS_SECRET_ACCESS_KEY="<your-access-key>"
For single-field credential types (for example, api_token, bearer, or gcp), the injected environment variable uses the bare runtime parameter name (MY_CRED), not the parameter name plus the credential field name (for example, not MY_CRED_API_TOKEN). Multi-field credential types (for example, basic or s3) keep the existing suffixed behavior: one variable per field, named {PARAMETER_NAME}_{FIELD_NAME} in uppercase snake case (for example, MY_CRED_USERNAME and MY_CRED_PASSWORD, or the AWS keys in the example above). JSON-encoded runtime parameter variables (for example, MLOPS_RUNTIME_PARAMETERS_OPEN_AI_API) are unchanged; only the flat variable for a single-field secret uses the bare parameter name.
Access runtime parameters in containers
For programmatic access to runtime parameters in containers, use DataRobotAppFrameworkBaseSettings as documented in the SDK API reference.
To change runtime parameter values for an existing deployment, deactivate the deployment, update the values in the deployment's Settings > Resources tab, and then reactivate the deployment. For details, see Configure deployment resource settings.
Runtime parameter definitions¶
Add runtime parameters to a custom model through the model metadata, making your custom model code easier to reuse. To define runtime parameters, you can add the following runtimeParameterDefinitions in model-metadata.yaml:
| Key | Description |
|---|---|
fieldName |
Define the name of the runtime parameter. |
type |
Define the data type the runtime parameter contains: string, boolean, numeric credential, deployment. |
defaultValue |
(Optional) Set the default value for the runtime parameter. For credential type parameters, use defaultValue to reference an existing credential by its credential ID. For other types, set the default string, boolean, or numeric value. If you define a runtime parameter without specifying a defaultValue, the default value is None. |
minValue |
(Optional) For numeric runtime parameters, set the minimum numeric value allowed in the runtime parameter. |
maxValue |
(Optional) For numeric runtime parameters, set the maximum numeric value allowed in the runtime parameter. |
credentialType |
(Optional) For credential runtime parameters, set the type of credentials the parameter must contain. |
allowEmpty |
(Optional) Set the empty field policy for the runtime parameter.
|
description |
(Optional) Provide a description of the purpose or contents of the runtime parameter. |
DataRobot reserved runtime parameters¶
The following runtime parameter is reserved by DataRobot for custom model configuration:
| Runtime parameter | Type | Description |
|---|---|---|
CUSTOM_MODEL_WORKERS |
numeric | Allows each replica to handle a set number of concurrent processes. This option is intended for process-safe custom models, primarily in generative AI use cases (for more information on process-safe models, see the note below). To determine the appropriate number of concurrent processes to allow per replica, monitor the number of requests and the median response time for the custom model. The median response time for the custom model should be close to the median response time from the LLM. If the response time of the custom model exceeds the LLM's response time, stop increasing the number of concurrent processes and instead increase the number of replicas. Default value: 1 Max value: 40 |
Custom model process safety
When enabling and configuring CUSTOM_MODEL_WORKERS, ensure that your model is process-safe, allowing multiple independent processes to safely interact with shared resources without causing conflicts. This configuration is not intended for general use with custom models to make them more resource efficient. Only process-safe custom models with I/O-bound tasks (like proxy models) benefit from utilizing CPU resources this way.
Define custom model metadata¶
Before you define runtimeParameterDefinitions in model-metadata.yaml, define the custom model metadata required for the target type. For binary and multiclass models, that includes an inferenceModel block (targetName and class labels; for multiclass, classLabels).
name: binary-example
targetType: binary
type: inference
inferenceModel:
targetName: target
positiveClassLabel: "1"
negativeClassLabel: "0"
name: regression-example
targetType: regression
type: inference
name: textgeneration-example
targetType: textgeneration
type: inference
name: anomaly-example
targetType: anomaly
type: inference
name: unstructured-example
targetType: unstructured
type: inference
name: multiclass-example
targetType: multiclass
type: inference
inferenceModel:
targetName: class
classLabels:
- class_a
- class_b
- class_c
Then, below the model information, you can provide the runtimeParameterDefinitions:
name: runtime-parameter-example
targetType: regression
type: inference
runtimeParameterDefinitions:
- fieldName: my_first_runtime_parameter
type: string
description: My first runtime parameter.
- fieldName: runtime_parameter_with_default_value
type: string
defaultValue: Default
description: A string-type runtime parameter with a default value.
- fieldName: runtime_parameter_boolean
type: boolean
defaultValue: true
description: A boolean-type runtime parameter with a default value of true.
- fieldName: runtime_parameter_numeric
type: numeric
defaultValue: 0
minValue: -100
maxValue: 100
description: A boolean-type runtime parameter with a default value of 0, a minimum value of -100, and a maximum value of 100.
- fieldName: runtime_parameter_for_credentials
type: credential
credentialType: basic
allowEmpty: false
description: A runtime parameter containing a dictionary of credentials; credentials must be provided before registering the custom model.
- fieldName: runtime_parameter_for_connected_deployment
type: deployment
description: A runtime parameter defined to accept the deployment ID of another deployment to connect to the deployed custom model.
Provide credentials through runtime parameters¶
The credential runtime parameter type supports any credentialType value available in the DataRobot REST API. At runtime, credential payloads are also reflected as environment variables in the container; see the runtime parameter considerations for naming rules (single-field types such as api_token, bearer, and gcp use the bare parameter name; multi-field types such as basic and s3 use suffixed names per field).
You can provide credentials in two ways:
-
Reference existing credentials: Use the credential ID as the
defaultValueto reference credentials defined in the DataRobot Credentials management section. -
Provide credential values directly: Include the full credential structure when defining the runtime parameter (typically used during local development with DRUM).
Credential types
For more information on the supported credential types, see the API reference documentation for credentials.
Reference existing credentials¶
To reference an existing credential, set the defaultValue to the credential ID:
- fieldName: my_api_token
type: credential
credentialType: api_token
allowEmpty: false
defaultValue: <credential-id>
description: A runtime parameter referencing an existing API token credential.
Credential requirements
When you reference an existing credential, the credential must exist in the credential management section before registering the custom model, must match the credentialType specified in the runtime parameter definition, and must match the credential ID used as the defaultValue.
Provide credential values directly¶
The credential information required depends on the credentialType, as shown in the examples below:
| Credential Type | Example |
|---|---|
basic |
basic:
credentialType: basic
description: string
name: string
password: string
user: string
|
azure |
azure:
credentialType: azure
description: string
name: string
azureConnectionString: string
|
gcp |
gcp:
credentialType: gcp
description: string
name: string
gcpKey: string
|
s3 |
s3:
credentialType: s3
description: string
name: string
awsAccessKeyId: string
awsSecretAccessKey: string
awsSessionToken: string
|
api_token |
api_token:
credentialType: api_token
apiToken: string
name: string
|
Provide override values during local development¶
For local development with DRUM, you can specify a .yaml file containing the values of the runtime parameters. The values defined here override the defaultValue set in model-metadata.yaml:
my_first_runtime_parameter: Hello, world.
runtime_parameter_with_default_value: Override the default value.
runtime_parameter_for_credentials:
credentialType: basic
name: credentials
password: password1
user: user1
When using DRUM, the --runtime-params-file option specifies the file containing the runtime parameter values:
drum score --runtime-params-file .runtime-parameters.yaml --code-dir model_templates/python3_sklearn --target-type regression --input tests/testdata/juniors_3_year_stats_regression.csv
Import and use runtime parameters in custom code¶
To import and access runtime parameters, you can import the RuntimeParameters module in your code in custom.py:
from datarobot_drum import RuntimeParameters
def mask(value, visible=3):
return value[:visible] + ("*" * len(value[visible:]))
def transform(data, model):
print("Loading the following Runtime Parameters:")
parameter1 = RuntimeParameters.get("my_first_runtime_parameter")
parameter2 = RuntimeParameters.get("runtime_parameter_with_default_value")
print(f"\tParameter 1: {parameter1}")
print(f"\tParameter 2: {parameter2}")
credentials = RuntimeParameters.get("runtime_parameter_for_credentials")
if credentials is not None:
credential_type = credentials.pop("credentialType")
print(
f"\tCredentials (type={credential_type}): "
+ str({k: mask(v) for k, v in credentials.items()})
)
else:
print("No credential data set")
return data