Prepare DataRobot Helm chart values¶
This page describes how to prepare and customize the DataRobot Helm chart's values_dr.yaml file to define your specific installation parameters, including cluster resources, persistent storage configurations, networking details, and application-level settings.
Minimal values files¶
For platform-specific configuration examples, refer to the minimal values files located within the datarobot-prime/override Helm chart artifact directory.
tar xzf datarobot-prime-X.X.X.tgz
cd datarobot-prime/override
備考
Replace X.X.X with the latest release chart version.
You can use one of the below files as reference when creating values_dr.yaml the values files:
override/minimal_datarobot-generic_s3_iam_user_values.yamloverride/minimal_datarobot-generic-ext-pcs_minio_values.yaml
変数¶
The following variables are used throughout the installation process. You will use these variables in the YAML templates and example commands found in the installation sections.
Kubernetes¶
DATAROBOT_NAMESPACE
: The primary namespace in which DataRobot will operate.
DR_APP_HELM_RELEASE_NAME
: The name of the Helm release used for the main application chart. The recommended value is dr.
OCI Registry¶
DOCKER_REGISTRY_URL
: Fully Qualified Domain Name or IP address of the docker registry.
DOCKER_REGISTRY_USERNAME
: Username for authenticating the docker registry.
DOCKER_REGISTRY_PASSWORD
: Password for authenticating the docker registry.
DR_IMAGE_PULL_SECRET_NAME
: Name of the Kubernetes Secret object that will be used to store authorization information for the docker registry. Can be any name, but using datarobot-image-pull-secret is recommended.
Object storage¶
DATAROBOT_S3_BUCKET
: An S3-compatible bucket used by DataRobot for object storage.
S3_HOST
: IP or hostname of the S3 appliance.
S3_REGION
: You may additionally set the S3_REGION variable if you want to explicitly specify which region you run on, or if you are using a storage provider which provides an S3-compatible API.
S3_IS_SECURE
: Whether the service is using HTTPS.
FILE_STORAGE_PREFIX
: Represents the prefix applied to all paths in the file storage medium after the root path.
filestore:
type: s3
environment:
S3_HOST: S3_HOST
S3_BUCKET: DATAROBOT_S3_BUCKET
S3_IS_SECURE: "True"
S3_VALIDATE_CERTS: "True"
S3_REGION: S3_REGION
S3_PORT: "443"
S3_SERVER_SIDE_ENCRYPTION: DISABLED
備考
If you encounter uploading issues, you can disable multipart file uploads by specifying MULTI_PART_S3_UPLOAD: false . In general, multipart uploads are well-tested and support much larger file uploads, so you will likely not need to change the default.
Disabling TLS verification¶
If your environment is configured to use self-signed or unverified TLS certificates for its internal object storage connection, the following configuration options are necessary:
global:
filestore:
type: s3
environment:
S3_VALIDATE_CERTS: false
Web portal¶
DR_WEBPAGE_FQDN
: The Fully Qualified Domain Name (FQDN) of the web portal where users will log in (e.g., datarobot-app.company-name.com).
ADMIN_USER_EMAIL
: The email address for the initial administrative user in the web portal (e.g., admin@datarobot.com).
ADMIN_USER_PASSWORD
: The password for the initial administrative user.
DR_LICENSE_CONTENT
: The encrypted content of the DataRobot license file.
Feature flags¶
The DataRobot platform utilizes feature flags as an administrative mechanism to control the availability of various functionalities and newly-released features.
Feature enablement requires assistance. Contact your DataRobot Representative or submit a request through the DataRobot Support Portal or by emailing support@datarobot.com.
core:
config_env_vars:
TLS Configuration¶
DataRobot provides specific configuration options to encrypt data-in-flight for different parts of the platform. See:
- Configuring TLS for ingress for instances where TLS is terminated at the load balancer or ingress controller.
- Configuring application-level TLS for service-to-service traffic within the platform.
- Configuring TLS for object storage for S3-compatible providers.
Configure StorageClass¶
DataRobot is deployed using default StorageClass configured for your cluster. To specify a non-default StorageClass name globally for all persistent volumes requested by the main DataRobot platform chart, adjust the following setting in the chart's values_dr.yaml file:
global:
storageClassName: DESIGNATED_STORAGE_CLASS_NAME
備考
Replace DESIGNATED_STORAGE_CLASS_NAME with the name of your StorageClass.
Persistent Critical Services (PCS)¶
This guide helps you customize the relevant PCS component for:
MongoDBPostgreSQLRabbitMQRedisElasticsearch
Review notebook configuration¶
DataRobot updated the naming for notebook chart values—this change may impact your process. For detailed instructions, see the Notebooks upgrade guide.
Generative AI service¶
When you install the generative AI (GenAI) service in a restricted network environment, two migrations need to be disabled and performed manually. For instructions, see the following pages:
Tile server¶
For information on updating the tile server in a restricted network, see the following page:
カスタムタスク¶
For information on configuring custom tasks in a restricted network, see the following pages: