Skip to content

On-premise users: click in-app to access the full platform documentation for your version of DataRobot.


Public preview

The native S3 connector, which introduces performance improvements and additional feature support, is on by default.

Feature flag: Enable S3 Connector

Supported authentication

  • AWS credentials: Long-term credentials using aws_access_key_id and aws_secret_access_key.
  • AWS temporary credentials: Short-term credentials using aws_access_key_id, aws_secret_access_key, and aws_session_token.


The following is required before connecting to AWS S3 in DataRobot:

  • AWS S3 account

Self-Managed AI Platform installations

For Self-Managed AI Platform installations, an admin must add the native S3 connector.

Required parameters

The table below lists the minimum required fields to establish a connection with AWS S3:

Required field Description Documentation
bucketName A container that stores your data in AWS S3. AWS documentation

Note that you can specify bucketRegion under Show advanced options, however this parameter is not required.

Feature considerations

  • This connector does not support:

    • Feature Discovery projects
    • Data wrangling in Workbench
    • Adding dynamic datasets to a Workbench Use Case
  • This connector does support:

    • Creating and sharing AWS credentials using secure configurations.
    • Parquet file ingest including, single parquet files, partitioned parquet files in a folder, and zipped parquet files.

Updated February 1, 2024
Back to top