# Snowflake integration

> Snowflake integration - How to set up an integration between DataRobot and Snowflake that allows
> joint users to both execute data science projects in DataRobot and perform computations in
> Snowflake.

This Markdown file sits beside the HTML page at the same path (with a `.md` suffix). It summarizes the topic and lists links for tools and LLM context.

Companion generated at `2026-04-24T16:03:56.542342+00:00` (UTC).

## Primary page

- [Snowflake integration](https://docs.datarobot.com/en/docs/classic-ui/data/transform-data/feature-discovery/fd-snowflake.html): Full documentation for this topic (HTML).

## Related documentation

- [Classic UI documentation](https://docs.datarobot.com/en/docs/classic-ui/index.html): Linked from this page.
- [Data](https://docs.datarobot.com/en/docs/classic-ui/data/index.html): Linked from this page.
- [Transform data](https://docs.datarobot.com/en/docs/classic-ui/data/transform-data/index.html): Linked from this page.
- [Feature Discovery](https://docs.datarobot.com/en/docs/classic-ui/data/transform-data/feature-discovery/index.html): Linked from this page.
- [data connection](https://docs.datarobot.com/en/docs/classic-ui/data/connect-data/data-conn.html#dataconn-add): Linked from this page.
- [dynamic datasets](https://docs.datarobot.com/en/docs/classic-ui/data/ai-catalog/catalog-asset.html#asset-states): Linked from this page.

## Documentation content

# Snowflake integration

An integration between DataRobot and Snowflake allows joint users to both execute data science projects in DataRobot and perform computations in Snowflake as a way to optimize workload performance.Feature Discovery training and prediction workflows will push down relational inner-joins, projection, and filter operations to the Snowflake platform (via SQL). By natively conducting joins in the Snowflake database, data is filtered into smaller datasets for transfer across the network before loading into DataRobot. The smaller datasets reduce project runtimes.

To enable integration with Snowflake, the following requirements must be met:

- A Snowflake data connection is set up.
- All secondary datasets are stored in Snowflake.
- All Snowflake sources are stored in the same warehouse.
- All datasets are configured as dynamic datasets in the AI Catalog.
- You have write permissions to one of the schemas in use or one PUBLIC schema of the database in use.

If the above requirements are met, DataRobot automatically establishes the integration and displays the Snowflake icon and Snowflake mode enabled, in blue, at the top of the Define Relationships page.
