# AI integrations and platforms

> AI integrations and platforms - Integrations with AI platforms and services to enhance your
> DataRobot experience.

This Markdown file sits beside the HTML page at the same path (with a `.md` suffix). It summarizes the topic and lists links for tools and LLM context.

Companion generated at `2026-05-06T18:17:09.572992+00:00` (UTC).

## Primary page

- [AI integrations and platforms](https://docs.datarobot.com/en/docs/api/dev-learning/accelerators/ai-integrations-platforms/index.html): Full documentation for this topic (HTML).

## Related documentation

- [Developer documentation](https://docs.datarobot.com/en/docs/api/index.html): Linked from this page.
- [Developer learning](https://docs.datarobot.com/en/docs/api/dev-learning/index.html): Linked from this page.
- [AI accelerators](https://docs.datarobot.com/en/docs/api/dev-learning/accelerators/index.html): Linked from this page.
- [AWS SageMaker deployment](https://docs.datarobot.com/en/docs/api/dev-learning/accelerators/ai-integrations-platforms/deploy-sagemaker.html): Linked from this page.
- [Feature Discovery SQL with Spark](https://docs.datarobot.com/en/docs/api/dev-learning/accelerators/ai-integrations-platforms/fd-sql-spark.html): Linked from this page.
- [GraphQL integration](https://docs.datarobot.com/en/docs/api/dev-learning/accelerators/ai-integrations-platforms/graphql.html): Linked from this page.
- [Amazon Athena workflow](https://docs.datarobot.com/en/docs/api/dev-learning/accelerators/ai-integrations-platforms/ml-athena.html): Linked from this page.
- [AWS workflow](https://docs.datarobot.com/en/docs/api/dev-learning/accelerators/ai-integrations-platforms/ml-aws.html): Linked from this page.
- [Azure workflow](https://docs.datarobot.com/en/docs/api/dev-learning/accelerators/ai-integrations-platforms/ml-azure.html): Linked from this page.
- [Databricks workflow](https://docs.datarobot.com/en/docs/api/dev-learning/accelerators/ai-integrations-platforms/ml-databricks.html): Linked from this page.
- [Google Cloud and BigQuery workflow](https://docs.datarobot.com/en/docs/api/dev-learning/accelerators/ai-integrations-platforms/ml-gcp.html): Linked from this page.
- [SageMaker workflow](https://docs.datarobot.com/en/docs/api/dev-learning/accelerators/ai-integrations-platforms/ml-sagemaker.html): Linked from this page.
- [Snowflake workflow](https://docs.datarobot.com/en/docs/api/dev-learning/accelerators/ai-integrations-platforms/ml-snowflake.html): Linked from this page.
- [Performance degradation prediction](https://docs.datarobot.com/en/docs/api/dev-learning/accelerators/ai-integrations-platforms/perform-degrade.html): Linked from this page.
- [Snowpark integration](https://docs.datarobot.com/en/docs/api/dev-learning/accelerators/ai-integrations-platforms/snowpark-data.html): Linked from this page.
- [SAP Hana workflow](https://docs.datarobot.com/en/docs/api/dev-learning/accelerators/ai-integrations-platforms/ml-sap.html): Linked from this page.
- [Speech recognition integration](https://docs.datarobot.com/en/docs/api/dev-learning/accelerators/ai-integrations-platforms/speech-rec.html): Linked from this page.

## Documentation content

| Topic | Description |
| --- | --- |
| AWS SageMaker deployment | Learn how to programmatically build a model with DataRobot and export and host the model in AWS SageMaker. |
| Feature Discovery SQL with Spark | Run Feature Discovery SQL in a new Spark cluster on Docker by setting up a Spark cluster in Docker, registering custom User Defined Functions (UDFs), and executing complex SQL queries across multiple datasets. |
| GraphQL integration | Connect a GraphQL server to the DataRobot OpenAPI specification using GraphQL Mesh. |
| Amazon Athena workflow | Read in an Amazon Athena table to create a project and deploy a model to make predictions with a test dataset. |
| AWS workflow | Work with AWS and DataRobot's Python client to import data, build and evaluate models, and deploy a model into production to make new predictions. |
| Azure workflow | Work with Azure and DataRobot's Python client to import data, build and evaluate models, and deploy a model into production to make new predictions. |
| Databricks workflow | Build models in DataRobot with data acquired and prepared in a Spark-backed notebook environment provided by Databricks. |
| Google Cloud and BigQuery workflow | Use Google Collaboratory to source data from BigQuery, build and evaluate a model using DataRobot, and deploy predictions from that model back into BigQuery and GCP. |
| SageMaker workflow | Take an ML model that has been built with DataRobot and deploy it to run within AWS SageMaker. |
| Snowflake workflow | Work with Snowflake and DataRobot's Python client to import data, build and evaluate models, and deploy a model into production to make new predictions. |
| Performance degradation prediction | Use a predictive framework for managing and maintaining your machine learning models with DataRobot MLOps. |
| Snowpark integration | Leverage Snowflake for data storage and Snowpark for deployment, feature engineering, and model scoring with DataRobot. |
| SAP Hana workflow | Learn how to programmatically build a model with DataRobot using SAP Hana as the data source. |
| Speech recognition integration | Use Whisper to transcribe audio files, process them efficiently, and store the transcriptions in a structured format for further analysis or use. |
