Ecosystem integration templates¶
Topic | Description |
---|---|
End-to-end ML workflow with Databricks | Build models in DataRobot with data acquired and prepared in a Spark-backed notebook environment provided by Databricks. |
End-to-end ML workflow with Google Cloud Platform and BigQuery | Use Google Collaboratory to source data from BigQuery, build and evaluate a model using DataRobot, and deploy predictions from that model back into BigQuery and GCP. |
End-to-end ML workflow with Snowflake | Work with Snowflake and DataRobot's Python client to import data, build and evaluate models, and deploy a model into production to make new predictions. |
End-to-end ML workflow with AWS | Work with AWS and DataRobot's Python client to import data, build and evaluate models, and deploy a model into production to make new predictions. |
End-to-end ML workflow with Azure | Work with Azure and DataRobot's Python client to import data, build and evaluate models, and deploy a model into production to make new predictions. |
Monitor AWS Sagemaker models with MLOps | Train and host a SageMaker model that can be monitored in the DataRobot platform. |
Integrate DataRobot and Snowpark by maximizing the data cloud | Leverage Snowflake for data storage and Snowpark for deployment, feature engineering, and model scoring with DataRobot. |
End-to-end workflow with SAP Hana | Learn how to programmatically build a model with DataRobot using SAP Hana as the data source. |
Deploy Scoring Code as a microservice | Follow a step-by-step procedure to embed Scoring Code in a microservice and prepare it as the Docker container for a deployment on customer infrastructure (it can be self- or hyperscaler-managed K8s). |
End-to-end demand forecasting workflow with DataRobot and Databricks | How to use DataRobot with Databricks to develop, evaluate, and deploy a multi-series demand forecasting model. |
Create and deploy a custom model | How to create, deploy, and monitor a custom inference model with DataRobot's Python client. You can use the Custom Model Workshop to upload a model artifact to create, test, and deploy custom inference models to DataRobot’s centralized deployment hub. |
Integrate GraphQL with DataRobot | Connect a GraphQL server to the DataRobot OpenAPI specification using GraphQL Mesh. |
End-to-end ML workflow with Athena | Read in an Amazon Athena table to create a project and deploy a model to make predictions with a test dataset. |
End-to-end ML workflow with Sagemaker | Take an ML model that has been built with DataRobot and deploy it to run within AWS SageMaker. |
Updated January 30, 2025
Was this page helpful?
Great! Let us know what you found helpful.
What can we do to improve the content?
Thanks for your feedback!