| Create and deploy a custom model |
How to create, deploy, and monitor a custom inference model with DataRobot's Python client. You can use the Custom Model Workshop to upload a model artifact to create, test, and deploy custom inference models to DataRobot’s centralized deployment hub. |
| Custom blueprints with Composable ML |
Customize models on the Leaderboard using the Blueprint Workshop. |
| GraphSAGE custom transformer |
Convert a tabular dataset into a graph representation, train a GraphSAGE-based neural network, and package the solution as a DataRobot custom transformer. |
| Google Gemini integration |
Implement a Streamlit application based on Google Gemini LLM and host it on the DataRobot platform with Vertex AI integration. |
| GIN financial fraud detection |
Integrate a Graph Isomorphism Network (GIN) as a custom model task in DataRobot using DRUM. |
| Llama 2 on GCP |
Host Llama 2 on Google Cloud Platform with cost comparisons, infrastructure details, and integration with DataRobot's custom model framework. |
| LLM custom inference template |
The LLM custom inference model template enables you to deploy and accelerate your own LLM, along with "batteries-included" LLMs like Azure OpenAI, Google, and AWS. |
| Mistral 7B on GCP |
Host Mistral 7B on Google Cloud Platform with infrastructure setup, cost considerations, and DataRobot integration for monitoring and deployment. |
| Reinforcement learning |
Implement a model based on the Q-learning algorithm. |
| Scoring Code microservice |
Follow a step-by-step procedure to embed Scoring Code in a microservice and prepare it as the Docker container for a deployment on customer infrastructure (it can be self- or hyperscaler-managed K8s). |