Backup MongoDB¶
This operation can be executed from any macOS or GNU/Linux machine that has enough space to store the backup.
Warning
If the DataRobot application is configured to use managed services (external PCS), refer to the Back Up Your Database Deployment guide for MongoDB Atlas instead of this guide.
Prerequisites¶
- Utility mongodump version 100.6.0 is installed on the host where the backup is created.
- Utility kubectl version 1.23 is installed on the host where the backup is created.
- Utility
kubectlis configured to access the Kubernetes cluster where the DataRobot application is running. Verify this with thekubectl cluster-infocommand.
Considerations¶
As the database size increases, the execution time of mongodump also increases. This can reach impractical durations in certain scenarios, potentially spanning days. We recommend using managed services (external PCS).
Create backup¶
We recommend using managed services (external PCS) and scheduling backups simultaneously for managed Postgres, Redis, and Mongo.
If you are using pcs-ha charts, you can use the script below to create a backup.
Export name of DataRobot application Kubernetes namespace in DR_CORE_NAMESPACE variable:
export DR_CORE_NAMESPACE=<namespace>
Define where the backups are stored on the host where the backup is created. This example uses ~/datarobot-backups/, but you can choose a different location:
export BACKUP_LOCATION=~/datarobot-backups/
mkdir -p ${BACKUP_LOCATION}/mongodb
The backup process requires you to forward a local port to the remote MongoDB service. Define which local port you use. This example uses port 27018, but you can use another:
export LOCAL_MONGO_PORT=27018
Obtain the MongoDB root user password:
export PCS_MONGO_PASSWD=$(kubectl -n $DR_CORE_NAMESPACE get secret pcs-mongo -o jsonpath="{.data.mongodb-root-password}" | base64 -d)
echo ${PCS_MONGO_PASSWD}
Forward local port to remote MongoDB service deployed in the Kubernetes:
kubectl -n $DR_CORE_NAMESPACE port-forward svc/pcs-mongo-headless --address 127.0.0.1 $LOCAL_MONGO_PORT:27017 &
Backup the Mongo database:
mkdir -p $BACKUP_LOCATION/mongodb
mongodump -vv -u pcs-mongodb -p $PCS_MONGO_PASSWD -h 127.0.0.1 --port $LOCAL_MONGO_PORT -o $BACKUP_LOCATION/mongodb
After the backup completes, find the process ID of the port-forwarding process:
ps aux | grep -E "port-forwar[d].*$LOCAL_MONGO_PORT"
Then stop it:
kill <pid_of_the_kubectl_port-forward>
Create a tar archive of the backed-up database files and delete the backup files after they are archived:
cd $BACKUP_LOCATION
tar -cf datarobot-mongo-backup-$(date +%F).tar -C ${BACKUP_LOCATION} mongodb --remove-files
Critical Collections for Feature-Specific Backup¶
The mongodump command backs up all MongoDB collections by default. However, for specific DataRobot features, ensure the following collections are included in your backup:
Custom Applications¶
custom_applications- Application metadata and configurationlongrunningservices- Kubernetes deployment configurations (critical for functional restoration)custom_application_images- Application Source metadatacustom_application_image_versions- Source version informationexecute_docker_images- Docker image metadataworkspace_items- File storage references
Note: The longrunningservices collection is shared across multiple DataRobot features (Custom Applications, Custom Models deployments, etc.) and is critical for restoring running workloads. Without this collection, applications and deployments will appear in the UI but will not be functional.