Using Azure Redis with DataRobot¶
はじめに¶
DataRobot supports the integration of external caching solutions, including Azure Redis (version 6.0), to optimize performance and enhance data retrieval. This documentation provides step-by-step instructions on configuring DataRobot to utilize Azure Redis as a caching layer.
Steps to Configure Azure Redis (version 6.0) with DataRobot¶
1. Create an Azure Redis Cache¶
- Log in to the Azure Portal.
- Navigate to the Azure Cache for Redis service.
- Click "Add" and select "Redis Cache" as the type.
- Choose "Engine Version 6" as the version.
- Follow the on-screen instructions to configure your Azure Redis Cache, specifying details such as cache name, resource group, pricing tier, and SKU.
- Set your Name
-
Choose an appropriate Cache size based on our requirements.
-
Enterprise Authentication: Ensure that "Microsoft Enterprise Authentication" is not enabled.
2. Configure Private Endpoint for Azure Redis Cache (Optional)¶
- Private Endpoint for Azure Redis Cache enhances security by providing private connectivity to the cache instance within your virtual network.
- In the Azure Portal, navigate to your Redis Cache instance.
- Under Settings, select Private endpoint connections.
- Click + Add and follow the on-screen instructions to create a new private endpoint connection.
- Choose your virtual network, subnet, and enable private DNS integration if needed.
2. Obtain Azure Redis Cache Connection Details¶
- Once the Redis Cache instance is created, you can obtain the connection details, including the connection string.
- Navigate to the overview page of your Redis Cache instance.
- Under the "Authentication" section, locate and copy the Primary connection string. These connection strings contain the necessary information for connecting to your Redis Cache instance.
- To parse the connection string and extract the required information such as Hostname, Port and password
3. Configure DataRobot to Use Azure Redis Cache¶
When DataRobot is configured to utilize an external Redis service, additional YAML override values must be provided.
redis:
auth:
password: YOUR_AZURE_REDIS_PASSWORD
then add to your values.yaml within the datarobot chart.
global:
redis:
internal: false
hostname: "YOUR_AWS_ELASTICACHE_REDIS_PRIMARY_ENDPOINT"
port: "YOUR_AWS_ELASTICACHE_REDIS_PORT"
tls: true
sentinel:
enabled: false
auth:
password: YOUR_AWS_ELASTICACHE_REDIS_PASSWORD
core:
config_env_vars:
REDISPROXY_PORT: YOUR_AZURE_REDIS_PORT # Azure Redis default TLS port is 6380
buzok-onprem:
buzok-worker:
services:
redis:
tls: true
cluster: false
useSentinel: false
host: YOUR_AZURE_REDIS_HOSTNAME
port: YOUR_AZURE_REDIS_PORT # Azure Redis default TLS port is 6380
Azure Redis Cache snapshots¶
Azure Redis Cache offers import and export as a data management operation. You can configure automatic backups and perform manual backups to ensure data durability and recovery options in case of failures. Refer to the Azure Cache for Redis documentation for more information on backup options and best practices.
成果¶
By following these steps, you can seamlessly integrate Azure Redis (version 6.0) with DataRobot, providing an efficient caching layer to enhance performance. This setup improves data retrieval and overall system responsiveness. If issues arise, refer to DataRobot documentation and Azure Redis Cache documentation for troubleshooting guidance.
Note: Always ensure that you follow best practices for security and compliance when configuring external caching solutions with DataRobot.