Troubleshooting¶
When working with GenAI in DataRobot, you might encounter errors during creation, management, and use of the features that make up the capabilities. The sections below provide an overview of common errors and their handling. In case of errors, review the error messages and take appropriate action, such as revalidating external vector databases, creating duplicate LLM blueprints, or addressing issues with custom models.
Vector database error handling¶
The following issues apply to vector databases.
Creation failure¶
Errors might occur during the creation of a vector database, such as issues with data processing or model training. In this case, the system saves the execution status as Error
and provides an error code and message to help identify and resolve the issue.
Empty vector database¶
The vector database might be empty if the documents in the dataset contain no text or only consist of images in PDF files. The system sends a notification of the issue. See the list of supported dataset types for more information.
Retrieval failure¶
Errors might occur while retrieving documents from the vector database, such as network glitches or internal errors in the custom model. If this occurs, follow the guidance in the error messages to diagnose and resolve the issue.
LLM blueprint error handling¶
The following issues apply to blueprints and the playground.
Vector database unlinked¶
If the connection with a vector database is broken, any LLM blueprint using it will be marked as Unlinked
. Either revalidate/restore your external vector database connection or create a duplicate of the LLM blueprint and select a new database in the configuration.
Vector database deleted¶
If a vector database is deleted, any LLM blueprints using it become invalid. To proceed, create a duplicate of the LLM blueprint and select a new database.
Custom model errors¶
If the custom model used in an LLM blueprint encounters an error, such as a model replacement, deletion, or access removal, the LLM blueprint is marked with an appropriate error status. Follow the provided in-app guidance to proceed.
When adding a deployed LLM that supports the chat completion API, the playground uses the chat completion API as the preferred communication method. Requests from the playground to the deployed LLM will specify datarobot-deployed-llm
as the model name. To disable the chat completion API and use the predictions API instead, delete the chat
function from the custom model and redeploy it.