Call prompts in an agent¶
Prompt templates are reusable prompts created in DataRobot, often containing placeholder variables (like {{ topic }}) to be defined in the agent code. The prompt template itself is stored in DataRobot and can have multiple versions, allowing for prompt iteration and fine-tuning without changing the agent code.
Token limits
Prompt templates count towards the LLM's maximum completion token limit.
In the agent code, the framework gets the template through the DataRobot API and provides the variable values. These variables are then substituted into the template text to create the final prompt sent to the LLM. Each framework template handles initial input formatting differently, so the method for using DataRobot prompt templates varies. Before modifying the agent.py file, create a prompt template in DataRobot and note the template ID. If needed, also note the version ID of the specific prompt template version to use. Then, in the agent.py file, modify the appropriate method or property in the MyAgent class based on the framework.
The examples below show modifications to the existing framework templates in this repository. Each example assumes a prompt template exists in DataRobot containing a {{ topic }} variable. For example, the prompt template might be: Write an article about {{ topic }} in 1997. The user prompt sent to the agent is combined with this template by substituting the user input into the {{ topic }} variable.
LangGraph uses a prompt_template property that returns a ChatPromptTemplate. Add import datarobot as dr at the top of the file, then modify this property to use DataRobot prompt templates:
# Added to imports
import datarobot as dr
# Modified in MyAgent class
@property
def prompt_template(self) -> ChatPromptTemplate:
prompt_template = dr.genai.PromptTemplate.get("PROMPT_TEMPLATE_ID")
prompt_template_version = prompt_template.get_latest_version()
# To use a specific version instead:
# prompt_template_version = prompt_template.get_version("PROMPT_VERSION_ID")
# Convert {{ variable }} format to {variable} format for LangGraph's ChatPromptTemplate
# The {topic} variable is filled by the framework at runtime
prompt_text = prompt_template_version.to_fstring()
return ChatPromptTemplate.from_messages(
[
(
"user",
prompt_text,
),
]
)
Replace the prompt template ID ("PROMPT_TEMPLATE_ID") with the appropriate template ID from DataRobot. The example uses get_latest_version() to automatically use the latest version without redeployment.
This example uses to_fstring() to convert the template's {{ topic }} variable to {topic} format, which LangGraph's ChatPromptTemplate replaces at runtime. If the variables in the prompt template change across versions (for example, if a new version uses {{ subject }} instead of {{ topic }}), update this code to handle all variables appropriately, otherwise the code may break when fetching a new version.
Multi-agent workflows
The prompt_template property is used for the initial user input. Each agent has its own prompt parameter in create_react_agent. To ensure all agents follow the prompt template instructions, incorporate the formatted prompt template into each agent's prompt parameter in create_react_agent.
LlamaIndex uses a make_input_message method that returns a string. Add import datarobot as dr at the top of the file, then modify this method to use DataRobot prompt templates:
# Added to imports
import datarobot as dr
# Modified in MyAgent class
def make_input_message(self, completion_create_params: Any) -> str:
user_prompt_content = extract_user_prompt_content(completion_create_params)
prompt_template = dr.genai.PromptTemplate.get("PROMPT_TEMPLATE_ID")
prompt_template_version = prompt_template.get_latest_version()
# To use a specific version instead:
# prompt_template_version = prompt_template.get_version("PROMPT_VERSION_ID")
# Render the prompt template with variables (assumes {{ topic }} in the template)
prompt_text = prompt_template_version.render(topic=user_prompt_content)
return prompt_text
Replace the prompt template ID ("PROMPT_TEMPLATE_ID") with the appropriate template ID from DataRobot. The example uses get_latest_version() to automatically use the latest version without redeployment.
This example assumes the prompt template contains a {{ topic }} variable. If the variables in the prompt template change across versions (for example, if a new version uses {{ subject }} instead of {{ topic }}), update this code to handle all variables appropriately, otherwise the code may break when fetching a new version.
Multi-agent workflows
The make_input_message method affects only the initial input message. Each agent has its own system_prompt property. To ensure all agents follow the prompt template instructions, incorporate the formatted prompt template into each agent's system_prompt property.
CrewAI uses agent properties (goal, backstory) that can contain prompt templates. Add import datarobot as dr at the top of the file, then modify agent properties to use DataRobot prompt templates:
# Added to imports
import datarobot as dr
# Modified in MyAgent class
@property
def agent_planner(self) -> Agent:
prompt_template = dr.genai.PromptTemplate.get("PROMPT_TEMPLATE_ID")
prompt_template_version = prompt_template.get_latest_version()
# To use a specific version instead:
# prompt_template_version = prompt_template.get_version("PROMPT_VERSION_ID")
# For properties that use {topic} (f-string format), use to_fstring()
prompt_text = prompt_template_version.to_fstring()
return Agent(
role="Planner",
goal=f"Plan engaging and factually accurate content on {{ topic }}. {prompt_text}",
backstory=f"You're working on planning a blog article about the topic: {{ topic }}. {prompt_text} "
"You collect information that helps the audience learn something and make informed decisions. "
"Your work is the basis for the Content Writer to write an article on this topic.",
# ... other properties
)
Replace the prompt template ID ("PROMPT_TEMPLATE_ID") with the appropriate template ID from DataRobot. The example uses get_latest_version() to automatically use the latest version without redeployment.
This example modifies agent_planner to use the prompt template in its goal and backstory properties. Since these properties use {topic} (f-string format that CrewAI will fill at runtime), the example uses to_fstring() to convert {{ topic }} to {topic} format so CrewAI can replace it with the user's input.
This example assumes the prompt template contains a {{ topic }} variable. If the variables in the prompt template change across versions (for example, if a new version uses {{ subject }} instead of {{ topic }}), update this code to handle all variables appropriately, otherwise the code may break when fetching a new version.
Multi-agent workflows
Apply prompt templates to each agent's goal or backstory properties where you want the instructions to be followed. For properties that use {topic}, use to_fstring(). For plain text properties, use render().