# Batch Prediction job definitions

> Batch Prediction job definitions - How to submit a working Batch Prediction job. You must supply a
> variety of elements to the POST request payload depending on the type of prediction.

This Markdown file sits beside the HTML page at the same path (with a `.md` suffix). It summarizes the topic and lists links for tools and LLM context.

Companion generated at `2026-05-06T18:17:09.613038+00:00` (UTC).

## Primary page

- [Batch Prediction job definitions](https://docs.datarobot.com/en/docs/api/reference/batch-prediction-api/job-definitions.html): Full documentation for this topic (HTML).

## Sections on this page

- [Job Definitions API](https://docs.datarobot.com/en/docs/api/reference/batch-prediction-api/job-definitions.html#job-definitions-api): In-page section heading.
- [Execute a Job Definition](https://docs.datarobot.com/en/docs/api/reference/batch-prediction-api/job-definitions.html#execute-a-job-definition): In-page section heading.

## Related documentation

- [Developer documentation](https://docs.datarobot.com/en/docs/api/index.html): Linked from this page.
- [API reference](https://docs.datarobot.com/en/docs/api/reference/index.html): Linked from this page.
- [Batch Prediction API](https://docs.datarobot.com/en/docs/api/reference/batch-prediction-api/index.html): Linked from this page.
- [DataRobot REST API reference documentation](https://docs.datarobot.com/en/docs/api/reference/public-api/batch_predictions.html): Linked from this page.
- [sample use cases section](https://docs.datarobot.com/en/docs/api/reference/batch-prediction-api/pred-examples.html): Linked from this page.
- [here](https://docs.datarobot.com/en/docs/api/reference/batch-prediction-api/job-scheduling.html): Linked from this page.

## Documentation content

To submit a working Batch Prediction job, you must supply a variety of elements to the `POST` request payload depending on what type of prediction is required. Additionally, you must consider the type of intake and output adapters used for a given job.

For more information about Batch Prediction REST API routes, view the [DataRobot REST API reference documentation](https://docs.datarobot.com/en/docs/api/reference/public-api/batch_predictions.html).

Every time you make a Batch Prediction, the prediction information is stored outside DataRobot and re-submitted for each prediction request, as described in detail in the [sample use cases section](https://docs.datarobot.com/en/docs/api/reference/batch-prediction-api/pred-examples.html). One such request could be as follows:

`POST https://app.datarobot.com/api/v2/batchPredictions`

```
{
    "deploymentId": "<deployment_id>",
    "intakeSettings": {
        "type": "dataset",
        "datasetId": "<dataset_ud>"
    },
    "outputSettings": {
        "type": "jdbc",
        "statementType": "insert",
        "credentialId": "<credential_id>",
        "dataStoreId": "<data_store_id>",
        "schema": "public",
        "table": "example_table",
        "createTableIfNotExists": false
    },
    "includeProbabilities": true,
    "includePredictionStatus": true,
    "passthroughColumnsSet": "all"
}
```

## Job Definitions API

If your use case requires the same, or close to the same, type of prediction to be done multiple times, you can choose to create a Job Definition of the Batch Prediction job and store this inside DataRobot for future use.

The API for job definitions is identical to the existing `/batchPredictions/` endpoint, and can be used interchangeably by changing the `POST` endpoint to `/batchPredictionJobDefinitions`:

`POST https://app.datarobot.com/api/v2/batchPredictionJobDefinitions`

```
{
    "deploymentId": "<deployment_id>",
    "intakeSettings": {
        "type": "dataset",
        "datasetId": "<dataset_ud>"
    },
    "outputSettings": {
        "type": "jdbc",
        "statementType": "insert",
        "credentialId": "<credential_id>",
        "dataStoreId": "<data_store_id>",
        "schema": "public",
        "table": "example_table",
        "createTableIfNotExists": false
    },
    "includeProbabilities": true,
    "includePredictionStatus": true,
    "passthroughColumnsSet": "all"
}
```

This definition endpoint will return an accepted payload that verifies the successful storing of the definition to DataRobot.

(Optional) You can supply a `name` parameter for easier identification. If you don't supply one, DataRobot will create one for you.

> [!WARNING] Warning
> The `name` parameter must be unique across your organization. If you attempt to create multiple definitions with the same name, the request will fail. If you wish to free up a name, you must first send a `DELETE` request with the existing job definition ID you wish to delete.

## Execute a Job Definition

If you wish to submit a stored job definition for scoring, you can either choose to do so on a scheduled basis, described [here](https://docs.datarobot.com/en/docs/api/reference/batch-prediction-api/job-scheduling.html), or by manually submitting the definition ID to the endpoint `/batchPredictions/fromJobDefinition` and with the definition ID as the payload, as such:

`POST https://app.datarobot.com/api/v2/batchPredictions/fromJobDefinition`

```
{
    "jobDefinitionId": "<job_definition_id>"
}
```

The endpoint supports regular the CRUD operations, `GET`, `POST`, `DELETE` and `PATCH`.
