Skip to content

On-premise users: click in-app to access the full platform documentation for your version of DataRobot.

Create custom jobs

When you create a custom job, you must assemble the required execution environment and files before running the job. Only the execution environment and an entry point file (typically are required; however, you can designate any file as the entry point. If you add other files to create the job, the entry point file should reference those files. In addition, to configure runtime parameters, create or upload a metadata.yaml file containing the runtime parameter configuration for the job.

To register and assemble a new custom job in the Registry:

  1. Click Registry > Jobs, and then click + Add job (or the button when the custom job panel is open).

    The custom job opens to the Assemble tab.

  2. On the Assemble tab for the new job, click the edit icon () to update the job name:

  3. In the Environment section, select a Base environment for the job.

  4. In the Files section, assemble the custom job. Drag files into the box, or use the options in this section to create or upload the files required to assemble a custom job:

    Option Description
    Choose from source / Upload Upload existing custom job files (, metadata.yaml, etc.) as Local Files or a Local Folder.
    Create Create a new file, empty or containing a template, and save it to the custom job:
    • Create Creates a basic, editable example of an entry point file.
    • Create metadata.yaml: Creates a basic, editable example of a runtime parameters file.
    • Create Creates a basic, editable README file.
    • Create Creates a basic, editable Python job file to print runtime parameters and deployments.
    • Create example job: Combines all template files to create a basic, editable custom job. You can quickly configure the runtime parameters and run this example job.
    • Create blank file: Creates an empty file. Click the edit icon () next to Untitled to provide a file name and extension, then add your custom contents. In the next step, it is possible to identify files created this way, with a custom name and content, as the entry point. After you configure the new file, click Save.

    File replacement

    If you add a new file with the same name as an existing file, when you click Save, the old file is replaced in the Files section.

  5. In the Settings section, configure the following:

    Setting Description
    Entry point Define the entry point shell (.sh) file for the job. If you've added a file, that file is the entry point; otherwise, you must select the entry point shell file from the drop-down list. The entry point file allows you to orchestrate multiple job files.
    Resources/Network access Configure the egress traffic of the custom job. Next to the Resources header, click Edit and configure Network access:
    • Public: The default setting. The custom job can access any fully qualified domain name (FQDN) in a public network to leverage third-party services.
    • None: The custom job is isolated from the public network and cannot access third party services.
    Availability information

    For the Managed AI Platform, the Network access setting is set to Public by default and the setting is configurable. For the Self-Managed AI Platform, the Network access setting is set to None by default and the setting is restricted; however, an administrator can change this behavior during DataRobot platform configuration. Contact your DataRobot representative or administrator for more information.

  6. (Optional) In the Settings section, if you uploaded a metadata.yaml file, define the Runtime parameters, clicking the edit icon () for each key-value row you want to configure.

Updated December 21, 2023
Back to top