# Use Scoring Code with AWS SageMaker

> Use Scoring Code with AWS SageMaker - Using Scoring Code models with AWS SageMaker.

This Markdown file sits beside the HTML page at the same path (with a `.md` suffix). It summarizes the topic and lists links for tools and LLM context.

Companion generated at `2026-04-24T16:03:56.548583+00:00` (UTC).

## Primary page

- [Use Scoring Code with AWS SageMaker](https://docs.datarobot.com/en/docs/classic-ui/integrations/aws/sagemaker/sc-sagemaker.html): Full documentation for this topic (HTML).

## Sections on this page

- [Download Scoring Code](https://docs.datarobot.com/en/docs/classic-ui/integrations/aws/sagemaker/sc-sagemaker.html#download-scoring-code): In-page section heading.
- [Upload Scoring Code to an AWS S3 bucket](https://docs.datarobot.com/en/docs/classic-ui/integrations/aws/sagemaker/sc-sagemaker.html#upload-scoring-code-to-an-aws-s3-bucket): In-page section heading.
- [Publish a Docker image to Amazon ECR](https://docs.datarobot.com/en/docs/classic-ui/integrations/aws/sagemaker/sc-sagemaker.html#publish-a-docker-image-to-amazon-ecr): In-page section heading.
- [Create the model](https://docs.datarobot.com/en/docs/classic-ui/integrations/aws/sagemaker/sc-sagemaker.html#create-the-model): In-page section heading.
- [Create an endpoint configuration](https://docs.datarobot.com/en/docs/classic-ui/integrations/aws/sagemaker/sc-sagemaker.html#create-an-endpoint-configuration): In-page section heading.
- [Make predictions](https://docs.datarobot.com/en/docs/classic-ui/integrations/aws/sagemaker/sc-sagemaker.html#make-predictions): In-page section heading.
- [Considerations](https://docs.datarobot.com/en/docs/classic-ui/integrations/aws/sagemaker/sc-sagemaker.html#considerations): In-page section heading.

## Related documentation

- [Classic UI documentation](https://docs.datarobot.com/en/docs/classic-ui/index.html): Linked from this page.
- [Integrations](https://docs.datarobot.com/en/docs/classic-ui/integrations/index.html): Linked from this page.
- [AWS](https://docs.datarobot.com/en/docs/classic-ui/integrations/aws/index.html): Linked from this page.
- [Amazon SageMaker](https://docs.datarobot.com/en/docs/classic-ui/integrations/aws/sagemaker/index.html): Linked from this page.
- [Scoring Code](https://docs.datarobot.com/en/docs/classic-ui/predictions/port-pred/scoring-code/index.html): Linked from this page.
- [Leaderboard](https://docs.datarobot.com/en/docs/classic-ui/predictions/port-pred/scoring-code/sc-download-leaderboard.html): Linked from this page.
- [deployment](https://docs.datarobot.com/en/docs/classic-ui/predictions/port-pred/scoring-code/sc-download-deployment.html): Linked from this page.

## Documentation content

# Use Scoring Code with AWS SageMaker

This topic describes how to make predictions using DataRobot’s Scoring Code deployed on AWS SageMaker. Scoring Code allows you to download machine learning models as JAR files which can then be deployed in the environment of your choice.

AWS SageMaker allows you to bring in your machine-learning models and expose them as API endpoints, and DataRobot can export models in Java and Python. Once exported, you can deploy the model on AWS SageMaker. This example focuses on the DataRobot Scoring Code export, which provides a Java JAR file.

Make sure the model you want to import supports [Scoring Code](https://docs.datarobot.com/en/docs/classic-ui/predictions/port-pred/scoring-code/index.html). Models that support Scoring Code export are indicated by the Scoring Code icon.

## Download Scoring Code

The first step to deploying a DataRobot model to AWS SageMaker is to create a TAR.GZ archive that contains your model (the Scoring Code JAR file provided by DataRobot). You can download the JAR file from the [Leaderboard](https://docs.datarobot.com/en/docs/classic-ui/predictions/port-pred/scoring-code/sc-download-leaderboard.html) or from a [deployment](https://docs.datarobot.com/en/docs/classic-ui/predictions/port-pred/scoring-code/sc-download-deployment.html).

> [!NOTE] Note
> Depending on your DataRobot license, the code may only be available through the Deployments page.

## Upload Scoring Code to an AWS S3 bucket

Once you have downloaded the Scoring Code JAR file, you need to upload your Codegen JAR file to an AWS S3 bucket so that SageMaker can access it.

SageMaker expects the archive ( `tar.gz` format) to be uploaded to an S3 bucket. Compress your model as a `tar.gz` archive using one of the following commands:

**Linux:**
```
tar -czvf 5e8471fa169e846a096d5137.jar.tar.gz 5e8471fa169e846a096d5137.jar
```

**MacOS:**
MacOS adds hidden files to the `tar.gz` package that can introduce issues during deployment. To prevent these issues, use the following command:

```
COPYFILE\_DISABLE=1 tar -czvf 5e8471fa169e846a096d5137.jar.tar.gz 5e8471fa169e846a096d5137.jar
```


Once you have created the `tar.gz` archive, upload it to S3:

1. Enter the Amazon S3 console.
2. ClickUploadand provide yourtar.gzarchive to the S3 bucket.

## Publish a Docker image to Amazon ECR

Next, publish a Docker image containing inference code to the Amazon ECR. In this example, you can download the DataRobot-provided Docker image with the following command:

```
docker pull datarobotdev/scoring-inference-code-sagemaker:latest
```

To publish the image to Amazon ECR:

1. Authenticate your Docker client to the Amazon ECR registry to which you intend to push your image. Authentication tokens must be obtained for each registry used, and the tokens are valid for 12 hours. You can refer to Amazon documentation for various authentication options listedhere.
2. Use token-based authentication: TOKEN=$(aws ecr get-authorization-token --output text --query 'authorizationData[].authorizationToken') curl -i -H "Authorization: Basic $TOKEN" <https://xxxxxxx.dkr.ecr.us-east-1.amazonaws.com/v2/sagemakertest/tags/list>
3. Next, create an Amazon ECR Registry where you can push your image: aws ecr create-repository --repository-name sagemakerdemo Using this command returns the output shown below:

You can also create the repository from the AWS Management console:

1. Navigate toECR Service > Create Repositoryand provide the repository name.
2. Identify the image to push. Run the docker images command to list the images on your system.
3. Tag the image you want to push to AWS ECR.
4. Thexxxxxxxxplaceholder represents the image ID of the DataRobot-provided Docker image containing the inference code (scoring-inference-code-sagemaker:latest) that you downloaded from Docker Hub.
5. Tag the image with the Amazon ECR registry, repository, and optional image tag name combination to use. The registry format is*aws_account_id.dkr.ecr.region.amazonaws.com*. The repository name should match the repository that you created for the image. If you omit the image tag, then DataRobot assumes the tag is the latest. docker tag xxxxxxxx "${account}.dkr.ecr.${region}.amazonaws.com/sagemakerdemo"
6. Push the image: docker push ${account}.dkr.ecr.${region}.amazonaws.com/sagemakermlopsdockerized

Once pushed, you can validate the image from the AWS management console.

## Create the model

1. Sign in to AWS and search forSageMaker. Select the first search result,Amazon SageMaker, to enter the SageMaker console and create a model.
2. In the IAM role field, selectCreate a new rolefrom the dropdown if you do not have an existing role on your account. This option creates a role with the required permissions and assigns it to your instance.
3. For theContainer input optionsfield (1), selectProvide model artifacts and inference image location. Then, specify the location of the Scoring Code image (your model) in the S3 bucket (2) and the registry path to the Docker image containing the inference code (3).
4. ClickAdd containerbelow the fields when complete. Your model configurations should match the example below:

## Create an endpoint configuration

To set up an endpoint for predictions:

1. Open the dashboard on the left side and navigate to theEndpoint configurationspage to create a new endpoint configuration. Select the uploaded model.
2. Enter anEndpoint configuration name(1) and provide anEncryption keyif desired (2). When complete, selectCreate endpoint configurationat the bottom of the page.
3. Use the dashboard to navigate toEndpointsand create a new endpoint: Enter anEndpoint name(1) andUse an existing endpoint configuration(2). Then, click the configuration you just created (3). When finished, clickSelect endpoint configuration. When the endpoint creation is complete, you can make prediction requests with your model. Once the endpoint is ready to service requests, the status will change toInService:

## Make predictions

Once the SageMaker endpoint status changes to InService you can start making predictions against the endpoint.

Test the endpoint from the command line first to make sure the endpoint is responding. Use the command below to make a test prediction and pass the data in the body of the CSV string:

```
aws sagemaker-runtime invoke-endpoint --endpoint-name mlops-dockerized-endpoint-new
```

> [!NOTE] Note
> To run the command above, ensure you have installed AWS CLI.

## Considerations

Note the following when deploying on SageMaker:

- There is no out-of-the-box data drift and accuracy tracking unless MLOps agents are configured.
- You may experience additional time overhead as a result of deploying to AWS SageMaker.
