ContainerDefinition
Describes the container, as part of model definition.
Types
Properties
This parameter is ignored for models that contain only a PrimaryContainer
.
The environment variables to set in the Docker container. Each key and value in the Environment
string to string map can have length of up to 1024. We support up to 16 entries in the map.
The path where inference code is stored. This can be either in Amazon EC2 Container Registry or in a Docker registry that is accessible from the same VPC that you configure for your endpoint. If you are using your own custom algorithm instead of an algorithm provided by SageMaker, the inference code must meet SageMaker requirements. SageMaker supports both registry/repository[:tag]
and registry/repository[@digest]
image path formats. For more information, see Using Your Own Algorithms with Amazon SageMaker.
Specifies whether the model container is in Amazon ECR or a private Docker registry accessible from your Amazon Virtual Private Cloud (VPC). For information about storing containers in a private Docker registry, see Use a Private Docker Registry for Real-Time Inference Containers.
The inference specification name in the model package version.
Whether the container hosts a single model or multiple models.
Specifies the location of ML model data to deploy.
The S3 path where the model artifacts, which result from model training, are stored. This path must point to a single gzip compressed tar archive (.tar.gz suffix). The S3 path is required for SageMaker built-in algorithms, but not if you use your own algorithms. For more information on built-in algorithms, see Common Parameters.
The name or Amazon Resource Name (ARN) of the model package to use to create the model.
Specifies additional configuration for multi-model endpoints.