DirectorySecurity AdvisoriesPricing
Sign in
Directory
azureml-inference-base-2204 logo

azureml-inference-base-2204

packaged by Chainguard

Last changed
Request a free trial

Contact our team to test out this image for free. Please also indicate any other images you would like to evaluate.

Tags
Overview
Comparison
Provenance
Specifications
SBOM
Vulnerabilities
Advisories

Chainguard Container for azureml-inference-base-2204

A minimal Wolfi-based image compatible with mcr.microsoft.com/azureml/inference-base-2204 for hosting Azure Machine Learning inference endpoints.

Chainguard Containers are regularly-updated, secure-by-default container images.

Download this Container Image

For those with access, this container image is available on cgr.dev:

docker pull cgr.dev/ORGANIZATION/azureml-inference-base-2204:latest

Be sure to replace the ORGANIZATION placeholder with the name used for your organization's private repository within the Chainguard Registry.

Compatibility Notes

The Chainguard azureml-inference-base-2204 container image is a drop-in replacement for the upstream mcr.microsoft.com/azureml/inference-base-2204 image. It bundles the azureml-inference-server-http Python package together with nginx, gunicorn, rsyslog, and runit and keeps the upstream directory layout, environment variables, and entrypoint hooks so existing Azure Machine Learning scoring scripts and deployment manifests drop in unchanged.

Like the upstream image, this image ships with CMD ["/bin/bash"] and no entrypoint so that derived image builds keep working. To start the inference server, override the entrypoint with the bundled azureml-entrypoint.sh script (which runs runsvdir /var/runit).

Getting Started

Example: Standalone Container

Mount your scoring script at /var/azureml-app and point AZUREML_ENTRY_SCRIPT at it:

mkdir -p /tmp/azureml-app
cat > /tmp/azureml-app/score.py <<'EOF'
def init():
    pass

def run(data):
    return {"result": "ok"}
EOF

docker run --rm -d \
  --name azureml-inference \
  -p 5001:5001 \
  -v /tmp/azureml-app:/var/azureml-app:ro \
  -e AZUREML_ENTRY_SCRIPT=score.py \
  --entrypoint azureml-entrypoint.sh \
  cgr.dev/ORGANIZATION/azureml-inference-base-2204:latest

curl http://localhost:5001/
curl -X POST http://localhost:5001/score -H 'Content-Type: application/json' -d '{"data":[1,2,3]}'

Documentation and Resources

For extended configuration (worker counts, timeouts, dynamic conda/pip installs, Application Insights logging) see the upstream Azure Machine Learning inference server documentation.

What are Chainguard Containers?

Chainguard's free tier of Starter container images are built with Wolfi, our minimal Linux undistro.

All other Chainguard Containers are built with Chainguard OS, Chainguard's minimal Linux operating system designed to produce container images that meet the requirements of a more secure software supply chain.

The main features of Chainguard Containers include:

For cases where you need container images with shells and package managers to build or debug, most Chainguard Containers come paired with a development, or -dev, variant.

In all other cases, including Chainguard Containers tagged as :latest or with a specific version number, the container images include only an open-source application and its runtime dependencies. These minimal container images typically do not contain a shell or package manager.

Although the -dev container image variants have similar security features as their more minimal versions, they include additional software that is typically not necessary in production environments. We recommend using multi-stage builds to copy artifacts from the -dev variant into a more minimal production image.

Need additional packages?

To improve security, Chainguard Containers include only essential dependencies. Need more packages? Chainguard customers can use Custom Assembly to add packages, either through the Console, chainctl, or API.

To use Custom Assembly in the Chainguard Console: navigate to the image you'd like to customize in your Organization's list of images, and click on the Customize image button at the top of the page.

Learn More

Refer to our Chainguard Containers documentation on Chainguard Academy. Chainguard also offers VMs and Librariescontact us for access.

Trademarks

This software listing is packaged by Chainguard. The trademarks set forth in this offering are owned by their respective companies, and use of them does not imply any affiliation, sponsorship, or endorsement by such companies.

Licenses

Chainguard's container images contain software packages that are direct or transitive dependencies. The following licenses were found in the "latest" tag of this image:

  • Apache-2.0

  • BSD-1-Clause

  • BSD-2-Clause

  • BSD-3-Clause

  • BSD-4-Clause-UC

  • CC-PDDC

  • GCC-exception-3.1

For a complete list of licenses, please refer to this Image's SBOM.

Software license agreement

Compliance

Chainguard Containers are SLSA Level 3 compliant with detailed metadata and documentation about how it was built. We generate build provenance and a Software Bill of Materials (SBOM) for each release, with complete visibility into the software supply chain.

SLSA compliance at Chainguard

This image helps reduce time and effort in establishing PCI DSS 4.0 compliance with low-to-no CVEs.

PCI DSS at Chainguard

A FIPS validated version of this image is available for FedRAMP compliance. STIG is included with FIPS image.


Related images
azureml-inference-base-2204-fips logoFIPS

azureml-inference-base-2204-fips


Category
Application

The trusted source for open source

Talk to an expert
PrivacyTerms

Product

Chainguard ContainersChainguard LibrariesChainguard VMsChainguard OS PackagesChainguard ActionsChainguard Agent SkillsIntegrationsPricing
© 2026 Chainguard, Inc. All Rights Reserved.
Chainguard® and the Chainguard logo are registered trademarks of Chainguard, Inc. in the United States and/or other countries.
The other respective trademarks mentioned on this page are owned by the respective companies and use of them does not imply any affiliation or endorsement.