Tutorial : Build and test Python applications
This page describes how to configure Cloud Build to build and test your Python applications, upload your artifacts to Artifact Registry, generate provenance information, and save your test logs in Cloud Storage.
Cloud Build enables you to use any publicly available container image to execute your tasks. The public python
image from Docker Hub comes preinstalled with python
and pip
tools. You can configure Cloud Build use these tools to install dependencies, build, and run unit tests using these tools.
Before you begin
The instructions on this page assume that you are familiar with Python. In addition:
Enable the Cloud Build, Artifact Registry, and Cloud Storage APIs.
- To run the
gcloud
commands on this page, install Google Cloud CLI. - Have your Python project handy.
- Have a Python repository in Artifact Registry. If you do not have one, create a new repository.
- If you want to store test logs in Cloud Storage, create a bucket in Cloud Storage.
Required IAM permissions
To store test logs in Logging, grant the Storage Object Creator (
roles/storage.objectCreator
) role for the Cloud Storage bucket to the Cloud Build service account.To store built images in Artifact Registry, grant the Artifact Registry Writer (
roles/artifactregistry.writer
) role to the Cloud Build service account.
For instructions on granting these roles see Granting a role using the IAM page.
Configuring Python builds
This section walks through an example build config file for a Python app. It has build steps to install requirements, add unit tests, and after the tests pass, build, and deploy the app.
In your project root directory, create Cloud Build config file named
cloudbuild.yaml
.Install requirements: The
python
image from Docker Hub comes preinstalled withpip
. To install dependencies frompip
, add a build step with the following fields:name
: Set the value of this field topython
orpython:<tag>
to use the python image from Docker Hub for this task. To see a list of available tags for other Python images, see the Docker Hub reference for the python image.entrypoint
: Setting this field overrides the default entrypoint of the image referenced inname
. Set the value of this field topip
to invokepip
as the entrypoint of the build step and runpip
commands.args
: Theargs
field of a build step takes a list of arguments and passes them to the image referenced by thename
field. Pass the arguments to run thepip install
command in this field.--user
flag in thepip install
command ensures that the subsequent build steps can access the modules installed in this build step.
The following build step adds arguments to install requirements:
Add unit tests: If you've defined unit tests in your application using a testing framework such as
pytest
, you can configure Cloud Build to run the tests by adding the following fields in a build step:name
: Set the value of this field topython
to use the python image from Docker Hub for your task.entrypoint
: Set the value of this field topython
to runpython
commands.args
: Add the arguments for running thepython pytest
command.
The following build step saves the
pytest
log output to a JUNIT XML file. The name of this file is constructed using$SHORT_SHA
, the short version of the commit ID associated with your build. A subsequent build step will save the logs in this file to Cloud Storage.Build: In your build config file, define the builder and the
args
to build your application:name
: Set the value of this field topython
to use the python image from Docker Hub for your task.entrypoint
: Set the value of this field topython
to runpython
commands.args
: Add the arguments for executing your build.
The following build step starts the build:
Upload to Artifact Registry:
Cloud Build generates Supply chain Levels for Software Artifacts (SLSA) build provenance information for standalone Python packages when you upload artifacts to Artifact Registry using the
python_packages
fields available in the Cloud Build config file.In your config file, add the
pythonPackages
field and specify your Python repository in Artifact Registry:Replace the following values:
- PROJECT-ID is the ID of the Google Cloud project that contains your Artifact Registry repository.
- REPOSITORY is the ID of the repository.
- LOCATION is the regional or multi-regionallocation for the repository.
Optional: Enable provenance for regional builds
If you are using a regional build, add the
requestedVerifyOption
field in theoptions
in your build config file. Set the value toVERIFIED
to enable provenance metadata generation. If you don't addrequestedVerifyOption: VERIFIED
, Cloud Build generates provenance for global builds only.Save test logs to Cloud Storage: You can configure Cloud Build to store any test logs in Cloud Storage by specifying an existing bucket location and path to the test logs. The following build step stores the test logs that you saved in the JUNIT XML file to a Cloud Storage bucket:
Start your build: manually or using build triggers.
Once your build completes, you can view repository details in Artifact Registry.
You can also view build provenance metadata and validate provenance to help protect your software supply chain.
Build, test, and containerize Python applications
This page describes how to configure Cloud Build to build, test, containerize, and deploy Python applications.
Cloud Build enables you to use any publicly available container image to execute your development tasks, including building, testing, containerizing, uploading to Artifact Registry, deploying, and saving your build logs. The public python
image from Docker Hub comes preinstalled with python
and pip
tools. You can configure Cloud Build use these tools to install dependencies, build, and run unit tests using these tools.
Before you begin
The instructions on this page assume that you are familiar with Python. In addition:
Enable the Cloud Build, Cloud Run, Cloud Storage and Artifact Registry APIs.
- To run the
gcloud
commands on this page, install Google Cloud CLI. - Have your Python project handy, including the
requirements.txt
file. You need aDockerfile
along with your source code. - If you want to store the built container in Artifact Registry, create a Docker repository in Artifact Registry.
- If you want to store test logs in Cloud Storage, create a bucket in Cloud Storage.
Required IAM permissions
To store test logs in Logging, grant the Storage Object Creator (
roles/storage.objectCreator
) role for the Cloud Storage bucket to the Cloud Build service account.To store built images in Artifact Registry, grant the Artifact Registry Writer (
roles/artifactregistry.writer
) role to the Cloud Build service account.
For instructions on granting these roles see Granting a role using the IAM page.
Configuring Python builds
This section walks through an example build config file for a Python app. It has build steps to install requirements, add unit tests, and after the tests pass, build, and deploy the app.
In your project root directory, create Cloud Build config file named
cloudbuild.yaml
.Install requirements: The
python
image from Docker Hub comes preinstalled withpip
. To install dependencies frompip
, add a build step with the following fields:name
: Set the value of this field topython
to use the python image from Docker Hub for this task.entrypoint
: Setting this field overrides the default entrypoint of the image referenced inname
. Set the value of this field topip
to invokepip
as the entrypoint of the build step and runpip
commands.args
: Theargs
field of a build step takes a list of arguments and passes them to the image referenced by thename
field. Pass the arguments to run thepip install
command in this field.--user
flag in thepip install
command ensures that the subsequent build steps can access the modules installed in this build step.
The following build step adds arguments to install requirements from the
requirements.txt
file:Add unit tests: If you've defined unit tests in your application using a testing framework such as
pytest
, you can configure Cloud Build to run the tests by adding the following fields in a build step:name
: Set the value of this field topython
to use the python image from Docker Hub for your task.entrypoint
: Set the value of this field topython
to runpython
commands.args
: Add the arguments for running thepython pytest
command.
The following build step saves the
pytest
log output to a JUNIT XML file. The name of this file is constructed using the short version of the commit ID associated with your build. A subsequent build step will save the logs in this file to Cloud Storage.Containerize the app: After adding the build step to ensure that the tests have passed, you can build the application. Cloud Build provides a pre-built Docker image that you can use to containerize your Python application. To containerize your app, add the following fields in a build step:
name
: Set the value of this field togcr.io/cloud-builders/docker
to use the prebuilt docker image for your task.args
: Add the arguments for thedocker build
command as values for this field.
The following build step builds the image
myimage
and tags it with the short version of your commit ID. The build step uses the default substitutions for project ID, repository name, and short SHA values therefore these values are automatically substituted at build time.Push the container to Artifact Registry: You can store the built container in Artifact Registry, which is a Google Cloud service that you can use to store, manage, and secure build artifacts. To do this, you'll need to have an existing Docker repository in Artifact Registry. To configure Cloud Build to store the image in an Artifact Registry Docker repository, add a build step with the following fields:
name
: Set the value of this field togcr.io/cloud-builders/docker
to use the officialdocker
builder image from Container Registry for your task.args
: Add the arguments for thedocker push
command as values of this field. For the destination URL, enter the Artifact Registry Docker repository where you want to store the image.
The following build step pushes the image that you built in the previous step to Artifact Registry:
Optional: If you want Cloud Build to generate build provenance information, use the
images
field in your build step instead of using a separate using aDocker push
build step. If you are using regional builds, you must also add therequestedVerifyOption
field and set the value toVERIFIED
to enable provenance generation.Deploy the container to Cloud Run: To deploy the image on Cloud Run, add a build step with the following fields:
name
: Set the value of this field togoogle/cloud-sdk
to use the gcloud CLI image to invoke thegcloud
command to deploy the image on Cloud Run.args
: Add the arguments for thegcloud run deploy
command as the values of this field.
The following build step deploys the previously built image to Cloud Run:
Save test logs to Cloud Storage: You can configure Cloud Build to store any test logs in Cloud Storage by specifying an existing bucket location and path to the test logs. The following build step stores the test logs that you saved in the JUNIT XML file to a Cloud Storage bucket:
The following snippet shows the complete build config file for the all the steps described above:
Start your build: manually or using build triggers.
Once your build completes, you can view repository details in Artifact Registry.
You can also view build provenance metadata and validate provenance to help protect your software supply chain.
Comments
Post a Comment