Tutorial : Build and test Python applications

 This page describes how to configure Cloud Build to build and test your Python applications, upload your artifacts to Artifact Registry, generate provenance information, and save your test logs in Cloud Storage.

Cloud Build enables you to use any publicly available container image to execute your tasks. The public python image from Docker Hub comes preinstalled with python and pip tools. You can configure Cloud Build use these tools to install dependencies, build, and run unit tests using these tools.

The instructions on this page assume that you are familiar with Python. In addition:

For instructions on granting these roles see Granting a role using the IAM page.

This section walks through an example build config file for a Python app. It has build steps to install requirements, add unit tests, and after the tests pass, build, and deploy the app.

  1. In your project root directory, create Cloud Build config file named cloudbuild.yaml.

  2. Install requirements: The python image from Docker Hub comes preinstalled with pip. To install dependencies from pip, add a build step with the following fields:

    • name: Set the value of this field to python or python:<tag> to use the python image from Docker Hub for this task. To see a list of available tags for other Python images, see the Docker Hub reference for the python image.
    • entrypoint: Setting this field overrides the default entrypoint of the image referenced in name. Set the value of this field to pip to invoke pip as the entrypoint of the build step and run pip commands.
    • args: The args field of a build step takes a list of arguments and passes them to the image referenced by the name field. Pass the arguments to run the pip install command in this field. --user flag in the pip install command ensures that the subsequent build steps can access the modules installed in this build step.

    The following build step adds arguments to install requirements:

     steps:
       
    - name: 'python'
          entrypoint
    : 'python'
          args
    : ['-m', 'pip', 'install', '--upgrade', 'pip']
       
    - name: python
          entrypoint
    : python
          args
    : ['-m', 'pip', 'install', 'build', 'pytest', 'Flask', '--user']
  3. Add unit tests: If you've defined unit tests in your application using a testing framework such as pytest, you can configure Cloud Build to run the tests by adding the following fields in a build step:

    • name: Set the value of this field to python to use the python image from Docker Hub for your task.
    • entrypoint: Set the value of this field to python to run python commands.
    • args: Add the arguments for running the python pytest command.

    The following build step saves the pytest log output to a JUNIT XML file. The name of this file is constructed using $SHORT_SHAthe short version of the commit ID associated with your build. A subsequent build step will save the logs in this file to Cloud Storage.

        - name: 'python'
          entrypoint
    : 'python'
          args
    : ['-m', 'pytest', '--junitxml=${SHORT_SHA}_test_log.xml']
  4. Build: In your build config file, define the builder and the args to build your application:

    • name: Set the value of this field to python to use the python image from Docker Hub for your task.
    • entrypoint: Set the value of this field to python to run python commands.
    • args: Add the arguments for executing your build.

    The following build step starts the build:

        - name: 'python'
          entrypoint
    : 'python'
          args
    : ['-m', 'build']
  5. Upload to Artifact Registry:

    Cloud Build generates Supply chain Levels for Software Artifacts (SLSA) build provenance information for standalone Python packages when you upload artifacts to Artifact Registry using the python_packages fields available in the Cloud Build config file.

    In your config file, add the pythonPackages field and specify your Python repository in Artifact Registry:

        artifacts:
           pythonPackages
    :
           
    - repository: 'https://LOCATION-python.pkg.dev/PROJECT-ID/REPOSITORY'
              paths
    : ['dist/*']

    Replace the following values:

    • PROJECT-ID is the ID of the Google Cloud project that contains your Artifact Registry repository.
    • REPOSITORY is the ID of the repository.
    • LOCATION is the regional or multi-regionallocation for the repository.
  6. Optional: Enable provenance for regional builds

    If you are using a regional build, add the requestedVerifyOption field in the options in your build config file. Set the value to VERIFIED to enable provenance metadata generation. If you don't add requestedVerifyOption: VERIFIED, Cloud Build generates provenance for global builds only.

    options:
      requestedVerifyOption
    : VERIFIED
  7. Save test logs to Cloud Storage: You can configure Cloud Build to store any test logs in Cloud Storage by specifying an existing bucket location and path to the test logs. The following build step stores the test logs that you saved in the JUNIT XML file to a Cloud Storage bucket:

        artifacts:
        objects
    :
           location
    : 'gs://${_BUCKET_NAME}/'
           paths
    :
             
    - '${SHORT_SHA}_test_log.xml'
  8. Start your buildmanually or using build triggers.

    Once your build completes, you can view repository details in Artifact Registry.

    You can also view build provenance metadata and validate provenance to help protect your software supply chain.


Build, test, and containerize Python applications

 

This page describes how to configure Cloud Build to build, test, containerize, and deploy Python applications.

Cloud Build enables you to use any publicly available container image to execute your development tasks, including building, testing, containerizing, uploading to Artifact Registry, deploying, and saving your build logs. The public python image from Docker Hub comes preinstalled with python and pip tools. You can configure Cloud Build use these tools to install dependencies, build, and run unit tests using these tools.

The instructions on this page assume that you are familiar with Python. In addition:

For instructions on granting these roles see Granting a role using the IAM page.

This section walks through an example build config file for a Python app. It has build steps to install requirements, add unit tests, and after the tests pass, build, and deploy the app.

  1. In your project root directory, create Cloud Build config file named cloudbuild.yaml.

  2. Install requirements: The python image from Docker Hub comes preinstalled with pip. To install dependencies from pip, add a build step with the following fields:

    • name: Set the value of this field to python to use the python image from Docker Hub for this task.
    • entrypoint: Setting this field overrides the default entrypoint of the image referenced in name. Set the value of this field to pip to invoke pip as the entrypoint of the build step and run pip commands.
    • args: The args field of a build step takes a list of arguments and passes them to the image referenced by the name field. Pass the arguments to run the pip install command in this field. --user flag in the pip install command ensures that the subsequent build steps can access the modules installed in this build step.

    The following build step adds arguments to install requirements from the requirements.txt file:

    steps:
      # Install dependencies
     
    - name: python
       
    entrypoint: pip
       
    args: ["install", "-r", "requirements.txt", "--user"]
  3. Add unit tests: If you've defined unit tests in your application using a testing framework such as pytest, you can configure Cloud Build to run the tests by adding the following fields in a build step:

    • name: Set the value of this field to python to use the python image from Docker Hub for your task.
    • entrypoint: Set the value of this field to python to run python commands.
    • args: Add the arguments for running the python pytest command.

    The following build step saves the pytest log output to a JUNIT XML file. The name of this file is constructed using the short version of the commit ID associated with your build. A subsequent build step will save the logs in this file to Cloud Storage.

    # Run unit tests
    - name: python
     
    entrypoint: python
     
    args: ["-m", "pytest", "--junitxml=${SHORT_SHA}_test_log.xml"]
  4. Containerize the app: After adding the build step to ensure that the tests have passed, you can build the application. Cloud Build provides a pre-built Docker image that you can use to containerize your Python application. To containerize your app, add the following fields in a build step:

    • name: Set the value of this field to gcr.io/cloud-builders/docker to use the prebuilt docker image for your task.
    • args: Add the arguments for the docker build command as values for this field.

    The following build step builds the image myimage and tags it with the short version of your commit ID. The build step uses the default substitutions for project ID, repository name, and short SHA values therefore these values are automatically substituted at build time.

    # Docker Build
    - name: 'gcr.io/cloud-builders/docker'
     
    args: ['build', '-t',
             
    'us-central1-docker.pkg.dev/${PROJECT_ID}/${_ARTIFACT_REGISTRY_REPO}/myimage:${SHORT_SHA}', '.']
  5. Push the container to Artifact Registry: You can store the built container in Artifact Registry, which is a Google Cloud service that you can use to store, manage, and secure build artifacts. To do this, you'll need to have an existing Docker repository in Artifact Registry. To configure Cloud Build to store the image in an Artifact Registry Docker repository, add a build step with the following fields:

    • name: Set the value of this field to gcr.io/cloud-builders/docker to use the official docker builder image from Container Registry for your task.
    • args: Add the arguments for the docker push command as values of this field. For the destination URL, enter the Artifact Registry Docker repository where you want to store the image.

    The following build step pushes the image that you built in the previous step to Artifact Registry:

    # Docker push to Google Artifact Registry
    - name: 'gcr.io/cloud-builders/docker'
     
    args: ['push',  'us-central1-docker.pkg.dev/${PROJECT_ID}/${_ARTIFACT_REGISTRY_REPO}/myimage:${SHORT_SHA}']

    Optional: If you want Cloud Build to generate build provenance information, use the images field in your build step instead of using a separate using a Docker push build step. If you are using regional builds, you must also add the requestedVerifyOption field and set the value to VERIFIED to enable provenance generation.

  6. Deploy the container to Cloud Run: To deploy the image on Cloud Run, add a build step with the following fields:

    • name: Set the value of this field to google/cloud-sdk to use the gcloud CLI image to invoke the gcloud command to deploy the image on Cloud Run.
    • args: Add the arguments for the gcloud run deploy command as the values of this field.

    The following build step deploys the previously built image to Cloud Run:

    # Deploy to Cloud Run
    - name: google/cloud-sdk
     
    args: ['gcloud', 'run', 'deploy', 'helloworld-${SHORT_SHA}',
             
    '--image=us-central1-docker.pkg.dev/${PROJECT_ID}/${_ARTIFACT_REGISTRY_REPO}/myimage:${SHORT_SHA}',
             
    '--region', 'us-central1', '--platform', 'managed',
             
    '--allow-unauthenticated']
  7. Save test logs to Cloud Storage: You can configure Cloud Build to store any test logs in Cloud Storage by specifying an existing bucket location and path to the test logs. The following build step stores the test logs that you saved in the JUNIT XML file to a Cloud Storage bucket:

    # Save test logs to Google Cloud Storage
    artifacts:
      objects:
        location: gs://${_BUCKET_NAME}/
       
    paths:
          - ${SHORT_SHA}_test_log.xml

    The following snippet shows the complete build config file for the all the steps described above:

    steps:
      # Install dependencies
     
    - name: python
       
    entrypoint: pip
       
    args: ["install", "-r", "requirements.txt", "--user"]

     
    # Run unit tests
     
    - name: python
       
    entrypoint: python
       
    args: ["-m", "pytest", "--junitxml=${SHORT_SHA}_test_log.xml"]

     
    # Docker Build
     
    - name: 'gcr.io/cloud-builders/docker'
       
    args: ['build', '-t',
               
    'us-central1-docker.pkg.dev/${PROJECT_ID}/${_ARTIFACT_REGISTRY_REPO}/myimage:${SHORT_SHA}', '.']

     
    # Docker push to Google Artifact Registry
     
    - name: 'gcr.io/cloud-builders/docker'
       
    args: ['push',  'us-central1-docker.pkg.dev/${PROJECT_ID}/${_ARTIFACT_REGISTRY_REPO}/myimage:${SHORT_SHA}']

     
    # Deploy to Cloud Run
     
    - name: google/cloud-sdk
       
    args: ['gcloud', 'run', 'deploy', 'helloworld-${SHORT_SHA}',
               
    '--image=us-central1-docker.pkg.dev/${PROJECT_ID}/${_ARTIFACT_REGISTRY_REPO}/myimage:${SHORT_SHA}',
               
    '--region', 'us-central1', '--platform', 'managed',
               
    '--allow-unauthenticated']

    # Save test logs to Google Cloud Storage
    artifacts:
      objects:
        location: gs://${_BUCKET_NAME}/
       
    paths:
          - ${SHORT_SHA}_test_log.xml
    # Store images in Google Artifact Registry
    images:
      - us-central1-docker.pkg.dev/${PROJECT_ID}/${_ARTIFACT_REGISTRY_REPO}/myimage:${SHORT_SHA}
  8. Start your buildmanually or using build triggers.

    Once your build completes, you can view repository details in Artifact Registry.

    You can also view build provenance metadata and validate provenance to help protect your software supply chain.

Comments

Popular posts from this blog

Terraform

Scrum Master Interview help - Bootcamp

Kubernetes