Create and run an example job

Learn how to create and run an example batch processing job that transcodes videos by using Batch for Google Cloud.


To follow step-by-step guidance for this task directly in the Google Cloud console, click Guide me :

Guide me


Before you begin

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. Install the Google Cloud CLI.

  3. If you're using an external identity provider (IdP), you must first sign in to the gcloud CLI with your federated identity .

  4. To initialize the gcloud CLI, run the following command:

    gcloud  
    init
  5. Create or select a Google Cloud project .

    • Create a Google Cloud project:

      gcloud projects create PROJECT_ID 
      

      Replace PROJECT_ID with a name for the Google Cloud project you are creating.

    • Select the Google Cloud project that you created:

      gcloud config set project PROJECT_ID 
      

      Replace PROJECT_ID with your Google Cloud project name.

  6. Verify that billing is enabled for your Google Cloud project .

  7. Enable the Batch, Compute Engine, Logging and Cloud Storage APIs:

    gcloud  
    services  
     enable 
      
    batch.googleapis.com  
     compute.googleapis.com  
     logging.googleapis.com  
     storage.googleapis.com
  8. Install the Google Cloud CLI.

  9. If you're using an external identity provider (IdP), you must first sign in to the gcloud CLI with your federated identity .

  10. To initialize the gcloud CLI, run the following command:

    gcloud  
    init
  11. Create or select a Google Cloud project .

    • Create a Google Cloud project:

      gcloud projects create PROJECT_ID 
      

      Replace PROJECT_ID with a name for the Google Cloud project you are creating.

    • Select the Google Cloud project that you created:

      gcloud config set project PROJECT_ID 
      

      Replace PROJECT_ID with your Google Cloud project name.

  12. Verify that billing is enabled for your Google Cloud project .

  13. Enable the Batch, Compute Engine, Logging and Cloud Storage APIs:

    gcloud  
    services  
     enable 
      
    batch.googleapis.com  
     compute.googleapis.com  
     logging.googleapis.com  
     storage.googleapis.com
  14. Make sure that you and the job's service account have the required permissions to complete this tutorial. This tutorial uses default service account for a job, which is the Compute Engine default service account .

    • To get the permissions that you need to complete this tutorial, ask your administrator to grant you the following IAM roles:

      • To create, view, and delete jobs:
      • To create, view, and delete Cloud Storage buckets: Storage Admin ( roles/storage.admin ) on the project
      • To view logs from jobs: Logs Viewer ( roles/logging.viewer ) on the project

      For more information about granting roles, see Manage access to projects, folders, and organizations .

      You might also be able to get the required permissions through custom roles or other predefined roles .

    • To ensure that the Compute Engine default service account has the necessary permissions to complete this tutorial, ask your administrator to grant the Compute Engine default service account the following IAM roles:

      • Batch Agent Reporter ( roles/batch.agentReporter ) on the project
      • To let jobs access Cloud Storage buckets: Storage Admin ( roles/storage.admin ) on the project
      • To let jobs generate logs in Logging: Logs Writer ( roles/logging.logWriter ) on the project
  15. Clone the Batch git repository into the current directory:
    git clone https://github.com/GoogleCloudPlatform/batch-samples.git
  16. Go to the transcoding directory:
    cd batch-samples/transcoding/

Prepare job inputs

  1. Create a Cloud Storage bucket:

     gcloud storage buckets create gs:// BUCKET_NAME 
     
    

    Replace BUCKET_NAME with a globally unique name for the bucket.

    The output is similar to the following:

     Creating gs:// BUCKET_NAME 
    /... 
    
  2. Copy the transcode.sh script and the folder containing the video files to your Cloud Storage bucket:

     gcloud storage cp -R transcode.sh input gs:// BUCKET_NAME 
     
    

    The output is similar to the following:

     Copying file://transcode.sh to gs:// BUCKET_NAME 
    /transcode.sh
    Copying file://input/video-2.mp4 to gs:// BUCKET_NAME 
    /input/video-2.mp4
    Copying file://input/video-1.mp4 to gs:// BUCKET_NAME 
    /input/video-1.mp4
    Copying file://input/video-0.mp4 to gs:// BUCKET_NAME 
    /input/video-0.mp4
      Completed files 4/4 | 37.5MiB/37.5MiB
    
    Average throughput: 48.4MiB/s 
    

Create a job

  1. In a text editor of your choice, open the job.json configuration file.

  2. Set the value of the remotePath field to the name of your Cloud Storage bucket:

      { 
      
     "taskGroups" 
     : 
      
     [ 
      
     { 
      
     "taskSpec" 
     : 
      
     { 
      
     "runnables" 
     : 
      
     [ 
      
     { 
      
     "script" 
     : 
      
     { 
      
     "text" 
     : 
      
     "bash /mnt/share/transcode.sh" 
      
     } 
      
     } 
      
     ], 
      
     "computeResource" 
     : 
      
     { 
      
     "cpuMilli" 
     : 
      
     2000 
     , 
      
     "memoryMib" 
     : 
      
     2048 
      
     }, 
      
     "volumes" 
     : 
      
     [ 
      
     { 
      
     "gcs" 
     : 
      
     { 
      
      "remotePath" 
     : 
      
     " BUCKET_NAME 
    " 
      
     }, 
      
     "mountPath" 
     : 
      
     "/mnt/share" 
      
     } 
      
     ], 
      
     "maxRetryCount" 
     : 
      
     2 
     , 
      
     "maxRunDuration" 
     : 
      
     "600s" 
      
     }, 
      
     "taskCount" 
     : 
      
     3 
     , 
      
     "parallelism" 
     : 
      
     3 
      
     } 
      
     ], 
      
     "allocationPolicy" 
     : 
      
     { 
      
     "instances" 
     : 
      
     [ 
      
     { 
      
     "policy" 
     : 
      
     { 
      
     "machineType" 
     : 
      
     "n2d-standard-4" 
     , 
      
     "provisioningModel" 
     : 
      
     "SPOT" 
      
     } 
      
     } 
      
     ] 
      
     }, 
      
     "labels" 
     : 
      
     { 
      
     "department" 
     : 
      
     "creative" 
     , 
      
     "env" 
     : 
      
     "testing" 
      
     }, 
      
     "logsPolicy" 
     : 
      
     { 
      
     "destination" 
     : 
      
     "CLOUD_LOGGING" 
      
     } 
     } 
     
    
  3. Save your changes and close the text editor.

  4. Create the transcode job:

     gcloud batch jobs submit transcode \
        --config=job.json \
        --location=us-central1 
    

    The output is similar to the following:

     Job transcode-7a1654ca-211c-40e8-b0fb-8a00 was successfully submitted.
    ... 
    

    The job runs 3 tasks concurrently. Each task runs the transcode.sh script, which encodes 1 of 3 video files and uploads it to the Cloud Storage bucket.

Monitor the job

  1. In the Google Cloud console, go to the Job listpage.

    Go to Job list

  2. In the Job namecolumn, click transcode.

    The Job detailspage opens.

  3. Click the Eventstab.

    In the Events listsection you can monitor the status of the transcode job. The time the job takes to finish being queued, scheduled, and running varies based on multiple factors. For this example, you can expect the job to be all finished in approximately 5 minutes.

  4. Optional: To update the page, click Refresh.

Before proceeding to the next step, make sure that the status of the job is set to Succeeded. If your job fails, see Troubleshooting instead.

View the encoded videos

  1. In the Google Cloud console, go to the Bucketspage.

    Go to Buckets

  2. In the Namecolumn, click BUCKET_NAME .

    The Bucket detailspage opens.

  3. In the Namecolumn, click output/, and then click one of the encoded video files.

    The Object detailspage opens.

  4. To view the encoded video, click Preview, and then click Play.

Clean up

To avoid incurring charges to your Google Cloud account for the resources used on this page, delete the Google Cloud project with the resources.

Delete the project

The easiest way to eliminate billing is to delete the project you used in this tutorial.

Delete a Google Cloud project:

gcloud projects delete PROJECT_ID 

Delete individual resources

If you want to keep using the current project, then delete the individual resources used in this tutorial.

Delete the job

After the Batch job has finished running, delete the transcode job:

 gcloud batch jobs delete transcode \
    --location=us-central1 

The output is similar to the following:

 Job projects/example-project/locations/us-central1/jobs/transcode deletion is in progress 

Deleting a job also deletes the job's details and history. The job's logs are automatically deleted at the end of the Cloud Logging log retention period .

Delete the bucket

If you no longer need the Cloud Storage bucket you used in this tutorial and its content, delete the bucket:

 gcloud storage rm gs:// BUCKET_NAME 
\
    --recursive 

The output is similar to the following:

 Removing objects:
Removing gs:// BUCKET_NAME 
/input/video-0.mp4#1694788495332395...
Removing gs:// BUCKET_NAME 
/input/video-2.mp4#1694788495296173...
Removing gs:// BUCKET_NAME 
/input/video-1.mp4#1694788495228839...
Removing gs:// BUCKET_NAME 
/output/video-0.mp4#1694788495332395...
Removing gs:// BUCKET_NAME 
/output/video-2.mp4#1694788495296173...
Removing gs:// BUCKET_NAME 
/output/video-1.mp4#1694788495228839...
Removing gs:// BUCKET_NAME 
/transcode.sh#1694788495039427...
  Completed 4/4
Removing Buckets:
Removing gs:// BUCKET_NAME 
/...
  Completed 1/1 

Delete the git repository

If you no longer need the Batch git repository that you cloned for this tutorial, you can delete it:

 cd ../../ && rm -rf batch-samples 

What's next

Create a Mobile Website
View Site in Mobile | Classic
Share by: