Stay organized with collectionsSave and categorize content based on your preferences.
This page describes how to create, view, list, cancel, and deletestorage batch operationsjobs. It also describes how to use Cloud Audit Logs
with storage batch operations jobs.
Before you begin
To create and manage storage batch operations jobs, complete the steps in the following sections.
Configure Storage Intelligence
To create and manage storage batch operations jobs,configure
Storage Intelligenceon the bucket where you want to run the job.
This section describes how to create a storage batch operations job.
Roles required
To get the required permissions for creating a storage batch operations job, ask your administrator to grant you the Storage Admin (roles/storage.admin) IAM role on the project. Thispredefined rolecontains the following permissions required to create a storage batch operations job:
storagebatchoperations.jobs.create
storage.objects.delete(Only required if running the delete objects storage batch operations job)
storage.objects.update(Only required if running the update object metadata, update object customer-managed encryption key, or update object hold storage batch operations job)
Command line
In the Google Cloud console, activate Cloud Shell.
At the bottom of the Google Cloud console, aCloud Shellsession starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
gcloud storage batch-operations jobs createJOB_NAME--bucket=BUCKET_NAMEOBJECT_SELECTION_FLAGJOB_TYPE_FLAG
Where:
JOB_NAMEis thenameof the storage batch operations job.
BUCKET_NAMEis thename of the bucketthat contains one or more objects you want to process.
OBJECT_SELECTION_FLAGis one of the following flags:
--included-object-prefixes: Specify one or moreobject prefixes. For example:
To match a single prefix, use:--included-object-prefixes='prefix1'.
To match multiple prefixes, use a comma-separated prefix list:--included-object-prefixes='prefix1,prefix2'.
To include all objects, use an empty prefix:--included-object-prefixes=''.
--manifest-location: Specify themanifestlocation. For example,gs://bucket_name/path/object_name.csv.
JOB_TYPE_FLAGis one of the following flags, depending on thejob type.
--delete-object: Delete one or more objects.
--put-metadata: Updateobject metadata. Object metadata is stored askey-valuepairs. Specify thekey-valuepair for the metadata you want to modify. You can specify one or morekey-valuepairs as a list.
use Google\Cloud\StorageBatchOperations\V1\Client\StorageBatchOperationsClient;use Google\Cloud\StorageBatchOperations\V1\CreateJobRequest;use Google\Cloud\StorageBatchOperations\V1\Job;use Google\Cloud\StorageBatchOperations\V1\BucketList;use Google\Cloud\StorageBatchOperations\V1\BucketList\Bucket;use Google\Cloud\StorageBatchOperations\V1\PrefixList;use Google\Cloud\StorageBatchOperations\V1\DeleteObject;/*** Create a new batch job.** @param string $projectId Your Google Cloud project ID.* (e.g. 'my-project-id')* @param string $jobId A unique identifier for this job.* (e.g. '94d60cc1-2d95-41c5-b6e3-ff66cd3532d5')* @param string $bucketName The name of your Cloud Storage bucket to operate on.* (e.g. 'my-bucket')* @param string $objectPrefix The prefix of objects to include in the operation.* (e.g. 'prefix1')*/function create_job(string $projectId, string $jobId, string $bucketName, string $objectPrefix): void{// Create a client.$storageBatchOperationsClient = new StorageBatchOperationsClient();$parent = $storageBatchOperationsClient->locationName($projectId, 'global');$prefixListConfig = new PrefixList(['included_object_prefixes' => [$objectPrefix]]);$bucket = new Bucket(['bucket' => $bucketName, 'prefix_list' => $prefixListConfig]);$bucketList = new BucketList(['buckets' => [$bucket]]);$deleteObject = new DeleteObject(['permanent_object_deletion_enabled' => false]);$job = new Job(['bucket_list' => $bucketList, 'delete_object' => $deleteObject]);$request = new CreateJobRequest(['parent' => $parent,'job_id' => $jobId,'job' => $job,]);$response = $storageBatchOperationsClient->createJob($request);printf('Created job: %s', $response->getName());}
PROJECT_IDis the ID or number of the project. For example,my-project.
JOB_IDis thenameof the storage batch operations job.
Get storage batch operations job details
This section describes how to get the storage batch operations job details.
Roles required
To get the required permissions for viewing a storage batch operations job, ask your administrator to grant you Storage Admin (roles/storage.admin) IAM role on the project. Thispredefined rolecontains the following permissions required to view a storage batch operations job:
storagebatchoperations.jobs.get
storagebatchoperations.operations.get
Command line
In the Google Cloud console, activate Cloud Shell.
At the bottom of the Google Cloud console, aCloud Shellsession starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
PROJECT_IDis the ID or number of
the project. For example,my-project.
JOB_IDis thenameof the storage batch operations job.
List storage batch operations jobs
This section describes how to list the storage batch operations jobs within a project.
Roles required
To get the required permissions for listing all storage batch operations jobs, ask your administrator to grant you Storage Admin (roles/storage.admin) IAM role on the project. Thispredefined rolecontains the following permissions required to list storage batch operations jobs:
storagebatchoperations.jobs.list
storagebatchoperations.operations.list
Command line
In the Google Cloud console, activate Cloud Shell.
At the bottom of the Google Cloud console, aCloud Shellsession starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
use Google\Cloud\StorageBatchOperations\V1\Client\StorageBatchOperationsClient;use Google\Cloud\StorageBatchOperations\V1\ListJobsRequest;/*** List Jobs in a given project.** @param string $projectId Your Google Cloud project ID.* (e.g. 'my-project-id')*/function list_jobs(string $projectId): void{// Create a client.$storageBatchOperationsClient = new StorageBatchOperationsClient();$parent = $storageBatchOperationsClient->locationName($projectId, 'global');$request = new ListJobsRequest(['parent' => $parent,]);$jobs = $storageBatchOperationsClient->listJobs($request);foreach ($jobs as $job) {printf('Job name: %s' . PHP_EOL, $job->getName());}}
PROJECT_IDis the ID or number of
the project. For example,my-project.
Cancel a storage batch operations job
This section describes how to cancel a storage batch operations job within a project.
Roles required
To get the required permissions for canceling a storage batch operations job, ask your administrator to grant you Storage Admin (roles/storage.admin) IAM role on the project. Thispredefined rolecontains the following permissions required to cancel a storage batch operations job:
storagebatchoperations.jobs.cancel
storagebatchoperations.operations.cancel
Command line
In the Google Cloud console, activate Cloud Shell.
At the bottom of the Google Cloud console, aCloud Shellsession starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
PROJECT_IDis the ID or number of
the project. For example,my-project.
JOB_IDis thenameof the storage batch operations job.
Delete a storage batch operations job
This section describes how to delete a storage batch operations job.
Roles required
To get the required permission for deleting a storage batch operations job, ask your administrator to grant you Storage Admin (roles/storage.admin) IAM role on the project. Thispredefined rolecontains thestoragebatchoperations.jobs.deletepermission required to delete a storage batch operations job.
Command line
In the Google Cloud console, activate Cloud Shell.
At the bottom of the Google Cloud console, aCloud Shellsession starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
PROJECT_IDis the ID or number of
the project. For example,my-project.
JOB_IDis thenameof the storage batch operations job.
Create a storage batch operations job using Storage Insights datasets
To create a storage batch operations job using Storage Insights
datasets, complete the steps in the following sections.
Roles required
To get the required permissions for creating storage batch operations jobs, ask your
administrator to grant you the Storage Admin (roles/storage.admin) IAM role on the
project. Thispredefined rolecontains the following permissions required to
create storage batch operations jobs:
storagebatchoperations.jobs.create
storage.objects.delete(Only required if running the
delete objects storage batch operations job)
storage.objects.update(Only required if running the
update object metadata, update object customer-managed encryption
key, or update object hold storage batch operations job)
Create a manifest using Storage Insights datasets
You can create the manifest for your storage batch operations job by
extracting data from BigQuery. To do so, you'll need toquery the
linked dataset, export the resulting data as a CSV file, and save it to a
Cloud Storage bucket. The storage batch operations job can
then use this CSV file as its manifest.
Running the following SQL query in BigQuery on a Storage Insights
dataset view retrieves objects larger than 1 KiB that are namedTemp_Training:
EXPORT DATA OPTIONS(
uri=`URI`,
format=`CSV`,
overwrite=OVERWRITE_VALUE,
field_delimiter=',') AS
SELECT bucket, name, generation
FROMDATASET_VIEW_NAMEWHERE bucket =BUCKET_NAMEAND name LIKE (`Temp_Training%`)
AND size > 1024 * 1024
AND snapshotTime =SNAPSHOT_TIME
Where:
URIis the URI to the bucket that contains the manifest. For example,gs://bucket_name/path_to_csv_file/*.csv. When you use the*.csvwildcard, BigQuery exports the result to multiple CSV files.
OVERWRITE_VALUEis a boolean value. If set totrue, the export operation overwrites existing files at the specified location.
DATASET_VIEW_NAMEis the fully qualified name of the Storage Insights dataset view inPROJECT_ID.DATASET_ID.VIEW_NAMEformat. To find the name of your dataset,view the linked dataset.
Where:
PROJECT_IDis the ID or number of the project. For example,my-project.
DATASET_IDis the name of the dataset. For example,objects-deletion-dataset.
VIEW_NAMEis the name of the dataset view. For example,bucket_attributes_view.
BUCKET_NAMEis the name of the bucket. For example,my-bucket.
SNAPSHOT_TIMEis the snapshot time of the Storage Insights dataset view. For example,2024-09-10T00:00:00Z.
Create a storage batch operations job
To create a storage batch operations job to process objects contained in the manifest, complete the following steps:
Command line
In the Google Cloud console, activate Cloud Shell.
At the bottom of the Google Cloud console, aCloud Shellsession starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
gcloud storage batch-operations jobs create \JOB_ID\
--bucket=SOURCE_BUCKET_NAME\
--manifest-location=URI\
--JOB_TYPE_FLAG
Where:
JOB_IDis thenameof the storage batch operations job.
SOURCE_BUCKET_NAMEis the bucket that contains one or more objects you want to process. For example,my-bucket.
URIis the URI to the bucket that contains the manifest. For example,gs://bucket_name/path_to_csv_file/*.csv. When you use the*.csvwildcard, BigQuery exports the result to multiple CSV files.
JOB_TYPE_FLAGis one of the following flags, depending on thejob type.
--delete-object: Delete one or more objects.
--put-metadata: Updateobject metadata. Object metadata is stored askey-valuepairs. Specify thekey-valuepair for the metadata you want to modify. You can specify one or morekey-valuepairs as a list.
You can provide an additional layer of security for
storage batch operations resources by using VPC Service Controls. When
you use VPC Service Controls, you add projects to service perimeters that protect
resources and services from requests that originate from outside of the
perimeter. To learn more about VPC Service Controls service perimeter details for
storage batch operations, seeSupported products and limitations.
Use Cloud Audit Logs for storage batch operations jobs
Storage batch operations jobs record transformations on Cloud Storage
objects in Cloud Storage Cloud Audit Logs. You can useCloud Audit Logs
with Cloud Storageto track the object transformations that
storage batch operations jobs perform. For information about enabling audit
logs, seeEnabling audit logs. In the audit log entry, thecallUserAgentmetadata field with the valueStorageBatchOperationsindicates a
storage batch operations transformation.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-09-04 UTC."],[],[],null,[]]