Create and manage batch operation jobsStay organized with collectionsSave and categorize content based on your preferences.
This page describes how to create, view, list, cancel, and deletestorage batch operationsjobs. It also describes how to use Cloud Audit Logs
with storage batch operations jobs.
Before you begin
To create and manage storage batch operations jobs, complete the steps in the following sections.
Configure Storage Intelligence
To create and manage storage batch operations jobs,configure
Storage Intelligenceon the bucket where you want to run the job.
This section describes how to create a storage batch operations job.
To get the permissions that
you need to create a storage batch operations job,
ask your administrator to grant you theStorage Admin(roles/storage.admin)
IAM role on the project.
For more information about granting roles, seeManage access to projects, folders, and organizations.
This predefined role contains
the permissions required to create a storage batch operations job. To see the exact permissions that are
required, expand theRequired permissionssection:
Required permissions
The following permissions are required to create a storage batch operations job:
Create a storage batch operations job:storagebatchoperations.jobs.create
Run the delete objects storage batch operations job:storage.objects.delete
Run the update object metadata, update object customer-managed encryption key, update object context, or update object hold storage batch operations job:storage.objects.update
At the bottom of the Google Cloud console, aCloud Shellsession starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
(Optional) Run adry runjob. Before executing any job, we recommend that you run the job in dry run mode to verify the object selection criteria and check for any errors. The dry run does not modify any objects.
gcloud storage batch-operations jobs createJOB_NAME\
--bucket=BUCKET_NAMEOBJECT_SELECTION_FLAGJOB_TYPE_FLAG
Where theparameters are as follows:
DRY_RUN_JOB_NAMEis thenameof the storage batch operations dry run job.
JOB_NAMEis thenameof the storage batch operations job.
BUCKET_NAMEis thenameof the bucket that contains one or more objects you want to process.
OBJECT_SELECTION_FLAGis one of the following flags that you need to specify:
--included-object-prefixes: Specify one or moreobject prefixes. For example:
To match a single prefix, use:--included-object-prefixes='prefix1'.
To match multiple prefixes, use a comma-separated prefix list:--included-object-prefixes='prefix1,prefix2'.
To include all objects, use an empty prefix:--included-object-prefixes=''.
--manifest-location: Specify themanifestlocation. For example,gs://bucket_name/path/object_name.csv.
JOB_TYPE_FLAGis one of the following flags that you need to specify, depending on thejob type.
--delete-object: Delete one or more objects.
IfObject Versioningis enabled for the bucket, current objects transition to a noncurrent state, and noncurrent objects are skipped.
If Object Versioning is disabled for the bucket, the delete operation permanently deletes objects and skips noncurrent objects.
--enable-permanent-object-deletion: Permanently delete objects. Use this flag along with the--delete-objectflag to permanently delete both live and noncurrent objects in a bucket, regardless of the bucket's object versioning configuration.
--put-metadata: Updateobject metadata. Specify the key-value pair for the object metadata you want to modify. You can specify one or more key-value pairs as a list. You can also setobject retention configurationsusing the--put-metadataflag. To do so, specify the retention parameters using theRetain-UntilandRetention-Modefields. For example,
RETAIN_UNTIL_TIMEis the date and time, inRFC 3339format, until which the object is retained. For example,2025-10-09T10:30:00Z. To set the retention configuration on an object, you'll
need toenable retention on the bucketwhich contains the object.
RETENTION_MODEis the retention mode, eitherUnlockedorLocked.
When you send a request to update theRETENTION_MODEandRETAIN_UNTIL_TIMEfields,
consider the following:
To update the object retention configuration, you must provide non-empty values for bothRETENTION_MODEandRETAIN_UNTIL_TIMEfields; setting only one results in
anINVALID_ARGUMENTerror.
You can extend theRETAIN_UNTIL_TIMEvalue for objects in bothUnlockedorLockedmodes.
The object retention must be inUnlockedmode if you want to do the
following:
Reduce theRETAIN_UNTIL_TIMEvalue.
Remove the retention configuration. To remove the configuration, you'll need to
provide empty values for bothRETENTION_MODEandRETAIN_UNTIL_TIMEfields.
If you omit bothRETENTION_MODEandRETAIN_UNTIL_TIMEfields, the
retention configuration remains unchanged.
--clear-all-object-custom-contexts: Delete all existingobject contexts.
The following example shows how to create a job to clear all object contexts for objects listed inmanifest.csv:
--clear-object-custom-contexts: Remove contexts with specific keys. You can also update specific contexts along with removing keys by using both the--clear-object-custom-contextsflag and one of the following flags:
--update-object-custom-contexts: Provide a map of key-value pairs.
The following example shows how to create a job to remove the context with keytemp-idand update or insert context with keyproject-idandcost-centerfor all objects listed inmanifest.csv:
use Google\Cloud\StorageBatchOperations\V1\Client\StorageBatchOperationsClient;use Google\Cloud\StorageBatchOperations\V1\CreateJobRequest;use Google\Cloud\StorageBatchOperations\V1\Job;use Google\Cloud\StorageBatchOperations\V1\BucketList;use Google\Cloud\StorageBatchOperations\V1\BucketList\Bucket;use Google\Cloud\StorageBatchOperations\V1\PrefixList;use Google\Cloud\StorageBatchOperations\V1\DeleteObject;/*** Create a new batch job.** @param string $projectId Your Google Cloud project ID.* (e.g. 'my-project-id')* @param string $jobId A unique identifier for this job.* (e.g. '94d60cc1-2d95-41c5-b6e3-ff66cd3532d5')* @param string $bucketName The name of your Cloud Storage bucket to operate on.* (e.g. 'my-bucket')* @param string $objectPrefix The prefix of objects to include in the operation.* (e.g. 'prefix1')*/function create_job(string $projectId, string $jobId, string $bucketName, string $objectPrefix): void{// Create a client.$storageBatchOperationsClient = new StorageBatchOperationsClient();$parent = $storageBatchOperationsClient->locationName($projectId, 'global');$prefixListConfig = new PrefixList(['included_object_prefixes' => [$objectPrefix]]);$bucket = new Bucket(['bucket' => $bucketName, 'prefix_list' => $prefixListConfig]);$bucketList = new BucketList(['buckets' => [$bucket]]);$deleteObject = new DeleteObject(['permanent_object_deletion_enabled' => false]);$job = new Job(['bucket_list' => $bucketList, 'delete_object' => $deleteObject]);$request = new CreateJobRequest(['parent' => $parent,'job_id' => $jobId,'job' => $job,]);$response = $storageBatchOperationsClient->createJob($request);printf('Created job: %s', $response->getName());}
METADATA_KEY/VALUEis theobject's metadatakey-value pair. You can specify
one or more pairs.
RETAIN_UNTIL_TIMEis the date and time, inRFC 3339format, until which the object is retained. For example,2025-10-09T10:30:00Z. To set the retention configuration on an object, you'll
need toenable retention on the bucketwhich contains the object.
RETENTION_MODEis the retention mode, eitherUnlockedorLocked.
When you send a request to update theRETENTION_MODEandRETAIN_UNTIL_TIMEfields,
consider the following:
To update the object retention configuration, you must provide non-empty values for bothRETENTION_MODEandRETAIN_UNTIL_TIMEfields; setting only one results in
anINVALID_ARGUMENTerror.
You can extend theRETAIN_UNTIL_TIMEvalue for objects in bothUnlockedorLockedmodes.
The object retention must be inUnlockedmode if you want to do the
following:
Reduce theRETAIN_UNTIL_TIMEvalue.
Remove the retention configuration. To remove the configuration, you'll need to
provide empty values for bothRETENTION_MODEandRETAIN_UNTIL_TIMEfields.
If you omit bothRETENTION_MODEandRETAIN_UNTIL_TIMEfields, the
retention configuration remains unchanged.
PROJECT_IDis the ID or number of the project. For example,my-project.
JOB_NAMEis thenameof the storage batch operations job.
Get storage batch operations job details
This section describes how to get the storage batch operations job details.
To get the permissions that
you need to view a storage batch operations job,
ask your administrator to grant you theStorage Admin(roles/storage.admin)
IAM role on the project.
For more information about granting roles, seeManage access to projects, folders, and organizations.
This predefined role contains
the permissions required to view a storage batch operations job. To see the exact permissions that are
required, expand theRequired permissionssection:
Required permissions
The following permissions are required to view a storage batch operations job:
View a storage batch operations job:storagebatchoperations.jobs.get,storagebatchoperations.operations.get
At the bottom of the Google Cloud console, aCloud Shellsession starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
This section describes how to list the storage batch operations jobs within a project.
To get the permissions that
you need to list storage batch operations jobs,
ask your administrator to grant you theStorage Admin(roles/storage.admin)
IAM role on the project.
For more information about granting roles, seeManage access to projects, folders, and organizations.
This predefined role contains
the permissions required to list storage batch operations jobs. To see the exact permissions that are
required, expand theRequired permissionssection:
Required permissions
The following permissions are required to list storage batch operations jobs:
List storage batch operations jobs:storagebatchoperations.jobs.list,storagebatchoperations.operations.list
At the bottom of the Google Cloud console, aCloud Shellsession starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
use Google\Cloud\StorageBatchOperations\V1\Client\StorageBatchOperationsClient;use Google\Cloud\StorageBatchOperations\V1\ListJobsRequest;/*** List Jobs in a given project.** @param string $projectId Your Google Cloud project ID.* (e.g. 'my-project-id')*/function list_jobs(string $projectId): void{// Create a client.$storageBatchOperationsClient = new StorageBatchOperationsClient();$parent = $storageBatchOperationsClient->locationName($projectId, 'global');$request = new ListJobsRequest(['parent' => $parent,]);$jobs = $storageBatchOperationsClient->listJobs($request);foreach ($jobs as $job) {printf('Job name: %s' . PHP_EOL, $job->getName());}}
PROJECT_IDis the ID or number of
the project. For example,my-project.
Cancel a storage batch operations job
This section describes how to cancel a storage batch operations job within a project.
To get the permissions that
you need to cancel a storage batch operations job,
ask your administrator to grant you theStorage Admin(roles/storage.admin)
IAM role on the project.
For more information about granting roles, seeManage access to projects, folders, and organizations.
This predefined role contains
the permissions required to cancel a storage batch operations job. To see the exact permissions that are
required, expand theRequired permissionssection:
Required permissions
The following permissions are required to cancel a storage batch operations job:
Cancel a storage batch operations job:storagebatchoperations.jobs.cancel,storagebatchoperations.operations.cancel
At the bottom of the Google Cloud console, aCloud Shellsession starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
PROJECT_IDis the ID or number of
the project. For example,my-project.
JOB_IDis thenameof the storage batch operations job.
Delete a storage batch operations job
This section describes how to delete a storage batch operations job.
To get the permissions that
you need to delete a storage batch operations job,
ask your administrator to grant you theStorage Admin(roles/storage.admin)
IAM role on the project.
For more information about granting roles, seeManage access to projects, folders, and organizations.
This predefined role contains
the permissions required to delete a storage batch operations job. To see the exact permissions that are
required, expand theRequired permissionssection:
Required permissions
The following permissions are required to delete a storage batch operations job:
Delete a storage batch operations job:storagebatchoperations.jobs.delete
At the bottom of the Google Cloud console, aCloud Shellsession starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
PROJECT_IDis the ID or number of
the project. For example,my-project.
JOB_IDis thenameof the storage batch operations job.
Create a storage batch operations job using Storage Insights datasets
To create a storage batch operations job using Storage Insights
datasets, complete the steps in the following sections.
To get the permissions that
you need to create a storage batch operations job,
ask your administrator to grant you theStorage Admin(roles/storage.admin)
IAM role on the project.
For more information about granting roles, seeManage access to projects, folders, and organizations.
This predefined role contains
the permissions required to create a storage batch operations job. To see the exact permissions that are
required, expand theRequired permissionssection:
Required permissions
The following permissions are required to create a storage batch operations job:
Create a storage batch operations job:storagebatchoperations.jobs.create
Run the delete objects storage batch operations job:storage.objects.delete
Run the update object metadata, update object customer-managed encryption key, update object context, or update object hold storage batch operations job:storage.objects.update
You can create the manifest for your storage batch operations job by
extracting data from BigQuery. To do so, you'll need toquery the
linked dataset, export the resulting data as a CSV file, and save it to a
Cloud Storage bucket. The storage batch operations job can
then use this CSV file as its manifest.
Running the following SQL query in BigQuery on a Storage Insights
dataset view retrieves objects larger than 1 KiB that are namedTemp_Training:
EXPORT DATA OPTIONS(
uri=`URI`,
format=`CSV`,
overwrite=OVERWRITE_VALUE,
field_delimiter=',') AS
SELECT bucket, name, generation
FROMDATASET_VIEW_NAMEWHERE bucket =BUCKET_NAMEAND name LIKE (`Temp_Training%`)
AND size > 1024 * 1024
AND snapshotTime =SNAPSHOT_TIME
Where:
URIis the URI to the bucket that contains the manifest. For example,gs://bucket_name/path_to_csv_file/*.csv. When you use the*.csvwildcard, BigQuery exports the result to multiple CSV files.
OVERWRITE_VALUEis a boolean value. If set totrue, the export operation overwrites existing files at the specified location.
DATASET_VIEW_NAMEis the fully qualified name of the Storage Insights dataset view inPROJECT_ID.DATASET_ID.VIEW_NAMEformat. To find the name of your dataset,view the linked dataset.
Where:
PROJECT_IDis the ID or number of the project. For example,my-project.
DATASET_IDis the name of the dataset. For example,objects-deletion-dataset.
VIEW_NAMEis the name of the dataset view. For example,bucket_attributes_view.
BUCKET_NAMEis the name of the bucket. For example,my-bucket.
SNAPSHOT_TIMEis the snapshot time of the Storage Insights dataset view. For example,2024-09-10T00:00:00Z.
Create a storage batch operations job
To create a storage batch operations job to process objects contained in the manifest, complete the following steps:
Command line
In the Google Cloud console, activate Cloud Shell.
At the bottom of the Google Cloud console, aCloud Shellsession starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
gcloud storage batch-operations jobs create \JOB_ID\
--bucket=SOURCE_BUCKET_NAME\
--manifest-location=URI\
--JOB_TYPE_FLAG
Where:
JOB_IDis thenameof the storage batch operations job.
SOURCE_BUCKET_NAMEis the bucket that contains one or more objects you want to process. For example,my-bucket.
URIis the URI to the bucket that contains the manifest. For example,gs://bucket_name/path_to_csv_file/*.csv. When you use the*.csvwildcard, BigQuery exports the result to multiple CSV files.
JOB_TYPE_FLAGis one of the following flags, depending on thejob type.
--delete-object: Delete one or more objects.
--put-metadata: Updateobject metadata. Object metadata is stored askey-valuepairs. Specify the key-value pair for the metadata you want to modify. You can specify one or more key-value pairs as a list. You can also provideobject retention configurationsusing the--put-metadataflag.
--clear-object-custom-contexts: Remove contexts with specific keys. You can also update specific contexts along with removing keys by using both the--clear-object-custom-contextsflag and one of the following flags:
--update-object-custom-contexts: Provide a map of key-value pairs.
The following example shows how to create a job to remove the context with keytemp-idand update or insert context with keyproject-idandcost-centerfor all objects listed inmanifest.csv:
You can provide an additional layer of security for
storage batch operations resources by using VPC Service Controls. When
you use VPC Service Controls, you add projects to service perimeters that protect
resources and services from requests that originate from outside of the
perimeter. To learn more about VPC Service Controls service perimeter details for
storage batch operations, seeSupported products and limitations.
Use Cloud Audit Logs for storage batch operations jobs
Storage batch operations jobs record transformations on Cloud Storage
objects in Cloud Storage Cloud Audit Logs. You can useCloud Audit Logs
with Cloud Storageto track the object transformations that
storage batch operations jobs perform. For information about enabling audit
logs, seeEnabling audit logs. In the audit log entry, thecallUserAgentmetadata field with the valueStorageBatchOperationsindicates a
storage batch operations transformation.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2026-04-06 UTC."],[],[]]