Stay organized with collectionsSave and categorize content based on your preferences.
Control access to resources with IAM
This document describes how to view, grant, and revoke access controls for
BigQuery datasets and for the resources within datasets:
tables, views, and routines. Although models are also dataset-level resources,
you cannot grant access to individual models using IAM roles.
You can grant access to Google Cloud resources by usingallow policies, also known asIdentity and Access Management (IAM) policies, which are
attached to resources. You can attach only one allow policy to each resource.
The allow policy controls access to the resource itself, as well as any
descendants of that resource thatinherit the allow policy.
For more information on allow policies, seePolicy structurein the IAM documentation.
Data Catalog doesn't support routine access controls. If a user has
conditionally granted routine-level access, they won't see their routines in
the BigQuery side panel. As a workaround, grant dataset-level
access instead.
This predefined role contains
the permissions required to modify IAM policies for resources. To see the exact permissions that are
required, expand theRequired permissionssection:
Required permissions
The following permissions are required to modify IAM policies for resources:
To get a dataset's access policy:bigquery.datasets.get
To set a dataset's access policy:bigquery.datasets.update
To get a dataset's access policy (Google Cloud console only):bigquery.datasets.getIamPolicy
To set a dataset's access policy (console only):bigquery.datasets.setIamPolicy
To get a table or view's policy:bigquery.tables.getIamPolicy
To set a table or view's policy:bigquery.tables.setIamPolicy
To get a routine's access policy:bigquery.routines.getIamPolicy
To set a routine's access policy:bigquery.routines.setIamPolicy
To create bq tool orSQL BigQuery jobs(optional):bigquery.jobs.create
You can provide access to a dataset by granting anIAM principala predefined or custom role that determines what the principal can do with the
dataset. This is also known as attaching anallow policyto a resource. After granting access, you can view the dataset's access
controls, and you can revoke access to the dataset.
Grant access to a dataset
You can't grant access to a dataset when you create it using the BigQuery web UI or
the bq command-line tool. You must create the dataset first and then grant access to it.
The API lets you grant access during dataset creation by calling thedatasets.insertmethodwith a defineddataset resource.
A project is the parent resource for a dataset, and a dataset is the parent
resource for tables and views, routines, and models. When you grant a role at
the project level, the role and its permissions are inherited by the dataset and
by the dataset's resources. Similarly, when you grant a role at the dataset
level, the role and its permissions are inherited by the resources within the
dataset.
You can provide access to a dataset by granting an IAM role
permission to access the dataset or by conditionally granting access using
an IAM condition. For more information on granting conditional
access, seeControl access with IAM Conditions.
To grant an IAM role access to a dataset without using
conditions, select one of the following options:
At the bottom of the Google Cloud console, aCloud Shellsession starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
To write the existing dataset information (including access controls) to
a JSON file, use thebq showcommand:
PATH_TO_FILE: the path to the JSON file on
your local machine
Make changes to theaccesssection of the JSON file. You can
add to any of thespecialGroupentries:projectOwners,projectWriters,projectReaders, andallAuthenticatedUsers. You can
also add any of the following:userByEmail,groupByEmail, anddomain.
For example, theaccesssection of a dataset's JSON file would look
like the following:
When your edits are complete, use thebq updatecommand and include
the JSON file using the--sourceflag. If the dataset is in a project
other than your default project, add the project ID to the dataset name
in the following format:PROJECT_ID:DATASET.
bqupdate --sourcePATH_TO_FILE PROJECT_ID:DATASET
To verify your access control changes, use thebq showcommand again
without writing the information to a file:
The following example shows how to use thegoogle_bigquery_dataset_iam_policyresourceto set the IAM policy for themydatasetdataset. This replaces any existing policy already attached
to the dataset:
# This file sets the IAM policy for the dataset created by# https://github.com/terraform-google-modules/terraform-docs-samples/blob/main/bigquery/bigquery_create_dataset/main.tf.# You must place it in the same local directory as that main.tf file,# and you must have already applied that main.tf file to create# the "default" dataset resource with a dataset_id of "mydataset".data"google_iam_policy""iam_policy"{binding{role="roles/bigquery.admin"members=["user:user@example.com",]}binding{role="roles/bigquery.dataOwner"members=["group:data.admin@example.com",]}binding{role="roles/bigquery.dataEditor"members=["serviceAccount:bqcx-1234567891011-12a3@gcp-sa-bigquery-condel.iam.gserviceaccount.com",]}}resource"google_bigquery_dataset_iam_policy""dataset_iam_policy"{dataset_id=google_bigquery_dataset.default.dataset_idpolicy_data=data.google_iam_policy.iam_policy.policy_data}
Set role membership for a dataset
The following example shows how to use thegoogle_bigquery_dataset_iam_bindingresourceto set membership in a given role for themydatasetdataset. This replaces any existing membership in that role.
Other roles within the IAM policy for the dataset
are preserved:
# This file sets membership in an IAM role for the dataset created by# https://github.com/terraform-google-modules/terraform-docs-samples/blob/main/bigquery/bigquery_create_dataset/main.tf.# You must place it in the same local directory as that main.tf file,# and you must have already applied that main.tf file to create# the "default" dataset resource with a dataset_id of "mydataset".resource"google_bigquery_dataset_iam_binding""dataset_iam_binding"{dataset_id=google_bigquery_dataset.default.dataset_idrole="roles/bigquery.jobUser"members=["user:user@example.com","group:group@example.com"]}
Set role membership for a single principal
The following example shows how to use thegoogle_bigquery_dataset_iam_memberresourceto update the IAM policy for themydatasetdataset to grant a role to one principal. Updating this
IAM policy does not affect access for any other principals
that have been granted that role for the dataset.
# This file adds a member to an IAM role for the dataset created by# https://github.com/terraform-google-modules/terraform-docs-samples/blob/main/bigquery/bigquery_create_dataset/main.tf.# You must place it in the same local directory as that main.tf file,# and you must have already applied that main.tf file to create# the "default" dataset resource with a dataset_id of "mydataset".resource"google_bigquery_dataset_iam_member""dataset_iam_member"{dataset_id=google_bigquery_dataset.default.dataset_idrole="roles/bigquery.user"member="user:user@example.com"}
To apply your Terraform configuration in a Google Cloud project, complete the steps in the
following sections.
Set the default Google Cloud project
where you want to apply your Terraform configurations.
You only need to run this command once per project, and you can run it in any directory.
export GOOGLE_CLOUD_PROJECT=PROJECT_ID
Environment variables are overridden if you set explicit values in the Terraform
configuration file.
Prepare the directory
Each Terraform configuration file must have its own directory (also
called aroot module).
InCloud Shell, create a directory and a new
file within that directory. The filename must have the.tfextension—for examplemain.tf. In this
tutorial, the file is referred to asmain.tf.
mkdirDIRECTORY&& cdDIRECTORY&& touch main.tf
If you are following a tutorial, you can copy the sample code in each section or step.
Copy the sample code into the newly createdmain.tf.
Optionally, copy the code from GitHub. This is recommended
when the Terraform snippet is part of an end-to-end solution.
Review and modify the sample parameters to apply to your environment.
Save your changes.
Initialize Terraform. You only need to do this once per directory.
terraform init
Optionally, to use the latest Google provider version, include the-upgradeoption:
terraform init -upgrade
Apply the changes
Review the configuration and verify that the resources that Terraform is going to create or
update match your expectations:
terraform plan
Make corrections to the configuration as necessary.
Apply the Terraform configuration by running the following command and enteringyesat the prompt:
terraform apply
Wait until Terraform displays the "Apply complete!" message.
Open your Google Cloud projectto view
the results. In the Google Cloud console, navigate to your resources in the UI to make sure
that Terraform has created or updated them.
import("context""fmt""io""cloud.google.com/go/bigquery")// grantAccessToDataset creates a new ACL conceding the READER role to the group "example-analyst-group@google.com"// For more information on the types of ACLs available see:// https://cloud.google.com/storage/docs/access-control/listsfuncgrantAccessToDataset(wio.Writer,projectID,datasetIDstring)error{// TODO(developer): uncomment and update the following lines:// projectID := "my-project-id"// datasetID := "mydataset"ctx:=context.Background()// Create BigQuery handler.client,err:=bigquery.NewClient(ctx,projectID)iferr!=nil{returnfmt.Errorf("bigquery.NewClient: %w",err)}deferclient.Close()// Create dataset handlerdataset:=client.Dataset(datasetID)// Get metadatameta,err:=dataset.Metadata(ctx)iferr!=nil{returnfmt.Errorf("bigquery.Dataset.Metadata: %w",err)}// Find more details about BigQuery Entity Types here:// https://pkg.go.dev/cloud.google.com/go/bigquery#EntityType//// Find more details about BigQuery Access Roles here:// https://pkg.go.dev/cloud.google.com/go/bigquery#AccessRoleentityType:=bigquery.GroupEmailEntityentityID:="example-analyst-group@google.com"roleType:=bigquery.ReaderRole// Append a new access control entry to the existing access list.update:=bigquery.DatasetMetadataToUpdate{Access:append(meta.Access,&bigquery.AccessEntry{Role:roleType,EntityType:entityType,Entity:entityID,}),}// Leverage the ETag for the update to assert there's been no modifications to the// dataset since the metadata was originally read.meta,err=dataset.Update(ctx,update,meta.ETag)iferr!=nil{returnerr}fmt.Fprintf(w,"Details for Access entries in dataset %v.\n",datasetID)for_,access:=rangemeta.Access{fmt.Fprintln(w)fmt.Fprintf(w,"Role: %s\n",access.Role)fmt.Fprintf(w,"Entities: %v\n",access.Entity)}returnnil}
importcom.google.cloud.bigquery.Acl;importcom.google.cloud.bigquery.Acl.Entity;importcom.google.cloud.bigquery.Acl.Group;importcom.google.cloud.bigquery.Acl.Role;importcom.google.cloud.bigquery.BigQuery;importcom.google.cloud.bigquery.BigQueryException;importcom.google.cloud.bigquery.BigQueryOptions;importcom.google.cloud.bigquery.Dataset;importcom.google.cloud.bigquery.DatasetId;importjava.util.ArrayList;importjava.util.List;publicclassGrantAccessToDataset{publicstaticvoidmain(String[]args){// TODO(developer): Replace these variables before running the sample.// Project and dataset from which to get the access policyStringprojectId="MY_PROJECT_ID";StringdatasetName="MY_DATASET_NAME";// Group to add to the ACLStringentityEmail="group-to-add@example.com";grantAccessToDataset(projectId,datasetName,entityEmail);}publicstaticvoidgrantAccessToDataset(StringprojectId,StringdatasetName,StringentityEmail){try{// Initialize client that will be used to send requests. This client only needs to be created// once, and can be reused for multiple requests.BigQuerybigquery=BigQueryOptions.getDefaultInstance().getService();// Create datasetId with the projectId and the datasetName.DatasetIddatasetId=DatasetId.of(projectId,datasetName);Datasetdataset=bigquery.getDataset(datasetId);// Create a new Entity with the corresponding type and email// "user-or-group-to-add@example.com"// For more information on the types of Entities available see:// https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.Acl.Entity// and// https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.Acl.Entity.TypeEntityentity=newGroup(entityEmail);// Create a new ACL granting the READER role to the group with the entity email// "user-or-group-to-add@example.com"// For more information on the types of ACLs available see:// https://cloud.google.com/storage/docs/access-control/listsAclnewEntry=Acl.of(entity,Role.READER);// Get a copy of the ACLs list from the dataset and append the new entry.List<Acl>acls=newArrayList<>(dataset.getAcl());acls.add(newEntry);// Update the ACLs by setting the new list.DatasetupdatedDataset=bigquery.update(dataset.toBuilder().setAcl(acls).build());System.out.println("ACLs of dataset \""+updatedDataset.getDatasetId().getDataset()+"\" updated successfully");}catch(BigQueryExceptione){System.out.println("ACLs were not updated \n"+e.toString());}}}
Set the new access list by appending the new entry to the existing list using theDataset#metadatamethod.
Then call theDataset#setMetadata()function to update the property.
/*** TODO(developer): Update and un-comment below lines.*/// const datasetId = "my_project_id.my_dataset_name";// ID of the user or group from whom you are adding access.// const entityId = "user-or-group-to-add@example.com";// One of the "Basic roles for datasets" described here:// https://cloud.google.com/bigquery/docs/access-control-basic-roles#dataset-basic-roles// const role = "READER";const{BigQuery}=require('@google-cloud/bigquery');// Instantiate a client.constclient=newBigQuery();// Type of entity you are granting access to.// Find allowed allowed entity type names here:// https://cloud.google.com/bigquery/docs/reference/rest/v2/datasets#resource:-datasetconstentityType='groupByEmail';asyncfunctiongrantAccessToDataset(){const[dataset]=awaitclient.dataset(datasetId).get();// The 'access entries' array is immutable. Create a copy for modifications.constentries=[...dataset.metadata.access];// Append an AccessEntry to grant the role to a dataset.// Find more details about the AccessEntry object in the BigQuery documentation:// https://cloud.google.com/python/docs/reference/bigquery/latest/google.cloud.bigquery.dataset.AccessEntryentries.push({role,[entityType]:entityId,});// Assign the array of AccessEntries back to the dataset.constmetadata={access:entries,};// Update will only succeed if the dataset// has not been modified externally since retrieval.//// See the BigQuery client library documentation for more details on metadata updates:// https://cloud.google.com/nodejs/docs/reference/bigquery/latest// Update just the 'access entries' property of the dataset.awaitclient.dataset(datasetId).setMetadata(metadata);console.log(`Role '${role}' granted for entity '${entityId}' in '${datasetId}'.`);}
fromgoogle.api_core.exceptionsimportPreconditionFailedfromgoogle.cloudimportbigqueryfromgoogle.cloud.bigquery.enumsimportEntityTypes# TODO(developer): Update and uncomment the lines below.# ID of the dataset to grant access to.# dataset_id = "my_project_id.my_dataset"# ID of the user or group receiving access to the dataset.# Alternatively, the JSON REST API representation of the entity,# such as the view's table reference.# entity_id = "user-or-group-to-add@example.com"# One of the "Basic roles for datasets" described here:# https://cloud.google.com/bigquery/docs/access-control-basic-roles#dataset-basic-roles# role = "READER"# Type of entity you are granting access to.# Find allowed allowed entity type names here:# https://cloud.google.com/python/docs/reference/bigquery/latest/enums#class-googlecloudbigqueryenumsentitytypesvalueentity_type=EntityTypes.GROUP_BY_EMAIL# Instantiate a client.client=bigquery.Client()# Get a reference to the dataset.dataset=client.get_dataset(dataset_id)# The `access_entries` list is immutable. Create a copy for modifications.entries=list(dataset.access_entries)# Append an AccessEntry to grant the role to a dataset.# Find more details about the AccessEntry object here:# https://cloud.google.com/python/docs/reference/bigquery/latest/google.cloud.bigquery.dataset.AccessEntryentries.append(bigquery.AccessEntry(role=role,entity_type=entity_type,entity_id=entity_id,))# Assign the list of AccessEntries back to the dataset.dataset.access_entries=entries# Update will only succeed if the dataset# has not been modified externally since retrieval.## See the BigQuery client library documentation for more details on `update_dataset`:# https://cloud.google.com/python/docs/reference/bigquery/latest/google.cloud.bigquery.client.Client#google_cloud_bigquery_client_Client_update_datasettry:# Update just the `access_entries` property of the dataset.dataset=client.update_dataset(dataset,["access_entries"],)# Show a success message.full_dataset_id=f"{dataset.project}.{dataset.dataset_id}"print(f"Role '{role}' granted for entity '{entity_id}'"f" in dataset '{full_dataset_id}'.")exceptPreconditionFailed:# A read-modify-write errorprint(f"Dataset '{dataset.dataset_id}' was modified remotely before this update. ""Fetch the latest version and retry.")
Predefined roles that grant access to datasets
You can grant the following IAM predefined roles access to a
dataset.
When granted on a dataset, this role grants these permissions:
Get metadata and access controls for the dataset.
Get metadata and access controls for tables and views.
Get metadata from the dataset's models and routines.
List tables, views, models, and routines in the dataset.
Dataset permissions
Most permissions that begin withbigquery.datasetsapply at the dataset level.bigquery.datasets.createdoesn't. In order to create datasets,bigquery.datasets.createpermission must be granted to a role on the parent
container–the project.
The following table lists all permissions for datasets and the lowest-level
resource the permission can be applied to.
Permission
Resource
Action
bigquery.datasets.create
Project
Create new datasets in the project.
bigquery.datasets.get
Dataset
Get metadata and access controls for the dataset. Viewing permissions in
the console also requires thebigquery.datasets.getIamPolicypermission.
bigquery.datasets.getIamPolicy
Dataset
Required by the console to grant the user permission to get a dataset's
access controls. Fails open. The console also requires thebigquery.datasets.getpermission to view the dataset.
bigquery.datasets.update
Dataset
Update metadata and access controls for the dataset. Updating access
controls in the console also requires thebigquery.datasets.setIamPolicypermission.
bigquery.datasets.setIamPolicy
Dataset
Required by the console to grant the user permission to set a dataset's
access controls. Fails open. The console also requires thebigquery.datasets.updatepermission to update the dataset.
bigquery.datasets.delete
Dataset
Delete a dataset.
bigquery.datasets.createTagBinding
Dataset
Attach tags to the dataset.
bigquery.datasets.deleteTagBinding
Dataset
Detach tags from the dataset.
bigquery.datasets.listTagBindings
Dataset
List tags for the dataset.
bigquery.datasets.listEffectiveTags
Dataset
List effective tags (applied and inherited) for the dataset.
List shared dataset usage statistics for datasets that you have access
to in the project. This permission is required to query theINFORMATION_SCHEMA.SHARED_DATASET_USAGEview.
View access controls for a dataset
You can view the explicitly set access controls for a dataset by choosing one of
the following options. Toview inherited roles, for a
dataset, use the BigQuery web UI.
At the bottom of the Google Cloud console, aCloud Shellsession starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
To get an existing policy and output it to a local file in JSON, use thebq showcommandin Cloud Shell:
PATH_TO_FILE: the path to the JSON file on your
local machine
SQL
Preview
This product or feature is subject to the "Pre-GA Offerings Terms" in the General Service Terms section
of theService Specific Terms.
Pre-GA products and features are available "as is" and might have limited support.
For more information, see thelaunch stage descriptions.
import("context""fmt""io""cloud.google.com/go/bigquery")// viewDatasetAccessPolicies retrieves the ACL for the given dataset// For more information on the types of ACLs available see:// https://cloud.google.com/storage/docs/access-control/listsfuncviewDatasetAccessPolicies(wio.Writer,projectID,datasetIDstring)error{// TODO(developer): uncomment and update the following lines:// projectID := "my-project-id"// datasetID := "mydataset"ctx:=context.Background()// Create new client.client,err:=bigquery.NewClient(ctx,projectID)iferr!=nil{returnfmt.Errorf("bigquery.NewClient: %w",err)}deferclient.Close()// Get dataset's metadata.meta,err:=client.Dataset(datasetID).Metadata(ctx)iferr!=nil{returnfmt.Errorf("bigquery.Client.Dataset.Metadata: %w",err)}fmt.Fprintf(w,"Details for Access entries in dataset %v.\n",datasetID)// Iterate over access permissions.for_,access:=rangemeta.Access{fmt.Fprintln(w)fmt.Fprintf(w,"Role: %s\n",access.Role)fmt.Fprintf(w,"Entity: %v\n",access.Entity)}returnnil}
importcom.google.cloud.bigquery.Acl;importcom.google.cloud.bigquery.BigQuery;importcom.google.cloud.bigquery.BigQueryException;importcom.google.cloud.bigquery.BigQueryOptions;importcom.google.cloud.bigquery.Dataset;importcom.google.cloud.bigquery.DatasetId;importjava.util.List;publicclassGetDatasetAccessPolicy{publicstaticvoidmain(String[]args){// TODO(developer): Replace these variables before running the sample.// Project and dataset from which to get the access policy.StringprojectId="MY_PROJECT_ID";StringdatasetName="MY_DATASET_NAME";getDatasetAccessPolicy(projectId,datasetName);}publicstaticvoidgetDatasetAccessPolicy(StringprojectId,StringdatasetName){try{// Initialize client that will be used to send requests. This client only needs to be created// once, and can be reused for multiple requests.BigQuerybigquery=BigQueryOptions.getDefaultInstance().getService();// Create datasetId with the projectId and the datasetName.DatasetIddatasetId=DatasetId.of(projectId,datasetName);Datasetdataset=bigquery.getDataset(datasetId);// Show ACL details.// Find more information about ACL and the Acl Class here:// https://cloud.google.com/storage/docs/access-control/lists// https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.AclList<Acl>acls=dataset.getAcl();System.out.println("ACLs in dataset \""+dataset.getDatasetId().getDataset()+"\":");System.out.println(acls.toString());for(Aclacl:acls){System.out.println();System.out.println("Role: "+acl.getRole());System.out.println("Entity: "+acl.getEntity());}}catch(BigQueryExceptione){System.out.println("ACLs info not retrieved. \n"+e.toString());}}}
Retrieve the dataset metadata using theDataset#getMetadata()function.
The access policy is available in the access property of the resulting metadata object.
/*** TODO(developer): Update and un-comment below lines*/// const datasetId = "my_project_id.my_dataset";const{BigQuery}=require('@google-cloud/bigquery');// Instantiate a client.constbigquery=newBigQuery();asyncfunctionviewDatasetAccessPolicy(){constdataset=bigquery.dataset(datasetId);const[metadata]=awaitdataset.getMetadata();constaccessEntries=metadata.access||[];// Show the list of AccessEntry objects.// More details about the AccessEntry object in the BigQuery documentation:// https://cloud.google.com/nodejs/docs/reference/bigquery/latestconsole.log(`${accessEntries.length}Access entries in dataset '${datasetId}':`);for(constaccessEntryofaccessEntries){console.log(`Role:${accessEntry.role||'null'}`);console.log(`Special group:${accessEntry.specialGroup||'null'}`);console.log(`User by Email:${accessEntry.userByEmail||'null'}`);}}
fromgoogle.cloudimportbigquery# Instantiate a client.client=bigquery.Client()# TODO(developer): Update and uncomment the lines below.# Dataset from which to get the access policy.# dataset_id = "my_dataset"# Get a reference to the dataset.dataset=client.get_dataset(dataset_id)# Show the list of AccessEntry objects.# More details about the AccessEntry object here:# https://cloud.google.com/python/docs/reference/bigquery/latest/google.cloud.bigquery.dataset.AccessEntryprint(f"{len(dataset.access_entries)}Access entries found "f"in dataset '{dataset_id}':")foraccess_entryindataset.access_entries:print()print(f"Role:{access_entry.role}")print(f"Special group:{access_entry.special_group}")print(f"User by Email:{access_entry.user_by_email}")
Revoke access to a dataset
To revoke access to a dataset, select one of the following options:
At the bottom of the Google Cloud console, aCloud Shellsession starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
To write the existing dataset information (including access controls) to
a JSON file, use thebq showcommand:
PATH_TO_FILE: the path to the JSON file on
your local machine
Make changes to theaccesssection of the JSON file. You can
remove any of thespecialGroupentries:projectOwners,projectWriters,projectReaders, andallAuthenticatedUsers. You can
also remove any of the following:userByEmail,groupByEmail, anddomain.
For example, theaccesssection of a dataset's JSON file would look
like the following:
When your edits are complete, use thebq updatecommand and include
the JSON file using the--sourceflag. If the dataset is in a project
other than your default project, add the project ID to the dataset name
in the following format:PROJECT_ID:DATASET.
bqupdate --sourcePATH_TO_FILE PROJECT_ID:DATASET
To verify your access control changes, use theshowcommand without
writing the information to a file:
bqshow--format=prettyjsonPROJECT_ID:DATASET
API
Call thedatasets.patchmethodand use theaccessproperty in theDatasetresource to update your access
controls.
Because thedatasets.updatemethod replaces the entire dataset resource,datasets.patchis the preferred method for updating access controls.
import("context""fmt""io""cloud.google.com/go/bigquery")// revokeAccessToDataset creates a new ACL removing the dataset access to "example-analyst-group@google.com" entity// For more information on the types of ACLs available see:// https://cloud.google.com/storage/docs/access-control/listsfuncrevokeAccessToDataset(wio.Writer,projectID,datasetID,entitystring)error{// TODO(developer): uncomment and update the following lines:// projectID := "my-project-id"// datasetID := "mydataset"// entity := "user@mydomain.com"ctx:=context.Background()// Create BigQuery client.client,err:=bigquery.NewClient(ctx,projectID)iferr!=nil{returnfmt.Errorf("bigquery.NewClient: %w",err)}deferclient.Close()// Get dataset handlerdataset:=client.Dataset(datasetID)// Get dataset metadatameta,err:=dataset.Metadata(ctx)iferr!=nil{returnerr}// Create new access entry list by copying the existing and omiting the access entry entity valuevarnewAccessList[]*bigquery.AccessEntryfor_,entry:=rangemeta.Access{ifentry.Entity!=entity{newAccessList=append(newAccessList,entry)}}// Only proceed with update if something in the access list was removed.// Additionally, we use the ETag from the initial metadata to ensure no// other changes were made to the access list in the interim.iflen(newAccessList)<len(meta.Access){update:=bigquery.DatasetMetadataToUpdate{Access:newAccessList,}meta,err=dataset.Update(ctx,update,meta.ETag)iferr!=nil{returnerr}}else{returnfmt.Errorf("any access entry was revoked")}fmt.Fprintf(w,"Details for Access entries in dataset %v.\n",datasetID)for_,access:=rangemeta.Access{fmt.Fprintln(w)fmt.Fprintf(w,"Role: %s\n",access.Role)fmt.Fprintf(w,"Entity: %v\n",access.Entity)}returnnil}
importcom.google.cloud.bigquery.Acl;importcom.google.cloud.bigquery.Acl.Entity;importcom.google.cloud.bigquery.Acl.Group;importcom.google.cloud.bigquery.BigQuery;importcom.google.cloud.bigquery.BigQueryException;importcom.google.cloud.bigquery.BigQueryOptions;importcom.google.cloud.bigquery.Dataset;importcom.google.cloud.bigquery.DatasetId;importjava.util.List;publicclassRevokeDatasetAccess{publicstaticvoidmain(String[]args){// TODO(developer): Replace these variables before running the sample.// Project and dataset from which to get the access policy.StringprojectId="MY_PROJECT_ID";StringdatasetName="MY_DATASET_NAME";// Group to remove from the ACLStringentityEmail="group-to-remove@example.com";revokeDatasetAccess(projectId,datasetName,entityEmail);}publicstaticvoidrevokeDatasetAccess(StringprojectId,StringdatasetName,StringentityEmail){try{// Initialize client that will be used to send requests. This client only needs// to be created once, and can be reused for multiple requests.BigQuerybigquery=BigQueryOptions.getDefaultInstance().getService();// Create datasetId with the projectId and the datasetName.DatasetIddatasetId=DatasetId.of(projectId,datasetName);Datasetdataset=bigquery.getDataset(datasetId);// Create a new Entity with the corresponding type and email// "user-or-group-to-remove@example.com"// For more information on the types of Entities available see:// https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.Acl.Entity// and// https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.Acl.Entity.TypeEntityentity=newGroup(entityEmail);// To revoke access to a dataset, remove elements from the Acl list.// Find more information about ACL and the Acl Class here:// https://cloud.google.com/storage/docs/access-control/lists// https://cloud.google.com/java/docs/reference/google-cloud-bigquery/latest/com.google.cloud.bigquery.Acl// Remove the entity from the ACLs list.List<Acl>acls=dataset.getAcl().stream().filter(acl->!acl.getEntity().equals(entity)).toList();// Update the ACLs by setting the new list.bigquery.update(dataset.toBuilder().setAcl(acls).build());System.out.println("ACLs of \""+datasetName+"\" updated successfully");}catch(BigQueryExceptione){System.out.println("ACLs were not updated \n"+e.toString());}}}
Update the dataset access list by removing the specified entry from the existing list using theDataset#get()method to retrieve the current metadata. Modify the access property to exclude the desired entity, and then call theDataset#setMetadata()function to apply the updated access list.
/*** TODO(developer): Update and un-comment below lines*/// const datasetId = "my_project_id.my_dataset"// ID of the user or group from whom you are revoking access.// const entityId = "user-or-group-to-remove@example.com"const{BigQuery}=require('@google-cloud/bigquery');// Instantiate a client.constbigquery=newBigQuery();asyncfunctionrevokeDatasetAccess(){const[dataset]=awaitbigquery.dataset(datasetId).get();// To revoke access to a dataset, remove elements from the access list.//// See the BigQuery client library documentation for more details on access entries:// https://cloud.google.com/nodejs/docs/reference/bigquery/latest// Filter access entries to exclude entries matching the specified entity_id// and assign a new list back to the access list.dataset.metadata.access=dataset.metadata.access.filter(entry=>{return!(entry.entity_id===entityId||entry.userByEmail===entityId||entry.groupByEmail===entityId);});// Update will only succeed if the dataset// has not been modified externally since retrieval.//// See the BigQuery client library documentation for more details on metadata updates:// https://cloud.google.com/bigquery/docs/updating-datasets// Update just the 'access entries' property of the dataset.awaitdataset.setMetadata(dataset.metadata);console.log(`Revoked access to '${entityId}' from '${datasetId}'.`);}
fromgoogle.cloudimportbigqueryfromgoogle.api_core.exceptionsimportPreconditionFailed# TODO(developer): Update and uncomment the lines below.# ID of the dataset to revoke access to.# dataset_id = "my-project.my_dataset"# ID of the user or group from whom you are revoking access.# Alternatively, the JSON REST API representation of the entity,# such as a view's table reference.# entity_id = "user-or-group-to-remove@example.com"# Instantiate a client.client=bigquery.Client()# Get a reference to the dataset.dataset=client.get_dataset(dataset_id)# To revoke access to a dataset, remove elements from the AccessEntry list.## See the BigQuery client library documentation for more details on `access_entries`:# https://cloud.google.com/python/docs/reference/bigquery/latest/google.cloud.bigquery.dataset.Dataset#google_cloud_bigquery_dataset_Dataset_access_entries# Filter `access_entries` to exclude entries matching the specified entity_id# and assign a new list back to the AccessEntry list.dataset.access_entries=[entryforentryindataset.access_entriesifentry.entity_id!=entity_id]# Update will only succeed if the dataset# has not been modified externally since retrieval.## See the BigQuery client library documentation for more details on `update_dataset`:# https://cloud.google.com/python/docs/reference/bigquery/latest/google.cloud.bigquery.client.Client#google_cloud_bigquery_client_Client_update_datasettry:# Update just the `access_entries` property of the dataset.dataset=client.update_dataset(dataset,["access_entries"],)# Notify user that the API call was successful.full_dataset_id=f"{dataset.project}.{dataset.dataset_id}"print(f"Revoked dataset access for '{entity_id}' to ' dataset '{full_dataset_id}.'")exceptPreconditionFailed:# A read-modify-write error.print(f"Dataset '{dataset.dataset_id}' was modified remotely before this update. ""Fetch the latest version and retry.")
Work with table and view access controls
Views are treated as table resources in BigQuery. You can provide
access to a table or view by granting anIAM principala predefined or custom role that determines what the principal can do with the
table or view. This is also known as attaching anallow policyto a resource. After granting access, you can view the access controls for the
table or view, and you can revoke access to the table or view.
Grant access to a table or view
For fine-grained access control, you can grant a predefined or custom
IAM role on a specific table or view. The table or view also
inherits access controls specified at the dataset level and higher. For example,
if you grant a principal the BigQuery Data Owner role on a dataset, that
principal also has BigQuery Data Owner permissions on the tables and views in
the dataset.
To grant access to a table or view, select one of the following options:
At the bottom of the Google Cloud console, aCloud Shellsession starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
The following example shows how to use thegoogle_bigquery_table_iam_policyresourceto set the IAM policy for themytabletable. This replaces any existing policy already attached
to the table:
# This file sets the IAM policy for the table created by# https://github.com/terraform-google-modules/terraform-docs-samples/blob/main/bigquery/bigquery_create_table/main.tf.# You must place it in the same local directory as that main.tf file,# and you must have already applied that main.tf file to create# the "default" table resource with a table_id of "mytable".data"google_iam_policy""iam_policy"{binding{role="roles/bigquery.dataOwner"members=["user:user@example.com",]}}resource"google_bigquery_table_iam_policy""table_iam_policy"{dataset_id=google_bigquery_table.default.dataset_idtable_id=google_bigquery_table.default.table_idpolicy_data=data.google_iam_policy.iam_policy.policy_data}
Set role membership for a table
The following example shows how to use thegoogle_bigquery_table_iam_bindingresourceto set membership in a given role for themytabletable. This replaces any existing membership in that role.
Other roles within the IAM policy for the table
are preserved.
# This file sets membership in an IAM role for the table created by# https://github.com/terraform-google-modules/terraform-docs-samples/blob/main/bigquery/bigquery_create_table/main.tf.# You must place it in the same local directory as that main.tf file,# and you must have already applied that main.tf file to create# the "default" table resource with a table_id of "mytable".resource"google_bigquery_table_iam_binding""table_iam_binding"{dataset_id=google_bigquery_table.default.dataset_idtable_id=google_bigquery_table.default.table_idrole="roles/bigquery.dataOwner"members=["group:group@example.com",]}
Set role membership for a single principal
The following example shows how to use thegoogle_bigquery_table_iam_memberresourceto update the IAM policy for themytabletable to grant a role to one principal. Updating this
IAM policy does not affect access for any other principals
that have been granted that role for the dataset.
# This file adds a member to an IAM role for the table created by# https://github.com/terraform-google-modules/terraform-docs-samples/blob/main/bigquery/bigquery_create_table/main.tf.# You must place it in the same local directory as that main.tf file,# and you must have already applied that main.tf file to create# the "default" table resource with a table_id of "mytable".resource"google_bigquery_table_iam_member""table_iam_member"{dataset_id=google_bigquery_table.default.dataset_idtable_id=google_bigquery_table.default.table_idrole="roles/bigquery.dataEditor"member="serviceAccount:bqcx-1234567891011-12a3@gcp-sa-bigquery-condel.iam.gserviceaccount.com"}
To apply your Terraform configuration in a Google Cloud project, complete the steps in the
following sections.
Set the default Google Cloud project
where you want to apply your Terraform configurations.
You only need to run this command once per project, and you can run it in any directory.
export GOOGLE_CLOUD_PROJECT=PROJECT_ID
Environment variables are overridden if you set explicit values in the Terraform
configuration file.
Prepare the directory
Each Terraform configuration file must have its own directory (also
called aroot module).
InCloud Shell, create a directory and a new
file within that directory. The filename must have the.tfextension—for examplemain.tf. In this
tutorial, the file is referred to asmain.tf.
mkdirDIRECTORY&& cdDIRECTORY&& touch main.tf
If you are following a tutorial, you can copy the sample code in each section or step.
Copy the sample code into the newly createdmain.tf.
Optionally, copy the code from GitHub. This is recommended
when the Terraform snippet is part of an end-to-end solution.
Review and modify the sample parameters to apply to your environment.
Save your changes.
Initialize Terraform. You only need to do this once per directory.
terraform init
Optionally, to use the latest Google provider version, include the-upgradeoption:
terraform init -upgrade
Apply the changes
Review the configuration and verify that the resources that Terraform is going to create or
update match your expectations:
terraform plan
Make corrections to the configuration as necessary.
Apply the Terraform configuration by running the following command and enteringyesat the prompt:
terraform apply
Wait until Terraform displays the "Apply complete!" message.
Open your Google Cloud projectto view
the results. In the Google Cloud console, navigate to your resources in the UI to make sure
that Terraform has created or updated them.
Call the resource'sIAM().SetPolicy()functionto save changes to the access policy for a table or view.
import("context""fmt""io""cloud.google.com/go/bigquery""cloud.google.com/go/iam")// grantAccessToResource creates a new ACL conceding the VIEWER role to the group "example-analyst-group@google.com"// For more information on the types of ACLs available see:// https://cloud.google.com/storage/docs/access-control/listsfuncgrantAccessToResource(wio.Writer,projectID,datasetID,resourceIDstring)error{// Resource can be a table or a view//// TODO(developer): uncomment and update the following lines:// projectID := "my-project-id"// datasetID := "mydataset"// resourceID := "myresource"ctx:=context.Background()// Create new clientclient,err:=bigquery.NewClient(ctx,projectID)iferr!=nil{returnfmt.Errorf("bigquery.NewClient: %w",err)}deferclient.Close()// Get resource policy.policy,err:=client.Dataset(datasetID).Table(resourceID).IAM().Policy(ctx)iferr!=nil{returnfmt.Errorf("bigquery.Dataset.Table.IAM.Policy: %w",err)}// Find more details about IAM Roles here:// https://pkg.go.dev/cloud.google.com/go/iam#RoleNameentityID:="example-analyst-group@google.com"roleType:=iam.Viewer// Add new policy.policy.Add(fmt.Sprintf("group:%s",entityID),roleType)// Update resource's policy.err=client.Dataset(datasetID).Table(resourceID).IAM().SetPolicy(ctx,policy)iferr!=nil{returnfmt.Errorf("bigquery.Dataset.Table.IAM.Policy: %w",err)}// Get resource policy again expecting the update.policy,err=client.Dataset(datasetID).Table(resourceID).IAM().Policy(ctx)iferr!=nil{returnfmt.Errorf("bigquery.Dataset.Table.IAM.Policy: %w",err)}fmt.Fprintf(w,"Details for Access entries in table or view %v.\n",resourceID)for_,role:=rangepolicy.Roles(){fmt.Fprintln(w)fmt.Fprintf(w,"Role: %s\n",role)fmt.Fprintf(w,"Entities: %v\n",policy.Members(role))}returnnil}
importcom.google.cloud.Identity;importcom.google.cloud.Policy;importcom.google.cloud.Role;importcom.google.cloud.bigquery.BigQuery;importcom.google.cloud.bigquery.BigQueryException;importcom.google.cloud.bigquery.BigQueryOptions;importcom.google.cloud.bigquery.TableId;publicclassGrantAccessToTableOrView{publicstaticvoidmain(String[]args){// TODO(developer): Replace these variables before running the sample.// Project, dataset and resource (table or view) from which to get the access policy.StringprojectId="MY_PROJECT_ID";StringdatasetName="MY_DATASET_NAME";StringresourceName="MY_TABLE_NAME";// Role to add to the policy accessRolerole=Role.of("roles/bigquery.dataViewer");// Identity to add to the policy accessIdentityidentity=Identity.user("user-add@example.com");grantAccessToTableOrView(projectId,datasetName,resourceName,role,identity);}publicstaticvoidgrantAccessToTableOrView(StringprojectId,StringdatasetName,StringresourceName,Rolerole,Identityidentity){try{// Initialize client that will be used to send requests. This client only needs// to be created once, and can be reused for multiple requests.BigQuerybigquery=BigQueryOptions.getDefaultInstance().getService();// Create table identity given the projectId, the datasetName and the resourceName.TableIdtableId=TableId.of(projectId,datasetName,resourceName);// Add new user identity to current IAM policy.Policypolicy=bigquery.getIamPolicy(tableId);policy=policy.toBuilder().addIdentity(role,identity).build();// Update the IAM policy by setting the new one.bigquery.setIamPolicy(tableId,policy);System.out.println("IAM policy of resource \""+resourceName+"\" updated successfully");}catch(BigQueryExceptione){System.out.println("IAM policy was not updated. \n"+e.toString());}}}
/*** TODO(developer): Update and un-comment below lines*/// const projectId = "YOUR_PROJECT_ID";// const datasetId = "YOUR_DATASET_ID";// const tableId = "YOUR_TABLE_ID";// const principalId = "YOUR_PRINCIPAL_ID";// const role = "YOUR_ROLE";const{BigQuery}=require('@google-cloud/bigquery');// Instantiate a client.constclient=newBigQuery();asyncfunctiongrantAccessToTableOrView(){constdataset=client.dataset(datasetId);consttable=dataset.table(tableId);// Get the IAM access policy for the table or view.const[policy]=awaittable.getIamPolicy();// Initialize bindings array.if(!policy.bindings){policy.bindings=[];}// To grant access to a table or view// add bindings to the Table or View policy.//// Find more details about Policy and Binding objects here:// https://cloud.google.com/security-command-center/docs/reference/rest/Shared.Types/Policy// https://cloud.google.com/security-command-center/docs/reference/rest/Shared.Types/Bindingconstbinding={role,members:[principalId],};policy.bindings.push(binding);// Set the IAM access policy with updated bindings.awaittable.setIamPolicy(policy);// Show a success message.console.log(`Role '${role}' granted for principal '${principalId}' on resource '${datasetId}.${tableId}'.`);}awaitgrantAccessToTableOrView();
fromgoogle.cloudimportbigquery# TODO(developer): Update and uncomment the lines below.# Google Cloud Platform project.# project_id = "my_project_id"# Dataset where the table or view is.# dataset_id = "my_dataset"# Table or view name to get the access policy.# resource_name = "my_table"# Principal to grant access to a table or view.# For more information about principal identifiers see:# https://cloud.google.com/iam/docs/principal-identifiers# principal_id = "user:bob@example.com"# Role to grant to the principal.# For more information about BigQuery roles see:# https://cloud.google.com/bigquery/docs/access-control# role = "roles/bigquery.dataViewer"# Instantiate a client.client=bigquery.Client()# Get the full table or view name.full_resource_name=f"{project_id}.{dataset_id}.{resource_name}"# Get the IAM access policy for the table or view.policy=client.get_iam_policy(full_resource_name)# To grant access to a table or view, add bindings to the IAM policy.## Find more details about Policy and Binding objects here:# https://cloud.google.com/security-command-center/docs/reference/rest/Shared.Types/Policy# https://cloud.google.com/security-command-center/docs/reference/rest/Shared.Types/Bindingbinding={"role":role,"members":[principal_id,],}policy.bindings.append(binding)# Set the IAM access policy with updated bindings.updated_policy=client.set_iam_policy(full_resource_name,policy)# Show a success message.print(f"Role '{role}' granted for principal '{principal_id}'"f" on resource '{full_resource_name}'.")
Predefined roles that grant access to tables and views
Views are treated as table resources in BigQuery. For
fine-grained access control, you can grant a predefined or custom
IAM role on a specific table or view. The table or view also
inherits access controls specified at the dataset level and higher. For example,
if you grant a principal the BigQuery Data Owner role on a dataset, that
principal also has Data Owner permissions on the tables and views in the
dataset.
The following predefined IAM roles have permissions on tables or
views.
When granted on a table or view, this role grants these permissions:
Get metadata and access controls for the table or view.
Permissions for tables and views
Views are treated as table resources in BigQuery. All table-level
permissions apply to views.
Most permissions that begin withbigquery.tablesapply at the table level.bigquery.tables.createandbigquery.tables.listdon't. In order to create
and list tables or views,bigquery.tables.createandbigquery.tables.listpermissions must be granted to a role on a parent container–the dataset or the
project.
The following table lists all permissions for tables and views and the
lowest-level resource they can be granted to.
Permission
Resource
Action
bigquery.tables.create
Dataset
Create new tables in the dataset.
bigquery.tables.createIndex
Table
Create a search index on the table.
bigquery.tables.deleteIndex
Table
Delete a search index on the table.
bigquery.tables.createSnapshot
Table
Create a snapshot of the table. Creating a snapshot requires several
additional permissions at the table and dataset level. For details, seePermissions and rolesfor creating table snapshots.
At the bottom of the Google Cloud console, aCloud Shellsession starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
To get an existing access policy and output it to a local file in JSON,
use thebq get-iam-policycommandin Cloud Shell:
RESOURCE: the name of the table or view whose
policy you want to view
PATH_TO_FILE: the path to the JSON file on your
local machine
SQL
Preview
This product or feature is subject to the "Pre-GA Offerings Terms" in the General Service Terms section
of theService Specific Terms.
Pre-GA products and features are available "as is" and might have limited support.
For more information, see thelaunch stage descriptions.
import("context""fmt""io""cloud.google.com/go/bigquery")// viewTableOrViewAccessPolicies retrieves the ACL for the given resource// For more information on the types of ACLs available see:// https://cloud.google.com/storage/docs/access-control/listsfuncviewTableOrViewAccessPolicies(wio.Writer,projectID,datasetID,resourceIDstring)error{// Resource can be a table or a view//// TODO(developer): uncomment and update the following lines:// projectID := "my-project-id"// datasetID := "my-dataset-id"// resourceID := "my-resource-id"ctx:=context.Background()// Create new client.client,err:=bigquery.NewClient(ctx,projectID)iferr!=nil{returnfmt.Errorf("bigquery.NewClient: %w",err)}deferclient.Close()// Get resource's policy access.policy,err:=client.Dataset(datasetID).Table(resourceID).IAM().Policy(ctx)iferr!=nil{returnfmt.Errorf("bigquery.Dataset.Table.IAM.Policy: %w",err)}fmt.Fprintf(w,"Details for Access entries in table or view %v.\n",resourceID)for_,role:=rangepolicy.Roles(){fmt.Fprintln(w)fmt.Fprintf(w,"Role: %s\n",role)fmt.Fprintf(w,"Entities: %v\n",policy.Members(role))}returnnil}
importcom.google.cloud.Policy;importcom.google.cloud.bigquery.BigQuery;importcom.google.cloud.bigquery.BigQueryException;importcom.google.cloud.bigquery.BigQueryOptions;importcom.google.cloud.bigquery.TableId;publicclassGetTableOrViewAccessPolicy{publicstaticvoidmain(String[]args){// TODO(developer): Replace these variables before running the sample.// Project, dataset and resource (table or view) from which to get the access policy.StringprojectId="MY_PROJECT_ID";StringdatasetName="MY_DATASET_NAME";StringresourceName="MY_RESOURCE_NAME";getTableOrViewAccessPolicy(projectId,datasetName,resourceName);}publicstaticvoidgetTableOrViewAccessPolicy(StringprojectId,StringdatasetName,StringresourceName){try{// Initialize client that will be used to send requests. This client only needs// to be created once, and can be reused for multiple requests.BigQuerybigquery=BigQueryOptions.getDefaultInstance().getService();// Create table identity given the projectId, the datasetName and the resourceName.TableIdtableId=TableId.of(projectId,datasetName,resourceName);// Get the table IAM policy.Policypolicy=bigquery.getIamPolicy(tableId);// Show policy details.// Find more information about the Policy Class here:// https://cloud.google.com/java/docs/reference/google-cloud-core/latest/com.google.cloud.PolicySystem.out.println("IAM policy info of resource \""+resourceName+"\" retrieved succesfully");System.out.println();System.out.println("IAM policy info: "+policy.toString());}catch(BigQueryExceptione){System.out.println("IAM policy info not retrieved. \n"+e.toString());}}}
Retrieve the IAM policy for a table or view using theTable#getIamPolicy()function.
The access policy details are available in the returned policy object.
/*** TODO(developer): Update and un-comment below lines*/// const projectId = "YOUR_PROJECT_ID"// const datasetId = "YOUR_DATASET_ID"// const resourceName = "YOUR_RESOURCE_NAME";const{BigQuery}=require('@google-cloud/bigquery');// Instantiate a client.constclient=newBigQuery();asyncfunctionviewTableOrViewAccessPolicy(){constdataset=client.dataset(datasetId);consttable=dataset.table(resourceName);// Get the IAM access policy for the table or view.const[policy]=awaittable.getIamPolicy();// Initialize bindings if they don't existif(!policy.bindings){policy.bindings=[];}// Show policy details.// Find more details for the Policy object here:// https://cloud.google.com/bigquery/docs/reference/rest/v2/Policyconsole.log(`Access Policy details for table or view '${resourceName}'.`);console.log(`Bindings:${JSON.stringify(policy.bindings,null,2)}`);console.log(`etag:${policy.etag}`);console.log(`Version:${policy.version}`);}
Revoke access to a table or view
To revoke access to a table or view, select one of the following options:
At the bottom of the Google Cloud console, aCloud Shellsession starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
import("context""fmt""io""cloud.google.com/go/bigquery""cloud.google.com/go/iam")// revokeTableOrViewAccessPolicies creates a new ACL removing the VIEWER role to group "example-analyst-group@google.com"// For more information on the types of ACLs available see:// https://cloud.google.com/storage/docs/access-control/listsfuncrevokeTableOrViewAccessPolicies(wio.Writer,projectID,datasetID,resourceIDstring)error{// Resource can be a table or a view//// TODO(developer): uncomment and update the following lines:// projectID := "my-project-id"// datasetID := "mydataset"// resourceID := "myresource"ctx:=context.Background()// Create new clientclient,err:=bigquery.NewClient(ctx,projectID)iferr!=nil{returnfmt.Errorf("bigquery.NewClient: %w",err)}deferclient.Close()// Get resource policy.policy,err:=client.Dataset(datasetID).Table(resourceID).IAM().Policy(ctx)iferr!=nil{returnfmt.Errorf("bigquery.Dataset.Table.IAM.Policy: %w",err)}// Find more details about IAM Roles here:// https://pkg.go.dev/cloud.google.com/go/iam#RoleNameentityID:="example-analyst-group@google.com"roleType:=iam.Viewer// Revoke policy access.policy.Remove(fmt.Sprintf("group:%s",entityID),roleType)// Update resource's policy.err=client.Dataset(datasetID).Table(resourceID).IAM().SetPolicy(ctx,policy)iferr!=nil{returnfmt.Errorf("bigquery.Dataset.Table.IAM.Policy: %w",err)}// Get resource policy again expecting the update.policy,err=client.Dataset(datasetID).Table(resourceID).IAM().Policy(ctx)iferr!=nil{returnfmt.Errorf("bigquery.Dataset.Table.IAM.Policy: %w",err)}fmt.Fprintf(w,"Details for Access entries in table or view %v.\n",resourceID)for_,role:=rangepolicy.Roles(){fmt.Fprintln(w)fmt.Fprintf(w,"Role: %s\n",role)fmt.Fprintf(w,"Entities: %v\n",policy.Members(role))}returnnil}
importcom.google.cloud.Identity;importcom.google.cloud.Policy;importcom.google.cloud.Role;importcom.google.cloud.bigquery.BigQuery;importcom.google.cloud.bigquery.BigQueryException;importcom.google.cloud.bigquery.BigQueryOptions;importcom.google.cloud.bigquery.TableId;importjava.util.HashMap;importjava.util.HashSet;importjava.util.Map;importjava.util.Set;publicclassRevokeAccessToTableOrView{publicstaticvoidmain(String[]args){// TODO(developer): Replace these variables before running the sample.// Project, dataset and resource (table or view) from which to get the access policyStringprojectId="MY_PROJECT_ID";StringdatasetName="MY_DATASET_NAME";StringresourceName="MY_RESOURCE_NAME";// Role to remove from the access policyRolerole=Role.of("roles/bigquery.dataViewer");// Identity to remove from the access policyIdentityuser=Identity.user("user-add@example.com");revokeAccessToTableOrView(projectId,datasetName,resourceName,role,user);}publicstaticvoidrevokeAccessToTableOrView(StringprojectId,StringdatasetName,StringresourceName,Rolerole,Identityidentity){try{// Initialize client that will be used to send requests. This client only needs// to be created once, and can be reused for multiple requests.BigQuerybigquery=BigQueryOptions.getDefaultInstance().getService();// Create table identity given the projectId, the datasetName and the resourceName.TableIdtableId=TableId.of(projectId,datasetName,resourceName);// Remove either identities or roles, or both from bindings and replace it in// the current IAM policy.Policypolicy=bigquery.getIamPolicy(tableId);// Create a copy of an immutable map.Map<Role,Set<Identity>>bindings=newHashMap<>(policy.getBindings());// Remove all identities with a specific role.bindings.remove(role);// Update bindings.policy=policy.toBuilder().setBindings(bindings).build();// Remove one identity in all the existing roles.for(RoleroleKey:bindings.keySet()){if(bindings.get(roleKey).contains(identity)){// Create a copy of an immutable set if the identity is present in the role.Set<Identity>identities=newHashSet<>(bindings.get(roleKey));// Remove identity.identities.remove(identity);bindings.put(roleKey,identities);if(bindings.get(roleKey).isEmpty()){// Remove the role if it has no identities.bindings.remove(roleKey);}}}// Update bindings.policy=policy.toBuilder().setBindings(bindings).build();// Update the IAM policy by setting the new one.bigquery.setIamPolicy(tableId,policy);System.out.println("IAM policy of resource \""+resourceName+"\" updated successfully");}catch(BigQueryExceptione){System.out.println("IAM policy was not updated. \n"+e.toString());}}}
Retrieve the current IAM policy for a table or view using theTable#getIamPolicy()method.
Modify the policy to remove the desired role or principal, and then apply the updated policy using theTable#setIamPolicy()method.
/*** TODO(developer): Update and un-comment below lines*/// const projectId = "YOUR_PROJECT_ID"// const datasetId = "YOUR_DATASET_ID"// const tableId = "YOUR_TABLE_ID"// const roleToRemove = "YOUR_ROLE"// const principalToRemove = "YOUR_PRINCIPAL_ID"const{BigQuery}=require('@google-cloud/bigquery');// Instantiate a client.constclient=newBigQuery();asyncfunctionrevokeAccessToTableOrView(){constdataset=client.dataset(datasetId);consttable=dataset.table(tableId);// Get the IAM access policy for the table or view.const[policy]=awaittable.getIamPolicy();// Initialize bindings array.if(!policy.bindings){policy.bindings=[];}// To revoke access to a table or view,// remove bindings from the Table or View policy.//// Find more details about Policy objects here:// https://cloud.google.com/security-command-center/docs/reference/rest/Shared.Types/Policyif(principalToRemove){// Create a copy of bindings for modifications.constbindings=[...policy.bindings];// Filter out the principal from each binding.for(constbindingofbindings){if(binding.members){binding.members=binding.members.filter(m=>m!==principalToRemove);}}// Filter out bindings with empty members.policy.bindings=bindings.filter(binding=>binding.members&&binding.members.length>0);}if(roleToRemove){// Filter out all bindings with the roleToRemove// and assign a new list back to the policy bindings.policy.bindings=policy.bindings.filter(b=>b.role!==roleToRemove);}// Set the IAM access policy with updated bindings.awaittable.setIamPolicy(policy);// Both role and principal are removedif(roleToRemove!==null&&principalToRemove!==null){console.log(`Role '${roleToRemove}' revoked for principal '${principalToRemove}' on resource '${datasetId}.${tableId}'.`);}// Only role is removedif(roleToRemove!==null&&principalToRemove===null){console.log(`Role '${roleToRemove}' revoked for all principals on resource '${datasetId}.${tableId}'.`);}// Only principal is removedif(roleToRemove===null&&principalToRemove!==null){console.log(`Access revoked for principal '${principalToRemove}' on resource '${datasetId}.${tableId}'.`);}// No changes were madeif(roleToRemove===null&&principalToRemove===null){console.log(`No changes made to access policy for '${datasetId}.${tableId}'.`);}}
You can provide access to a routine by granting an IAM
principal](/iam/docs/principal-identifiers#allow) a predefined or custom role
that determines what the principal can do with the routine. This is also known
as attaching anallow policyto a resource. After
granting access, you can view the access controls for the routine, and you
can revoke access to the routine.
Grant access to a routine
For fine-grained access control, you can grant a predefined or custom
IAM role on a specific routine. The routine also inherits access
controls specified at the dataset level and higher. For example, if you grant a
principal the BigQuery Data Owner role on a dataset, that principal also has
Data Owner permissions on the routines in the dataset.
At the bottom of the Google Cloud console, aCloud Shellsession starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
To write the existing routine information (including access controls) to
a JSON file, use thebq get-iam-policycommand:
DATASET: the name of the dataset that
contains the routine that you want to update
ROUTINE: the name of the resource to update
PATH_TO_FILE: the path to the JSON file on
your local machine
Make changes to thebindingssection of the JSON file. A binding binds
one or more principals to a singlerole. Principals can
be user accounts, service accounts, Google groups, and domains. For
example, thebindingssection of a routine's JSON file would look like
the following:
For fine-grained access control, you can grant a predefined or custom
IAM role on a specific routine. The routine also inherits access
controls specified at the dataset level and higher. For example, if you grant a
principal the Data Owner role on a dataset, that principal also has Data Owner
permissions on the routines in the dataset through inheritance.
The following predefined IAM roles have permissions on routines.
When granted on a routine, this role grants these permissions:
In a query, reference a routine created by someone else.
Permissions for routines
Most permissions that begin withbigquery.routinesapply at the routine level.bigquery.routines.createandbigquery.routines.listdon't. In order to
create and list routines,bigquery.routines.createandbigquery.routines.listpermissions must be granted to a role on the parent
container–the dataset.
The following table lists all permissions for routines and the lowest-level
resource they can be granted to.
Permission
Resource
Description
bigquery.routines.create
Dataset
Create a routine in the dataset. This permission also requiresbigquery.jobs.createto run a query job that contains aCREATE FUNCTIONstatement.
bigquery.routines.delete
Routine
Delete a routine.
bigquery.routines.get
Routine
Reference a routine created by someone else. This permission also
requiresbigquery.jobs.createto run a query job that
references the routine, and you also need permission to access any resources
that the routine references, such as tables or views.
bigquery.routines.list
Dataset
List routines in the dataset and show metadata for routines.
bigquery.routines.update
Routine
Update routine definitions and metadata.
bigquery.routines.getIamPolicy
Routine
Get access controls for the routine.
bigquery.routines.setIamPolicy
Routine
Set access controls for the routine.
View the access controls for a routine
To view the access controls for a routine, choose one of the following options:
At the bottom of the Google Cloud console, aCloud Shellsession starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
To write the existing routine information (including access controls) to
a JSON file, use thebq get-iam-policycommand:
DATASET: the name of the dataset that contains
the routine that you want to update
ROUTINE: the name of the resource to update
PATH_TO_FILE: the path to the JSON file on
your local machine
In the policy file, the value forversionremains1. This number
refers to the IAM policyschemaversion, not the
version of the policy. The value foretagvalue is the policy
version number.
Make changes to theaccesssection of the JSON file. You can remove
any of thespecialGroupentries:projectOwners,projectWriters,projectReaders, andallAuthenticatedUsers. You can also remove any
of the following:userByEmail,groupByEmail, anddomain.
For example, theaccesssection of a routine's JSON file would look
like the following:
Edit the policy to add principals or bindings, or both.
For the format required for the policy, see thePolicyreference topic.
View inherited access controls for a resource
You can examine the inherited IAM roles for a resource by using
the BigQuery web UI. You'll need theappropriate permissions to view inheritancein the console. To examine inheritance for a dataset, table, view, or routine:
In the Google Cloud console, go to theBigQuerypage.
In theExplorerpane, select the dataset, or expand the dataset and
select a table, view, or routine.
For a dataset, clickSharing. For a table, view, or routine, clickShare.
Verify that theShow inherited roles in tableoption is enabled.
Expand a role in the table.
In theInheritancecolumn, the hexagonal icon indicates whether the role
was inherited from a parent resource.
Deny access to a resource
IAM deny policieslet you set
guardrails on access to BigQuery resources. You can define deny rules
that prevent selected principals from usingcertain permissions, regardless of
the roles they're granted.
For information about how to create, update, and delete deny policies, seeDeny access to resources.
Special cases
Consider the following scenarios when you createIAM deny policieson a few BigQuery permissions:
Blocks all BigQuery authorized resources in the specified project.PROJECT_NUMBERis an automatically generated unique identifier for your project of typeINT64.
To exempt certain principals from the deny policy, specify those
principals in theexceptionPrincipalsfield of your deny policy. For example,exceptionPrincipals: "principalSet://bigquery.googleapis.com/projects/1234/*".
BigQuerycaches query resultsof a job owner for 24 hours, which the job owner can access without needing
thebigquery.tables.getDatapermission on the table containing the
data. Hence, adding an IAM deny policy to thebigquery.tables.getDatapermission doesn't block access to cached results
for the job owner until the cache expires. To block the job owner access to
cached results, create a separate deny policy on thebigquery.jobs.createpermission.
To prevent unintended data access when using deny policies to block data read
operations, we recommend that you also review and revoke any existing
subscriptions on the dataset.
To create aIAM deny policyfor
viewing dataset access controls, deny the following permissions:
bigquery.datasets.get
bigquery.datasets.getIamPolicy
To create aIAM deny policyfor
updating dataset access controls, deny the following permissions:
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-09-03 UTC."],[[["\u003cp\u003eThis document explains how to manage access to BigQuery resources using Identity and Access Management (IAM), covering how to view, grant, and revoke access policies.\u003c/p\u003e\n"],["\u003cp\u003eModifying IAM policies for resources requires specific permissions, primarily within the \u003ccode\u003ebigquery.datasets\u003c/code\u003e and \u003ccode\u003ebigquery.tables\u003c/code\u003e categories, which can be obtained through the BigQuery Data Owner role or custom roles.\u003c/p\u003e\n"],["\u003cp\u003eYou can view the access policy of a dataset or a table/view through the console, using the \u003ccode\u003ebq\u003c/code\u003e command-line tool, or by using the BigQuery API with Python code.\u003c/p\u003e\n"],["\u003cp\u003eAccess to a dataset can be granted using the console, SQL \u003ccode\u003eGRANT\u003c/code\u003e statements, \u003ccode\u003ebq\u003c/code\u003e commands, Terraform, and API methods, allowing control over who can view or modify data.\u003c/p\u003e\n"],["\u003cp\u003eRevoking access involves removing principals or roles from the access policy, and can be done through the console, SQL \u003ccode\u003eREVOKE\u003c/code\u003e statements, \u003ccode\u003ebq\u003c/code\u003e commands, or API methods, with considerations for inherited access and special cases like cached results.\u003c/p\u003e\n"]]],[],null,[]]