This predefined role contains
the permissions required to modify IAM policies for resources. To see the exact permissions that are
required, expand theRequired permissionssection:
Required permissions
The following permissions are required to modify IAM policies for resources:
To get a dataset's access policy:bigquery.datasets.get
To set a dataset's access policy:bigquery.datasets.update
To get a dataset's access policy (Google Cloud console only):bigquery.datasets.getIamPolicy
To set a dataset's access policy (console only):bigquery.datasets.setIamPolicy
To get a table or view's policy:bigquery.tables.getIamPolicy
To set a table or view's policy:bigquery.tables.setIamPolicy
To create bq tool orSQL BigQuery jobs(optional):bigquery.jobs.create
At the bottom of the Google Cloud console, aCloud Shellsession starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
To get an existing policy and output it to a local file in JSON, use thebq showcommandin Cloud Shell:
At the bottom of the Google Cloud console, aCloud Shellsession starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
To get an existing access policy and output it to a local file in JSON, use thebq get-iam-policycommandin Cloud Shell:
At the bottom of the Google Cloud console, aCloud Shellsession starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
To write the existing dataset information (including access controls) to a
JSON file, use thebq showcommand:
PATH_TO_FILE: the path to the JSON file on your local machine
Make changes to theaccesssection of the JSON file. You can
add to any of thespecialGroupentries:projectOwners,projectWriters,projectReaders, andallAuthenticatedUsers. You can
also add any of the following:userByEmail,groupByEmail, anddomain.
For example, theaccesssection of a dataset's JSON file would look like
the following:
When your edits are complete, use thebq updatecommand and include the
JSON file using the--sourceflag. If the dataset is in a project other
than your default project, add the project ID to the dataset name in the
following format:PROJECT_ID:DATASET.
bqupdate\--sourcePATH_TO_FILE\PROJECT_ID:DATASET
To verify your access control changes, use thebq showcommand again
without writing the information to a file:
The following example shows how to use thegoogle_bigquery_dataset_iam_policyresourceto set the IAM policy for themydatasetdataset. This replaces any existing policy already attached
to the dataset:
# This file sets the IAM policy for the dataset created by# https://github.com/terraform-google-modules/terraform-docs-samples/blob/main/bigquery/bigquery_create_dataset/main.tf.# You must place it in the same local directory as that main.tf file,# and you must have already applied that main.tf file to create# the "default" dataset resource with a dataset_id of "mydataset".data"google_iam_policy""iam_policy"{binding{role="roles/bigquery.admin"members=["user:hao@altostrat.com",]}binding{role="roles/bigquery.dataOwner"members=["group:dba@altostrat.com",]}binding{role="roles/bigquery.dataEditor"members=["serviceAccount:bqcx-1234567891011-12a3@gcp-sa-bigquery-condel.iam.gserviceaccount.com",]}}resource"google_bigquery_dataset_iam_policy""dataset_iam_policy"{dataset_id=google_bigquery_dataset.default.dataset_idpolicy_data=data.google_iam_policy.iam_policy.policy_data}
Set role membership for a dataset
The following example shows how to use thegoogle_bigquery_dataset_iam_bindingresourceto set membership in a given role for themydatasetdataset. This replaces any existing membership in that role.
Other roles within the IAM policy for the dataset
are preserved:
# This file sets membership in an IAM role for the dataset created by# https://github.com/terraform-google-modules/terraform-docs-samples/blob/main/bigquery/bigquery_create_dataset/main.tf.# You must place it in the same local directory as that main.tf file,# and you must have already applied that main.tf file to create# the "default" dataset resource with a dataset_id of "mydataset".resource"google_bigquery_dataset_iam_binding""dataset_iam_binding"{dataset_id=google_bigquery_dataset.default.dataset_idrole="roles/bigquery.jobUser"members=["user:raha@altostrat.com","group:analysts@altostrat.com"]}
Set role membership for a single principal
The following example shows how to use thegoogle_bigquery_dataset_iam_memberresourceto update the IAM policy for themydatasetdataset to grant a role to one principal. Updating this
IAM policy does not affect access for any other principals
that have been granted that role for the dataset.
# This file adds a member to an IAM role for the dataset created by# https://github.com/terraform-google-modules/terraform-docs-samples/blob/main/bigquery/bigquery_create_dataset/main.tf.# You must place it in the same local directory as that main.tf file,# and you must have already applied that main.tf file to create# the "default" dataset resource with a dataset_id of "mydataset".resource"google_bigquery_dataset_iam_member""dataset_iam_member"{dataset_id=google_bigquery_dataset.default.dataset_idrole="roles/bigquery.user"member="user:yuri@altostrat.com"}
To apply your Terraform configuration in a Google Cloud project, complete the steps in the
following sections.
Set the default Google Cloud project
where you want to apply your Terraform configurations.
You only need to run this command once per project, and you can run it in any directory.
export GOOGLE_CLOUD_PROJECT=PROJECT_ID
Environment variables are overridden if you set explicit values in the Terraform
configuration file.
Prepare the directory
Each Terraform configuration file must have its own directory (also
called aroot module).
InCloud Shell, create a directory and a new
file within that directory. The filename must have the.tfextension—for examplemain.tf. In this
tutorial, the file is referred to asmain.tf.
mkdirDIRECTORY&& cdDIRECTORY&& touch main.tf
If you are following a tutorial, you can copy the sample code in each section or step.
Copy the sample code into the newly createdmain.tf.
Optionally, copy the code from GitHub. This is recommended
when the Terraform snippet is part of an end-to-end solution.
Review and modify the sample parameters to apply to your environment.
Save your changes.
Initialize Terraform. You only need to do this once per directory.
terraform init
Optionally, to use the latest Google provider version, include the-upgradeoption:
terraform init -upgrade
Apply the changes
Review the configuration and verify that the resources that Terraform is going to create or
update match your expectations:
terraform plan
Make corrections to the configuration as necessary.
Apply the Terraform configuration by running the following command and enteringyesat the prompt:
terraform apply
Wait until Terraform displays the "Apply complete!" message.
Open your Google Cloud projectto view
the results. In the Google Cloud console, navigate to your resources in the UI to make sure
that Terraform has created or updated them.
import("context""fmt""cloud.google.com/go/bigquery")// updateDatasetAccessControl demonstrates how the access control policy of a dataset// can be amended by adding an additional entry corresponding to a specific user identity.funcupdateDatasetAccessControl(projectID,datasetIDstring)error{// projectID := "my-project-id"// datasetID := "mydataset"ctx:=context.Background()client,err:=bigquery.NewClient(ctx,projectID)iferr!=nil{returnfmt.Errorf("bigquery.NewClient: %v",err)}deferclient.Close()ds:=client.Dataset(datasetID)meta,err:=ds.Metadata(ctx)iferr!=nil{returnerr}// Append a new access control entry to the existing access list.update:=bigquery.DatasetMetadataToUpdate{Access:append(meta.Access,&bigquery.AccessEntry{Role:bigquery.ReaderRole,EntityType:bigquery.UserEmailEntity,Entity:"sample.bigquery.dev@gmail.com"},),}// Leverage the ETag for the update to assert there's been no modifications to the// dataset since the metadata was originally read.if_,err:=ds.Update(ctx,update,meta.ETag);err!=nil{returnerr}returnnil}
importcom.google.cloud.bigquery.Acl;importcom.google.cloud.bigquery.Acl.Role;importcom.google.cloud.bigquery.Acl.User;importcom.google.cloud.bigquery.BigQuery;importcom.google.cloud.bigquery.BigQueryException;importcom.google.cloud.bigquery.BigQueryOptions;importcom.google.cloud.bigquery.Dataset;importjava.util.ArrayList;publicclassUpdateDatasetAccess{publicstaticvoidmain(String[]args){// TODO(developer): Replace these variables before running the sample.StringdatasetName="MY_DATASET_NAME";// Create a new ACL granting the READER role to "sample.bigquery.dev@gmail.com"// For more information on the types of ACLs available see:// https://cloud.google.com/storage/docs/access-control/listsAclnewEntry=Acl.of(newUser("sample.bigquery.dev@gmail.com"),Role.READER);updateDatasetAccess(datasetName,newEntry);}publicstaticvoidupdateDatasetAccess(StringdatasetName,AclnewEntry){try{// Initialize client that will be used to send requests. This client only needs to be created// once, and can be reused for multiple requests.BigQuerybigquery=BigQueryOptions.getDefaultInstance().getService();Datasetdataset=bigquery.getDataset(datasetName);// Get a copy of the ACLs list from the dataset and append the new entryArrayList<Acl>acls=newArrayList<>(dataset.getAcl());acls.add(newEntry);bigquery.update(dataset.toBuilder().setAcl(acls).build());System.out.println("Dataset Access Control updated successfully");}catch(BigQueryExceptione){System.out.println("Dataset Access control was not updated \n"+e.toString());}}}
# TODO(developer): Set dataset_id to the ID of the dataset to fetch.dataset_id="your-project.your_dataset"# TODO(developer): Set entity_id to the ID of the email or group from whom# you are adding access. Alternatively, to the JSON REST API representation# of the entity, such as a view's table reference.entity_id="user-or-group-to-add@example.com"fromgoogle.cloud.bigquery.enumsimportEntityTypes# TODO(developer): Set entity_type to the type of entity you are granting access to.# Common types include:## * "userByEmail" -- A single user or service account. For example "fred@example.com"# * "groupByEmail" -- A group of users. For example "example@googlegroups.com"# * "view" -- An authorized view. For example# {"projectId": "p", "datasetId": "d", "tableId": "v"}## For a complete reference, see the REST API reference documentation:# https://cloud.google.com/bigquery/docs/reference/rest/v2/datasets#Dataset.FIELDS.accessentity_type=EntityTypes.GROUP_BY_EMAIL# TODO(developer): Set role to a one of the "Basic roles for datasets"# described here:# https://cloud.google.com/bigquery/docs/access-control-basic-roles#dataset-basic-rolesrole="READER"fromgoogle.cloudimportbigquery# Construct a BigQuery client object.client=bigquery.Client()dataset=client.get_dataset(dataset_id)# Make an API request.entries=list(dataset.access_entries)entries.append(bigquery.AccessEntry(role=role,entity_type=entity_type,entity_id=entity_id,))dataset.access_entries=entriesdataset=client.update_dataset(dataset,["access_entries"])# Make an API request.full_dataset_id="{}.{}".format(dataset.project,dataset.dataset_id)print("Updated dataset '{}' with modified user permissions.".format(full_dataset_id))
At the bottom of the Google Cloud console, aCloud Shellsession starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
The following example shows how to use thegoogle_bigquery_table_iam_policyresourceto set the IAM policy for themytabletable. This replaces any existing policy already attached
to the table:
# This file sets the IAM policy for the table created by# https://github.com/terraform-google-modules/terraform-docs-samples/blob/main/bigquery/bigquery_create_table/main.tf.# You must place it in the same local directory as that main.tf file,# and you must have already applied that main.tf file to create# the "default" table resource with a table_id of "mytable".data"google_iam_policy""iam_policy"{binding{role="roles/bigquery.dataOwner"members=["user:raha@altostrat.com",]}}resource"google_bigquery_table_iam_policy""table_iam_policy"{dataset_id=google_bigquery_table.default.dataset_idtable_id=google_bigquery_table.default.table_idpolicy_data=data.google_iam_policy.iam_policy.policy_data}
Set role membership for a table
The following example shows how to use thegoogle_bigquery_table_iam_bindingresourceto set membership in a given role for themytabletable. This replaces any existing membership in that role.
Other roles within the IAM policy for the table
are preserved.
# This file sets membership in an IAM role for the table created by# https://github.com/terraform-google-modules/terraform-docs-samples/blob/main/bigquery/bigquery_create_table/main.tf.# You must place it in the same local directory as that main.tf file,# and you must have already applied that main.tf file to create# the "default" table resource with a table_id of "mytable".resource"google_bigquery_table_iam_binding""table_iam_binding"{dataset_id=google_bigquery_table.default.dataset_idtable_id=google_bigquery_table.default.table_idrole="roles/bigquery.dataOwner"members=["group:analysts@altostrat.com",]}
Set role membership for a single principal
The following example shows how to use thegoogle_bigquery_table_iam_memberresourceto update the IAM policy for themytabletable to grant a role to one principal. Updating this
IAM policy does not affect access for any other principals
that have been granted that role for the dataset.
# This file adds a member to an IAM role for the table created by# https://github.com/terraform-google-modules/terraform-docs-samples/blob/main/bigquery/bigquery_create_table/main.tf.# You must place it in the same local directory as that main.tf file,# and you must have already applied that main.tf file to create# the "default" table resource with a table_id of "mytable".resource"google_bigquery_table_iam_member""table_iam_member"{dataset_id=google_bigquery_table.default.dataset_idtable_id=google_bigquery_table.default.table_idrole="roles/bigquery.dataEditor"member="serviceAccount:bqcx-1234567891011-12a3@gcp-sa-bigquery-condel.iam.gserviceaccount.com"}
To apply your Terraform configuration in a Google Cloud project, complete the steps in the
following sections.
Set the default Google Cloud project
where you want to apply your Terraform configurations.
You only need to run this command once per project, and you can run it in any directory.
export GOOGLE_CLOUD_PROJECT=PROJECT_ID
Environment variables are overridden if you set explicit values in the Terraform
configuration file.
Prepare the directory
Each Terraform configuration file must have its own directory (also
called aroot module).
InCloud Shell, create a directory and a new
file within that directory. The filename must have the.tfextension—for examplemain.tf. In this
tutorial, the file is referred to asmain.tf.
mkdirDIRECTORY&& cdDIRECTORY&& touch main.tf
If you are following a tutorial, you can copy the sample code in each section or step.
Copy the sample code into the newly createdmain.tf.
Optionally, copy the code from GitHub. This is recommended
when the Terraform snippet is part of an end-to-end solution.
Review and modify the sample parameters to apply to your environment.
Save your changes.
Initialize Terraform. You only need to do this once per directory.
terraform init
Optionally, to use the latest Google provider version, include the-upgradeoption:
terraform init -upgrade
Apply the changes
Review the configuration and verify that the resources that Terraform is going to create or
update match your expectations:
terraform plan
Make corrections to the configuration as necessary.
Apply the Terraform configuration by running the following command and enteringyesat the prompt:
terraform apply
Wait until Terraform displays the "Apply complete!" message.
Open your Google Cloud projectto view
the results. In the Google Cloud console, navigate to your resources in the UI to make sure
that Terraform has created or updated them.
importcom.google.cloud.Identity;importcom.google.cloud.Policy;importcom.google.cloud.Role;importcom.google.cloud.bigquery.BigQuery;importcom.google.cloud.bigquery.BigQueryException;importcom.google.cloud.bigquery.BigQueryOptions;importcom.google.cloud.bigquery.TableId;// Sample to create iam policy for tablepublicclassCreateIamPolicy{publicstaticvoidmain(String[]args){// TODO(developer): Replace these variables before running the sample.StringdatasetName="MY_DATASET_NAME";StringtableName="MY_TABLE_NAME";createIamPolicy(datasetName,tableName);}publicstaticvoidcreateIamPolicy(StringdatasetName,StringtableName){try{// Initialize client that will be used to send requests. This client only needs to be created// once, and can be reused for multiple requests.BigQuerybigquery=BigQueryOptions.getDefaultInstance().getService();TableIdtableId=TableId.of(datasetName,tableName);Policypolicy=bigquery.getIamPolicy(tableId);policy.toBuilder().addIdentity(Role.of("roles/bigquery.dataViewer"),Identity.allUsers()).build();bigquery.setIamPolicy(tableId,policy);System.out.println("Iam policy created successfully");}catch(BigQueryExceptione){System.out.println("Iam policy was not created. \n"+e.toString());}}}
fromgoogle.cloudimportbigquerybqclient=bigquery.Client()policy=bqclient.get_iam_policy(your_table_id,# e.g. "project.dataset.table")analyst_email="example-analyst-group@google.com"binding={"role":"roles/bigquery.dataViewer","members":{f"group:{analyst_email}"},}policy.bindings.append(binding)updated_policy=bqclient.set_iam_policy(your_table_id,# e.g. "project.dataset.table"policy,)forbindinginupdated_policy.bindings:print(repr(binding))
Revoke access to a resource
The following sections describe how to revoke access to different resources.
At the bottom of the Google Cloud console, aCloud Shellsession starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
To write the existing dataset information (including access controls) to a
JSON file, use thebq showcommand:
PATH_TO_FILE: the path to the JSON file on your local
machine
Make changes to theaccesssection of the JSON file. You can
remove any of thespecialGroupentries:projectOwners,projectWriters,projectReaders, andallAuthenticatedUsers. You can
also remove any of the following:userByEmail,groupByEmail, anddomain.
For example, theaccesssection of a dataset's JSON file would look like
the following:
When your edits are complete, use thebq updatecommand and include the
JSON file using the--sourceflag. If the dataset is in a project other
than your default project, add the project ID to the dataset name in the
following format:PROJECT_ID:DATASET.
bqupdate\--sourcePATH_TO_FILE\PROJECT_ID:DATASET
To verify your access control changes, use theshowcommand again
without writing the information to a file:
bqshow--format=prettyjsonPROJECT_ID:DATASET
API
Calldatasets.patchand
use theaccessproperty in theDatasetresource to update your access controls.
Because thedatasets.updatemethod replaces the entire dataset resource,datasets.patchis the preferred method for updating access controls.
import("context""fmt""cloud.google.com/go/bigquery")// revokeDatasetAccess updates the access control on a dataset to remove all// access entries that reference a specific entity.funcrevokeDatasetAccess(projectID,datasetID,entitystring)error{// projectID := "my-project-id"// datasetID := "mydataset"// entity := "user@mydomain.com"ctx:=context.Background()client,err:=bigquery.NewClient(ctx,projectID)iferr!=nil{returnfmt.Errorf("bigquery.NewClient: %v",err)}deferclient.Close()ds:=client.Dataset(datasetID)meta,err:=ds.Metadata(ctx)iferr!=nil{returnerr}varnewAccessList[]*bigquery.AccessEntryfor_,entry:=rangemeta.Access{ifentry.Entity!=entity{newAccessList=append(newAccessList,entry)}}// Only proceed with update if something in the access list was removed.// Additionally, we use the ETag from the initial metadata to ensure no// other changes were made to the access list in the interim.iflen(newAccessList)<len(meta.Access){update:=bigquery.DatasetMetadataToUpdate{Access:newAccessList,}if_,err:=ds.Update(ctx,update,meta.ETag);err!=nil{returnerr}}returnnil}
# TODO(developer): Set dataset_id to the ID of the dataset to fetch.dataset_id="your-project.your_dataset"# TODO(developer): Set entity_id to the ID of the email or group from whom you are revoking access.entity_id="user-or-group-to-remove@example.com"fromgoogle.cloudimportbigquery# Construct a BigQuery client object.client=bigquery.Client()dataset=client.get_dataset(dataset_id)# Make an API request.entries=list(dataset.access_entries)dataset.access_entries=[entryforentryinentriesifentry.entity_id!=entity_id]dataset=client.update_dataset(dataset,# Update just the `access_entries` property of the dataset.["access_entries"],)# Make an API request.full_dataset_id=f"{dataset.project}.{dataset.dataset_id}"print(f"Revoked dataset access for '{entity_id}' to ' dataset '{full_dataset_id}.'")
At the bottom of the Google Cloud console, aCloud Shellsession starts and displays a command-line prompt. Cloud Shell is a shell environment
with the Google Cloud CLI
already installed and with values already set for
your current project. It can take a few seconds for the session to initialize.
importcom.google.cloud.Identity;importcom.google.cloud.Policy;importcom.google.cloud.Role;importcom.google.cloud.bigquery.BigQuery;importcom.google.cloud.bigquery.BigQueryException;importcom.google.cloud.bigquery.BigQueryOptions;importcom.google.cloud.bigquery.TableId;importjava.util.HashMap;importjava.util.Map;importjava.util.Set;// Sample to update iam policy in tablepublicclassUpdateIamPolicy{publicstaticvoidmain(String[]args){// TODO(developer): Replace these variables before running the sample.StringdatasetName="MY_DATASET_NAME";StringtableName="MY_TABLE_NAME";updateIamPolicy(datasetName,tableName);}publicstaticvoidupdateIamPolicy(StringdatasetName,StringtableName){try{// Initialize client that will be used to send requests. This client only needs to be created// once, and can be reused for multiple requests.BigQuerybigquery=BigQueryOptions.getDefaultInstance().getService();TableIdtableId=TableId.of(datasetName,tableName);Policypolicy=bigquery.getIamPolicy(tableId);Map<Role,Set<Identity>>binding=newHashMap<>(policy.getBindings());binding.remove(Role.of("roles/bigquery.dataViewer"));policy.toBuilder().setBindings(binding).build();bigquery.setIamPolicy(tableId,policy);System.out.println("Iam policy updated successfully");}catch(BigQueryExceptione){System.out.println("Iam policy was not updated. \n"+e.toString());}}}
Deny access to a resource
IAM deny policieslet you set
guardrails on access to BigQuery resources. You can define deny rules
that prevent selected principals from usingcertain permissions, regardless of
the roles they're granted.
For information about how to create, update, and delete deny policies, seeDeny access to resources.
Special cases
Consider the following scenarios when you createIAM deny policieson a few BigQuery permissions:
Blocks all BigQuery authorized resources in the specified project.PROJECT_NUMBERis an automatically generated unique identifier for your project of typeINT64.
BigQuerycaches query resultsof a job owner for 24 hours, which the job owner can access without needing
thebigquery.tables.getDatapermission on the table containing the
data. Hence, adding an IAM deny policy to thebigquery.tables.getDatapermission doesn't block access to cached results
for the job owner until the cache expires. To block the job owner access to
cached results, create a separate deny policy on thebigquery.jobs.createpermission.
To prevent unintended data access when using deny policies to block data read
operations, we recommend that you also review and revoke any existing
subscriptions on the dataset.
To create aIAM deny policyfor
viewing dataset access controls, deny the following permissions:
bigquery.datasets.get
bigquery.datasets.getIamPolicy
To create aIAM deny policyfor
updating dataset access controls, deny the following permissions:
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2024-11-12 UTC."],[],[]]