Stay organized with collectionsSave and categorize content based on your preferences.
Using Sensitive Data Protection to scan BigQuery data
Knowing where your sensitive data exists is often the first step in ensuring
that it is properly secured and managed. This knowledge can help reduce the risk
of exposing sensitive details such as credit card numbers, medical information,
Social Security numbers, driver's license numbers, addresses, full names, and
company-specific secrets. Periodic scanning of your data can also help with
compliance requirements and ensure best practices are followed as your data
grows and changes with use. To help meet compliance requirements, use
Sensitive Data Protection to inspect your BigQuery tables and
to help protect your sensitive data.
There are two ways to scan your BigQuery data:
Sensitive data profiling.Sensitive Data Protection can generate profiles about
BigQuery data across an organization, folder, or project.Data
profilescontain metrics and metadata about your tables and help you
determine wheresensitive and high-risk
datareside. Sensitive Data Protection
reports these metrics at the project, table, and column levels. For more
information, seeData profiles for
BigQuery data.
On-demand inspection.Sensitive Data Protection can perform a deep inspection on
a single table or a subset of columns and report its findings down to the cell
level. This kind of inspection can help you identify individual instances of
specific datatypes, such as the precise
location of a credit card number inside a table cell. You can do an on-demand
inspection through the Sensitive Data Protection page in the
Google Cloud console, theBigQuerypage in the Google Cloud console,
or programmatically through the DLP API.
This page describes how to do an on-demand inspection through theBigQuerypage in the Google Cloud console.
Sensitive Data Protection is a fully managed service that lets Google Cloud customers
identify and protect sensitive data at scale. Sensitive Data Protection uses more
than 150 predefined detectors to identify patterns, formats, and checksums.
Sensitive Data Protection also provides a set of tools to de-identify your data
including masking, tokenization, pseudonymization, date shifting, and more, all
without replicating customer data.
Ensure that the user creating your Sensitive Data Protection jobs is granted an
appropriate predefined Sensitive Data ProtectionIAM roleor
sufficientpermissionsto run Sensitive Data Protection
jobs.
Scanning BigQuery data using the Google Cloud console
To scan BigQuery data, you create a Sensitive Data Protection job
that analyzes a table. You can scan a BigQuery table quickly by using
theScan with Sensitive Data Protectionoption in the BigQuery Google Cloud console.
To scan a BigQuery table using Sensitive Data Protection:
In the Google Cloud console, go to the BigQuery page.
In theExplorerpanel, expand your project and dataset, then select
the table.
ClickExport > Scan with Sensitive Data Protection. The Sensitive Data Protection job
creation page opens in a new tab.
ForStep 1: Choose input data, enter a job ID. The values in theLocationsection are automatically generated. Also, theSamplingsection is automatically configured to run a sample scan against your data, but
you can adjust the settings as needed.
ClickContinue.
Optional: ForStep 2: Configure detection, you can configure what types
of data to look for, calledinfoTypes.
Do one of the following:
To select from the list of predefinedinfoTypes, clickManage
infoTypes. Then, select the infoTypes you want to search for.
To use an existinginspection template,
in theTemplate namefield, enter the template's full resource name.
Optional: ForStep 3: Add actions, turn onSave to BigQueryto publish your Sensitive Data Protection findings to a BigQuery
table. If you don't store findings, the completed job contains only
statistics about the number of findings and theirinfoTypes. Saving
findings to BigQuery saves details about the precise location and
confidence of each individual finding.
Optional: If you turned onSave to BigQuery, in theSave to
BigQuerysection, enter the following information:
Project ID: the project ID where your results are stored.
Dataset ID: the name of the dataset that stores your results.
Optional:Table ID: the name of the table that stores your
results. If no table ID is specified, a default name is assigned to
a new table similar to the following:dlp_googleapis_date_1234567890.
If you specify an existing table, findings are appended to it.
To include the actual content that was detected, turn onInclude quote.
ClickContinue.
Optional: ForStep 4: Schedule, configure a time span or schedule by
selecting eitherSpecify time spanorCreate a trigger to run the job
on a periodic schedule.
ClickContinue.
Optional: On theReviewpage, examine the details of your job. If needed,
adjust the previous settings.
ClickCreate.
After the Sensitive Data Protection job completes, you are redirected to the job
details page, and you're notified by email. You can view the results of the
scan on the job details page, or you can click the link to
the Sensitive Data Protection job details page in the job completion email.
If you chose to publish Sensitive Data Protection findings to
BigQuery, on theJob detailspage, clickView Findings in
BigQueryto open the table in the Google Cloud console. You can then query the
table and analyze your findings. For more information on querying your results
in BigQuery, seeQuerying Sensitive Data Protection findings in BigQueryin the Sensitive Data Protection documentation.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-09-04 UTC."],[[["\u003cp\u003eSensitive Data Protection can scan BigQuery data to identify sensitive information, including credit card numbers, medical details, and other personal data, helping ensure its proper security and management.\u003c/p\u003e\n"],["\u003cp\u003eThere are two primary methods for scanning BigQuery data: sensitive data profiling, which provides an overview of data sensitivity across an organization, and on-demand inspection, which offers deep analysis of specific tables or columns down to the cell level.\u003c/p\u003e\n"],["\u003cp\u003eOn-demand inspections can be initiated from the BigQuery page in the Google Cloud console, allowing users to quickly analyze a table for sensitive data and configure what data types to look for.\u003c/p\u003e\n"],["\u003cp\u003eAfter completing a scan, Sensitive Data Protection can save the findings to a BigQuery table, including precise locations and confidence levels of sensitive data, providing detailed insights for further analysis or action.\u003c/p\u003e\n"],["\u003cp\u003eThe DLP API is required to be enabled, and users running Sensitive Data Protection jobs need appropriate IAM roles or permissions.\u003c/p\u003e\n"]]],[],null,["# Using Sensitive Data Protection to scan BigQuery data\n=====================================================\n\nKnowing where your sensitive data exists is often the first step in ensuring\nthat it is properly secured and managed. This knowledge can help reduce the risk\nof exposing sensitive details such as credit card numbers, medical information,\nSocial Security numbers, driver's license numbers, addresses, full names, and\ncompany-specific secrets. Periodic scanning of your data can also help with\ncompliance requirements and ensure best practices are followed as your data\ngrows and changes with use. To help meet compliance requirements, use\nSensitive Data Protection to inspect your BigQuery tables and\nto help protect your sensitive data.\n\nThere are two ways to scan your BigQuery data:\n\n- **Sensitive data profiling.** Sensitive Data Protection can generate profiles about\n BigQuery data across an organization, folder, or project. *Data\n profiles* contain metrics and metadata about your tables and help you\n determine where [sensitive and high-risk\n data](/sensitive-data-protection/docs/sensitivity-risk-calculation) reside. Sensitive Data Protection\n reports these metrics at the project, table, and column levels. For more\n information, see [Data profiles for\n BigQuery data](/sensitive-data-protection/docs/data-profiles).\n\n- **On-demand inspection.** Sensitive Data Protection can perform a deep inspection on\n a single table or a subset of columns and report its findings down to the cell\n level. This kind of inspection can help you identify individual instances of\n specific data [types](/sensitive-data-protection/docs/infotypes-reference), such as the precise\n location of a credit card number inside a table cell. You can do an on-demand\n inspection through the Sensitive Data Protection page in the\n Google Cloud console, the **BigQuery** page in the Google Cloud console,\n or programmatically through the DLP API.\n\nThis page describes how to do an on-demand inspection through the\n**BigQuery** page in the Google Cloud console.\n\nSensitive Data Protection is a fully managed service that lets Google Cloud customers\nidentify and protect sensitive data at scale. Sensitive Data Protection uses more\nthan 150 predefined detectors to identify patterns, formats, and checksums.\nSensitive Data Protection also provides a set of tools to de-identify your data\nincluding masking, tokenization, pseudonymization, date shifting, and more, all\nwithout replicating customer data.\n\nTo learn more about Sensitive Data Protection, see the [Sensitive Data Protection](/sensitive-data-protection/docs)\ndocumentation.\n\nBefore you begin\n----------------\n\n1. Get familiar with [Sensitive Data Protection pricing](/sensitive-data-protection/pricing) and [how to keep Sensitive Data Protection costs under control](/sensitive-data-protection/docs/best-practices-costs).\n2. [Enable the DLP API](/apis/docs/enable-disable-apis).\n\n [Enable the API](https://console.cloud.google.com/flows/enableapi?apiid=dlp.googleapis.com)\n3. Ensure that the user creating your Sensitive Data Protection jobs is granted an\n appropriate predefined Sensitive Data Protection [IAM role](/sensitive-data-protection/docs/iam-roles) or\n sufficient [permissions](/sensitive-data-protection/docs/iam-permissions) to run Sensitive Data Protection\n jobs.\n\n| **Note:** When you enable the DLP API, a service account is created with a name similar to `service-`\u003cvar translate=\"no\"\u003eproject_number\u003c/var\u003e`@dlp-api.iam.gserviceaccount.com`. This service account is granted the DLP API Service Agent role, which lets the service account authenticate with the BigQuery API. For more information, see [Service account](/sensitive-data-protection/docs/iam-permissions#service_account) on the Sensitive Data Protection IAM permissions page.\n\nScanning BigQuery data using the Google Cloud console\n-----------------------------------------------------\n\nTo scan BigQuery data, you create a Sensitive Data Protection job\nthat analyzes a table. You can scan a BigQuery table quickly by using\nthe **Scan with Sensitive Data Protection** option in the BigQuery Google Cloud console.\n\nTo scan a BigQuery table using Sensitive Data Protection:\n\n1. In the Google Cloud console, go to the BigQuery page.\n\n [Go to BigQuery](https://console.cloud.google.com/bigquery)\n2. In the **Explorer** panel, expand your project and dataset, then select\n the table.\n\n3. Click **Export \\\u003e Scan with Sensitive Data Protection**. The Sensitive Data Protection job\n creation page opens in a new tab.\n\n4. For **Step 1: Choose input data** , enter a job ID. The values in the\n **Location** section are automatically generated. Also, the **Sampling**\n section is automatically configured to run a sample scan against your data, but\n you can adjust the settings as needed.\n\n5. Click **Continue**.\n\n6. Optional: For **Step 2: Configure detection** , you can configure what types\n of data to look for, called `infoTypes`.\n\n Do one of the following:\n - To select from the list of predefined `infoTypes`, click **Manage\n infoTypes**. Then, select the infoTypes you want to search for.\n - To use an existing [inspection template](/sensitive-data-protection/docs/creating-templates-inspect), in the **Template name** field, enter the template's full resource name.\n\n For more information on `infoTypes`, see\n [InfoTypes and infoType detectors](/sensitive-data-protection/docs/concepts-infotypes) in the\n Sensitive Data Protection documentation.\n7. Click **Continue**.\n\n8. Optional: For **Step 3: Add actions** , turn on **Save to BigQuery**\n to publish your Sensitive Data Protection findings to a BigQuery\n table. If you don't store findings, the completed job contains only\n statistics about the number of findings and their `infoTypes`. Saving\n findings to BigQuery saves details about the precise location and\n confidence of each individual finding.\n\n9. Optional: If you turned on **Save to BigQuery** , in the **Save to\n BigQuery** section, enter the following information:\n\n - **Project ID**: the project ID where your results are stored.\n - **Dataset ID**: the name of the dataset that stores your results.\n - Optional: **Table ID** : the name of the table that stores your results. If no table ID is specified, a default name is assigned to a new table similar to the following: `dlp_googleapis_`\u003cvar translate=\"no\"\u003edate\u003c/var\u003e`_1234567890`. If you specify an existing table, findings are appended to it.\n\n To include the actual content that was detected, turn on **Include quote**.\n10. Click **Continue**.\n\n11. Optional: For **Step 4: Schedule** , configure a time span or schedule by\n selecting either **Specify time span** or **Create a trigger to run the job\n on a periodic schedule**.\n\n12. Click **Continue**.\n\n13. Optional: On the **Review** page, examine the details of your job. If needed,\n adjust the previous settings.\n\n14. Click **Create**.\n\n15. After the Sensitive Data Protection job completes, you are redirected to the job\n details page, and you're notified by email. You can view the results of the\n scan on the job details page, or you can click the link to\n the Sensitive Data Protection job details page in the job completion email.\n\n16. If you chose to publish Sensitive Data Protection findings to\n BigQuery, on the **Job details** page, click **View Findings in\n BigQuery** to open the table in the Google Cloud console. You can then query the\n table and analyze your findings. For more information on querying your results\n in BigQuery, see\n [Querying Sensitive Data Protection findings in BigQuery](/sensitive-data-protection/docs/querying-findings)\n in the Sensitive Data Protection documentation.\n\nWhat's next\n-----------\n\n- Learn more about [inspecting BigQuery and other storage\n repositories for sensitive data using Sensitive Data Protection](/sensitive-data-protection/docs/inspecting-storage).\n\n- Learn more about [profiling data in an organization, folder, or\n project](/sensitive-data-protection/docs/data-profiles).\n\n- Read the Identity \\& Security blog post [Take charge of your data: using\n Sensitive Data Protection to de-identify and obfuscate sensitive\n information](https://cloud.google.com/blog/products/identity-security/taking-charge-of-your-data-using-cloud-dlp-to-de-identify-and-obfuscate-sensitive-information).\n\nIf you want to redact or otherwise de-identify the sensitive data that the\nSensitive Data Protection scan found, see the following:\n\n- [Inspect text to de-identify sensitive information](/sensitive-data-protection/docs/inspect-sensitive-text-de-identify)\n- [De-identifying sensitive data](/sensitive-data-protection/docs/deidentify-sensitive-data) in the Sensitive Data Protection documentation\n- [AEAD encryption concepts in GoogleSQL](/bigquery/docs/aead-encryption-concepts) for information on encrypting individual values within a table\n- [Protecting data with Cloud KMS keys](/bigquery/docs/customer-managed-encryption) for information on creating and managing your own encryption keys in [Cloud KMS](/kms/docs) to encrypt BigQuery tables"]]