Knowledge Catalog pricing

Knowledge Catalog pricing is based on pay-as-you-go usage. Knowledge Catalog currently charges based on the following SKUs:

  • Knowledge Catalog processing (standard and premium)
  • Metadata storage

The following is a high-level overview of how each key Knowledge Catalog capability is billed:

Capability

Standard

N/A

Premium

Yes

Premium

Yes - if published to Catalog

Premium

Yes - if published to Catalog

Enrich metadata in Knowledge Catalog

N/A

Yes

Gemini-powered features in Knowledge Catalog including data insights and automated metadata generation features are billed as part of Gemini in BigQuery or Gemini Code Assist (link: https://cloud.google.com/products/gemini/pricing#gemini-in-bigquery-pricing)

Other usage

Data organization features in Knowledge Catalog (lake, zone, or asset setup) and security policy application and propagation, are provided free of charge.

In addition, some Knowledge Catalog functionalities (including discovery scans, scheduled data quality and data ingestion tasks, and Knowledge Catalog managed connectors for ingesting metadata from CloudSQL and Looker) trigger job execution using GCS, Dataproc Serverless, BigQuery, Dataflow, and Cloud Scheduler. Those usages are charged according to the GCS , Dataproc , BigQuery , Dataflow , and Cloud Scheduler pricing models respectively, and charges will show up under GCS, Dataproc, BigQuery, and Dataflow instead of Knowledge Catalog.

Knowledge Catalog processing pricing

Knowledge Catalog standard and premium processing are metered by the Data Compute Unit (DCU). DCU-hour is an abstract billing unit for Knowledge Catalog and the actual metering depends on the individual features you use.

Knowledge Catalog standard processing pricing

Knowledge Catalog standard tier covers the  data discovery functionality  that discovers metadata across Knowledge Catalog managed data. The following are the prices as per the region of your choice.

  • Johannesburg (africa-south1)
  • Taiwan (asia-east1)
  • Hong Kong (asia-east2)
  • Tokyo (asia-northeast1)
  • Osaka (asia-northeast2)
  • Seoul (asia-northeast3)
  • Mumbai (asia-south1)
  • Singapore (asia-southeast1)
  • Jakarta (asia-southeast2)
  • Bangkok (asia-southeast3)
  • Sydney (australia-southeast1)
  • Melbourne (australia-southeast2)
  • Warsaw (europe-central2)
  • Finland (europe-north1)
  • Stockholm (europe-north2)
  • Madrid (europe-southwest1)
  • Belgium (europe-west1)
  • Berlin (europe-west10)
  • Turin (europe-west12)
  • London (europe-west2)
  • Frankfurt (europe-west3)
  • Netherlands (europe-west4)
  • Zurich (europe-west6)
  • Milan (europe-west8)
  • Paris (europe-west9)
  • Doha (me-central1)
  • Dammam (me-central2)
  • Tel Aviv (me-west1)
  • Montreal (northamerica-northeast1)
  • Toronto (northamerica-northeast2)
  • Mexico (northamerica-south1)
  • Sao Paulo (southamerica-east1)
  • Santiago (southamerica-west1)
  • Iowa (us-central1)
  • Oklahoma (us-central2)
  • South Carolina (us-east1)
  • Northern Virginia (us-east4)
  • Columbus (us-east5)
  • Dallas (us-south1)
  • Oregon (us-west1)
  • Los Angeles (us-west2)
  • Salt Lake City (us-west3)
  • Las Vegas (us-west4)
Show discount options

Item

Meter

Default * (USD)
BigQuery CUD - 1 Year * (USD)
BigQuery CUD - 3 Year * (USD)

Knowledge Catalog processing

per DCU per unit time

$0.06 $0.054 $0.048
* Each consumption model has a unique ID. You may need to opt-in to be eligible for consumption model discounts. Click here to learn more.

Knowledge Catalog free tier

As part of the  Google Cloud Free Tier , Knowledge Catalog offers some resources free of charge up to a specific limit. These free usage limits are available during and after the free trial period. If you go over these usage limits and are no longer in the free trial period, you will be charged according to the pricing as described in the sections above.

Note:  The Knowledge Catalog free tier is only available for the Knowledge Catalog Standard Processing SKU, and is not available for the Knowledge Catalog Premium Processing SKU.

Resource

Monthly free usage limits

Knowledge Catalog Processing

100 DCU-hour

Knowledge Catalog premium processing pricing

The Knowledge Catalog premium processing tier covers data lineage, data quality, and data profiling.

  • Johannesburg (africa-south1)
  • Taiwan (asia-east1)
  • Hong Kong (asia-east2)
  • Tokyo (asia-northeast1)
  • Osaka (asia-northeast2)
  • Seoul (asia-northeast3)
  • Mumbai (asia-south1)
  • Delhi (asia-south2)
  • Singapore (asia-southeast1)
  • Jakarta (asia-southeast2)
  • Bangkok (asia-southeast3)
  • Sydney (australia-southeast1)
  • Melbourne (australia-southeast2)
  • Warsaw (europe-central2)
  • Finland (europe-north1)
  • Stockholm (europe-north2)
  • Madrid (europe-southwest1)
  • Belgium (europe-west1)
  • Berlin (europe-west10)
  • Turin (europe-west12)
  • London (europe-west2)
  • Frankfurt (europe-west3)
  • Netherlands (europe-west4)
  • Zurich (europe-west6)
  • Milan (europe-west8)
  • Paris (europe-west9)
  • Doha (me-central1)
  • Dammam (me-central2)
  • Tel Aviv (me-west1)
  • Montreal (northamerica-northeast1)
  • Toronto (northamerica-northeast2)
  • Mexico (northamerica-south1)
  • Sao Paulo (southamerica-east1)
  • Santiago (southamerica-west1)
  • Iowa (us-central1)
  • Oklahoma (us-central2)
  • South Carolina (us-east1)
  • Northern Virginia (us-east4)
  • Columbus (us-east5)
  • Alabama (us-east7)
  • Dallas (us-south1)
  • Oregon (us-west1)
  • Los Angeles (us-west2)
  • Salt Lake City (us-west3)
  • Las Vegas (us-west4)
  • Phoenix (us-west8)
Show discount options

Item

Meter

Default * (USD)
BigQuery CUD - 1 Year * (USD)
BigQuery CUD - 3 Year * (USD)

Knowledge Catalog premium processing pricing

per DCU per unit time

$0.089 $0.0801 $0.0712
* Each consumption model has a unique ID. You may need to opt-in to be eligible for consumption model discounts. Click here to learn more.

Calculation of DCU charges

DCU charges for each feature are calculated as follows:

1. Auto data quality  scans:

  • The DCU-hour consumption is proportional to the processing involved in profiling the data and computing the data quality metrics. This is billed per second, with a minimum of one minute.
  • The charge depends on the number of rows, the number of columns, the amount of data that you've scanned, the data quality rule configuration, the partitioning and clustering settings on the table, and the frequency of the scan.
  • For data quality anomaly detection scans, DCU charges do not apply. Instead, standard BigQuery pricing for compute, storage, and BQML model training, processing, and deploying apply.
  • Using a custom execution identity changes how you are billed for the scan. When you specify a custom execution identity, the compute and storage costs associated with the scan are billed directly to your BigQuery project, bypassing the standard Knowledge Catalog Premium SKUs.

2. There are several options to reduce the cost of auto data quality scans:

  • Sampling
  • Incremental scans
  • To separate data quality charges from other charges in the Knowledge Catalog premium processing SKU, on the  Cloud Billing report , use the label goog-dataplex-workload-type with value DATA_QUALITY.

3. To filter aggregate charges, use the following labels available in billing export in BigQuery:

  • goog-dataplex-datascan-data-source-dataplex-entity
  • goog-dataplex-datascan-data-source-dataplex-lake
  • goog-dataplex-datascan-data-source-dataplex-zone
  • goog-dataplex-datascan-data-source-project
  • goog-dataplex-datascan-data-source-region
  • goog-dataplex-datascan-id
  • goog-dataplex-datascan-job-id

4. Data Profiling  scans:

  • The DCU-hour consumption is proportional to the processing involved in profiling the data and computing the data quality metrics. This is billed per second, with a minimum of one minute.
  • The charge depends on the number of rows, numbers of columns, the amount of data scanned, partitioning and clustering settings on the table, and the frequency of the scan.
  • Using a custom execution identity changes how you are billed for the scan. When you specify a custom execution identity, the compute and storage costs associated with the scan are billed directly to your BigQuery project, bypassing the standard Knowledge Catalog Premium SKUs.

5. There are several options to reduce the cost of data profiling scans:

  • Sampling
  • Incremental scans
  • Column filtering
  • Row filtering
  • To separate data profiling charges from other charges in the Knowledge Catalog premium processing SKU, on the  Cloud Billing report , use the label goog-dataplex-workload-type with value DATA_PROFILE.

6. To filter aggregate charges, use the following labels available in billing export in BigQuery:

  • goog-dataplex-datascan-data-source-dataplex-entity
  • goog-dataplex-datascan-data-source-dataplex-lake
  • goog-dataplex-datascan-data-source-dataplex-zone
  • goog-dataplex-datascan-data-source-project
  • goog-dataplex-datascan-data-source-region
  • goog-dataplex-datascan-id
  • goog-dataplex-datascan-job-id

7. Data Lineage :

  • The DCU-hour consumption is proportional to the processing involved to automatically parse lineage.
  • To separate data lineage charges from other charges in the Knowledge Catalog premium processing SKU, on the  Cloud Billing report , use the label goog-dataplex-workload-type with value LINEAGE.
  • If you call the Data Lineage API  Origin  sourceType with a value other than CUSTOM, it causes additional costs.

Data lineage pricing example

User A enables data lineage to track lineage for BigQuery in their project. The project is in the us-central1 location. During one month, data lineage consumes 100 DCU-hours of Knowledge Catalog Premium processing, and generates 1GiB of data lineage metadata. The cost is:

  • Example
Loading...

What's next

Request a custom quote

With Google Cloud's pay-as-you-go pricing, you only pay for the services you use. Connect with our sales team to get a custom quote for your organization.
Google Cloud
Create a Mobile Website
View Site in Mobile | Classic
Share by: