Create BigQuery subscriptions

This document describes how to create a BigQuery subscription. You can use the Google Cloud console, the Google Cloud CLI, the client library, or the Pub/Sub API to create a BigQuery subscription.

Before you begin

Before reading this document, ensure that you're familiar with the following:

In addition to your familiarity with Pub/Sub and BigQuery, ensure that you meet the following prerequisites before you create a BigQuery subscription:

  • A BigQuery table exists. Alternatively, you can create one when you create the BigQuery subscription as described in the later sections of this document.

  • Compatibility between the schema of the Pub/Sub topic and the BigQuery table. If you add a non-compatible BigQuery table, you get a compatibility-related error message. For more information, see Schema compatibility .

Required roles and permissions

The following is a list of guidelines regarding roles and permissions:

  • To create a subscription, you must configure access control at the project level.

  • You also need resource-level permissions if your subscriptions and topics are in different projects, as discussed later in this section.

  • To create a BigQuery subscription, either the Pub/Sub service agent or a custom service account must have permission to write to the specific BigQuery table. For more information about how to grant these permissions, see the next section of this document.

  • You can configure a BigQuery subscription in a project to write to a BigQuery table in a different project.

To get the permissions that you need to create BigQuery subscriptions, ask your administrator to grant you the Pub/Sub Editor ( roles/pubsub.editor ) IAM role on the project. For more information about granting roles, see Manage access to projects, folders, and organizations .

This predefined role contains the permissions required to create BigQuery subscriptions. To see the exact permissions that are required, expand the Required permissionssection:

Required permissions

The following permissions are required to create BigQuery subscriptions:

  • Pull from a subscription: pubsub.subscriptions.consume
  • Create a subscription: pubsub.subscriptions.create
  • Delete a subscription: pubsub.subscriptions.delete
  • Get a subscription: pubsub.subscriptions.get
  • List a subscription: pubsub.subscriptions.list
  • Update a subscription: pubsub.subscriptions.update
  • Attach a subscription to a topic: pubsub.topics.attachSubscription
  • Get the IAM policy for a subscription: pubsub.subscriptions.getIamPolicy
  • Configure the IAM policy for a subscription: pubsub.subscriptions.setIamPolicy

You might also be able to get these permissions with custom roles or other predefined roles .

To let a principal in one project create a BigQuery subscription in another project, you must grant that principal the Pub/Sub Editor ( roles/pubsub.editor ) role in both projects. This provides the necessary permissions to create the new BigQuery subscription and attach it to the original topic. The Pub/Sub Editor ( roles/pubsub.editor ) role on the topic also helps you attach BigQuery subscriptions in a different project to the topic.

Assign roles to service accounts

Some Google Cloud services have Google Cloud-managed service accounts that let the services access your resources. These service accounts are known as service agents. Pub/Sub creates and maintains a service agent for each project in the format service- project-number @gcp-sa-pubsub.iam.gserviceaccount.com .

You can choose between letting the Pub/Sub service agent or a custom service account permission to write to the BigQuery table.

Granting permission to the Pub/Sub service agent means that any user who has permission to create a subscription in your project can write to the BigQuery table. If you want to provide more granular permission for writing to the BigQuery table, configure a custom service account instead.

For more information about BigQuery IAM, see BigQuery roles and permissions .

If you want to create a BigQuery subscription using the Pub/Sub service agent, then it must have permission to write to the specific BigQuery table and read the table metadata.

Grant the BigQuery Data Editor ( roles/bigquery.dataEditor ) role to the Pub/Sub service agent. You can grant the permission on an individual table or on the project as a whole.

Table

  1. In the Google Cloud console, go to BigQuery Studio.

    Go to BigQuery Studio

  2. In the Explorer pane search box labeled Filter by name and labels, type the name of the table and press Enter.

  3. Click the table to which you want to grant permission.

  4. For the table, select More actions> Share> Permissions.

    Alternatively, click the table and in the main page click Sharing> Permissions.

    The Share permissionswindow opens.

  5. Click Add principal.

  6. For Add principals, enter the name of your Pub/Sub service agent for the project containing the subscription. The format of the service agent is service- project-number @gcp-sa-pubsub.iam.gserviceaccount.com . For example, for a project with project-number=112233445566 , the service agent is of the format service-112233445566@gcp-sa-pubsub.iam.gserviceaccount.com .

  7. In the Select a roledrop-down, enter BigQuery , and select the BigQuery Data Editorrole.

  8. Click Save.

Project

  1. In the Google Cloud console, go to the IAMpage.

    Go to IAM

  2. Click Grant access.

  3. In the Add principalssection, enter the name of your Pub/Sub service agent. The format of the service agent is service- project-number @gcp-sa-pubsub.iam.gserviceaccount.com . For example, for a project with project-number=112233445566 , the service agent is of the format service-112233445566@gcp-sa-pubsub.iam.gserviceaccount.com .

  4. In the Assign rolessection, click Add another role.

  5. In the Select a roledrop-down, enter BigQuery , and select the BigQuery Data Editorrole.

  6. Click Save.

If you want to use a custom service account for writing to a BigQuery table, then you must set the following permissions:

  • The custom service account must have permission to write to the specific BigQuery table and to read the table metadata.
  • The Pub/Sub service agent must have the iam.serviceAccounts.getAccessToken permission on the custom service account.
  • The user creating the subscription must have the iam.serviceAccounts.actAs permission on the custom service account.

Create the service account and grant permissions with the following steps:

  1. Create the custom service account . The service account must be in the same project as the subscription.

  2. Grant the BigQuery Data Editor ( roles/bigquery.dataEditor ) role to the custom service account.

    You can grant the service account permission on a single table in the project or on all tables in the project. To do so, see the appropriate section in Assign BigQuery roles to the Pub/Sub service agent . In the procedure, replace the Pub/Sub service agent email address with the custom service account email address.

  3. Give the Pub/Sub service agent the iam.serviceAccounts.getAccessToken permission on the custom service account or on allservice accounts in the project. You can grant this permission by giving the roles/iam.serviceAccountTokenCreator role to the Pub/Sub service agent.

    Choose the appropriate method based on your requirements.

  1. In the Google Cloud console, go to the Service accountspage.

    Go to Service accounts

  2. Enter the name of the custom service account in the Filter.

  3. Select the service account from the list.

  4. Click Principals with access.

  5. Click Grant access.

  6. In the Add principalssection, enter the name of your Pub/Sub service agent for the project containing the subscription. The format of the service agent is service- project-number @gcp-sa-pubsub.iam.gserviceaccount.com . For example, for a project with project-number=112233445566 , the service agent is of the format service-112233445566@gcp-sa-pubsub.iam.gserviceaccount.com .

  7. In the Select a roledrop-down, enter Service Account , and select the Service Account Token Creatorrole.

  8. Click Save.

Project

  1. In the Google Cloud console, go to the IAMpage.

    Go to IAM

  2. Click Grant access.

  3. In the Add principalssection, enter the name of your custom service account.

  4. In the Assign rolessection, click Add another role.

  5. In the Select a roledrop-down, enter Service Account , and select the Service Account Token Creatorrole.

  6. Click Save.

If you created the custom service account, you should already have the necessary iam.serviceAccounts.actAs permission. If you need to grant someone else the permission on the service account:

  1. In the Google Cloud console, go to the Service accountspage.

    Go to Service accounts

  2. Enter the name of the custom service account in the Filter.

  3. Select the service account from the list.

  4. Click Principals with access.

  5. Click Grant access.

  6. In the Add principalssection, enter the name the account to which you want to grant access.

  7. In the Select a roledrop-down, enter Service Account , and select the Service Account Userrole.

  8. Additionally, if your BigQuery table is an Apache Iceberg table, grant the Pub/Sub service account the Storage Adminrole ( roles/storage.admin ) to access the Cloud Storage bucket.

  9. Click Save.

BigQuery subscription properties

When you configure a BigQuery subscription, you can specify the following properties.

Common properties

Learn about the common subscription properties that you can set across all subscriptions.

Use topic schema

This option lets Pub/Sub use the schema of the Pub/Sub topic to which the subscription is attached. In addition, Pub/Sub writes the fields in messages to the corresponding columns in the BigQuery table.

When you use this option, remember to check the following additional requirements:

  • The fields in the topic schema and the BigQuery schema must have the same names and their types must be compatible with each other.

  • Any optional field in the topic schema must also be optional in the BigQuery schema.

  • Required fields in the topic schema don't need to be required in the BigQuery schema.

  • If there are BigQuery fields that are not present in the topic schema, these BigQuery fields must be in mode NULLABLE .

  • If the topic schema has additional fields that are not present in the BigQuery schema and these fields can be dropped, select the option Drop unknown fields.

  • You can select only one of the subscription properties, Use topic schemaor Use table schema.

If you don't select the Use topic schemaor Use table schemaoption, ensure that the BigQuery table has a column called data of type BYTES , STRING , or JSON . Pub/Sub writes the message to this BigQuery column.

You might not see changes to the Pub/Sub topics schema or BigQuery table schema take effect immediately with messages written to the BigQuery table. For example, if the Drop unknown fieldsoption is enabled and a field is present in the Pub/Sub schema, but not the BigQuery schema, messages written to the BigQuery table might still not contain the field after adding it to the BigQuery schema. Eventually, the schemas synchronize and subsequent messages include the field.

When you use the Use topic schemaoption for your BigQuery subscription, you can also take advantage of BigQuery change data capture (CDC). CDC updates your BigQuery tables by processing and applying changes to existing rows.

To learn more about this feature, see Stream table updates with change data capture .

To learn how to use this feature with BigQuery subscriptions, see BigQuery change data capture .

Use table schema

This option lets Pub/Sub use the schema of the BigQuery table to write the fields of a JSON message to the corresponding columns. When you use this option, remember to check the following additional requirements:

  • The names of each column in the BigQuery table must only contain letters (a-z, A-Z), numbers (0-9), or underscores (_).

  • Published messages must be in JSON format.

  • The following JSON conversions are supported:

    JSON Type BigQuery Data Type
    string NUMERIC , BIGNUMERIC , DATE , TIME , DATETIME , or TIMESTAMP
    number NUMERIC , BIGNUMERIC , DATE , TIME , DATETIME , or TIMESTAMP
    • When using number to DATE , DATETIME , TIME , or TIMESTAMP conversions, the number must adhere to the supported representations .
    • When using number to NUMERIC or BIGNUMERIC conversion, the precision and range of values is limited to those accepted by the IEEE 754 standard for floating-point arithmetic . If you require high precision or a wider range of values, use string to NUMERIC or BIGNUMERIC conversions instead.
    • When using string to NUMERIC or BIGNUMERIC conversions, Pub/Sub assumes the string is a human readable number (e.g. "123.124" ). If processing the string as a human readable number fails, Pub/Sub treats the string as bytes encoded with the BigDecimalByteStringEncoder .
  • If the subscription's topic has a schema associated with it , then the message encoding property must be set to JSON .

  • If there are BigQuery fields that are not present in the messages, these BigQuery fields must be in mode NULLABLE .

  • If the messages have additional fields that are not present in the BigQuery schema and these fields can be dropped, select the option Drop unknown fields.

  • You can select only one of the subscription properties, Use topic schemaor Use table schema.

If you don't select the Use topic schemaor Use table schemaoption, ensure that the BigQuery table has a column called data of type BYTES , STRING , or JSON . Pub/Sub writes the message to this BigQuery column.

You might not see changes to the BigQuery table schema take effect immediately with messages written to the BigQuery table. For example, if the Drop unknown fieldsoption is enabled and a field is present in the messages, but not in the BigQuery schema, messages written to the BigQuery table might still not contain the field after adding it to the BigQuery schema. Eventually, the schema synchronizes and subsequent messages include the field.

When you use the Use table schemaoption for your BigQuery subscription, you can also take advantage of BigQuery change data capture (CDC). CDC updates your BigQuery tables by processing and applying changes to existing rows.

To learn more about this feature, see Stream table updates with change data capture .

To learn how to use this feature with BigQuery subscriptions, see BigQuery change data capture .

Drop unknown fields

This option is used with the Use topic schemaor Use table schemaoption. When enabled, this option lets Pub/Sub drop any field that is present in the topic schema or message but not in the BigQuery schema. The fields that are not part of the BigQuery schema are dropped when writing the message to the BigQuery table.

Without Drop unknown fieldsset, messages with extra fields are not written to BigQuery and remain in the subscription backlog unless you configure a dead letter topic .

The Drop unknown fieldssetting does not affect fields that are not defined in either the Pub/Sub topic schema or the BigQuery table schema. In this case, a valid Pub/Sub message is delivered to the subscription. However, because BigQuery does not have columns defined for these extra fields, these fields are dropped during the BigQuery writing process. To prevent this behaviour, ensure that any field contained in the Pub/Sub message is also contained in the BigQuery table schema.

The behavior regarding extra fields can also depend on the specific schema type (Avro, Protocol Buffer) and encoding (JSON, Binary) used. For information about how these factors affect the handling of extra fields, see the documentation for your specific schema type and encoding.

This option lets Pub/Sub write the metadata of each message to additional columns in the BigQuery table. Else, the metadata is not written to the BigQuery table.

If you select the Write metadataoption, ensure that the BigQuery table has the fields described in the following table.

If you don't select the Write metadataoption, then the destination BigQuery table only requires the data field unless use_topic_schema is true. If you select both the Write metadataand Use topic schemaoptions, then the schema of the topic must not contain any fields with names that match those of the metadata parameters. This limitation includes camelcase versions of these snake case parameters.

Parameters
subscription_name

STRING

Name of a subscription.

message_id

STRING

ID of a message

publish_time

TIMESTAMP

The time of publishing a message.

data

BYTES, STRING, or JSON

The message body.

The data field is required for all destination BigQuery tables that don't select Use topic schema or Use table schema . If the field is of type JSON, then the message body must be valid JSON.

attributes

STRING or JSON

A JSON object containing all message attributes. It also contains additional fields that are part of the Pub/Sub message including the ordering key, if present.

Create a BigQuery subscription

The following samples demonstrate how to create a subscription with BigQuery delivery.

Console

  1. In the Google Cloud console, go to the Subscriptionspage.

    Go to Subscriptions

  2. Click Create subscription.

  3. For the Subscription IDfield, enter a name. For information on how to name a subscription, see Guidelines to name a topic or a subscription .

  4. Choose or create a topic from the drop-down menu. The subscription receives messages from the topic.

  5. Select Delivery typeas Write to BigQuery.

  6. Select the project for the BigQuery table.

  7. Select an existing dataset or create a new one. For information on how to create a dataset, see Creating datasets .

  8. Select an existing table or create a new one. For information on how to create a table, see Creating tables .

  9. We strongly recommend that you enable Dead letteringto handle message failures. For more information, see Dead letter topic .

  10. Click Create.

You can also create a subscription from the Topicspage. This shortcut is useful for associating topics with subscriptions.

  1. In the Google Cloud console, go to the Topicspage.

    Go to Topics

  2. Click next to the topic for which you want to create a subscription.

  3. From the context menu, select Create subscription.

  4. Select Delivery typeas Write to BigQuery.

  5. Select the project for the BigQuery table.

  6. Select an existing dataset or create a new one. For information on how to create a dataset, see Creating datasets .

  7. Select an existing table or create a new one. For information on how to create a dataset, see Creating tables .

  8. We strongly recommend that you enable Dead letteringto handle message failures. For more information, see Dead letter topic .

  9. Click Create.

gcloud

  1. In the Google Cloud console, activate Cloud Shell.

    Activate Cloud Shell

  1. To create a Pub/Sub subscription, use the gcloud pubsub subscriptions create command:

     gcloud  
    pubsub  
    subscriptions  
    create  
     SUBSCRIPTION_ID 
      
     \ 
      
    --topic = 
     TOPIC_ID 
      
     \ 
      
    --bigquery-table = 
     PROJECT_ID 
    . DATASET_ID 
    . TABLE_ID 
     
    

    If you want to use a custom service account, provide it as an additional argument:

     gcloud  
    pubsub  
    subscriptions  
    create  
     SUBSCRIPTION_ID 
      
     \ 
      
    --topic = 
     TOPIC_ID 
      
     \ 
      
    --bigquery-table = 
     PROJECT_ID 
    . DATASET_ID 
    . TABLE_ID 
      
      
    --bigquery-service-account-email = 
     SERVICE_ACCOUNT_NAME 
     
    

    Replace the following:

    • SUBSCRIPTION_ID : Specifies the ID of the subscription.
    • TOPIC_ID : Specifies the ID of the topic. The topic requires a schema.
    • PROJECT_ID : Specifies the ID of the project.
    • DATASET_ID : Specifies the ID of an existing dataset. To create a dataset, see Create datasets .
    • TABLE_ID : Specifies the ID of an existing table. The table requires a data field if your topic doesn't have a schema. To create a table, see Create an empty table with a schema definition .
    • SERVICE_ACCOUNT_NAME : Specifies the name of the service account to use to write to BigQuery.

C++

Before trying this sample, follow the C++ setup instructions in Quickstart: Using Client Libraries . For more information, see the Pub/Sub C++ API reference documentation .

  namespace 
  
 pubsub 
  
 = 
  
 :: 
 google 
 :: 
 cloud 
 :: 
 pubsub 
 ; 
 namespace 
  
 pubsub_admin 
  
 = 
  
 :: 
 google 
 :: 
 cloud 
 :: 
 pubsub_admin 
 ; 
 []( 
 pubsub_admin 
 :: 
 SubscriptionAdminClient 
  
 client 
 , 
  
 std 
 :: 
 string 
  
 const 
&  
 project_id 
 , 
  
 std 
 :: 
 string 
  
 const 
&  
 topic_id 
 , 
  
 std 
 :: 
 string 
  
 const 
&  
 subscription_id 
 , 
  
 std 
 :: 
 string 
  
 const 
&  
 table_id 
 ) 
  
 { 
  
 google 
 :: 
 pubsub 
 :: 
 v1 
 :: 
 Subscription 
  
 request 
 ; 
  
 request 
 . 
 set_name 
 ( 
  
 pubsub 
 :: 
 Subscription 
 ( 
 project_id 
 , 
  
 subscription_id 
 ). 
 FullName 
 ()); 
  
 request 
 . 
 set_topic 
 ( 
 pubsub 
 :: 
 Topic 
 ( 
 project_id 
 , 
  
 topic_id 
 ). 
 FullName 
 ()); 
  
 request 
 . 
 mutable_bigquery_config 
 () 
 - 
> set_table 
 ( 
 table_id 
 ); 
  
 auto 
  
 sub 
  
 = 
  
 client 
 . 
 CreateSubscription 
 ( 
 request 
 ); 
  
 if 
  
 ( 
 ! 
 sub 
 ) 
  
 { 
  
 if 
  
 ( 
 sub 
 . 
 status 
 (). 
 code 
 () 
  
 == 
  
 google 
 :: 
 cloud 
 :: 
 StatusCode 
 :: 
 kAlreadyExists 
 ) 
  
 { 
  
 std 
 :: 
 cout 
 << 
 "The subscription already exists 
 \n 
 " 
 ; 
  
 return 
 ; 
  
 } 
  
 throw 
  
 std 
 :: 
 move 
 ( 
 sub 
 ). 
 status 
 (); 
  
 } 
  
 std 
 :: 
 cout 
 << 
 "The subscription was successfully created: " 
 << 
 sub 
 - 
> DebugString 
 () 
 << 
 " 
 \n 
 " 
 ; 
 } 
 

C#

Before trying this sample, follow the C# setup instructions in Quickstart: Using Client Libraries . For more information, see the Pub/Sub C# API reference documentation .

  using 
  
  Google.Cloud.PubSub.V1 
 
 ; 
 public 
  
 class 
  
 CreateBigQuerySubscriptionSample 
 { 
  
 public 
  
 Subscription 
  
 CreateBigQuerySubscription 
 ( 
 string 
  
 projectId 
 , 
  
 string 
  
 topicId 
 , 
  
 string 
  
 subscriptionId 
 , 
  
 string 
  
 bigqueryTableId 
 ) 
  
 { 
  
  SubscriberServiceApiClient 
 
  
 subscriber 
  
 = 
  
  SubscriberServiceApiClient 
 
 . 
  Create 
 
 (); 
  
  TopicName 
 
  
 topicName 
  
 = 
  
  TopicName 
 
 . 
  FromProjectTopic 
 
 ( 
 projectId 
 , 
  
 topicId 
 ); 
  
  SubscriptionName 
 
  
 subscriptionName 
  
 = 
  
  SubscriptionName 
 
 . 
  FromProjectSubscription 
 
 ( 
 projectId 
 , 
  
 subscriptionId 
 ); 
  
 var 
  
 subscriptionRequest 
  
 = 
  
 new 
  
  Subscription 
 
  
 { 
  
 SubscriptionName 
  
 = 
  
 subscriptionName 
 , 
  
 TopicAsTopicName 
  
 = 
  
 topicName 
 , 
  
 BigqueryConfig 
  
 = 
  
 new 
  
  BigQueryConfig 
 
  
 { 
  
 Table 
  
 = 
  
 bigqueryTableId 
  
 } 
  
 }; 
  
 var 
  
 subscription 
  
 = 
  
 subscriber 
 . 
  CreateSubscription 
 
 ( 
 subscriptionRequest 
 ); 
  
 return 
  
 subscription 
 ; 
  
 } 
 } 
 

Go

The following sample uses the major version of the Go Pub/Sub client library (v2). If you are still using the v1 library, see the migration guide to v2 . To see a list of v1 code samples, see the deprecated code samples .

Before trying this sample, follow the Go setup instructions in Quickstart: Using Client Libraries . For more information, see the Pub/Sub Go API reference documentation .

  import 
  
 ( 
  
 "context" 
  
 "fmt" 
  
 "io" 
  
 "cloud.google.com/go/pubsub/v2" 
  
 "cloud.google.com/go/pubsub/v2/apiv1/pubsubpb" 
 ) 
 // createBigQuerySubscription creates a Pub/Sub subscription that exports messages to BigQuery. 
 func 
  
 createBigQuerySubscription 
 ( 
 w 
  
 io 
 . 
 Writer 
 , 
  
 projectID 
 , 
  
 topic 
 , 
  
 subscription 
 , 
  
 table 
  
 string 
 ) 
  
 error 
  
 { 
  
 // projectID := "my-project" 
  
 // topic := "projects/my-project-id/topics/my-topic" 
  
 // subscription := "projects/my-project/subscriptions/my-sub" 
  
 // table := "my-project-id.dataset_id.table_id" 
  
 ctx 
  
 := 
  
 context 
 . 
 Background 
 () 
  
 client 
 , 
  
 err 
  
 := 
  
 pubsub 
 . 
 NewClient 
 ( 
 ctx 
 , 
  
 projectID 
 ) 
  
 if 
  
 err 
  
 != 
  
 nil 
  
 { 
  
 return 
  
 fmt 
 . 
 Errorf 
 ( 
 "pubsub.NewClient: %w" 
 , 
  
 err 
 ) 
  
 } 
  
 defer 
  
 client 
 . 
 Close 
 () 
  
 sub 
 , 
  
 err 
  
 := 
  
 client 
 . 
 SubscriptionAdminClient 
 . 
 CreateSubscription 
 ( 
 ctx 
 , 
  
& pubsubpb 
 . 
 Subscription 
 { 
  
 Name 
 : 
  
 subscription 
 , 
  
 Topic 
 : 
  
 topic 
 , 
  
 BigqueryConfig 
 : 
  
& pubsubpb 
 . 
 BigQueryConfig 
 { 
  
 Table 
 : 
  
 table 
 , 
  
 WriteMetadata 
 : 
  
 true 
 , 
  
 }, 
  
 }) 
  
 if 
  
 err 
  
 != 
  
 nil 
  
 { 
  
 return 
  
 fmt 
 . 
 Errorf 
 ( 
 "failed to create subscription: %w" 
 , 
  
 err 
 ) 
  
 } 
  
 fmt 
 . 
 Fprintf 
 ( 
 w 
 , 
  
 "Created BigQuery subscription: %v\n" 
 , 
  
 sub 
 ) 
  
 return 
  
 nil 
 } 
 

Java

Before trying this sample, follow the Java setup instructions in Quickstart: Using Client Libraries . For more information, see the Pub/Sub Java API reference documentation .

  import 
  
 com.google.cloud.pubsub.v1. SubscriptionAdminClient 
 
 ; 
 import 
  
 com.google.pubsub.v1. BigQueryConfig 
 
 ; 
 import 
  
 com.google.pubsub.v1. ProjectSubscriptionName 
 
 ; 
 import 
  
 com.google.pubsub.v1. ProjectTopicName 
 
 ; 
 import 
  
 com.google.pubsub.v1. Subscription 
 
 ; 
 import 
  
 java.io.IOException 
 ; 
 public 
  
 class 
 CreateBigQuerySubscriptionExample 
  
 { 
  
 public 
  
 static 
  
 void 
  
 main 
 ( 
 String 
 ... 
  
 args 
 ) 
  
 throws 
  
 Exception 
  
 { 
  
 // TODO(developer): Replace these variables before running the sample. 
  
 String 
  
 projectId 
  
 = 
  
 "your-project-id" 
 ; 
  
 String 
  
 topicId 
  
 = 
  
 "your-topic-id" 
 ; 
  
 String 
  
 subscriptionId 
  
 = 
  
 "your-subscription-id" 
 ; 
  
 String 
  
 bigqueryTableId 
  
 = 
  
 "your-project.your-dataset.your-table" 
 ; 
  
 createBigQuerySubscription 
 ( 
 projectId 
 , 
  
 topicId 
 , 
  
 subscriptionId 
 , 
  
 bigqueryTableId 
 ); 
  
 } 
  
 public 
  
 static 
  
 void 
  
 createBigQuerySubscription 
 ( 
  
 String 
  
 projectId 
 , 
  
 String 
  
 topicId 
 , 
  
 String 
  
 subscriptionId 
 , 
  
 String 
  
 bigqueryTableId 
 ) 
  
 throws 
  
 IOException 
  
 { 
  
 try 
  
 ( 
  SubscriptionAdminClient 
 
  
 subscriptionAdminClient 
  
 = 
  
  SubscriptionAdminClient 
 
 . 
 create 
 ()) 
  
 { 
  
  ProjectTopicName 
 
  
 topicName 
  
 = 
  
  ProjectTopicName 
 
 . 
 of 
 ( 
 projectId 
 , 
  
 topicId 
 ); 
  
  ProjectSubscriptionName 
 
  
 subscriptionName 
  
 = 
  
  ProjectSubscriptionName 
 
 . 
 of 
 ( 
 projectId 
 , 
  
 subscriptionId 
 ); 
  
  BigQueryConfig 
 
  
 bigqueryConfig 
  
 = 
  
  BigQueryConfig 
 
 . 
 newBuilder 
 (). 
  setTable 
 
 ( 
 bigqueryTableId 
 ). 
 setWriteMetadata 
 ( 
 true 
 ). 
 build 
 (); 
  
  Subscription 
 
  
 subscription 
  
 = 
  
 subscriptionAdminClient 
 . 
 createSubscription 
 ( 
  
  Subscription 
 
 . 
 newBuilder 
 () 
  
 . 
 setName 
 ( 
 subscriptionName 
 . 
  toString 
 
 ()) 
  
 . 
 setTopic 
 ( 
 topicName 
 . 
  toString 
 
 ()) 
  
 . 
  setBigqueryConfig 
 
 ( 
 bigqueryConfig 
 ) 
  
 . 
 build 
 ()); 
  
 System 
 . 
 out 
 . 
 println 
 ( 
 "Created a BigQuery subscription: " 
  
 + 
  
 subscription 
 . 
 getAllFields 
 ()); 
  
 } 
  
 } 
 } 
 

Node.js

Before trying this sample, follow the Node.js setup instructions in Quickstart: Using Client Libraries . For more information, see the Pub/Sub Node.js API reference documentation .

  /** 
  
 * 
  
 TODO 
 ( 
 developer 
 ): 
  
 Uncomment 
  
 these 
  
 variables 
  
 before 
  
 running 
  
 the 
  
 sample 
 . 
  
 */ 
 // 
  
 const 
  
 topicNameOrId 
  
 = 
  
 'YOUR_TOPIC_NAME_OR_ID' 
 ; 
 // 
  
 const 
  
 subscriptionNameOrId 
  
 = 
  
 'YOUR_SUBSCRIPTION_NAME_OR_ID' 
 ; 
 // 
  
 const 
  
 bigqueryTableId 
  
 = 
  
 'YOUR_TABLE_ID' 
 ; 
 // 
  
 Imports 
  
 the 
  
 Google 
  
 Cloud 
  
 client 
  
 library 
 const 
  
 { 
 PubSub 
 } 
  
 = 
  
 require 
 ( 
 '@google-cloud/pubsub' 
 ); 
 // 
  
 Creates 
  
 a 
  
 client 
 ; 
  
 cache 
  
 this 
  
 for 
  
 further 
  
 use 
 const 
  
 pubSubClient 
  
 = 
  
 new 
  
 PubSub 
 (); 
 async 
  
 function 
  
 createBigQuerySubscription 
 ( 
  
 topicNameOrId 
 , 
  
 subscriptionNameOrId 
 , 
  
 bigqueryTableId 
 , 
 ) 
  
 { 
  
 const 
  
 options 
  
 = 
  
 { 
  
 bigqueryConfig 
 : 
  
 { 
  
 table 
 : 
  
 bigqueryTableId 
 , 
  
 writeMetadata 
 : 
  
 true 
 , 
  
 }, 
  
 }; 
  
 await 
  
 pubSubClient 
  
 . 
 topic 
 ( 
 topicNameOrId 
 ) 
  
 . 
 createSubscription 
 ( 
 subscriptionNameOrId 
 , 
  
 options 
 ); 
  
 console 
 . 
 log 
 ( 
 ` 
 Subscription 
  
 $ 
 { 
 subscriptionNameOrId 
 } 
  
 created 
 . 
 ` 
 ); 
 } 
 

Node.ts

Before trying this sample, follow the Node.js setup instructions in Quickstart: Using Client Libraries . For more information, see the Pub/Sub Node.js API reference documentation .

  /** 
 * 
 TODO 
 ( 
 developer 
 ): 
 Uncomment 
 these 
 variables 
 before 
 running 
 the 
 sample 
 . 
 */ 
 // 
 const 
 topicNameOrId 
 = 
 'YOUR_TOPIC_NAME_OR_ID' 
 ; 
 // 
 const 
 subscriptionNameOrId 
 = 
 'YOUR_SUBSCRIPTION_NAME_OR_ID' 
 ; 
 // 
 const 
 bigqueryTableId 
 = 
 'YOUR_TABLE_ID' 
 ; 
 // 
 Imports 
 the 
 Google 
 Cloud 
 client 
 library 
 import 
  
 { 
 PubSub 
 , 
 CreateSubscriptionOptions 
 } 
 from 
  
 '@google-cloud/pubsub' 
 ; 
 // 
 Creates 
 a 
 client 
 ; 
 cache 
 this 
 for 
 further 
 use 
 const 
 pubSubClient 
 = 
 new 
 PubSub 
 (); 
 async 
 function 
 createBigQuerySubscription 
 ( 
 topicNameOrId 
 : 
 string 
 , 
 subscriptionNameOrId 
 : 
 string 
 , 
 bigqueryTableId 
 : 
 string 
 , 
 ) 
 { 
 const 
 options 
 : 
 CreateSubscriptionOptions 
 = 
 { 
 bigqueryConfig 
 : 
 { 
 table 
 : 
 bigqueryTableId 
 , 
 writeMetadata 
 : 
 true 
 , 
 }, 
 }; 
 await 
 pubSubClient 
 . 
 topic 
 ( 
 topicNameOrId 
 ) 
 . 
 createSubscription 
 ( 
 subscriptionNameOrId 
 , 
 options 
 ); 
 console 
 . 
 log 
 ( 
 ` 
 Subscription 
 $ 
 { 
 subscriptionNameOrId 
 } 
 created 
 . 
 ` 
 ); 
 } 
 

PHP

Before trying this sample, follow the PHP setup instructions in Quickstart: Using Client Libraries . For more information, see the Pub/Sub PHP API reference documentation .

  use Google\Cloud\PubSub\PubSubClient; 
 use Google\Cloud\PubSub\V1\BigQueryConfig; 
 /** 
 * Creates a Pub/Sub BigQuery subscription. 
 * 
 * @param string $projectId  The Google project ID. 
 * @param string $topicName  The Pub/Sub topic name. 
 * @param string $subscriptionName  The Pub/Sub subscription name. 
 * @param string $table      The BigQuery table to which to write. 
 */ 
 function create_bigquery_subscription($projectId, $topicName, $subscriptionName, $table) 
 { 
 $pubsub = new PubSubClient([ 
 'projectId' => $projectId, 
 ]); 
 $topic = $pubsub->topic($topicName); 
 $subscription = $topic->subscription($subscriptionName); 
 $config = new BigQueryConfig(['table' => $table]); 
 $subscription->create([ 
 'bigqueryConfig' => $config 
 ]); 
 printf('Subscription created: %s' . PHP_EOL, $subscription->name()); 
 } 
 

Python

Before trying this sample, follow the Python setup instructions in Quickstart: Using Client Libraries . For more information, see the Pub/Sub Python API reference documentation .

  from 
  
 google.cloud 
  
 import 
 pubsub_v1 
 # TODO(developer) 
 # project_id = "your-project-id" 
 # topic_id = "your-topic-id" 
 # subscription_id = "your-subscription-id" 
 # bigquery_table_id = "your-project.your-dataset.your-table" 
 publisher 
 = 
 pubsub_v1 
 . 
  PublisherClient 
 
 () 
 subscriber 
 = 
 pubsub_v1 
 . 
  SubscriberClient 
 
 () 
 topic_path 
 = 
 publisher 
 . 
 topic_path 
 ( 
 project_id 
 , 
 topic_id 
 ) 
 subscription_path 
 = 
 subscriber 
 . 
 subscription_path 
 ( 
 project_id 
 , 
 subscription_id 
 ) 
 bigquery_config 
 = 
 pubsub_v1 
 . 
 types 
 . 
  BigQueryConfig 
 
 ( 
 table 
 = 
 bigquery_table_id 
 , 
 write_metadata 
 = 
 True 
 ) 
 # Wrap the subscriber in a 'with' block to automatically call close() to 
 # close the underlying gRPC channel when done. 
 with 
 subscriber 
 : 
 subscription 
 = 
 subscriber 
 . 
 create_subscription 
 ( 
 request 
 = 
 { 
 "name" 
 : 
 subscription_path 
 , 
 "topic" 
 : 
 topic_path 
 , 
 "bigquery_config" 
 : 
 bigquery_config 
 , 
 } 
 ) 
 print 
 ( 
 f 
 "BigQuery subscription created: 
 { 
 subscription 
 } 
 ." 
 ) 
 print 
 ( 
 f 
 "Table for subscription is: 
 { 
 bigquery_table_id 
 } 
 " 
 ) 
 

Ruby

The following sample uses Ruby Pub/Sub client library v3. If you are still using the v2 library, see the migration guide to v3 . To see a list of Ruby v2 code samples, see the deprecated code samples .

Before trying this sample, follow the Ruby setup instructions in Quickstart: Using Client Libraries . For more information, see the Pub/Sub Ruby API reference documentation .

  # project_id = "your-project-id" 
 # topic_id = "your-topic-id" 
 # subscription_id = "your-subscription-id" 
 # bigquery_table_id = "my-project:dataset-id.table-id" 
 pubsub 
  
 = 
  
 Google 
 :: 
 Cloud 
 :: 
  PubSub 
 
 . 
  new 
 
  
 project_id 
 : 
  
 project_id 
 subscription_admin 
  
 = 
  
 pubsub 
 . 
  subscription_admin 
 
 subscription 
  
 = 
  
 subscription_admin 
 . 
 create_subscription 
  
 \ 
  
 name 
 : 
  
 pubsub 
 . 
 subscription_path 
 ( 
 subscription_id 
 ), 
  
 topic 
 : 
  
 pubsub 
 . 
 topic_path 
 ( 
 topic_id 
 ), 
  
 bigquery_config 
 : 
  
 { 
  
 table 
 : 
  
 bigquery_table_id 
 , 
  
 write_metadata 
 : 
  
 true 
  
 } 
 puts 
  
 "BigQuery subscription created: 
 #{ 
 subscription_id 
 } 
 ." 
 puts 
  
 "Table for subscription is: 
 #{ 
 bigquery_table_id 
 } 
 " 
 

Monitor a BigQuery subscription

Cloud Monitoring provides a number of metrics to monitor subscriptions .

For a list of all the available metrics related to Pub/Sub and their descriptions, see the Monitoring documentation for Pub/Sub .

You can also monitor subscriptions from within Pub/Sub .

What's next

Design a Mobile Site
View Site in Mobile | Classic
Share by: