This document helps you choose the appropriate type of Pub/Sub subscription suited to your business requirements.
Pub/Sub offers the following types of subscriptions:
-
Pull subscriptionsuse a subscriber client to request messages from the Pub/Sub server.
-
Push subscriptionsuse the Pub/Sub server to initiate requests to your subscriber application to deliver messages.
-
Export subscriptionsexport your messages directly to a Google Cloud resource. These subscriptions include the following:
-
BigQuery subscriptionsexport data to a BigQuery table.
-
Cloud Storage subscriptionsexport data to a Cloud Storage bucket.
-
Pub/Sub subscription comparison table
The following table offers some guidance in choosing the appropriate delivery mechanism for your application:
- Large volume of messages (GBs per second).
- Efficiency and throughput of message processing is critical.
- Environments where it's not feasible to set up a public HTTPS endpoint with a non-self-signed SSL certificate.
- Multiple topics that must be processed by the same webhook.
- App Engine Standard or Cloud Run functions subscribers.
- Environments where it's not feasible to set up Google Cloud dependencies such as credentials and the client library.
- Large volume of messages that can scale up to multiple millions of messages per second.
- Sending messages directly to a Google Cloud resource without any additional processing.
An HTTPS server with non-self-signed certificate accessible on the public web.
The receiving endpoint might be decoupled from the Pub/Sub subscription, so that messages from multiple subscriptions are sent to a single endpoint.
Push endpoints can be load balancers.
The Pub/Sub service automatically balances the load.
No configuration is necessary.
- No configuration is necessary for App Engine apps in the same project as the subscriber.
- Verification of push endpoints is not required in the Google Cloud console.
- Endpoints must be reachable using DNS names and have SSL certificates installed.
When to use an export subscription
Without an export subscription, you need a pull or push subscription and a subscriber (such as Dataflow) to read messages and write them to a Google Cloud resource. The overhead of running a Dataflow job is not necessary when messages don't require additional processing before being stored.
Export subscriptions have the following advantages:
-
Simple deployment.You can set up an export subscription through a single workflow in the console, Google Cloud CLI, client library, or Pub/Sub API.
-
Low costs.Reduces the additional cost and latency of similar Pub/Sub pipelines that include Dataflow jobs. This cost optimization is useful for messaging systems that don't require additional processing before storage.
-
Minimal monitoring.Export subscriptions are part of the multi-tenant Pub/Sub service and don't require you to run separate monitoring jobs.
-
Flexibility. A BigQuery subscription can use the schema of the topic to which it is attached, which is not available with the basic Dataflow template for writing from Pub/Sub to BigQuery. Similarly, a Cloud Storage subscription offers configurable file batching options based on file size and elapsed time, which are not configurable in the basic Dataflow template for writing from Pub/Sub to Cloud Storage.
However, a Dataflow pipeline is still recommended for Pub/Sub systems where some data transformation is required before the data is stored in a Google Cloud resource such as a BigQuery table or Cloud Storage bucket.
To learn how to stream data from Pub/Sub to BigQuery with transformation by using Dataflow, see Stream from Pub/Sub to BigQuery .
To learn how to stream data from Pub/Sub to Cloud Storage with transformation by using Dataflow, see Stream messages from Pub/Sub by using Dataflow .
What's next
Understand the workflow for each subscription type:

