You can convert an import topic to a standard one or conversely, a standard topic to an import one.
Convert an import topic to a standard topic
To convert an import topic to a standard topic, clear the ingestion settings. Perform the following steps:
Console
-
In the Google Cloud console, go to the Topicspage.
-
Click the import topic.
-
In the topic details page, click Edit.
-
Clear the option Enable ingestion.
-
Click Update.
gcloud
-
In the Google Cloud console, activate Cloud Shell.
At the bottom of the Google Cloud console, a Cloud Shell session starts and displays a command-line prompt. Cloud Shell is a shell environment with the Google Cloud CLI already installed and with values already set for your current project. It can take a few seconds for the session to initialize.
-
Run the
gcloud pubsub topics update
command:gcloud pubsub topics update TOPIC_ID \ --clear-ingestion-data-source-settings
Replace TOPIC_ID with the topic ID.
Convert a standard topic to an Amazon Kinesis Data Streams import topic
To convert a standard topic to an Amazon Kinesis Data Streams import topic, first check that you meet all the prerequisites .
Console
-
In the Google Cloud console, go to the Topicspage.
-
Click the topic that you want to convert to an import topic.
-
In the topic details page, click Edit.
-
Select the option Enable ingestion.
-
For ingestion source, select Amazon Kinesis Data Streams .
-
Enter the following details:
-
Kinesis Stream ARN : The ARN for the Kinesis Data Stream that you are planning to ingest into Pub/Sub. The ARN format is as follows:
arn:${Partition}:kinesis:${Region}:${Account}:stream/${StreamName}
. -
Kinesis Consumer ARN : The ARN of the consumer resource that is registered to the AWS Kinesis Data Stream. The ARN format is as follows:
arn:${Partition}:kinesis:${Region}:${Account}:${StreamType}/${StreamName}/consumer/${ConsumerName}:${ConsumerCreationTimpstamp}
. -
AWS Role ARN : The ARN of the AWS role. The ARN format of the role is as follows:
arn:aws:iam::${Account}:role/${RoleName}
. -
Service account : The service account that you created.
-
-
Click Update.
gcloud
-
In the Google Cloud console, activate Cloud Shell.
At the bottom of the Google Cloud console, a Cloud Shell session starts and displays a command-line prompt. Cloud Shell is a shell environment with the Google Cloud CLI already installed and with values already set for your current project. It can take a few seconds for the session to initialize.
-
Run the
gcloud pubsub topics update
command with all the flags mentioned in the following sample:gcloud pubsub topics update TOPIC_ID
--kinesis-ingestion-stream-arn KINESIS_STREAM_ARN
--kinesis-ingestion-consumer-arn KINESIS_CONSUMER_ARN
--kinesis-ingestion-role-arn KINESIS_ROLE_ARN
--kinesis-ingestion-service-account PUBSUB_SERVICE_ACCOUNTReplace the following:
-
TOPIC_ID is the topic ID or name. This field cannot be updated.
-
KINESIS_STREAM_ARN is the ARN for the Kinesis Data Streams that you are planning to ingest into Pub/Sub. The ARN format is as follows:
arn:${Partition}:kinesis:${Region}:${Account}:stream/${StreamName}
. -
KINESIS_CONSUMER_ARN is the ARN of the consumer resource that is registered to the AWS Kinesis Data Streams. The ARN format is as follows:
arn:${Partition}:kinesis:${Region}:${Account}:${StreamType}/${StreamName}/consumer/${ConsumerName}:${ConsumerCreationTimpstamp}
. -
KINESIS_ROLE_ARN is the ARN of the AWS role. The ARN format of the role is as follows:
arn:aws:iam::${Account}:role/${RoleName}
. -
PUBSUB_SERVICE_ACCOUNT is the service account that you created.
-
Go
The following sample uses the major version of the Go Pub/Sub client library (v2). If you are still using the v1 library, see the migration guide to v2 . To see a list of v1 code samples, see the deprecated code samples .
Before trying this sample, follow the Go setup instructions in Quickstart: Using Client Libraries . For more information, see the Pub/Sub Go API reference documentation .
Java
Before trying this sample, follow the Java setup instructions in Quickstart: Using Client Libraries . For more information, see the Pub/Sub Java API reference documentation .
Node.js
Before trying this sample, follow the Node.js setup instructions in Quickstart: Using Client Libraries . For more information, see the Pub/Sub Node.js API reference documentation .
Python
Before trying this sample, follow the Python setup instructions in Quickstart: Using Client Libraries . For more information, see the Pub/Sub Python API reference documentation .
C++
Before trying this sample, follow the C++ setup instructions in Quickstart: Using Client Libraries . For more information, see the Pub/Sub C++ API reference documentation .
Node.ts
Before trying this sample, follow the Node.js setup instructions in Quickstart: Using Client Libraries . For more information, see the Pub/Sub Node.js API reference documentation .
For more information about ARNs, see Amazon Resource Names (ARNs) and IAM Identifiers .
Convert a standard topic to a Cloud Storage import topic
To convert a standard topic to a Cloud Storage import topic, first check that you meet all the prerequisites .
Console
-
In the Google Cloud console, go to the Topicspage.
-
Click the topic that you want to convert to a Cloud Storage import topic.
-
In the topic details page, click Edit.
-
Select the option Enable ingestion.
-
For ingestion source, select Google Cloud Storage .
-
For the Cloud Storage bucket, click Browse .
The Select bucket page opens. Select one of the following options:
-
Select an existing bucket from any appropriate project.
-
Click the create icon and follow the instructions on the screen to create a new bucket. After you create the bucket, select the bucket for the Cloud Storage import topic.
-
-
When you specify the bucket, Pub/Sub checks for the appropriate permissions on the bucket for the Pub/Sub service account. If there are permissions issues, you see an error message related to the permissions.
If you get permission issues, click Set permissions . For more information, see Grant Cloud Storage permissions to the Pub/Sub service account .
-
For Object format , select Text , Avro , or Pub/Sub Avro .
If you select Text , you can optionally specify a Delimiter with which to split objects into messages.
For more information about these options, see Input format .
- Optional. You can specify a Minimum object creation time
for your
topic. If set, only objects created after the minimum object creation time
are ingested.
For more information, see Minimum object creation time .
- You must specify a Glob pattern
. To ingest all objects in the bucket,
use
**
as the glob pattern. Only objects that match the given pattern are ingested.For more information, see Match a glob pattern .
- Retain the other default settings.
- Click Update topic .
gcloud
-
In the Google Cloud console, activate Cloud Shell.
At the bottom of the Google Cloud console, a Cloud Shell session starts and displays a command-line prompt. Cloud Shell is a shell environment with the Google Cloud CLI already installed and with values already set for your current project. It can take a few seconds for the session to initialize.
-
To avoid losing your settings for the import topic, make sure to include all of them every time you update the topic. If you leave something out, Pub/Sub resets the setting to its original default value.
Run the
gcloud pubsub topics update
command with all the flags mentioned in the following sample:gcloud pubsub topics update TOPIC_ID \ --cloud-storage-ingestion-bucket = BUCKET_NAME \ --cloud-storage-ingestion-input-format = INPUT_FORMAT \ --cloud-storage-ingestion-text-delimiter = TEXT_DELIMITER \ --cloud-storage-ingestion-minimum-object-create-time = MINIMUM_OBJECT_CREATE_TIME \ --cloud-storage-ingestion-match-glob = MATCH_GLOB
Replace the following:
-
TOPIC_ID is the topic ID or name. This field cannot be updated.
-
BUCKET_NAME : Specifies the name of an existing bucket. For example,
prod_bucket
. The bucket name must not include the project ID. To create a bucket, see Create buckets . -
INPUT_FORMAT : Specifies the format of the objects that are ingested. This can be
text
,avro
, orpubsub_avro
. For more information about these options, see Input format . -
TEXT_DELIMITER : Specifies the delimiter with which to split text objects into Pub/Sub messages. This must be a single character and must only be set when
INPUT_FORMAT
istext
. It defaults to the newline character (\n
).When using gcloud CLI to specify the delimiter, pay close attention to the handling of special characters like newline
\n
. Use the format'\n'
to ensure the delimiter is correctly interpreted. Simply using\n
without quotes or escaping results in a delimiter of"n"
. -
MINIMUM_OBJECT_CREATE_TIME : Specifies the minimum time at which an object was created in order for it to be ingested. This should be in UTC in the format
YYYY-MM-DDThh:mm:ssZ
. For example,2024-10-14T08:30:30Z
.Any date, past or future, from
0001-01-01T00:00:00Z
to9999-12-31T23:59:59Z
inclusive, is valid. -
MATCH_GLOB : Specifies the glob pattern to match in order for an object to be ingested. When you are using gcloud CLI, a match glob with
*
characters must have the*
character formatted as escaped in the form\*\*.txt
or the whole match glob must be in quotes"**.txt"
or'**.txt'
. For information on supported syntax for glob patterns, see the Cloud Storage documentation .
-