Reference documentation and code samples for the Google Cloud Retail V2 Client class BigQuerySource.
BigQuery source import data from.
Generated from protobuf message google.cloud.retail.v2.BigQuerySource
Methods
__construct
Constructor.
data
array
Optional. Data for populating the Message object.
↳ partition_date
Google\Type\Date
BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format. Only supported in ImportProductsRequest .
↳ project_id
string
The project ID (can be project # or ID) that the BigQuery source is in with a length limit of 128 characters. If not specified, inherits the project ID from the parent request.
↳ dataset_id
string
Required. The BigQuery data set to copy the data from with a length limit of 1,024 characters.
↳ table_id
string
Required. The BigQuery table to copy the data from with a length limit of 1,024 characters.
↳ gcs_staging_dir
string
Intermediate Cloud Storage directory used for the import with a length limit of 2,000 characters. Can be specified if one wants to have the BigQuery export to a specific Cloud Storage directory.
↳ data_schema
string
The schema to use when parsing the data from the source. Supported values for product imports: * product
(default): One JSON Product
per line. Each product must have a valid Product.id
. * product_merchant_center
: See Importing catalog data from Merchant Center
. Supported values for user events imports: * user_event
(default): One JSON UserEvent
per line. * user_event_ga360
: The schema is available here: https://support.google.com/analytics/answer/3437719
. * user_event_ga4
: The schema is available here: https://support.google.com/analytics/answer/7029846
. Supported values for auto-completion imports: * suggestions
(default): One JSON completion suggestion per line. * denylist
: One JSON deny suggestion per line. * allowlist
: One JSON allow suggestion per line.
getPartitionDate
BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format.
Only supported in ImportProductsRequest .
Generated from protobuf field .google.type.Date partition_date = 6;
hasPartitionDate
setPartitionDate
BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format.
Only supported in ImportProductsRequest .
Generated from protobuf field .google.type.Date partition_date = 6;
$this
getProjectId
The project ID (can be project # or ID) that the BigQuery source is in with a length limit of 128 characters. If not specified, inherits the project ID from the parent request.
Generated from protobuf field string project_id = 5;
string
setProjectId
The project ID (can be project # or ID) that the BigQuery source is in with a length limit of 128 characters. If not specified, inherits the project ID from the parent request.
Generated from protobuf field string project_id = 5;
var
string
$this
getDatasetId
Required. The BigQuery data set to copy the data from with a length limit of 1,024 characters.
Generated from protobuf field string dataset_id = 1 [(.google.api.field_behavior) = REQUIRED];
string
setDatasetId
Required. The BigQuery data set to copy the data from with a length limit of 1,024 characters.
Generated from protobuf field string dataset_id = 1 [(.google.api.field_behavior) = REQUIRED];
var
string
$this
getTableId
Required. The BigQuery table to copy the data from with a length limit of 1,024 characters.
Generated from protobuf field string table_id = 2 [(.google.api.field_behavior) = REQUIRED];
string
setTableId
Required. The BigQuery table to copy the data from with a length limit of 1,024 characters.
Generated from protobuf field string table_id = 2 [(.google.api.field_behavior) = REQUIRED];
var
string
$this
getGcsStagingDir
Intermediate Cloud Storage directory used for the import with a length limit of 2,000 characters. Can be specified if one wants to have the BigQuery export to a specific Cloud Storage directory.
Generated from protobuf field string gcs_staging_dir = 3;
string
setGcsStagingDir
Intermediate Cloud Storage directory used for the import with a length limit of 2,000 characters. Can be specified if one wants to have the BigQuery export to a specific Cloud Storage directory.
Generated from protobuf field string gcs_staging_dir = 3;
var
string
$this
getDataSchema
The schema to use when parsing the data from the source.
Supported values for product imports:
-
product
(default): One JSON Product per line. Each product must have a valid Product.id . -
product_merchant_center
: See Importing catalog data from Merchant Center . Supported values for user events imports: -
user_event
(default): One JSON UserEvent per line. -
user_event_ga360
: The schema is available here: https://support.google.com/analytics/answer/3437719 . -
user_event_ga4
: The schema is available here: https://support.google.com/analytics/answer/7029846 . Supported values for auto-completion imports: -
suggestions
(default): One JSON completion suggestion per line. -
denylist
: One JSON deny suggestion per line. -
allowlist
: One JSON allow suggestion per line.
Generated from protobuf field string data_schema = 4;
string
setDataSchema
The schema to use when parsing the data from the source.
Supported values for product imports:
-
product
(default): One JSON Product per line. Each product must have a valid Product.id . -
product_merchant_center
: See Importing catalog data from Merchant Center . Supported values for user events imports: -
user_event
(default): One JSON UserEvent per line. -
user_event_ga360
: The schema is available here: https://support.google.com/analytics/answer/3437719 . -
user_event_ga4
: The schema is available here: https://support.google.com/analytics/answer/7029846 . Supported values for auto-completion imports: -
suggestions
(default): One JSON completion suggestion per line. -
denylist
: One JSON deny suggestion per line. -
allowlist
: One JSON allow suggestion per line.
Generated from protobuf field string data_schema = 4;
var
string
$this
getPartition
string