BigQuery source import data from.
JSON representation |
---|
{ "projectId" : string , "datasetId" : string , "tableId" : string , "gcsStagingDir" : string , "dataSchema" : string , // Union field |
projectId
string
The project ID or the project number that contains the BigQuery source. Has a length limit of 128 characters. If not specified, inherits the project ID from the parent request.
datasetId
string
Required. The BigQuery data set to copy the data from with a length limit of 1,024 characters.
tableId
string
Required. The BigQuery table to copy the data from with a length limit of 1,024 characters.
gcsStagingDir
string
Intermediate Cloud Storage directory used for the import with a length limit of 2,000 characters. Can be specified if one wants to have the BigQuery export to a specific Cloud Storage directory.
dataSchema
string
The schema to use when parsing the data from the source.
Supported values for user event imports:
-
user_event
(default): OneUserEvent
per row.
Supported values for document imports:
-
document
(default): OneDocument
format per row. Each document must have a validDocument.id
and one ofDocument.json_data
orDocument.struct_data
. -
custom
: One custom data per row in arbitrary format that conforms to the definedSchema
of the data store. This can only be used by the GENERIC Data Store vertical.
partition
. BigQuery table partition info. Leave this empty if the BigQuery table is not partitioned. partition
can be only one of the following:partitionDate
object (
Date
)
BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format.