BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format.
↳ project_id
string
The project ID or the project number that contains the BigQuery source. Has a length limit of 128 characters. If not specified, inherits the project ID from the parent request.
↳ dataset_id
string
Required. The BigQuery data set to copy the data from with a length limit of 1,024 characters.
↳ table_id
string
Required. The BigQuery table to copy the data from with a length limit of 1,024 characters.
↳ gcs_staging_dir
string
Intermediate Cloud Storage directory used for the import with a length limit of 2,000 characters. Can be specified if one wants to have the BigQuery export to a specific Cloud Storage directory.
↳ data_schema
string
The schema to use when parsing the data from the source. Supported values for user event imports: * *user_event(default): OneUserEventper row. Supported values for document imports: * *document(default): OneDocumentformat per row. Each document must have a validDocument.idand one ofDocument.json_dataorDocument.struct_data. * *custom: One custom data per row in arbitrary format that conforms to the definedSchemaof the data store. This can only be used by the GENERIC Data Store vertical.
getPartitionDate
BigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format.
The project ID or the project number that contains the BigQuery source. Has
a length limit of 128 characters. If not specified, inherits the project
ID from the parent request.
Returns
Type
Description
string
setProjectId
The project ID or the project number that contains the BigQuery source. Has
a length limit of 128 characters. If not specified, inherits the project
ID from the parent request.
Parameter
Name
Description
var
string
Returns
Type
Description
$this
getDatasetId
Required. The BigQuery data set to copy the data from with a length limit
of 1,024 characters.
Returns
Type
Description
string
setDatasetId
Required. The BigQuery data set to copy the data from with a length limit
of 1,024 characters.
Parameter
Name
Description
var
string
Returns
Type
Description
$this
getTableId
Required. The BigQuery table to copy the data from with a length limit of
1,024 characters.
Returns
Type
Description
string
setTableId
Required. The BigQuery table to copy the data from with a length limit of
1,024 characters.
Parameter
Name
Description
var
string
Returns
Type
Description
$this
getGcsStagingDir
Intermediate Cloud Storage directory used for the import with a length
limit of 2,000 characters. Can be specified if one wants to have the
BigQuery export to a specific Cloud Storage directory.
Returns
Type
Description
string
setGcsStagingDir
Intermediate Cloud Storage directory used for the import with a length
limit of 2,000 characters. Can be specified if one wants to have the
BigQuery export to a specific Cloud Storage directory.
Parameter
Name
Description
var
string
Returns
Type
Description
$this
getDataSchema
The schema to use when parsing the data from the source.
Supported values for user event imports:
user_event(default): OneUserEventper row.
Supported values for document imports:
custom: One custom data per row in arbitrary format that conforms to
the definedSchemaof the data
store. This can only be used by the GENERIC Data Store vertical.
Returns
Type
Description
string
setDataSchema
The schema to use when parsing the data from the source.
Supported values for user event imports:
user_event(default): OneUserEventper row.
Supported values for document imports:
custom: One custom data per row in arbitrary format that conforms to
the definedSchemaof the data
store. This can only be used by the GENERIC Data Store vertical.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-09-04 UTC."],[],[],null,["# Google Cloud Discovery Engine V1 Client - Class BigQuerySource (1.7.0)\n\nVersion latestkeyboard_arrow_down\n\n- [1.7.0 (latest)](/php/docs/reference/cloud-discoveryengine/latest/V1.BigQuerySource)\n- [1.6.1](/php/docs/reference/cloud-discoveryengine/1.6.1/V1.BigQuerySource)\n- [1.5.1](/php/docs/reference/cloud-discoveryengine/1.5.1/V1.BigQuerySource)\n- [1.4.0](/php/docs/reference/cloud-discoveryengine/1.4.0/V1.BigQuerySource)\n- [1.3.3](/php/docs/reference/cloud-discoveryengine/1.3.3/V1.BigQuerySource)\n- [1.2.0](/php/docs/reference/cloud-discoveryengine/1.2.0/V1.BigQuerySource)\n- [1.1.0](/php/docs/reference/cloud-discoveryengine/1.1.0/V1.BigQuerySource)\n- [1.0.0](/php/docs/reference/cloud-discoveryengine/1.0.0/V1.BigQuerySource)\n- [0.11.3](/php/docs/reference/cloud-discoveryengine/0.11.3/V1.BigQuerySource)\n- [0.8.0](/php/docs/reference/cloud-discoveryengine/0.8.0/V1.BigQuerySource)\n- [0.7.1](/php/docs/reference/cloud-discoveryengine/0.7.1/V1.BigQuerySource)\n- [0.6.0](/php/docs/reference/cloud-discoveryengine/0.6.0/V1.BigQuerySource)\n- [0.5.0](/php/docs/reference/cloud-discoveryengine/0.5.0/V1.BigQuerySource)\n- [0.4.0](/php/docs/reference/cloud-discoveryengine/0.4.0/V1.BigQuerySource)\n- [0.3.0](/php/docs/reference/cloud-discoveryengine/0.3.0/V1.BigQuerySource)\n- [0.2.0](/php/docs/reference/cloud-discoveryengine/0.2.0/V1.BigQuerySource)\n- [0.1.1](/php/docs/reference/cloud-discoveryengine/0.1.1/V1.BigQuerySource) \nReference documentation and code samples for the Google Cloud Discovery Engine V1 Client class BigQuerySource.\n\nBigQuery source import data from.\n\nGenerated from protobuf message `google.cloud.discoveryengine.v1.BigQuerySource`\n\nNamespace\n---------\n\nGoogle \\\\ Cloud \\\\ DiscoveryEngine \\\\ V1\n\nMethods\n-------\n\n### __construct\n\nConstructor.\n\n### getPartitionDate\n\nBigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format.\n\n### hasPartitionDate\n\n### setPartitionDate\n\nBigQuery time partitioned table's _PARTITIONDATE in YYYY-MM-DD format.\n\n### getProjectId\n\nThe project ID or the project number that contains the BigQuery source. Has\na length limit of 128 characters. If not specified, inherits the project\nID from the parent request.\n\n### setProjectId\n\nThe project ID or the project number that contains the BigQuery source. Has\na length limit of 128 characters. If not specified, inherits the project\nID from the parent request.\n\n### getDatasetId\n\nRequired. The BigQuery data set to copy the data from with a length limit\nof 1,024 characters.\n\n### setDatasetId\n\nRequired. The BigQuery data set to copy the data from with a length limit\nof 1,024 characters.\n\n### getTableId\n\nRequired. The BigQuery table to copy the data from with a length limit of\n1,024 characters.\n\n### setTableId\n\nRequired. The BigQuery table to copy the data from with a length limit of\n1,024 characters.\n\n### getGcsStagingDir\n\nIntermediate Cloud Storage directory used for the import with a length\nlimit of 2,000 characters. Can be specified if one wants to have the\nBigQuery export to a specific Cloud Storage directory.\n\n### setGcsStagingDir\n\nIntermediate Cloud Storage directory used for the import with a length\nlimit of 2,000 characters. Can be specified if one wants to have the\nBigQuery export to a specific Cloud Storage directory.\n\n### getDataSchema\n\nThe schema to use when parsing the data from the source.\n\nSupported values for user event imports:\n\n- `user_event` (default): One [UserEvent](/php/docs/reference/cloud-discoveryengine/latest/V1.UserEvent) per row. Supported values for document imports:\n- `document` (default): One [Document](/php/docs/reference/cloud-discoveryengine/latest/V1.Document) format per row. Each document must have a valid [Document.id](/php/docs/reference/cloud-discoveryengine/latest/V1.Document#_Google_Cloud_DiscoveryEngine_V1_Document__getId__) and one of [Document.json_data](/php/docs/reference/cloud-discoveryengine/latest/V1.Document#_Google_Cloud_DiscoveryEngine_V1_Document__getJsonData__) or [Document.struct_data](/php/docs/reference/cloud-discoveryengine/latest/V1.Document#_Google_Cloud_DiscoveryEngine_V1_Document__getStructData__).\n- `custom`: One custom data per row in arbitrary format that conforms to the defined [Schema](/php/docs/reference/cloud-discoveryengine/latest/V1.Schema) of the data store. This can only be used by the GENERIC Data Store vertical.\n\n### setDataSchema\n\nThe schema to use when parsing the data from the source.\n\nSupported values for user event imports:\n\n- `user_event` (default): One [UserEvent](/php/docs/reference/cloud-discoveryengine/latest/V1.UserEvent) per row. Supported values for document imports:\n- `document` (default): One [Document](/php/docs/reference/cloud-discoveryengine/latest/V1.Document) format per row. Each document must have a valid [Document.id](/php/docs/reference/cloud-discoveryengine/latest/V1.Document#_Google_Cloud_DiscoveryEngine_V1_Document__getId__) and one of [Document.json_data](/php/docs/reference/cloud-discoveryengine/latest/V1.Document#_Google_Cloud_DiscoveryEngine_V1_Document__getJsonData__) or [Document.struct_data](/php/docs/reference/cloud-discoveryengine/latest/V1.Document#_Google_Cloud_DiscoveryEngine_V1_Document__getStructData__).\n- `custom`: One custom data per row in arbitrary format that conforms to the defined [Schema](/php/docs/reference/cloud-discoveryengine/latest/V1.Schema) of the data store. This can only be used by the GENERIC Data Store vertical.\n\n### getPartition"]]