Reference documentation and code samples for the Google Cloud Dataproc V1 Client class SparkJob.
A Dataproc job for runningApache Sparkapplications on YARN.
Generated from protobuf messagegoogle.cloud.dataproc.v1.SparkJob
Namespace
Google \ Cloud \ Dataproc \ V1
Methods
__construct
Constructor.
Parameters
Name
Description
data
array
Optional. Data for populating the Message object.
↳ main_jar_file_uri
string
The HCFS URI of the jar file that contains the main class.
↳ main_class
string
The name of the driver's main class. The jar file that contains the class must be in the default CLASSPATH or specified injar_file_uris.
↳ args
array
Optional. The arguments to pass to the driver. Do not include arguments, such as--conf, that can be set as job properties, since a collision may occur that causes an incorrect job submission.
↳ jar_file_uris
array
Optional. HCFS URIs of jar files to add to the CLASSPATHs of the Spark driver and tasks.
↳ file_uris
array
Optional. HCFS URIs of files to be placed in the working directory of each executor. Useful for naively parallel tasks.
↳ archive_uris
array
Optional. HCFS URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
Optional. A mapping of property names to values, used to configure Spark. Properties that conflict with values set by the Dataproc API may be overwritten. Can include properties set in /etc/spark/conf/spark-defaults.conf and classes in user code.
Optional. The runtime log config for job execution.
getMainJarFileUri
The HCFS URI of the jar file that contains the main class.
Returns
Type
Description
string
hasMainJarFileUri
setMainJarFileUri
The HCFS URI of the jar file that contains the main class.
Parameter
Name
Description
var
string
Returns
Type
Description
$this
getMainClass
The name of the driver's main class. The jar file that contains the class
must be in the default CLASSPATH or specified injar_file_uris.
Returns
Type
Description
string
hasMainClass
setMainClass
The name of the driver's main class. The jar file that contains the class
must be in the default CLASSPATH or specified injar_file_uris.
Parameter
Name
Description
var
string
Returns
Type
Description
$this
getArgs
Optional. The arguments to pass to the driver. Do not include arguments,
such as--conf, that can be set as job properties, since a collision may
occur that causes an incorrect job submission.
Optional. The arguments to pass to the driver. Do not include arguments,
such as--conf, that can be set as job properties, since a collision may
occur that causes an incorrect job submission.
Parameter
Name
Description
var
string[]
Returns
Type
Description
$this
getJarFileUris
Optional. HCFS URIs of jar files to add to the CLASSPATHs of the
Spark driver and tasks.
Optional. HCFS URIs of files to be placed in the working directory of
each executor. Useful for naively parallel tasks.
Parameter
Name
Description
var
string[]
Returns
Type
Description
$this
getArchiveUris
Optional. HCFS URIs of archives to be extracted into the working directory
of each executor. Supported file types:
.jar, .tar, .tar.gz, .tgz, and .zip.
Optional. HCFS URIs of archives to be extracted into the working directory
of each executor. Supported file types:
.jar, .tar, .tar.gz, .tgz, and .zip.
Parameter
Name
Description
var
string[]
Returns
Type
Description
$this
getProperties
Optional. A mapping of property names to values, used to configure Spark.
Properties that conflict with values set by the Dataproc API may be
overwritten. Can include properties set in
/etc/spark/conf/spark-defaults.conf and classes in user code.
Optional. A mapping of property names to values, used to configure Spark.
Properties that conflict with values set by the Dataproc API may be
overwritten. Can include properties set in
/etc/spark/conf/spark-defaults.conf and classes in user code.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-09-04 UTC."],[],[],null,["# Google Cloud Dataproc V1 Client - Class SparkJob (3.14.0)\n\nVersion latestkeyboard_arrow_down\n\n- [3.14.0 (latest)](/php/docs/reference/cloud-dataproc/latest/V1.SparkJob)\n- [3.13.4](/php/docs/reference/cloud-dataproc/3.13.4/V1.SparkJob)\n- [3.12.0](/php/docs/reference/cloud-dataproc/3.12.0/V1.SparkJob)\n- [3.11.0](/php/docs/reference/cloud-dataproc/3.11.0/V1.SparkJob)\n- [3.10.1](/php/docs/reference/cloud-dataproc/3.10.1/V1.SparkJob)\n- [3.9.0](/php/docs/reference/cloud-dataproc/3.9.0/V1.SparkJob)\n- [3.8.1](/php/docs/reference/cloud-dataproc/3.8.1/V1.SparkJob)\n- [3.7.1](/php/docs/reference/cloud-dataproc/3.7.1/V1.SparkJob)\n- [3.6.1](/php/docs/reference/cloud-dataproc/3.6.1/V1.SparkJob)\n- [3.5.1](/php/docs/reference/cloud-dataproc/3.5.1/V1.SparkJob)\n- [3.4.0](/php/docs/reference/cloud-dataproc/3.4.0/V1.SparkJob)\n- [3.3.0](/php/docs/reference/cloud-dataproc/3.3.0/V1.SparkJob)\n- [3.2.2](/php/docs/reference/cloud-dataproc/3.2.2/V1.SparkJob)\n- [2.6.1](/php/docs/reference/cloud-dataproc/2.6.1/V1.SparkJob)\n- [2.5.0](/php/docs/reference/cloud-dataproc/2.5.0/V1.SparkJob)\n- [2.3.0](/php/docs/reference/cloud-dataproc/2.3.0/V1.SparkJob)\n- [2.2.3](/php/docs/reference/cloud-dataproc/2.2.3/V1.SparkJob)\n- [2.1.0](/php/docs/reference/cloud-dataproc/2.1.0/V1.SparkJob)\n- [2.0.0](/php/docs/reference/cloud-dataproc/2.0.0/V1.SparkJob) \nReference documentation and code samples for the Google Cloud Dataproc V1 Client class SparkJob.\n\nA Dataproc job for running [Apache Spark](https://spark.apache.org/)\napplications on YARN.\n\nGenerated from protobuf message `google.cloud.dataproc.v1.SparkJob`\n\nNamespace\n---------\n\nGoogle \\\\ Cloud \\\\ Dataproc \\\\ V1\n\nMethods\n-------\n\n### __construct\n\nConstructor.\n\n### getMainJarFileUri\n\nThe HCFS URI of the jar file that contains the main class.\n\n### hasMainJarFileUri\n\n### setMainJarFileUri\n\nThe HCFS URI of the jar file that contains the main class.\n\n### getMainClass\n\nThe name of the driver's main class. The jar file that contains the class\nmust be in the default CLASSPATH or specified in `jar_file_uris`.\n\n### hasMainClass\n\n### setMainClass\n\nThe name of the driver's main class. The jar file that contains the class\nmust be in the default CLASSPATH or specified in `jar_file_uris`.\n\n### getArgs\n\nOptional. The arguments to pass to the driver. Do not include arguments,\nsuch as `--conf`, that can be set as job properties, since a collision may\noccur that causes an incorrect job submission.\n\n### setArgs\n\nOptional. The arguments to pass to the driver. Do not include arguments,\nsuch as `--conf`, that can be set as job properties, since a collision may\noccur that causes an incorrect job submission.\n\n### getJarFileUris\n\nOptional. HCFS URIs of jar files to add to the CLASSPATHs of the\nSpark driver and tasks.\n\n### setJarFileUris\n\nOptional. HCFS URIs of jar files to add to the CLASSPATHs of the\nSpark driver and tasks.\n\n### getFileUris\n\nOptional. HCFS URIs of files to be placed in the working directory of\neach executor. Useful for naively parallel tasks.\n\n### setFileUris\n\nOptional. HCFS URIs of files to be placed in the working directory of\neach executor. Useful for naively parallel tasks.\n\n### getArchiveUris\n\nOptional. HCFS URIs of archives to be extracted into the working directory\nof each executor. Supported file types:\n.jar, .tar, .tar.gz, .tgz, and .zip.\n\n### setArchiveUris\n\nOptional. HCFS URIs of archives to be extracted into the working directory\nof each executor. Supported file types:\n.jar, .tar, .tar.gz, .tgz, and .zip.\n\n### getProperties\n\nOptional. A mapping of property names to values, used to configure Spark.\n\nProperties that conflict with values set by the Dataproc API may be\noverwritten. Can include properties set in\n/etc/spark/conf/spark-defaults.conf and classes in user code.\n\n### setProperties\n\nOptional. A mapping of property names to values, used to configure Spark.\n\nProperties that conflict with values set by the Dataproc API may be\noverwritten. Can include properties set in\n/etc/spark/conf/spark-defaults.conf and classes in user code.\n\n### getLoggingConfig\n\nOptional. The runtime log config for job execution.\n\n### hasLoggingConfig\n\n### clearLoggingConfig\n\n### setLoggingConfig\n\nOptional. The runtime log config for job execution.\n\n### getDriver"]]