Reference documentation and code samples for the Google Cloud Dataproc V1 Client class SparkRJob.
A Dataproc job for running Apache SparkR applications on YARN.
Generated from protobuf message google.cloud.dataproc.v1.SparkRJob
Namespace
Google \ Cloud \ Dataproc \ V1Methods
__construct
Constructor.
data
array
Optional. Data for populating the Message object.
↳ main_r_file_uri
string
Required. The HCFS URI of the main R file to use as the driver. Must be a .R file.
↳ args
array
Optional. The arguments to pass to the driver. Do not include arguments, such as --conf
, that can be set as job properties, since a collision may occur that causes an incorrect job submission.
↳ file_uris
array
Optional. HCFS URIs of files to be placed in the working directory of each executor. Useful for naively parallel tasks.
↳ archive_uris
array
Optional. HCFS URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
↳ properties
array| Google\Protobuf\Internal\MapField
Optional. A mapping of property names to values, used to configure SparkR. Properties that conflict with values set by the Dataproc API might be overwritten. Can include properties set in /etc/spark/conf/spark-defaults.conf and classes in user code.
↳ logging_config
getMainRFileUri
Required. The HCFS URI of the main R file to use as the driver.
Must be a .R file.
string
setMainRFileUri
Required. The HCFS URI of the main R file to use as the driver.
Must be a .R file.
var
string
$this
getArgs
Optional. The arguments to pass to the driver. Do not include arguments,
such as --conf
, that can be set as job properties, since a collision may
occur that causes an incorrect job submission.
setArgs
Optional. The arguments to pass to the driver. Do not include arguments,
such as --conf
, that can be set as job properties, since a collision may
occur that causes an incorrect job submission.
var
string[]
$this
getFileUris
Optional. HCFS URIs of files to be placed in the working directory of each executor. Useful for naively parallel tasks.
setFileUris
Optional. HCFS URIs of files to be placed in the working directory of each executor. Useful for naively parallel tasks.
var
string[]
$this
getArchiveUris
Optional. HCFS URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
setArchiveUris
Optional. HCFS URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
var
string[]
$this
getProperties
Optional. A mapping of property names to values, used to configure SparkR.
Properties that conflict with values set by the Dataproc API might be overwritten. Can include properties set in /etc/spark/conf/spark-defaults.conf and classes in user code.
setProperties
Optional. A mapping of property names to values, used to configure SparkR.
Properties that conflict with values set by the Dataproc API might be overwritten. Can include properties set in /etc/spark/conf/spark-defaults.conf and classes in user code.
$this
getLoggingConfig
Optional. The runtime log config for job execution.
hasLoggingConfig
clearLoggingConfig
setLoggingConfig
Optional. The runtime log config for job execution.
$this