Create a Dataproc cluster by using the Google Cloud console

This page shows you how to use the Google Cloud console to create a Dataproc cluster, run a basic Apache Spark job in the cluster, and then modify the number of workers in the cluster.


To follow step-by-step guidance for this task directly in the Google Cloud console, click Guide me :

Guide me


Before you begin

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  3. Verify that billing is enabled for your Google Cloud project .

  4. Enable the Dataproc API.

    Enable the API

  5. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  6. Verify that billing is enabled for your Google Cloud project .

  7. Enable the Dataproc API.

    Enable the API

Create a cluster

  1. In the Google Cloud console, go to the Dataproc Clusterspage.

    Go to Clusters

  2. Click Create cluster.

  3. In the Create Dataproc clusterdialog, click Createin the Cluster on Compute Enginerow.

  4. In the Cluster namefield, enter example-cluster .

  5. In the Regionand Zonelists, select a region and zone.

    Select a region (for example, us-east1 or europe-west1 ) to isolate resources, such as virtual machine (VM) instances and Cloud Storage and metadata storage locations that are utilized by Dataproc, in the region. For more information, see Available regions and zones and Regional endpoints .

  6. For all the other options, use the default settings.

  7. To create the cluster, click Create.

    Your new cluster appears in a list on the Clusterspage. The status is Provisioninguntil the cluster is ready to use, and then the status changes to Running. Provisioning the cluster might take a couple of minutes.

Submit a Spark job

Submit a Spark job that estimates a value of Pi:

  1. In the Dataproc navigation menu, click Jobs.
  2. On the Jobspage, click Submit job, and then do the following:

    1. In the Job IDfield, use the default setting, or provide an ID that is unique to your Google Cloud project.
    2. In the Clusterdrop-down, select example-cluster .
    3. For Job type, select Spark.
    4. In the Main class or jarfield, enter org.apache.spark.examples.SparkPi .
    5. In the Jar filesfield, enter file:///usr/lib/spark/examples/jars/spark-examples.jar .
    6. In the Argumentsfield, enter 1000 to set the number of tasks.

    7. Click Submit.

      Your job is displayed on the Job detailspage. The job status is Runningor Starting, and then it changes to Succeededafter it's submitted.

      To avoid scrolling in the output, click Line wrap: off. The output is similar to the following:

      Pi is roughly 3.1416759514167594

      To view job details, click the Configurationtab.

Update a cluster

Update your cluster by changing the number of worker instances:

  1. In the Dataproc navigation menu, click Clusters.
  2. In the list of clusters, click example-cluster .
  3. On the Cluster detailspage, click the Configurationtab.

    Your cluster settings are displayed.

  4. Click Edit.

  5. In the Worker nodesfield, enter 5 .

  6. Click Save.

Your cluster is now updated. To decrease the number of worker nodes to the original value, follow the same procedure.

Clean up

To avoid incurring charges to your Google Cloud account for the resources used on this page, follow these steps.

  1. To delete the cluster, on the Cluster detailspage for example-cluster , click Delete.
  2. To confirm that you want to delete the cluster, click Delete.

What's next

Design a Mobile Site
View Site in Mobile | Classic
Share by: