This page describes how to use the pg_dump
tool to export
an AlloyDB database to a custom-format, archive DMP file that you
can later import using the pg_restore
tool.
The procedure to perform the export involves these tasks:
-
Create a Cloud Storage bucket to store the DMP file in.
-
Prepare a client host to perform the export operation.
-
Export the database to the DMP file.
-
Clean up the resources created to perform the procedure.
Before you begin
- You must have the Owner (
roles/owner) or Editor (roles/editor) basic IAM role in the Google Cloud project you are using, or you must have these predefined IAM roles:- AlloyDB Admin (
roles/alloydb.admin) or AlloyDB Viewer (roles/alloydb.viewer) - Storage Admin (
roles/storage.admin) - Compute Instance Admin (v1) (
roles/compute.instanceAdmin.v1)
- AlloyDB Admin (
Create a Cloud Storage bucket
Create a standard storage, regional storage bucket in the project and region where your AlloyDB database is located.
Prepare a client host
To prepare a client host to perform the export operation, you create a
Compute Engine VM that can connect to the AlloyDB primary
instance where your database is located, and install the pg_dump
tool and the
Google Cloud CLI on that VM.
-
Follow the instructions Connect a psql client to an instance to create a Compute Engine VM with the proper connectivity and the
pg_dumptool installed. When following these instructions, make sure to allocate enough local storage to the Compute Engine VM to accommodate the DMP file you will create. -
Install the gcloud CLI to provide command-line access to create the DMP file in the Cloud Storage bucket.
Export the database
To export the database to a DMP file, you get the IP address of the
AlloyDB primary instance where your database is located and then
use the pg_dump
tool.
- Get the IP address of the AlloyDB primary instance where your database is located by viewing its details .
- SSH into the Compute Engine VM.
Console
- In the Google Cloud console, go to the VM instancespage.
- In the list of virtual machine instances, click SSHin
the row of the instance you created.

gcloud
Use the
gcloud compute sshcommand to connect to the instance you created.gcloud compute ssh --project= PROJECT_ID --zone= ZONE VM_NAME
Replace the following:
-
PROJECT_ID: The ID of the project that contains the instance. -
ZONE: The name of the zone in which the instance is located. -
VM_NAME: The name of the instance.
- Export the database to a DMP file:
pg_dump -h IP_ADDRESS -U postgres -F custom \ DB_NAME > DMP_FILE_NAME
You will be prompted to enter the password of the
postgresuser.-
IP_ADDRESS: The IP address of the primary instance. -
-F custom: Sets the format of the DMP file to a custom-format archive that you can later import using thepg_restoretool. -
DB_NAME: The name of the database you want to export. -
DMP_FILE_NAME: Provide a file name for the DMP file to create on the client host's local file system.
The above shows a simple
pg_dumpcommand. For information about the wide range of options the command supports, see the PostgreSQL documentation provides. -
- Copy the DMP file to the Cloud Storage bucket you created earlier:
gcloud storage cp DMP_FILE_NAME gs:// BUCKET_NAME
Clean up resources
After successfully exporting the DMP file, you can delete the Compute Engine VM you used during the export procedure.
What's next
- Learn how to export a CSV file .
- Learn how to export a SQL dump file .

