A Model Context Protocol (MCP) server acts as a proxy between an external service that provides context, data, or capabilities to a Large Language Model (LLM) or AI application. MCP servers connect AI applications to external systems such as databases and web services, translating their responses into a format that the AI application can understand.
Server Setup
You must enable MCP servers and set up authentication before use. For more information about using Google and Google Cloud remote MCP servers, see Google Cloud MCP servers overview .
Cloud SQL Admin API for MCP
Server Endpoints
An MCP service endpoint is the network address and communication interface (usually a URL) of the MCP server that an AI application (the Host for the MCP client) uses to establish a secure, standardized connection. It is the point of contact for the LLM to request context, call a tool, or access a resource. Google MCP endpoints can be global or regional.
The cloud-sql MCP server has the following MCP endpoint:
- https://sqladmin.googleapis.com/mcp
MCP Tools
An MCP tool is a function or executable capability that an MCP server exposes to a LLM or AI application to perform an action in the real world.
The cloud-sql MCP server has the following tools:
Initiates the creation of a Cloud SQL instance.
- The tool returns a long-running operation. Use the
get_operationtool to poll its status until the operation completes. - The instance creation operation can take several minutes. Use a command line tool to pause for 30 seconds before rechecking the status.
- After you use the
create_instancetool to create an instance, you can use thecreate_usertool to create an IAM user account for the user currently logged in to the project. -
The value of
data_api_accessis set toALLOW_DATA_APIby default. This setting lets you execute SQL statements using theexecute_sqltool and theexecuteSqlAPI.
Unless otherwise specified, a newly created instance uses the default instance configuration of a development environment.
The following is the default configuration for an instance in a development environment:
{
"tier": "db-perf-optimized-N-2",
"data_disk_size_gb": 100,
"region": "us-central1",
"database_version": "POSTGRES_18",
"edition": "ENTERPRISE_PLUS",
"availability_type": "ZONAL",
"tags": [{"environment": "dev"}]
}
The following configuration is recommended for an instance in a production environment:
{
"tier": "db-perf-optimized-N-8",
"data_disk_size_gb": 250,
"region": "us-central1",
"database_version": "POSTGRES_18",
"edition": "ENTERPRISE_PLUS",
"availability_type": "REGIONAL",
"tags": [{"environment": "prod"}]
}
The following instance configuration is recommended for SQL Server:
{
"tier": "db-perf-optimized-N-8",
"data_disk_size_gb": 250,
"region": "us-central1",
"database_version": "SQLSERVER_2022_STANDARD",
"edition": "ENTERPRISE_PLUS",
"availability_type": "REGIONAL",
"tags": [{"environment": "prod"}]
}
Execute any valid SQL statement, including data definition language (DDL), data control language (DCL), data query language (DQL), or data manipulation language (DML) statements, on a Cloud SQL instance.
To support the execute_sql
tool, a Cloud SQL instance must meet the following requirements:
- The value of
data_api_accessmust be set toALLOW_DATA_API. - For a MySQL instance, the database flag
cloudsql_iam_authenticationmust be set toon. For a PostgreSQL instance, the database flagcloudsql.iam_authenticationmust be set toon. - An IAM user account or IAM service account (
CLOUD_IAM_USERorCLOUD_IAM_SERVICE_ACCOUNT) is required to call theexecute_sqltool. The tool executes the SQL statements using the privileges of the database user logged with IAM database authentication.
After you use the create_instance
tool to create an instance, you can use the create_user
tool to create an IAM user account for the user currently logged in to the project.
The execute_sql
tool has the following limitations:
- If a SQL statement returns a response larger than 10 MB, then the response will be truncated.
- The
execute_sqltool has a default timeout of 30 seconds. If a query runs longer than 30 seconds, then the tool returns aDEADLINE_EXCEEDEDerror. - The
execute_sqltool isn't supported for SQL Server.
If you receive errors similar to "IAM authentication is not enabled for the instance", then you can use the get_instance
tool to check the value of the IAM database authentication flag for the instance.
If you receive errors like "The instance doesn't allow using executeSql to access this instance", then you can use get_instance
tool to check the data_api_access
setting.
When you receive authentication errors:
- Check if the currently logged-in user account exists as an IAM user on the instance using the
list_userstool. - If the IAM user account doesn't exist, then use the
create_usertool to create the IAM user account for the logged-in user. - If the currently logged in user doesn't have the proper database user roles, then you can use
update_usertool to grant database roles to the user. For example,cloudsqlsuperuserrole can provide an IAM user with many required permissions. -
Check if the currently logged in user has the correct IAM permissions assigned for the project. You can use
gcloud projects get-iam-policy [PROJECT_ID]command to check if the user has the proper IAM roles or permissions assigned for the project.- The user must have
cloudsql.instance.loginpermission to do automatic IAM database authentication. - The user must have
cloudsql.instances.executeSqlpermission to execute SQL statements using theexecute_sqltool orexecuteSqlAPI. - Common IAM roles that contain the required permissions: Cloud SQL Instance User (
roles/cloudsql.instanceUser) or Cloud SQL Admin (roles/cloudsql.admin)
- The user must have
When receiving an ExecuteSqlResponse
, always check the message
and status
fields within the response body. A successful HTTP status code doesn't guarantee full success of all SQL statements. The message
and status
fields will indicate if there were any partial errors or warnings during SQL statement execution.
Create a database user for a Cloud SQL instance.
- This tool returns a long-running operation. Use the
get_operationtool to poll its status until the operation completes. - When you use the
create_usertool, specify the type of user:CLOUD_IAM_USERorCLOUD_IAM_SERVICE_ACCOUNT. - By default the newly created user is assigned the
cloudsqlsuperuserrole, unless you specify other database roles explicitly in the request. - You can use a newly created user with the
execute_sqltool if the user is a currently logged in IAM user. Theexecute_sqltool executes the SQL statements using the privileges of the database user logged in using IAM database authentication.
The create_user
tool has the following limitations:
- You can't create a built-in user with a password.
- The
create_usertool doesn't support creating a user for SQL Server.
To create an IAM user in PostgreSQL:
- The database username must be the IAM user's email address and all lowercase. For example, to create user for PostgreSQL IAM user
example-user@example.com, you can use the following request:
{
"name": "example-user@example.com",
"type": "CLOUD_IAM_USER",
"instance":"test-instance",
"project": "test-project"
}
The created database username for the IAM user is example-user@example.com
.
To create an IAM service account in PostgreSQL:
- The database username must be created without the
.gserviceaccount.comsuffix even though the full email address for the account isservice-account-name@project-id.iam.gserviceaccount.com. For example, to create an IAM service account for PostgreSQL you can use the following request format:
{
"name": "test@test-project.iam",
"type": "CLOUD_IAM_SERVICE_ACCOUNT",
"instance": "test-instance",
"project": "test-project"
}
The created database username for the IAM service account is test@test-project.iam
.
To create an IAM user or IAM service account in MySQL:
- When Cloud SQL for MySQL stores a username, it truncates the @ and the domain name from the user or service account's email address. For example,
example-user@example.combecomesexample-user. - For this reason, you can't add two IAM users or service accounts with the same username but different domain names to the same Cloud SQL instance.
- For example, to create user for the MySQL IAM user
example-user@example.com, use the following request:
{
"name": "example-user@example.com",
"type": "CLOUD_IAM_USER",
"instance": "test-instance",
"project": "test-project"
}
The created database username for the IAM user is example-user
.
- For example, to create the MySQL IAM service account
service-account-name@project-id.iam.gserviceaccount.com, use the following request:
{
"name": "service-account-name@project-id.iam.gserviceaccount.com",
"type": "CLOUD_IAM_SERVICE_ACCOUNT",
"instance": "test-instance",
"project": "test-project"
}
The created database username for the IAM service account is service-account-name
.
Update a database user for a Cloud SQL instance. A common use case for the update_user
is to grant a user the cloudsqlsuperuser
role, which can provide a user with many required permissions.
This tool only supports updating users to assign database roles.
- This tool returns a long-running operation. Use the
get_operationtool to poll its status until the operation completes. - Before calling the
update_usertool, always check the existing configuration of the user such as the user type withlist_userstool. - As a special case for MySQL, if the
list_userstool returns a full email address for theiamEmailfield, for example{name=test-account, iamEmail=test-account@project-id.iam.gserviceaccount.com}, then in yourupdate_userrequest, use the full email address in theiamEmailfield in thenamefield of your toolrequest. For example,name=test-account@project-id.iam.gserviceaccount.com.
Key parameters for updating user roles:
-
database_roles: A list of database roles to be assigned to the user. -
revokeExistingRoles: A boolean field (default: false) that controls how existing roles are handled.
How role updates work:
-
If
revokeExistingRolesis true:- Any existing roles granted to the user but NOT in the provided
database_roleslist will be REVOKED. - Revoking only applies to non-system roles. System roles like
cloudsqliamuseretc won't be revoked. - Any roles in the
database_roleslist that the user does NOT already have will be GRANTED. - If
database_rolesis empty, then ALL existing non-system roles are revoked.
- Any existing roles granted to the user but NOT in the provided
-
If
revokeExistingRolesis false (default):- Any roles in the
database_roleslist that the user does NOT already have will be GRANTED. - Existing roles NOT in the
database_roleslist are KEPT. - If
database_rolesis empty, then there is no change to the user's roles.
- Any roles in the
Examples:
-
Existing Roles:
[roleA, roleB]- Request:
database_roles: [roleB, roleC], revokeExistingRoles: true - Result: Revokes
roleA, GrantsroleC. User roles become[roleB, roleC].
- Request:
database_roles: [roleB, roleC], revokeExistingRoles: false - Result: Grants
roleC. User roles become[roleA, roleB, roleC].
- Request:
database_roles: [], revokeExistingRoles: true - Result: Revokes
roleA, RevokesroleB. User roles become[].
- Request:
database_roles: [], revokeExistingRoles: false - Result: No change. User roles remain
[roleA, roleB].
- Request:
Create a Cloud SQL instance as a clone of a source instance.
- This tool returns a long-running operation. Use the
get_operationtool to poll its status until the operation completes. - The clone operation can take several minutes. Use a command line tool to pause for 30 seconds before rechecking the status.
Partially updates the configuration settings of a Cloud SQL instance.
- This tool returns a long-running operation. Use the
get_operationtool to poll its status until the operation completes.
Import data into a Cloud SQL instance.
If the file doesn't start with gs://
, then the assumption is that the file is stored locally. If the file is local, then the file must be uploaded to Cloud Storage before you can make the actual import_data
call. To upload the file to Cloud Storage, you can use the gcloud
or gsutil
commands.
Before you upload the file to Cloud Storage, consider whether you want to use an existing bucket or create a new bucket in the provided project.
After the file is uploaded to Cloud Storage, the instance service account must have sufficient permissions to read the uploaded file from the Cloud Storage bucket.
This can be accomplished as follows:
- Use the
get_instancetool to get the email address of the instance service account. From the output of the tool, get the value of theserviceAccountEmailAddressfield. - Grant the instance service account the
storage.objectAdminrole on the provided Cloud Storage bucket. Use a command likegcloud storage buckets add-iam-policy-bindingor a request to the Cloud Storage API. It can take from two to up to seven minutes or more for the role to be granted and the permissions to be propagated to the service account in Cloud Storage. If you encounter a permissions error after updatingthe IAM policy, then wait a few minutes and try again.
After permissions are granted, you can import the data. We recommend that you leave optional parameters empty and use the system defaults. The file type can typically be determined by the file extension. For example, if the file is a SQL file, .sql
or .csv
for CSV file.
The following is a sample SQL importContext
for MySQL.
{
"uri": "gs://sample-gcs-bucket/sample-file.sql",
"kind": "sql#importContext",
"fileType": "SQL"
}
There is no database
parameter present for MySQL since the database name is expected to be present in the SQL file. Specify only one URI. No other fields are required outside of importContext
.
For PostgreSQL, the database
field is required. The following is a sample PostgreSQL importContext
with the database
field specified.
{
"uri": "gs://sample-gcs-bucket/sample-file.sql",
"kind": "sql#importContext",
"fileType": "SQL",
"database": "sample-db"
}
The import_data
tool returns a long-running operation. Use the get_operation
tool to poll its status until the operation completes.
Get MCP tool specifications
To get the MCP tool specifications for all tools in an MCP server, use the tools/list
method. The following example demonstrates how to use curl
to list all tools and their specifications currently available within the MCP server.
| Curl Request |
|---|
curl --location 'https://sqladmin.googleapis.com/mcp' \ --header 'content-type: application/json' \ --header 'accept: application/json, text/event-stream' \ --data '{ "method": "tools/list", "jsonrpc": "2.0", "id": 1 }' |

