This page describes how to confirm that your migrated data is complete and accurate. At a minimum, you should run SQL statements to verify that your tables exist in the migrated AlloyDB for PostgreSQL database. For a more precise data comparison between the source and destination database, you can try using the open-source Data Validation Tool .
Verify destination data with SQL statements
You can run SQL statements to verify that your tables exist in the migrated AlloyDB for PostgreSQL database. Do the following:
- Connect to your Cloud SQL for SQL Server instance with a tool where you can
run SQL commands against your migrated databases.
For more information on connecting to AlloyDB for PostgreSQL instances, see Connection options in AlloyDB for PostgreSQL documentation.
- Run SQL commands to verify your migrated data. For example:
List all tables in a database
Run the following commands to check if your migrated database contains all the necessary tables:
- Start using a specific database:
USE DATABASE_NAME ; GO
- List all tables in the database:
SELECT * FROM information_schema . tables ;
Check table content and definitions
Run the following commands to verify the correctness of a specific migrated table:
- Start using a specific database:
USE DATABASE_NAME ; GO
- View the table definition:
EXEC sp_help 'dbo. TABLE_NAME ' ;
- Verify the table contents:
SELECT * FROM TABLE_NAME ' ; GO
- Start using a specific database:
Verify data with the Data Validation Tool
The open-source Data Validation Tool lets you perform very precise data comparisons between two databases, but it requires creating network connections to your source and destination databases.
The following steps show a minimal example:
-
Deploy or use a virtual machine with access to both the source and the destination.
-
In the virtual machine, create a folder in which to install the Data Validation Tool.
-
Navigate to this folder.
-
Use
pipto install the Data Validation Tool.pip install google-pso-data-validator
-
Create connections to the source SQL Server database and the£ destination AlloyDB for PostgreSQL database.
data-validation connections add -c SOURCE_CONNECTION_NAME MSSQL --host ' ip-address ' --port port --user username --password pswd --database database-name data-validation connections add -c TARGET_CONNECTION_NAME AlloyDB --host ' ip-address ' --port port --user username --password pswd --database database-name
For example:
data-validation connections add -c source_connection_name MSSQL --host '10.10.10.11' --port 1521 --user system --password pswd --database XE data-validation connections add -c target_connection_name AlloyDB --host '10.10.10.12' --port 5432 --user my_user --password pswd --database mydb
-
Create or generate a list of tables to compare data between the source and destination databases.
export TABLES_LIST=$(data-validation find-tables --source-conn SOURCE_CONNECTION_NAME --target-conn TARGET_CONNECTION_NAME --allowed-schemas schema-name )
For example:
export TABLES_LIST=$(data-validation find-tables --source-conn source_connection_name --target-conn target_connection_name --allowed-schemas public)
-
Run full validation against all tables.
data-validation validate column --source-conn source_connection_name --target-conn target_connection_name --tables-list "${TABLES_LIST}"
We suggest that you run this validation during replication to ensure relative consistency. Large table queries may take too long to run during a small promotion window. In such cases, use the Data Validation Tool to add filters to reduce runtime or prepare the table list to include a subset of tables for the final validation.

