Stay organized with collectionsSave and categorize content based on your preferences.
Query history audits allow you to generate a report of all jobs run using your Ads Data Hub account. This allows you to answer questions relating to who accessed your data and when they did it.
Query history audits are written as BigQuery tables containing log entries for all queries run using your Ads Data Hub account. To view query history audits for your account, you need to first generate the report via an API. Each audit log contains 1 day’s worth of data. You can generate an audit log for any day within the past 30 days.
Each query history audit uses the following schema:
Field name
Description
customer_id
The Ads Data Hub customer ID
ads_customer_id
The ID of the sub-account, if used (will be identical to customer_id otherwise)
match_table_customer_id
The ID of the account containing the match table, if used (will be identical to customer_id otherwise)
user_email
Email address of the user who ran the query
query_start_time
The time the query began running
query_end_time
The time the query finished running
query_type
Differentiates between analysis queries and audience queries
query_resource_id
The ID associated with the query
query_text
The query’s SQL
query_parameters
query_parameters.name
The name of the query’s parameter
query_parameters.value
The value passed via the query’s parameter row_merge_summary
row_merge_summary.column_name
The name of column
row_merge_summary.merge_type
The type of row merge summary
row_merge_summary.constant_value
The value of the constant set (will be null if no constant is used)
destination_table
The location (in BigQuery) that the query was written to
Accessing query history audits
In order to access the query history audits, you’ll need to call the API. Find sample code for calling the API below, orview the reference documentationand write your own query.
The results of the API request will be written to the BigQuery dataset that you specify in the body of the API request.
"""This sample shows how to create a query history audit.For the program to execute successfully, ensure that you run it using Python 3."""from__future__importprint_functionfromjsonimportdumpsfromgoogle_auth_oauthlibimportflowfromgoogleapiclient.discoveryimportbuildappflow=flow.InstalledAppFlow.from_client_secrets_file(# Replace client_secrets.json with your own client secret file.'client_secrets.json',scopes=['https://www.googleapis.com/auth/adsdatahub'])appflow.run_local_server()credentials=appflow.credentialsdeveloper_key=input('Developer key: ').strip()service=build('adsdatahub','v1',credentials=credentials,developerKey=developer_key)defpprint(x):print(dumps(x,sort_keys=True,indent=4))customer_id=input('Customer ID (e.g. "customers/123"): ').strip()bq_project=input('Destination BigQuery project ID (e.g. "your-project"): ').strip()dataset_id=input('Destination BigQuery dataset (e.g. "your-dataset"): ').strip()start=input('The start date for your query history audit. Formatted as "mm/dd/yyyy": ').strip().split('/')end=input('The end date for your query history audit. Should be 1 day later than start_date. Formatted as "mm/dd/yyyy": ').strip().split('/')choice=input("Do you want to enter a timezone? Defaults to UTC otherwise. (y/n) ")ifchoice.lower()=='y':timezone=input("Timezone (e.g. 'UTC'): ")else:timezone='UTC'body={'project_id':bq_project,'dataset':dataset_id,'start_date':{'year':start[2],'day':start[1],'month':start[0]},'end_date':{'year':end[2],'day':end[1],'month':end[0]},'time_zone':timezone}pprint(service.customers().exportJobHistory(customer=customer_id,body=body).execute())
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2024-09-18 UTC."],[[["\u003cp\u003eQuery history audits provide a report of all jobs run using your Ads Data Hub account, letting you track data access.\u003c/p\u003e\n"],["\u003cp\u003eThese audits are generated as BigQuery tables containing daily log entries for the past 30 days, accessible only to superusers.\u003c/p\u003e\n"],["\u003cp\u003eEach audit log includes details such as user, query time, query type, query text, and destination table.\u003c/p\u003e\n"],["\u003cp\u003eYou need to call the API to access query history audits, specifying the BigQuery dataset for output, and the start and end dates (end date being one day after the start date).\u003c/p\u003e\n"]]],["Query history audits track all jobs run in an Ads Data Hub account, accessible via an API for superusers. To access, generate a report through an API call, which writes a log to a specified BigQuery dataset. Each audit log details one day's query data, including user email, query start/end times, SQL, parameters, and the destination table. The API request requires project ID, dataset, start/end dates (1-day range), and optionally a timezone. The provided Python code sample demonstrates creating an audit.\n"],null,["# View query history audits\n\nQuery history audits allow you to generate a report of all jobs run using your Ads Data Hub account. This allows you to answer questions relating to who accessed your data and when they did it.\n\nQuery history audits are written as BigQuery tables containing log entries for all queries run using your Ads Data Hub account. To view query history audits for your account, you need to first generate the report via an API. Each audit log contains 1 day's worth of data. You can generate an audit log for any day within the past 30 days.\n\nQuery history audits are only available to superusers. [Learn more about role-based access](/ads-data-hub/marketers/guides/assign-access-by-role)\n\nQuery history audit format\n--------------------------\n\nEach query history audit uses the following schema:\n\n| Field name | Description |\n|----------------------------------|--------------------------------------------------------------------------------------------------------|\n| customer_id | The Ads Data Hub customer ID |\n| ads_customer_id | The ID of the sub-account, if used (will be identical to customer_id otherwise) |\n| match_table_customer_id | The ID of the account containing the match table, if used (will be identical to customer_id otherwise) |\n| user_email | Email address of the user who ran the query |\n| query_start_time | The time the query began running |\n| query_end_time | The time the query finished running |\n| query_type | Differentiates between analysis queries and audience queries |\n| query_resource_id | The ID associated with the query |\n| query_text | The query's SQL |\n| query_parameters | |\n| query_parameters.name | The name of the query's parameter |\n| query_parameters.value | The value passed via the query's parameter row_merge_summary |\n| row_merge_summary.column_name | The name of column |\n| row_merge_summary.merge_type | The type of row merge summary |\n| row_merge_summary.constant_value | The value of the constant set (will be null if no constant is used) |\n| destination_table | The location (in BigQuery) that the query was written to |\n\nAccessing query history audits\n------------------------------\n\nIn order to access the query history audits, you'll need to call the API. Find sample code for calling the API below, or [view the reference documentation](/ads-data-hub/marketers/reference/rest) and write your own query.\n\nThe results of the API request will be written to the BigQuery dataset that you specify in the body of the API request.\n**Note:** As query history audits are currently available for a single day's worth of data, the end date needs to be 1 day after the start date. \n\n\n \"\"\"This sample shows how to create a query history audit.\n\n For the program to execute successfully, ensure that you run it using Python 3.\n \"\"\"\n\n from __future__ import print_function\n from json import dumps\n from google_auth_oauthlib import flow\n from googleapiclient.discovery import build\n\n appflow = flow.InstalledAppFlow.from_client_secrets_file(\n # Replace client_secrets.json with your own client secret file.\n 'client_secrets.json',\n scopes=['https://www.googleapis.com/auth/adsdatahub'])\n appflow.run_local_server()\n credentials = appflow.credentials\n developer_key = input('Developer key: ').strip()\n service = build('adsdatahub', 'v1', credentials=credentials,\n developerKey=developer_key)\n\n def pprint(x):\n print(dumps(x, sort_keys=True, indent=4))\n\n customer_id = input('Customer ID (e.g. \"customers/123\"): ').strip()\n bq_project = input('Destination BigQuery project ID (e.g. \"your-project\"): ').strip()\n dataset_id = input('Destination BigQuery dataset (e.g. \"your-dataset\"): ').strip()\n start = input('The start date for your query history audit. Formatted as \"mm/dd/yyyy\": ').strip().split('/')\n end = input('The end date for your query history audit. Should be 1 day later than start_date. Formatted as \"mm/dd/yyyy\": ').strip().split('/')\n\n choice = input(\"Do you want to enter a timezone? Defaults to UTC otherwise. (y/n) \")\n\n if choice.lower() == 'y':\n timezone = input(\"Timezone (e.g. 'UTC'): \")\n else:\n timezone = 'UTC'\n\n body = {\n 'project_id': bq_project,\n 'dataset': dataset_id,\n 'start_date': {\n 'year': start[2],\n 'day': start[1],\n 'month': start[0]\n },\n 'end_date': {\n 'year': end[2],\n 'day': end[1],\n 'month': end[0]\n },\n 'time_zone': timezone\n }\n\n pprint(service.customers().exportJobHistory(customer=customer_id, body=body).execute())"]]