Tool: list_data_tables
List data tables in Chronicle SIEM.
Retrieves a list of data tables available in the Chronicle SIEM instance. This is useful for discovering available tables, auditing their configuration, and managing security context data.
Agent Responsibilities:- Parse the JSON response to extract the list from the dataTables
key. - Handle the nextPageToken
for pagination to retrieve subsequent pages if they exist.
Workflow Integration:- Use to verify data table contents after creation or updates. - Essential for auditing data quality and consistency in security context tables. - Helps understand available data when developing or troubleshooting detection rules. - Supports data governance by providing visibility into managed security datasets.
Use Cases:- Review threat intelligence data before creating detection rules. - Verify that asset inventory data is current and accurate. - Audit user role mappings for consistency and completeness. - Troubleshoot detection rule issues by examining referenced table data. - Generate reports on security context data for compliance or operational reviews.
Args: project_id (str): Google Cloud project ID (required). customer_id (str): Chronicle customer ID (required). region (str): Chronicle region (e.g., "us", "europe") (required). page_size (Optional[int]): The maximum number of data tables to return per page. Defaults to 100, max 1000. page_token (Optional[str]): A page token from a previous call to fetch the next page.
Returns: str: Raw JSON response from the API (ListDataTablesResponse), containing a list of 'dataTables' and potentially a 'nextPageToken'.
Example Usage: # List first page of data tables list_data_tables( project_id="my-project", customer_id="my-customer", region="us", page_size=50 )
# List tables with full details
list_data_tables(
project_id="my-project",
customer_id="my-customer",
region="us"
)
Next Steps (using MCP-enabled tools): - Use list_data_table_rows
to inspect the contents of a specific table. - Use create_data_table
to add new tables. - Use the nextPageToken
to fetch more tables if available. - Add more rows using add_rows_to_data_table
if the table needs additional data. - Delete specific rows using delete_data_table_rows
if outdated or incorrect data is found. - Reference the table data in detection rules to enhance security monitoring. - Export the data for analysis or integration with other security tools. - Set up regular reviews to maintain data quality and relevance.
The following sample demonstrate how to use curl
to invoke the list_data_tables
MCP tool.
| Curl Request |
|---|
curl --location 'https://chronicle.googleapis.com/mcp' \ --header 'content-type: application/json' \ --header 'accept: application/json, text/event-stream' \ --data '{ "method": "tools/call", "params": { "name": "list_data_tables", "arguments": { // provide these details according to the tool' s MCP specification } } , "jsonrpc" : "2.0" , "id" : 1 } ' |
Input Schema
Request message for ListDataTables.
ListDataTablesRequest
| JSON representation |
|---|
{ "projectId" : string , "customerId" : string , "region" : string , "pageSize" : integer , "pageToken" : string } |
| Fields | |
|---|---|
projectId
|
Project ID of the customer. |
customerId
|
Customer ID of the customer. |
region
|
Region of the customer. |
pageSize
|
Page size of the request. |
pageToken
|
Page token of the request. |
Output Schema
Response message for listing data tables.
ListDataTablesResponse
| JSON representation |
|---|
{
"dataTables"
:
[
{
object (
|
| Fields | |
|---|---|
dataTables[]
|
The list of the data tables returned. |
nextPageToken
|
A token, which can be sent as |
DataTable
| JSON representation |
|---|
{ "name" : string , "displayName" : string , "description" : string , "createTime" : string , "updateTime" : string , "columnInfo" : [ { object ( |
| Fields | |
|---|---|
name
|
Identifier. The resource name of the data table Format: "{project}/locations/{region}/instances/{instance}/dataTables/{data_table}" |
displayName
|
Output only. The unique display name of the data table. |
description
|
Required. A user-provided description of the data table. |
createTime
|
Output only. Table create time Uses RFC 3339, where generated output will always be Z-normalized and use 0, 3, 6 or 9 fractional digits. Offsets other than "Z" are also accepted. Examples: |
updateTime
|
Output only. Table update time Uses RFC 3339, where generated output will always be Z-normalized and use 0, 3, 6 or 9 fractional digits. Offsets other than "Z" are also accepted. Examples: |
columnInfo[]
|
Immutable. Details of all the columns in the table |
dataTableUuid
|
Output only. Data table unique id |
rules[]
|
Output only. The resource names for the associated Rules that use this data table. Format: projects/{project}/locations/{location}/instances/{instance}/rules/{rule}. {rule} here refers to the rule id. |
ruleAssociationsCount
|
Output only. The count of rules using the data table. |
rowTimeToLive
|
Optional. User-provided TTL of the data table. |
approximateRowCount
|
Output only. The count of rows in the data table. |
scopeInfo
|
Optional. The scope info of the data table. During data table creation, if this field is not set, the data table without scopes (an unscoped table) will be created for a global user. For a scoped user, this field must be set. During data table update, if scope_info is requested to be updated, this field must be set. |
updateSource
|
Output only. Source of the data table update. |
rowTimeToLiveUpdateTime
|
Output only. Last update time of the TTL of the data table. Uses RFC 3339, where generated output will always be Z-normalized and use 0, 3, 6 or 9 fractional digits. Offsets other than "Z" are also accepted. Examples: |
Timestamp
| JSON representation |
|---|
{ "seconds" : string , "nanos" : integer } |
| Fields | |
|---|---|
seconds
|
Represents seconds of UTC time since Unix epoch 1970-01-01T00:00:00Z. Must be between -62135596800 and 253402300799 inclusive (which corresponds to 0001-01-01T00:00:00Z to 9999-12-31T23:59:59Z). |
nanos
|
Non-negative fractions of a second at nanosecond resolution. This field is the nanosecond portion of the duration, not an alternative to seconds. Negative second values with fractions must still have non-negative nanos values that count forward in time. Must be between 0 and 999,999,999 inclusive. |
DataTableColumnInfo
| JSON representation |
|---|
{ "columnIndex" : integer , "originalColumn" : string , "keyColumn" : boolean , "repeatedValues" : boolean , // Union field |
columnIndex
integer
Required. Column Index. 0,1,2...
originalColumn
string
Required. Original column name of the Data Table (present in the CSV header in case of creation of data tables using file uploads). It must satisfy the following requirements: - Starts with letter. - Contains only letters, numbers and underscore. - Must be unique and has length < 256.
keyColumn
boolean
Optional. Whether to include this column in the calculation of the row ID. If no columns have key_column = true, all columns will be included in the calculation of the row ID.
repeatedValues
boolean
Optional. Whether the column is a repeated values column.
Union field path_or_type
.
path_or_type
can be only one of the following:
mappedColumnPath
string
Entity proto field path that the column is mapped to
columnType
enum (
DataTableColumnType
)
Column type can be STRING, CIDR (Ex- 10.1.1.0/24), REGEX
DataTableScopeInfo
| JSON representation |
|---|
{ "dataAccessScopes" : [ string ] } |
| Fields | |
|---|---|
dataAccessScopes[]
|
Required. Contains the list of scope names of the data table. If the list is empty, the data table is treated as unscoped. The scope names should be full resource names and should be of the format: "projects/{project}/locations/{location}/instances/{instance}/dataAccessScopes/{scope_name}" |
Tool Annotations
Destructive Hint: ❌ | Idempotent Hint: ✅ | Read Only Hint: ✅ | Open World Hint: ❌

