Collect DomainTools Iris Investigate results

Supported in:

This document explains how to ingest DomainTools Iris Investigate results to Google Security Operations using Amazon S3. The parser transforms raw JSON data from DomainTools' Iris API into a structured format conforming to Google SecOps's Unified Data Model (UDM). It extracts information related to domain details, contact information, security risks, SSL certificates, and other relevant attributes, mapping them to corresponding UDM fields for consistent analysis and threat intelligence.

Before you begin

  • Google SecOps instance
  • Privileged access to DomainTools enterprise account (API access to Iris Investigate)
  • Privileged access to AWS (S3, IAM, Lambda, EventBridge)

Get DomainTools API Key and Endpoint

  1. Sign in to the DomainTools API Dashboard(only the API owner account can reset the API key).
  2. In the My Accountsection, select the View API Dashboardlink located in the Account Summarytab.
  3. Go to the API Usernamesection to obtain your username.
  4. In the same tab, locate your API Key.
  5. Copy and save the key in a secure location.
  6. If you need a new key, select Reset API Key.

  7. Note the Iris Investigate endpoint: https://api.domaintools.com/v1/iris-investigate/ .

Configure AWS S3 bucket and IAM for Google SecOps

  1. Create Amazon S3 bucketfollowing this user guide: Creating a bucket
  2. Save bucket Nameand Regionfor future reference (for example, domaintools-iris ).
  3. Create a user following this user guide: Creating an IAM user .
  4. Select the created User.
  5. Select the Security credentialstab.
  6. Click Create Access Keyin the Access Keyssection.
  7. Select Third-party serviceas the Use case.
  8. Click Next.
  9. Optional: add a description tag.
  10. Click Create access key.
  11. Click Download CSV fileto save the Access Keyand Secret Access Keyfor later use.
  12. Click Done.
  13. Select the Permissionstab.
  14. Click Add permissionsin the Permissions policiessection.
  15. Select Add permissions.
  16. Select Attach policies directly
  17. Search for and select the AmazonS3FullAccesspolicy.
  18. Click Next.
  19. Click Add permissions.

Configure the IAM policy and role for S3 uploads

  1. In the AWS console, go to IAM > Policies > Create policy > JSON tab.
  2. Enter the following policy:

      { 
      
     "Version" 
     : 
      
     "2012-10-17" 
     , 
      
     "Statement" 
     : 
      
     [ 
      
     { 
      
     "Sid" 
     : 
      
     "AllowPutDomainToolsIrisObjects" 
     , 
      
     "Effect" 
     : 
      
     "Allow" 
     , 
      
     "Action" 
     : 
      
     "s3:PutObject" 
     , 
      
     "Resource" 
     : 
      
     "arn:aws:s3:::domaintools-iris/*" 
      
     } 
      
     ] 
     } 
     
    
    • Replace domaintools-iris if you entered a different bucket name.
  3. Click Next > Create policy.

  4. Go to IAM > Roles > Create role > AWS service > Lambda.

  5. Attach the newly created policy.

  6. Name the role WriteDomainToolsIrisToS3Role and click Create role.

Create the Lambda function

  1. In the AWS Console, go to Lambda > Functions > Create function.
  2. Click Author from scratch.
  3. Provide the following configuration details:

    Setting Value
    Name domaintools_iris_to_s3
    Runtime Python 3.13
    Architecture x86_64
    Execution role WriteDomainToolsIrisToS3Role
  4. After the function is created, open the Codetab, delete the stub and enter the following code ( domaintools_iris_to_s3.py ):

      #!/usr/bin/env python3 
     # Lambda: Pull DomainTools Iris Investigate results to S3 (no transform) 
     import 
      
     os 
     , 
      
     json 
     , 
      
     time 
     , 
      
     urllib.parse 
     from 
      
     urllib.request 
      
     import 
     Request 
     , 
     urlopen 
     from 
      
     urllib.error 
      
     import 
     HTTPError 
     import 
      
     boto3 
     # --- Environment --- 
     S3_BUCKET 
     = 
     os 
     . 
     environ 
     [ 
     "S3_BUCKET" 
     ] 
     . 
     strip 
     () 
     S3_PREFIX 
     = 
     os 
     . 
     environ 
     . 
     get 
     ( 
     "S3_PREFIX" 
     , 
     "domaintools/iris/" 
     ) 
     . 
     strip 
     () 
     STATE_KEY 
     = 
     os 
     . 
     environ 
     . 
     get 
     ( 
     "STATE_KEY" 
     , 
     "domaintools/iris/state.json" 
     ) 
     . 
     strip 
     () 
     DT_API_KEY 
     = 
     os 
     . 
     environ 
     [ 
     "DT_API_KEY" 
     ] 
     . 
     strip 
     () 
     DT_API_SECRET 
     = 
     os 
     . 
     environ 
     . 
     get 
     ( 
     "DT_API_SECRET" 
     , 
     "" 
     ) 
     . 
     strip 
     () 
     # optional if your account uses key-only auth 
     USE_MODE 
     = 
     os 
     . 
     environ 
     . 
     get 
     ( 
     "USE_MODE" 
     , 
     "HASH" 
     ) 
     . 
     strip 
     () 
     . 
     upper 
     () 
     # HASH | DOMAINS | QUERY 
     SEARCH_HASHES 
     = 
     [ 
     h 
     . 
     strip 
     () 
     for 
     h 
     in 
     os 
     . 
     environ 
     . 
     get 
     ( 
     "SEARCH_HASHES" 
     , 
     "" 
     ) 
     . 
     split 
     ( 
     ";" 
     ) 
     if 
     h 
     . 
     strip 
     ()] 
     DOMAINS 
     = 
     [ 
     d 
     . 
     strip 
     () 
     for 
     d 
     in 
     os 
     . 
     environ 
     . 
     get 
     ( 
     "DOMAINS" 
     , 
     "" 
     ) 
     . 
     split 
     ( 
     ";" 
     ) 
     if 
     d 
     . 
     strip 
     ()] 
     QUERY_LIST 
     = 
     [ 
     q 
     . 
     strip 
     () 
     for 
     q 
     in 
     os 
     . 
     environ 
     . 
     get 
     ( 
     "QUERY_LIST" 
     , 
     "" 
     ) 
     . 
     split 
     ( 
     ";" 
     ) 
     if 
     q 
     . 
     strip 
     ()] 
     PAGE_SIZE 
     = 
     int 
     ( 
     os 
     . 
     environ 
     . 
     get 
     ( 
     "PAGE_SIZE" 
     , 
     "500" 
     )) 
     MAX_PAGES 
     = 
     int 
     ( 
     os 
     . 
     environ 
     . 
     get 
     ( 
     "MAX_PAGES" 
     , 
     "20" 
     )) 
     USE_NEXT 
     = 
     os 
     . 
     environ 
     . 
     get 
     ( 
     "USE_NEXT" 
     , 
     "true" 
     ) 
     . 
     lower 
     () 
     == 
     "true" 
     HTTP_TIMEOUT 
     = 
     int 
     ( 
     os 
     . 
     environ 
     . 
     get 
     ( 
     "HTTP_TIMEOUT" 
     , 
     "60" 
     )) 
     RETRIES 
     = 
     int 
     ( 
     os 
     . 
     environ 
     . 
     get 
     ( 
     "HTTP_RETRIES" 
     , 
     "2" 
     )) 
     BASE_URL 
     = 
     "https://api.domaintools.com/v1/iris-investigate/" 
     HDRS 
     = 
     { 
     "X-Api-Key" 
     : 
     DT_API_KEY 
     , 
     "Accept" 
     : 
     "application/json" 
     , 
     } 
     if 
     DT_API_SECRET 
     : 
     HDRS 
     [ 
     "X-Api-Secret" 
     ] 
     = 
     DT_API_SECRET 
     s3 
     = 
     boto3 
     . 
     client 
     ( 
     "s3" 
     ) 
     # --- HTTP helpers --- 
     def 
      
     _http_get 
     ( 
     url 
     : 
     str 
     ) 
     - 
    > dict 
     : 
     req 
     = 
     Request 
     ( 
     url 
     , 
     method 
     = 
     "GET" 
     ) 
     for 
     k 
     , 
     v 
     in 
     HDRS 
     . 
     items 
     (): 
     req 
     . 
     add_header 
     ( 
     k 
     , 
     v 
     ) 
     attempt 
     = 
     0 
     while 
     True 
     : 
     try 
     : 
     with 
     urlopen 
     ( 
     req 
     , 
     timeout 
     = 
     HTTP_TIMEOUT 
     ) 
     as 
     r 
     : 
     return 
     json 
     . 
     loads 
     ( 
     r 
     . 
     read 
     () 
     . 
     decode 
     ( 
     "utf-8" 
     )) 
     except 
     HTTPError 
     as 
     e 
     : 
     if 
     e 
     . 
     code 
     in 
     ( 
     429 
     , 
     500 
     , 
     502 
     , 
     503 
     , 
     504 
     ) 
     and 
     attempt 
    < RETRIES 
     : 
     delay 
     = 
     int 
     ( 
     e 
     . 
     headers 
     . 
     get 
     ( 
     "Retry-After" 
     , 
     "2" 
     )) 
     time 
     . 
     sleep 
     ( 
     max 
     ( 
     1 
     , 
     delay 
     )) 
     attempt 
     += 
     1 
     continue 
     raise 
     def 
      
     _build_url 
     ( 
     params 
     : 
     dict 
     ) 
     - 
    > str 
     : 
     return 
     BASE_URL 
     + 
     ( 
     "?" 
     + 
     urllib 
     . 
     parse 
     . 
     urlencode 
     ( 
     params 
     , 
     doseq 
     = 
     True 
     ) 
     if 
     params 
     else 
     "" 
     ) 
     # --- S3 helpers --- 
     def 
      
     _write_page 
     ( 
     obj 
     : 
     dict 
     , 
     label 
     : 
     str 
     , 
     page 
     : 
     int 
     ) 
     - 
    > str 
     : 
     ts 
     = 
     time 
     . 
     strftime 
     ( 
     "%Y/%m/ 
     %d 
     /%H%M%S" 
     , 
     time 
     . 
     gmtime 
     ()) 
     key 
     = 
     f 
     " 
     { 
     S3_PREFIX 
     . 
     rstrip 
     ( 
     '/' 
     ) 
     } 
     / 
     { 
     ts 
     } 
     - 
     { 
     label 
     } 
     -p 
     { 
     page 
     : 
     05d 
     } 
     .json" 
     s3 
     . 
     put_object 
     ( 
     Bucket 
     = 
     S3_BUCKET 
     , 
     Key 
     = 
     key 
     , 
     Body 
     = 
     json 
     . 
     dumps 
     ( 
     obj 
     , 
     separators 
     = 
     ( 
     "," 
     , 
     ":" 
     )) 
     . 
     encode 
     ( 
     "utf-8" 
     ), 
     ContentType 
     = 
     "application/json" 
     , 
     ) 
     return 
     key 
     # --- Iris paging --- 
     def 
      
     _first_page_params 
     () 
     - 
    > dict 
     : 
     params 
     : 
     dict 
     [ 
     str 
     , 
     object 
     ] 
     = 
     { 
     "page_size" 
     : 
     str 
     ( 
     PAGE_SIZE 
     )} 
     if 
     USE_NEXT 
     : 
     params 
     [ 
     "next" 
     ] 
     = 
     "true" 
     return 
     params 
     def 
      
     _paginate 
     ( 
     label 
     : 
     str 
     , 
     params 
     : 
     dict 
     ) 
     - 
    > tuple 
     [ 
     int 
     , 
     int 
     ]: 
     pages 
     = 
     0 
     total 
     = 
     0 
     url 
     = 
     _build_url 
     ( 
     params 
     ) 
     while 
     pages 
    < MAX_PAGES 
     : 
     data 
     = 
     _http_get 
     ( 
     url 
     ) 
     _write_page 
     ( 
     data 
     , 
     label 
     , 
     pages 
     ) 
     resp 
     = 
     data 
     . 
     get 
     ( 
     "response" 
     ) 
     or 
     {} 
     results 
     = 
     resp 
     . 
     get 
     ( 
     "results" 
     ) 
     or 
     [] 
     total 
     += 
     len 
     ( 
     results 
     ) 
     pages 
     += 
     1 
     # Prefer `next` absolute URL if present 
     next_url 
     = 
     resp 
     . 
     get 
     ( 
     "next" 
     ) 
     if 
     isinstance 
     ( 
     resp 
     , 
     dict 
     ) 
     else 
     None 
     if 
     next_url 
     : 
     url 
     = 
     next_url 
     continue 
     # Fallback: position pager when `next=true` not used/supported 
     if 
     resp 
     . 
     get 
     ( 
     "has_more_results" 
     ) 
     and 
     resp 
     . 
     get 
     ( 
     "position" 
     ): 
     base 
     = 
     _first_page_params 
     () 
     base 
     . 
     pop 
     ( 
     "next" 
     , 
     None 
     ) 
     base 
     [ 
     "position" 
     ] 
     = 
     resp 
     [ 
     "position" 
     ] 
     url 
     = 
     _build_url 
     ( 
     base 
     ) 
     continue 
     break 
     return 
     pages 
     , 
     total 
     # --- Mode runners --- 
     def 
      
     run_hashes 
     ( 
     hashes 
     : 
     list 
     [ 
     str 
     ]) 
     - 
    > dict 
     : 
     agg_pages 
     = 
     agg_results 
     = 
     0 
     for 
     h 
     in 
     hashes 
     : 
     params 
     = 
     _first_page_params 
     () 
     params 
     [ 
     "search_hash" 
     ] 
     = 
     h 
     p 
     , 
     r 
     = 
     _paginate 
     ( 
     f 
     "hash- 
     { 
     h 
     } 
     " 
     , 
     params 
     ) 
     agg_pages 
     += 
     p 
     agg_results 
     += 
     r 
     return 
     { 
     "pages" 
     : 
     agg_pages 
     , 
     "results" 
     : 
     agg_results 
     } 
     def 
      
     run_domains 
     ( 
     domains 
     : 
     list 
     [ 
     str 
     ]) 
     - 
    > dict 
     : 
     agg_pages 
     = 
     agg_results 
     = 
     0 
     for 
     d 
     in 
     domains 
     : 
     params 
     = 
     _first_page_params 
     () 
     # DomainTools accepts `domain` as a filter in Investigate search 
     params 
     [ 
     "domain" 
     ] 
     = 
     d 
     p 
     , 
     r 
     = 
     _paginate 
     ( 
     f 
     "domain- 
     { 
     d 
     } 
     " 
     , 
     params 
     ) 
     agg_pages 
     += 
     p 
     agg_results 
     += 
     r 
     return 
     { 
     "pages" 
     : 
     agg_pages 
     , 
     "results" 
     : 
     agg_results 
     } 
     def 
      
     run_queries 
     ( 
     queries 
     : 
     list 
     [ 
     str 
     ]) 
     - 
    > dict 
     : 
     agg_pages 
     = 
     agg_results 
     = 
     0 
     for 
     q 
     in 
     queries 
     : 
     # Merge arbitrary k=v pairs from the query string 
     base 
     = 
     _first_page_params 
     () 
     for 
     k 
     , 
     v 
     in 
     urllib 
     . 
     parse 
     . 
     parse_qsl 
     ( 
     q 
     , 
     keep_blank_values 
     = 
     True 
     ): 
     base 
     . 
     setdefault 
     ( 
     k 
     , 
     v 
     ) 
     p 
     , 
     r 
     = 
     _paginate 
     ( 
     f 
     "query- 
     { 
     q 
     . 
     replace 
     ( 
     '=' 
     , 
     '-' 
     ) 
     } 
     " 
     , 
     base 
     ) 
     agg_pages 
     += 
     p 
     agg_results 
     += 
     r 
     return 
     { 
     "pages" 
     : 
     agg_pages 
     , 
     "results" 
     : 
     agg_results 
     } 
     # --- Entry point --- 
     def 
      
     lambda_handler 
     ( 
     event 
     = 
     None 
     , 
     context 
     = 
     None 
     ): 
     if 
     USE_MODE 
     == 
     "HASH" 
     and 
     SEARCH_HASHES 
     : 
     res 
     = 
     run_hashes 
     ( 
     SEARCH_HASHES 
     ) 
     elif 
     USE_MODE 
     == 
     "DOMAINS" 
     and 
     DOMAINS 
     : 
     res 
     = 
     run_domains 
     ( 
     DOMAINS 
     ) 
     elif 
     USE_MODE 
     == 
     "QUERY" 
     and 
     QUERY_LIST 
     : 
     res 
     = 
     run_queries 
     ( 
     QUERY_LIST 
     ) 
     else 
     : 
     raise 
     ValueError 
     ( 
     "Invalid USE_MODE or missing parameters. Set USE_MODE to HASH | DOMAINS | QUERY and provide SEARCH_HASHES | DOMAINS | QUERY_LIST accordingly." 
     ) 
     return 
     { 
     "ok" 
     : 
     True 
     , 
     "mode" 
     : 
     USE_MODE 
     , 
     ** 
     res 
     } 
     if 
     __name__ 
     == 
     "__main__" 
     : 
     print 
     ( 
     json 
     . 
     dumps 
     ( 
     lambda_handler 
     (), 
     indent 
     = 
     2 
     )) 
     
    
  5. Go to Configuration > Environment variables > Edit > Add new environment variable.

  6. Enter the following environment variables, replacing with your values:

    Key Example Value Description
    S3_BUCKET
    domaintools-iris S3 bucket name where data will be stored.
    S3_PREFIX
    domaintools/iris/ Optional S3 prefix (subfolder) for objects.
    STATE_KEY
    domaintools/iris/state.json Optional state/checkpoint file key.
    DT_API_KEY
    DT-XXXXXXXXXXXXXXXXXXXX DomainTools API key.
    DT_API_SECRET
    YYYYYYYYYYYYYYYYYYYYYYYY DomainTools API secret (if applicable).
    USE_MODE
    HASH | DOMAINS | QUERY Select which mode to use (only one is active at a time).
    SEARCH_HASHES
    hash1;hash2;hash3 Required if USE_MODE=HASH . Semicolon-separated list of saved search hashes from the Iris UI.
    DOMAINS
    example.com;domaintools.com Required if USE_MODE=DOMAINS . Semicolon-separated list of domains.
    QUERY_LIST
    ip=1.1.1.1;ip=8.8.8.8;domain=example.org Required if USE_MODE=QUERY . Semicolon-separated list of query strings ( k=v&k2=v2 ).
    PAGE_SIZE
    500 Rows per page (default 500).
    MAX_PAGES
    20 Max pages per request
  7. After the function is created, stay on its page (or open Lambda > Functions > your-function).

  8. Select the Configurationtab.

  9. In the General configurationpanel, click Edit.

  10. Change Timeoutto 15 minutes (900 seconds)and click Save.

Create an EventBridge schedule

  1. Go to Amazon EventBridge > Scheduler > Create schedule.
  2. Provide the following configuration details:
    • Recurring schedule: Rate( 1 hour ).
    • Target: your Lambda function.
    • Name: domaintools-iris-1h .
  3. Click Create schedule.

Optional: Create read-only IAM user & keys for Google SecOps

  1. In the AWS Console, go to IAM > Users, then click Add users.
  2. Provide the following configuration details:
    • User: Enter a unique name (for example, secops-reader )
    • Access type: Select Access key - Programmatic access
    • Click Create user.
  3. Attach minimal read policy (custom): Users > select secops-reader > Permissions > Add permissions > Attach policies directly > Create policy
  4. In the JSON editor, enter the following policy:

      { 
      
     "Version" 
     : 
      
     "2012-10-17" 
     , 
      
     "Statement" 
     : 
      
     [ 
      
     { 
      
     "Effect" 
     : 
      
     "Allow" 
     , 
      
     "Action" 
     : 
      
     [ 
     "s3:GetObject" 
     ], 
      
     "Resource" 
     : 
      
     "arn:aws:s3:::<your-bucket>/*" 
      
     }, 
      
     { 
      
     "Effect" 
     : 
      
     "Allow" 
     , 
      
     "Action" 
     : 
      
     [ 
     "s3:ListBucket" 
     ], 
      
     "Resource" 
     : 
      
     "arn:aws:s3:::<your-bucket>" 
      
     } 
      
     ] 
     } 
     
    
  5. Set the name to secops-reader-policy .

  6. Go to Create policy > search/select > Next > Add permissions.

  7. Go to Security credentials > Access keys > Create access key.

  8. Download the CSV(these values are entered into the feed).

Configure a feed in Google SecOps to ingest DomainTools Iris Investigate results

  1. Go to SIEM Settings > Feeds.
  2. Click Add New Feed.
  3. In the Feed namefield, enter a name for the feed (for example, DomainTools Iris Investigate ).
  4. Select Amazon S3 V2as the Source type.
  5. Select DomainTools Threat Intelligenceas the Log type.
  6. Click Next.
  7. Specify values for the following input parameters:
    • S3 URI: s3://domaintools-iris/domaintools/iris/
    • Source deletion options: Select the deletion option according to your preference.
    • Maximum File Age: Default 180 Days.
    • Access Key ID: User access key with access to the S3 bucket.
    • Secret Access Key: User secret key with access to the S3 bucket.
    • Asset namespace: domaintools.threat_intel
    • Ingestion labels: The label to be applied to the events from this feed.
  8. Click Next.
  9. Review your new feed configuration in the Finalizescreen, and then click Submit.

UDM Mapping Table

Log Field UDM Mapping Logic
active
principal.domain.status Directly mapped from the active field in the raw log.
additional_whois_email.[].value
about.labels.additional_whois_email Extracted from additional_whois_email array and added as a label in the about object.
adsense.value
about.labels.adsense Extracted from adsense.value and added as a label in the about object.
admin_contact.city.value
principal.domain.admin.office_address.city Directly mapped from the admin_contact.city.value field in the raw log.
admin_contact.country.value
principal.domain.admin.office_address.country_or_region Directly mapped from the admin_contact.country.value field in the raw log.
admin_contact.email.[].value
principal.domain.admin.email_addresses Extracted from admin_contact.email array and added to the email_addresses field.
admin_contact.fax.value
principal.domain.admin.attribute.labels.fax Extracted from admin_contact.fax.value and added as a label with key "fax" in the admin attribute.
admin_contact.name.value
principal.domain.admin.user_display_name Directly mapped from the admin_contact.name.value field in the raw log.
admin_contact.org.value
principal.domain.admin.company_name Directly mapped from the admin_contact.org.value field in the raw log.
admin_contact.phone.value
principal.domain.admin.phone_numbers Directly mapped from the admin_contact.phone.value field in the raw log.
admin_contact.postal.value
principal.domain.admin.attribute.labels.postal Extracted from admin_contact.postal.value and added as a label with key "postal" in the admin attribute.
admin_contact.state.value
principal.domain.admin.office_address.state Directly mapped from the admin_contact.state.value field in the raw log.
admin_contact.street.value
principal.domain.admin.office_address.name Directly mapped from the admin_contact.street.value field in the raw log.
alexa
about.labels.alexa Directly mapped from the alexa field in the raw log and added as a label in the about object.
baidu_codes.[].value
about.labels.baidu_codes Extracted from baidu_codes array and added as a label in the about object.
billing_contact.city.value
principal.domain.billing.office_address.city Directly mapped from the billing_contact.city.value field in the raw log.
billing_contact.country.value
principal.domain.billing.office_address.country_or_region Directly mapped from the billing_contact.country.value field in the raw log.
billing_contact.email.[].value
principal.domain.billing.email_addresses Extracted from billing_contact.email array and added to the email_addresses field.
billing_contact.fax.value
principal.domain.billing.attribute.labels.fax Extracted from billing_contact.fax.value and added as a label with key "fax" in the billing attribute.
billing_contact.name.value
principal.domain.billing.user_display_name Directly mapped from the billing_contact.name.value field in the raw log.
billing_contact.org.value
principal.domain.billing.company_name Directly mapped from the billing_contact.org.value field in the raw log.
billing_contact.phone.value
principal.domain.billing.phone_numbers Directly mapped from the billing_contact.phone.value field in the raw log.
billing_contact.postal.value
principal.domain.billing.attribute.labels.postal Extracted from billing_contact.postal.value and added as a label with key "postal" in the billing attribute.
billing_contact.state.value
principal.domain.billing.office_address.state Directly mapped from the billing_contact.state.value field in the raw log.
billing_contact.street.value
principal.domain.billing.office_address.name Directly mapped from the billing_contact.street.value field in the raw log.
create_date.value
principal.domain.creation_time Converted to timestamp format from the create_date.value field in the raw log.
data_updated_timestamp
principal.domain.audit_update_time Converted to timestamp format from the data_updated_timestamp field in the raw log.
domain
principal.hostname Directly mapped from the domain field in the raw log.
domain_risk.components.[].evidence
security_result.detection_fields.evidence Extracted from domain_risk.components.[].evidence array and added as a detection field with key "evidence" in the security_result object.
domain_risk.components.[].name
security_result.category_details Directly mapped from the domain_risk.components.[].name field in the raw log.
domain_risk.components.[].risk_score
security_result.risk_score Directly mapped from the domain_risk.components.[].risk_score field in the raw log.
domain_risk.components.[].threats
security_result.threat_name The first element of the domain_risk.components.[].threats array is mapped to security_result.threat_name .
domain_risk.components.[].threats
security_result.detection_fields.threats The remaining elements of the domain_risk.components.[].threats array are added as detection fields with key "threats" in the security_result object.
domain_risk.risk_score
security_result.risk_score Directly mapped from the domain_risk.risk_score field in the raw log.
email_domain.[].value
about.labels.email_domain Extracted from email_domain array and added as a label in the about object.
expiration_date.value
principal.domain.expiration_time Converted to timestamp format from the expiration_date.value field in the raw log.
fb_codes.[].value
about.labels.fb_codes Extracted from fb_codes array and added as a label in the about object.
first_seen.value
principal.domain.first_seen_time Converted to timestamp format from the first_seen.value field in the raw log.
ga4.[].value
about.labels.ga4 Extracted from ga4 array and added as a label in the about object.
google_analytics.value
about.labels.google_analytics Extracted from google_analytics.value and added as a label in the about object.
gtm_codes.[].value
about.labels.gtm_codes Extracted from gtm_codes array and added as a label in the about object.
hotjar_codes.[].value
about.labels.hotjar_codes Extracted from hotjar_codes array and added as a label in the about object.
ip.[].address.value
principal.ip The first element of the ip array is mapped to principal.ip .
ip.[].address.value
about.labels.ip_address The remaining elements of the ip array are added as labels with key "ip_address" in the about object.
ip.[].asn.[].value
network.asn The first element of the first ip.asn array is mapped to network.asn .
ip.[].asn.[].value
about.labels.asn The remaining elements of the ip.asn arrays are added as labels with key "asn" in the about object.
ip.[].country_code.value
principal.location.country_or_region The country_code.value of the first element in the ip array is mapped to principal.location.country_or_region .
ip.[].country_code.value
about.location.country_or_region The country_code.value of the remaining elements in the ip array are mapped to about.location.country_or_region .
ip.[].isp.value
principal.labels.isp The isp.value of the first element in the ip array is mapped to principal.labels.isp .
ip.[].isp.value
about.labels.isp The isp.value of the remaining elements in the ip array are mapped to about.labels.isp .
matomo_codes.[].value
about.labels.matomo_codes Extracted from matomo_codes array and added as a label in the about object.
monitor_domain
about.labels.monitor_domain Directly mapped from the monitor_domain field in the raw log and added as a label in the about object.
monitoring_domain_list_name
about.labels.monitoring_domain_list_name Directly mapped from the monitoring_domain_list_name field in the raw log and added as a label in the about object.
mx.[].domain.value
about.domain.name Directly mapped from the mx.[].domain.value field in the raw log.
mx.[].host.value
about.hostname Directly mapped from the mx.[].host.value field in the raw log.
mx.[].ip.[].value
about.ip Extracted from mx.[].ip array and added to the ip field.
mx.[].priority
about.security_result.priority_details Directly mapped from the mx.[].priority field in the raw log.
name_server.[].domain.value
about.labels.name_server_domain Extracted from name_server.[].domain.value and added as a label with key "name_server_domain" in the about object.
name_server.[].host.value
principal.domain.name_server Extracted from name_server.[].host.value and added to the name_server field.
name_server.[].host.value
about.domain.name_server Extracted from name_server.[].host.value and added to the name_server field.
name_server.[].ip.[].value
about.labels.ip Extracted from name_server.[].ip array and added as a label with key "ip" in the about object.
popularity_rank
about.labels.popularity_rank Directly mapped from the popularity_rank field in the raw log and added as a label in the about object.
redirect.value
about.labels.redirect Extracted from redirect.value and added as a label in the about object.
redirect_domain.value
about.labels.redirect_domain Extracted from redirect_domain.value and added as a label in the about object.
registrant_contact.city.value
principal.domain.registrant.office_address.city Directly mapped from the registrant_contact.city.value field in the raw log.
registrant_contact.country.value
principal.domain.registrant.office_address.country_or_region Directly mapped from the registrant_contact.country.value field in the raw log.
registrant_contact.email.[].value
principal.domain.registrant.email_addresses Extracted from registrant_contact.email array and added to the email_addresses field.
registrant_contact.fax.value
principal.domain.registrant.attribute.labels.fax Extracted from registrant_contact.fax.value and added as a label with key "fax" in the registrant attribute.
registrant_contact.name.value
principal.domain.registrant.user_display_name Directly mapped from the registrant_contact.name.value field in the raw log.
registrant_contact.org.value
principal.domain.registrant.company_name Directly mapped from the registrant_contact.org.value field in the raw log.
registrant_contact.phone.value
principal.domain.registrant.phone_numbers Directly mapped from the registrant_contact.phone.value field in the raw log.
registrant_contact.postal.value
principal.domain.registrant.attribute.labels.postal Extracted from registrant_contact.postal.value and added as a label with key "postal" in the registrant attribute.
registrant_contact.state.value
principal.domain.registrant.office_address.state Directly mapped from the registrant_contact.state.value field in the raw log.
registrant_contact.street.value
principal.domain.registrant.office_address.name Directly mapped from the registrant_contact.street.value field in the raw log.
registrant_name.value
about.labels.registrant_name Extracted from registrant_name.value and added as a label in the about object.
registrant_org.value
about.labels.registrant_org Extracted from registrant_org.value and added as a label in the about object.
registrar.value
principal.domain.registrar Directly mapped from the registrar.value field in the raw log.
registrar_status
about.labels.registrar_status Extracted from registrar_status array and added as a label in the about object.
server_type
network.tls.client.server_name Directly mapped from the server_type field in the raw log.
soa_email.[].value
principal.user.email_addresses Extracted from soa_email array and added to the email_addresses field.
spf_info
about.labels.spf_info Directly mapped from the spf_info field in the raw log and added as a label in the about object.
ssl_email.[].value
about.labels.ssl_email Extracted from ssl_email array and added as a label in the about object.
ssl_info.[].alt_names.[].value
about.labels.alt_names Extracted from ssl_info.[].alt_names array and added as a label in the about object.
ssl_info.[].common_name.value
about.labels.common_name Extracted from ssl_info.[].common_name.value and added as a label in the about object.
ssl_info.[].duration.value
about.labels.duration Extracted from ssl_info.[].duration.value and added as a label in the about object.
ssl_info.[].email.[].value
about.labels.ssl_info_email Extracted from ssl_info.[].email array and added as a label with key "ssl_info_email" in the about object.
ssl_info.[].hash.value
network.tls.server.certificate.sha1 The hash.value of the first element in the ssl_info array is mapped to network.tls.server.certificate.sha1 .
ssl_info.[].hash.value
about.labels.hash The hash.value of the remaining elements in the ssl_info array are mapped to about.labels.hash .
ssl_info.[].issuer_common_name.value
network.tls.server.certificate.issuer The issuer_common_name.value of the first element in the ssl_info array is mapped to network.tls.server.certificate.issuer .
ssl_info.[].issuer_common_name.value
about.labels.issuer_common_name The issuer_common_name.value of the remaining elements in the ssl_info array are mapped to about.labels.issuer_common_name .
ssl_info.[].not_after.value
network.tls.server.certificate.not_after The not_after.value of the first element in the ssl_info array is converted to timestamp format and mapped to network.tls.server.certificate.not_after .
ssl_info.[].not_after.value
about.labels.not_after The not_after.value of the remaining elements in the ssl_info array are mapped to about.labels.not_after .
ssl_info.[].not_before.value
network.tls.server.certificate.not_before The not_before.value of the first element in the ssl_info array is converted to timestamp format and mapped to network.tls.server.certificate.not_before .
ssl_info.[].not_before.value
about.labels.not_before The not_before.value of the remaining elements in the ssl_info array are mapped to about.labels.not_before .
ssl_info.[].organization.value
network.organization_name The organization.value of the first element in the ssl_info array is mapped to network.organization_name .
ssl_info.[].organization.value
about.labels.organization The organization.value of the remaining elements in the ssl_info array are mapped to about.labels.organization .
ssl_info.[].subject.value
about.labels.subject Extracted from ssl_info.[].subject.value and added as a label in the about object.
statcounter_project_codes.[].value
about.labels.statcounter_project_codes Extracted from statcounter_project_codes array and added as a label in the about object.
statcounter_security_codes.[].value
about.labels.statcounter_security_codes Extracted from statcounter_security_codes array and added as a label in the about object.
tags.[].label
about.file.tags Extracted from tags.[].label and added to the tags field.
tags.[].scope
security_result.detection_fields.scope Extracted from tags.[].scope and added as a detection field with key "scope" in the security_result object.
tags.[].tagged_at
security_result.detection_fields.tagged_at Extracted from tags.[].tagged_at and added as a detection field with key "tagged_at" in the security_result object.
technical_contact.city.value
principal.domain.tech.office_address.city Directly mapped from the technical_contact.city.value field in the raw log.
technical_contact.country.value
principal.domain.tech.office_address.country_or_region Directly mapped from the technical_contact.country.value field in the raw log.
technical_contact.email.[].value
principal.domain.tech.email_addresses Extracted from technical_contact.email array and added to the email_addresses field.
technical_contact.fax.value
principal.domain.tech.attribute.labels.fax Extracted from technical_contact.fax.value and added as a label with key "fax" in the tech attribute.
technical_contact.name.value
principal.domain.tech.user_display_name Directly mapped from the technical_contact.name.value field in the raw log.
technical_contact.org.value
principal.domain.tech.company_name Directly mapped from the technical_contact.org.value field in the raw log.
technical_contact.phone.value
principal.domain.tech.phone_numbers Directly mapped from the technical_contact.phone.value field in the raw log.
technical_contact.postal.value
principal.domain.tech.attribute.labels.postal Extracted from technical_contact.postal.value and added as a label with key "postal" in the tech attribute.
technical_contact.state.value
principal.domain.tech.office_address.state Directly mapped from the technical_contact.state.value field in the raw log.
technical_contact.street.value
principal.domain.tech.office_address.name Directly mapped from the technical_contact.street.value field in the raw log.
tld
about.labels.tld Directly mapped from the tld field in the raw log and added as a label in the about object.
timestamp
about.labels.timestamp Directly mapped from the timestamp field in the raw log and added as a label in the about object.
website_response
principal.network.http.response_code Directly mapped from the website_response field in the raw log.
website_title
about.labels.website_title Directly mapped from the website_title field in the raw log and added as a label in the about object.
whois_url
principal.domain.whois_server Directly mapped from the whois_url field in the raw log.
yandex_codes.[].value
about.labels.yandex_codes Extracted from yandex_codes array and added as a label in the about object.
edr.client.hostname Set to the value of the domain field.
edr.client.ip_addresses Set to the value of the first element in the ip array, specifically ip.[0].address.value .
edr.raw_event_name Set to "STATUS_UPDATE" if principal.hostname is present, otherwise set to "GENERIC_EVENT".
metadata.event_timestamp Copied from the top-level create_time field in the raw log.
metadata.event_type Set to "STATUS_UPDATE" if principal.hostname is present, otherwise set to "GENERIC_EVENT".
metadata.log_type Set to "DOMAINTOOLS_THREATINTEL".
metadata.product_name Set to "DOMAINTOOLS".
metadata.vendor_name Set to "DOMAINTOOLS".

Need more help? Get answers from Community members and Google SecOps professionals.

Create a Mobile Website
View Site in Mobile | Classic
Share by: