This parser extracts key information such as timestamps, user IDs, source IPs, actions, and object IDs from JSON and SYSLOG formatted logs. It uses grok patterns to match various log message formats, handling variations in structure, and populates a unified data model (UDM) with the extracted fields. The parser also categorizes events based on the presence of user or IP information.
Before you begin
Ensure that you have the following prerequisites:
Google SecOps instance.
Privileged access to Google Cloud IAM.
Privileged access to Google Cloud Storage.
Privileged access to Jenkins.
Create a Google Cloud Storage Bucket
Go toCloud Storage.
Create a new bucket. Choose a unique name and appropriate region.
Ensure the bucket has proper access controls (for example, only authorized service accounts can write to it).
Create a Google Cloud Service account
Go toIAM & Admin>Service Accounts.
Create a new service account. Give it a descriptive name (for example,jenkins-logs).
Grant the service account theStorage Object Creatorrole on the GCS bucket you created in the previous step.
Ifdatamatches an IP address pattern, it maps toprincipal.ip. If it matches a username pattern, it maps toprincipal.user.userid. Otherwise, it maps tometadata.description.
Used to extractobjectandact. If a/is present, it is split intoobjectandact. If»is present, it is split intoobjectandact. Otherwise, it is treated asactand potentially further parsed.
If present, initially mapped tometadata.description. If it contains "completed:", the value after is extracted and mapped tosecurity_result.action_details.
object
target.asset.product_object_id
Extracted frommsg1. Represents the object acted upon.
object_id
target.resource.attribute.labels.value
Extracted fromobjectif a/is present. Represents a more specific object identifier. The key is hardcoded as "Plugin Name".
src_ip
principal.ip
Extracted frommessageordata. Represents the source IP address.
user
principal.user.userid
Extracted frommessageordata. Represents the user associated with the event.
metadata.event_timestamp
Copied from the calculated@timestampfield.
metadata.event_type
Determined by parser logic. Set toUSER_UNCATEGORIZEDifuseris present,STATUS_UNCATEGORIZEDifsrc_ipis present, andGENERIC_EVENTotherwise.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-09-04 UTC."],[[["\u003cp\u003eThis guide explains how to collect Jenkins logs and send them to Google SecOps for analysis, using a parser to extract key data from JSON and SYSLOG formats.\u003c/p\u003e\n"],["\u003cp\u003eThe process involves creating a Google Cloud Storage bucket and service account, installing the Google Cloud Storage and OAuth Credentials plugins in Jenkins, and configuring Jenkins to authenticate with Google Cloud.\u003c/p\u003e\n"],["\u003cp\u003eJenkins logs are uploaded to a specified storage location through the configuration of post-build actions and setting Google Cloud as a destination.\u003c/p\u003e\n"],["\u003cp\u003eA feed in Google SecOps is configured to ingest the uploaded Jenkins logs, specifying the source type, log type, and storage bucket URI, along with other parameters for data handling.\u003c/p\u003e\n"],["\u003cp\u003eThe parser will map the Jenkins logs data into the unified data model (UDM), specifying the mapping between the fields in Jenkins logs and UDM fields.\u003c/p\u003e\n"]]],[],null,["# Collect Jenkins logs\n====================\n\nSupported in: \nGoogle secops [SIEM](/chronicle/docs/secops/google-secops-siem-toc)\n| **Note:** This feature is covered by [Pre-GA Offerings Terms](https://chronicle.security/legal/service-terms/) of the Google Security Operations Service Specific Terms. Pre-GA features might have limited support, and changes to pre-GA features might not be compatible with other pre-GA versions. For more information, see the [Google SecOps Technical Support Service guidelines](https://chronicle.security/legal/technical-support-services-guidelines/) and the [Google SecOps Service Specific Terms](https://chronicle.security/legal/service-terms/).\n\nOverview\n--------\n\nThis parser extracts key information such as timestamps, user IDs, source IPs, actions, and object IDs from JSON and SYSLOG formatted logs. It uses grok patterns to match various log message formats, handling variations in structure, and populates a unified data model (UDM) with the extracted fields. The parser also categorizes events based on the presence of user or IP information.\n\nBefore you begin\n----------------\n\nEnsure that you have the following prerequisites:\n\n- Google SecOps instance.\n- Privileged access to Google Cloud IAM.\n- Privileged access to Google Cloud Storage.\n- Privileged access to Jenkins.\n\nCreate a Google Cloud Storage Bucket\n------------------------------------\n\n1. Go to **Cloud Storage**.\n2. Create a new bucket. Choose a unique name and appropriate region.\n3. Ensure the bucket has proper access controls (for example, only authorized service accounts can write to it).\n\nCreate a Google Cloud Service account\n-------------------------------------\n\n1. Go to **IAM \\& Admin** \\\u003e **Service Accounts**.\n2. Create a new service account. Give it a descriptive name (for example, **jenkins-logs**).\n3. Grant the service account the **Storage Object Creator** role on the GCS bucket you created in the previous step.\n4. Create an SSH key for your service account: [Create and delete service account keys](/iam/docs/keys-create-delete).\n5. Download a JSON key file for the service account.\n\n | **Note:** Keep this file secure. You will need it for the **Google OAuth Credentials** plugin to create credentials.\n\nInstall Google Cloud Storage plugin in Jenkins\n----------------------------------------------\n\n1. Go to **Manage Jenkins** \\\u003e **Plugins**.\n2. Select **Available plugins**.\n3. Search for the **Google Cloud Storage** plugin.\n4. Install the plugin and restart Jenkins if required.\n\nInstall Google OAuth Credentials Plugin in Jenkins\n--------------------------------------------------\n\n1. Go to **Manage Jenkins** \\\u003e **Plugins**.\n2. Select **Available plugins**\n3. Search for the **Google OAuth Credentials** plugin.\n4. Install the plugin and restart Jenkins if required.\n\nConfigure Jenkins to authenticate with Google Cloud\n---------------------------------------------------\n\n1. Go to **Manage Jenkins** \\\u003e **Credentials** \\\u003e **System**.\n\n | **Note:** You can use **Global Credentials** or add a new domain (recommended).\n2. Click add **Add Credentials**.\n\n3. **Kind** : select **Google Service Account from private key**.\n\n4. **Project name**: set a name for the credentials.\n\n5. Upload the JSON key file you obtained during the Google Cloud Service account creation.\n\n6. Click **Create**.\n\nConfigure Jenkins logs to upload Google SecOps\n----------------------------------------------\n\n1. In the Jenkins job configuration, add **Google Storage Build Log Upload** in post-build actions, with the following parameters:\n - **Google Credentials**: The name of your Google credentials you created in the previous step.\n - **Log Name**: The name of the file to store the Jenkins build log, under the specified storage path.\n - **Storage Location**: The name of the bucket where you want to upload your logs. The bucket must be accessible to the service account you created.\n2. Test the log upload.\n\nSet up feeds\n------------\n\nTo configure a feed, follow these steps:\n\n1. Go to **SIEM Settings** \\\u003e **Feeds**.\n2. Click **Add New Feed**.\n3. On the next page, click **Configure a single feed**.\n4. In the **Feed name** field, enter a name for the feed; for example, **Jenkins Logs**.\n5. Select **Google Cloud Storage V2** as the **Source type**.\n6. Select **Jenkins** as the **Log type**.\n7. Click **Get Service Account** as the **Chronicle Service Account**.\n8. Specify values for the following input parameters:\n\n - **Storage Bucket URI** : Google Cloud storage bucket URL in **`gs://my-bucket/\u003cvalue\u003e`** format.\n - **Source deletion options**: select deletion option according to your preference.\n\n | **Note:** If you select the `Delete transferred files` or `Delete transferred files and empty directories` option, make sure that you granted appropriate permissions to the service account. \\* **Maximum File Age**: Includes files modified in the last number of days. Default is 180 days.\n9. Click **Create Feed**.\n\nUDM Mapping Table\n-----------------\n\n**Need more help?** [Get answers from Community members and Google SecOps professionals.](https://security.googlecloudcommunity.com/google-security-operations-2)"]]