Collect MuleSoft Anypoint logs
This document explains how to ingest audit-trail events from MuleSoft Anypoint platform logs to Google Security Operations using AWS S3.
Before you begin
Make sure you have the following prerequisites:
- Google SecOps instance
- Privileged access to MuleSoft
- Privileged access to AWS
Get the MuleSoft Organization ID
- Sign in to the Anypoint Platform.
- Go to Menu > Access Management.
- In the Business Groupstable, click your organization's name.
- Copy the Organization ID(for example, 0a12b3c4-d5e6-789f-1021-1a2b34cd5e6f).
- Alternatively, go to MuleSoft Business Groups and copy the ID from the URL.
Configure AWS S3 bucket and IAM for Google SecOps
- Create Amazon S3 bucketfollowing this user guide: Creating a bucket .
- Save bucket Nameand Regionfor future reference (for example, mulesoft-audit-logs).
- Create a Userfollowing this user guide: Creating an IAM user .
- Select the created User.
- Select Security credentialstab.
- Click Create Access Keyin section Access Keys.
- Select Third-party serviceas Use case.
- Click Next.
- Optional: Add description tag.
- Click Create access key.
- Click Download CSV filefor save the Access Keyand Secret Access Keyfor future reference.
- Click Done.
- Select Permissionstab.
- Click Add permissionsin section Permissions policies.
- Select Add permissions.
- Select Attach policies directly.
- Search for and select the AmazonS3FullAccesspolicy.
- Click Next.
- Click Add permissions.
Create the MuleSoft Connected App
- Sign in to the Anypoint Platform.
- Go to Access Management > Connected Apps > Create App.
- Provide the following configuration details: -  App name: Enter a unique name (for example, Google SecOps export).
- Select App acts on its own behalf (client credentials).
- Click Add scopes → Audit Log Viewer → Next.
- Select every Business Group whose logs you need.
- Click Next > Add scopes.
 
-  App name: Enter a unique name (for example, 
- Click Saveand copy the Client IDand Client Secret.
Configure IAM policy & role for S3 uploads
-  Policy JSON(replace mulesoft-audit-logswith your bucket name):{ "Version" : "2012-10-17" , "Statement" : [ { "Sid" : "AllowPutAuditObjects" , "Effect" : "Allow" , "Action" : [ "s3:PutObject" ], "Resource" : "arn:aws:s3:::mulesoft-audit-logs/*" } ] }
-  Go to AWS console > IAM > Policies > Create policy > JSON tab. 
-  Copy and paste the policy. 
-  Click Next > Create policy. 
-  Go to IAM > Roles > Create role > AWS service > Lambda. 
-  Attach the newly created policy. 
-  Name the role WriteMulesoftToS3Roleand click Create role.
Create the Lambda function
| Setting | Value | 
|---|---|
| Name | mulesoft_audit_to_s3 | 
| Runtime | Python 3.13 | 
| Architecture | x86_64 | 
| Execution role | Use existing > WriteMulesoftToS3Role | 
-  After the function is created, open the Codetab, delete the stub and enter the following code ( mulesoft_audit_to_s3.py).#!/usr/bin/env python3 import os , json , gzip , io , uuid , datetime as dt , urllib.request , urllib.error , urllib.parse import boto3 ORG_ID = os . environ [ "MULE_ORG_ID" ] CLIENT_ID = os . environ [ "CLIENT_ID" ] CLIENT_SECRET = os . environ [ "CLIENT_SECRET" ] S3_BUCKET = os . environ [ "S3_BUCKET_NAME" ] TOKEN_URL = "https://anypoint.mulesoft.com/accounts/api/v2/oauth2/token" QUERY_URL = f "https://anypoint.mulesoft.com/audit/v2/organizations/ { ORG_ID } /query" def http_post ( url , data , headers = None ): raw = json . dumps ( data ) . encode () if headers else urllib . parse . urlencode ( data ) . encode () req = urllib . request . Request ( url , raw , headers or {}) try : with urllib . request . urlopen ( req , timeout = 30 ) as r : return json . loads ( r . read ()) except urllib . error . HTTPError as e : print ( "MuleSoft error body →" , e . read () . decode ()) raise def get_token (): return http_post ( TOKEN_URL , { "grant_type" : "client_credentials" , "client_id" : CLIENT_ID , "client_secret" : CLIENT_SECRET })[ "access_token" ] def fetch_audit ( token , start , end ): headers = { "Authorization" : f "Bearer { token } " , "Content-Type" : "application/json" } body = { "startDate" : f " { start . isoformat ( timespec = 'milliseconds' ) } Z" , "endDate" : f " { end . isoformat ( timespec = 'milliseconds' ) } Z" , "limit" : 200 , "offset" : 0 , "ascending" : False } while True : data = http_post ( QUERY_URL , body , headers ) if not data . get ( "data" ): break yield from data [ "data" ] body [ "offset" ] += body [ "limit" ] def upload ( events , ts ): key = f " { ts : %Y/%m/%d } /mulesoft-audit- { uuid . uuid4 () } .json.gz" buf = io . BytesIO () with gzip . GzipFile ( fileobj = buf , mode = "w" ) as gz : for ev in events : gz . write (( json . dumps ( ev ) + " \n " ) . encode ()) buf . seek ( 0 ) boto3 . client ( "s3" ) . upload_fileobj ( buf , S3_BUCKET , key ) def lambda_handler ( event = None , context = None ): now = dt . datetime . utcnow () . replace ( microsecond = 0 ) start = now - dt . timedelta ( days = 1 ) token = get_token () events = list ( fetch_audit ( token , start , now )) if events : upload ( events , start ) print ( f "Uploaded { len ( events ) } events" ) else : print ( "No events in the last 24 h" ) # For local testing if __name__ == "__main__" : lambda_handler ()
-  Go to Configuration > Environment variables > Edit > Add new environment variable. 
-  Enter the following environment variables provided, replacing with your value. Key Example value MULE_ORG_IDyour_org_idCLIENT_IDyour_client_idCLIENT_SECRETyour_client_secretS3_BUCKET_NAMEmulesoft-audit-logs
Schedule the Lambda function (EventBridge Scheduler)
- Go to Configuration > Triggers > Add trigger > EventBridge Scheduler > Create rule.
- Provide the following configuration details: -  Name: daily-mulesoft-audit export.
- Schedule pattern: Cron expression.
-  Expression: 0 2 * * *(runs daily at 02:00 UTC).
 
-  Name: 
- Leave the rest as default and click Create.
Configure a feed in Google SecOps to ingest the MuleSoft logs
- Go to SIEM Settings > Feeds.
- Click Add new.
- In the Feed namefield, enter a name for the feed (for example, MuleSoft Logs).
- Select Amazon S3 V2as the Source type.
- Select Mulesoftas the Log type.
- Click Next.
-  Specify values for the following input parameters: -  S3 URI: The bucket URI -  s3://mulesoft-audit-logs/- Replace mulesoft-audit-logswith the actual name of the bucket.
 
- Replace 
 
-  
-  Source deletion options: select the deletion option according to your preference. 
-  Maximum File Age: Include files modified in the last number of days. Default is 180 days. 
-  Access Key ID: User access key with access to the s3 bucket. 
-  Secret Access Key: User secret key with access to the s3 bucket. 
-  Asset namespace: The asset namespace . 
-  Ingestion labels: The label to be applied to the events from this feed. 
 
-  S3 URI: The bucket URI 
-  Click Next. 
-  Review your new feed configuration in the Finalizescreen, and then click Submit. 
Need more help? Get answers from Community members and Google SecOps professionals.

