Collect Microsoft Network Policy Server (NPS) logs
This document explains how to ingest Microsoft Network Policy Server (NPS) logs to Google Security Operations using Google Cloud Storage V2.
Microsoft Network Policy Server (NPS) is a Windows Server role that provides RADIUS authentication, authorization, and accounting. NPS enables centralized management of network access policies for wireless devices, VPN connections, switches, and remote access. NPS does not have native cloud export capabilities or APIs, so logs must be collected using a third-party forwarder.
Before you begin
Ensure that you have the following prerequisites:
- A Google SecOps instance
- A GCP project with Cloud Storage API enabled
- Permissions to create and manage GCS buckets
- Permissions to manage IAM policies on GCS buckets
- Privileged access to the Windows Server running NPS
- Administrator access to install NXLog on the NPS server
Configure NPS logging
Enable NPS accounting logs
- On the Windows Server running NPS, open Server Manager.
- Go to Tools > Network Policy Server.
- In the NPS console, expand Accountingin the left navigation.
- Right-click Accountingand select Configure Accounting.
- In the Accounting Configurationwizard, click Next.
- Select Log to a text file on the local computerand click Next.
- In the Configure Log File Propertiespage, configure the following:
- Directory: Leave as default (
%systemroot%\System32\LogFiles) or specify a custom path - Format: Select DTS Compliant(recommended by Microsoft for structured XML format)
- Create a new log file: Select Dailyor Weeklybased on your log volume
- Directory: Leave as default (
- Click Next.
- Review the summary and click Finish.
Enable NPS audit logging
- Open Command Promptas Administrator on the NPS server.
-
Run the following command to enable NPS audit logging:
auditpol /set /subcategory:"Network Policy Server" /success:enable /failure:enable -
Verify the audit policy is enabled:
auditpol /get /subcategory:"Network Policy Server"
Install and configure NXLog on the NPS server
Download and install NXLog
- On the NPS server, download NXLog Enterprise Edition from the NXLog website .
- Run the installer and follow the installation wizard.
- Accept the default installation path (
C:\Program Files\nxlog) or specify a custom path. - Complete the installation.
Configure NXLog to collect NPS logs
- On the NPS server, navigate to the NXLog configuration directory:
- Default path:
C:\Program Files\nxlog\conf
- Default path:
- Open the file
nxlog.confin a text editor (for example, Notepad) with Administrator privileges. -
Replace the entire contents of
nxlog.confwith the following configuration:## NXLog configuration for Microsoft NPS log collection ## Collects NPS logs and forwards to Google Cloud Storage define ROOT C:\Program Files\nxlog define LOGDIR C:\Windows\System32\LogFiles define GCS_BUCKET your-gcs-bucket-name define GCS_PREFIX nps-logs define GCS_CREDENTIALS C:\nxlog-credentials\gcs-service-account.json Moduledir %ROOT%\modules CacheDir %ROOT%\data Pidfile %ROOT%\data\nxlog.pid SpoolDir %ROOT%\data LogFile %ROOT%\data\nxlog.log LogLevel INFO <Extension nps> Module xm_nps </Extension> <Extension json> Module xm_json </Extension> <Input nps_logs> Module im_file File '%LOGDIR%\IN*.log' SavePos TRUE ReadFromLast TRUE Recursive FALSE InputType nps <Exec> # Parse NPS logs (DTS Compliant format) # Add hostname for tracking $Hostname = hostname(); # Convert to JSON to_json(); </Exec> </Input> <Output gcs> Module om_file File '%ROOT%\data\gcs-upload\nps-' + strftime(now(), '%Y%m%d-%H%M%S') + '.json' <Exec> # Rotate output file every hour if (file_size(file_name()) > 100000000) file_cycle(); </Exec> </Output> <Route nps_to_gcs> Path nps_logs => gcs </Route> -
In the configuration file, replace the following placeholders:
-
your-gcs-bucket-name: Your GCS bucket name (created in the next section) - Verify the
LOGDIRpath matches your NPS log file location
-
-
Save the file.
Create Google Cloud Storage bucket
- Go to the Google Cloud Console .
- Select your project or create a new one.
- In the navigation menu, go to Cloud Storage > Buckets.
- Click Create bucket.
-
Provide the following configuration details:
Setting Value Name your bucket Enter a globally unique name (for example, nps-logs-chronicle)Location type Choose based on your needs (Region, Dual-region, Multi-region) Location Select the location (for example, us-central1)Storage class Standard (recommended for frequently accessed logs) Access control Uniform (recommended) Protection tools Optional: Enable object versioning or retention policy -
Click Create.
Create GCS service account for NXLog
Create service account
- In the GCP Console, go to IAM & Admin > Service Accounts.
- Click Create Service Account.
- Provide the following configuration details:
- Service account name: Enter
nxlog-nps-uploader(for example) - Service account description: Enter
Service account for NXLog to upload NPS logs to GCS
- Service account name: Enter
- Click Create and Continue.
- In the Grant this service account access to projectsection, add the following role:
- Click Select a role.
- Search for and select Storage Object Creator.
- Click Continue.
- Click Done.
Download service account key
- On the Service Accountspage, click on the service account you just created (
nxlog-nps-uploader). - Go to the Keystab.
- Click Add Key > Create new key.
- Select JSONas the key type.
- Click Create.
- The JSON key file will be downloaded to your computer.
- Copy the JSON key file to the NPS server at the following location:
-
C:\nxlog-credentials\gcs-service-account.json
-
- Create the directory
C:\nxlog-credentialsif it does not exist.
Configure automated GCS upload
Create PowerShell upload script
- On the NPS server, create a new directory:
-
C:\nxlog-scripts
-
- Create a new file named
upload-to-gcs.ps1in the directory. -
Open the file in a text editor and paste the following PowerShell script:
# PowerShell script to upload NXLog output files to Google Cloud Storage # Uses gsutil command-line tool $ErrorActionPreference = "Stop" # Configuration $LocalLogDir = "C:\Program Files\nxlog\data\gcs-upload" $GcsBucket = "gs://nps-logs-chronicle/nps-logs/" $ArchiveDir = "C:\Program Files\nxlog\data\archive" $LogFile = "C:\nxlog-scripts\upload.log" # Function to write log messages function Write-Log { param ( [string] $Message ) $Timestamp = Get-Date -Format "yyyy-MM-dd HH:mm:ss" "$Timestamp - $Message" | Out-File -FilePath $LogFile -Append } Write-Log "Starting GCS upload process" # Check if gsutil is available try { $null = gsutil version } catch { Write-Log "ERROR: gsutil not found. Please install Google Cloud SDK." exit 1 } # Activate service account $ServiceAccountKey = "C:\nxlog-credentials\gcs-service-account.json" if ( -not ( Test-Path $ServiceAccountKey )) { Write-Log "ERROR: Service account key not found at $ServiceAccountKey" exit 1 } try { gcloud auth activate-service-account - -key -file = $ServiceAccountKey 2>&1 | Out-Null Write-Log "Service account activated successfully" } catch { Write-Log "ERROR: Failed to activate service account: $_" exit 1 } # Create archive directory if it doesn't exist if ( -not ( Test-Path $ArchiveDir )) { New-Item -ItemType Directory -Path $ArchiveDir | Out-Null Write-Log "Created archive directory: $ArchiveDir" } # Get all JSON files in the local log directory $Files = Get-ChildItem -Path $LocalLogDir -Filter "*.json" -File if ( $Files . Count -eq 0 ) { Write-Log "No files to upload" exit 0 } Write-Log "Found $( $Files . Count ) file(s) to upload" # Upload each file to GCS $SuccessCount = 0 $FailCount = 0 foreach ( $File in $Files ) { try { # Upload to GCS gsutil cp $File . FullName $GcsBucket 2>&1 | Out-Null # Move to archive on success Move-Item -Path $File . FullName -Destination $ArchiveDir -Force Write-Log "Uploaded and archived: $( $File . Name ) " $SuccessCount ++ } catch { Write-Log "ERROR uploading $( $File . Name ) : $_" $FailCount ++ } } Write-Log "Upload complete. Success: $SuccessCount, Failed: $FailCount" -
In the script, replace the following values:
-
$GcsBucket: Your GCS bucket URI (for example,gs://nps-logs-chronicle/nps-logs/)
-
-
Save the file.
Install Google Cloud SDK
- On the NPS server, download the Google Cloud SDK installer from Google Cloud SDK .
- Run the installer and follow the installation wizard.
- During installation, select Install bundled Pythonif Python is not already installed.
- Complete the installation.
-
Open a new Command Promptand verify the installation:
gsutil version
Create scheduled task for automated upload
- On the NPS server, open Task Scheduler.
- In the right panel, click Create Task.
- In the Generaltab, configure the following:
- Name: Enter
NXLog GCS Upload - Description: Enter
Upload NPS logs to Google Cloud Storage - Security options: Select Run whether user is logged on or not
- Security options: Check Run with highest privileges
- Configure for: Select Windows Server 2016or your server version
- Name: Enter
- In the Triggerstab:
- Click New.
- Begin the task: Select On a schedule.
- Settings: Select Daily.
- Recur every: Enter
1days. - Repeat task every: Select 15 minutes.
- for a duration of: Select Indefinitely.
- Click OK.
- In the Actionstab:
- Click New.
- Action: Select Start a program.
- Program/script: Enter
powershell.exe. - Add arguments: Enter
-ExecutionPolicy Bypass -File "C:\nxlog-scripts\upload-to-gcs.ps1". - Click OK.
- In the Conditionstab:
- Uncheck Start the task only if the computer is on AC power.
- In the Settingstab:
- Check Run task as soon as possible after a scheduled start is missed.
- Check If the task fails, restart every: Enter
5 minutes.
- Click OK.
- Enter the credentials for the account that will run the task (must have Administrator privileges).
- Click OK.
Test the upload process
- On the NPS server, open Task Scheduler.
- In the left panel, expand Task Scheduler Library.
- Right-click the task NXLog GCS Uploadand select Run.
- Wait a few seconds for the task to complete.
- Open the log file at
C:\nxlog-scripts\upload.logto verify the upload was successful. - Go to the GCP Console > Cloud Storage > Buckets.
- Click on your bucket name (
nps-logs-chronicle). - Navigate to the
nps-logs/folder. - Verify that JSON files with NPS logs are present.
Start NXLog service
- On the NPS server, open Services(run
services.msc). - Locate the nxlogservice in the list.
- Right-click the nxlogservice and select Start.
- Verify the service status is Running.
- To verify NXLog is collecting logs, check the NXLog log file:
-
C:\Program Files\nxlog\data\nxlog.log
-
-
Look for messages indicating successful log collection:
INFO im_file: monitoring file 'C:\Windows\System32\LogFiles\IN250315.log'
Retrieve the Google SecOps service account
Google SecOps uses a unique service account to read data from your GCS bucket. You must grant this service account access to your bucket.
Get the service account email
- Go to SIEM Settings > Feeds.
- Click Add New Feed.
- Click Configure a single feed.
- In the Feed namefield, enter a name for the feed (for example,
Microsoft NPS Logs). - Select Google Cloud Storage V2as the Source type.
- Select MICROSOFT_NPSas the Log type.
-
Click Get Service Account. A unique service account email will be displayed, for example:
``` none chronicle - 12345678 @chronicle - gcp - prod . iam . gserviceaccount . com ``` -
Copy this email address for use in the next step.
-
Click Next.
-
Specify values for the following input parameters:
-
Storage bucket URL: Enter the GCS bucket URI with the prefix path:
gs://nps-logs-chronicle/nps-logs/- Replace:
-
nps-logs-chronicle: Your GCS bucket name. -
nps-logs/: The prefix/folder path where logs are stored.
-
- Replace:
- Source deletion option: Select the deletion option according to your preference:
- Never: Never deletes any files after transfers (recommended for testing).
- Delete transferred files: Deletes files after successful transfer.
- Delete transferred files and empty directories: Deletes files and empty directories after successful transfer.
- Maximum File Age: Include files modified in the last number of days (default is 180 days)
- Asset namespace: The asset namespace
- Ingestion labels: The label to be applied to the events from this feed
-
-
Click Next.
-
Review your new feed configuration in the Finalizescreen, and then click Submit.
Grant IAM permissions to the Google SecOps service account
- The Google SecOps service account needs Storage Object Viewerrole on your GCS bucket.
- Go to Cloud Storage > Buckets.
- Click on your bucket name (
nps-logs-chronicle). - Go to the Permissionstab.
- Click Grant access.
- Provide the following configuration details:
- Add principals: Paste the Google SecOps service account email
- Assign roles: Select Storage Object Viewer
-
Click Save.
UDM mapping table
| Log Field | UDM Mapping | Logic |
|---|---|---|
|
Version
|
additional.fields | Merged with corresponding objects if not empty |
|
Channel
|
additional.fields | |
|
Keywords
|
additional.fields | |
|
Opcode
|
additional.fields | |
|
Task
|
additional.fields | |
|
ThreadID
|
additional.fields | |
|
EventData.%1
|
additional.fields | |
|
EventData.%2
|
additional.fields | |
|
Channel
|
channel.value.string_value | Value copied directly |
|
EventData.%1
|
event_data_p1.value.string_value | Value copied directly |
|
EventData.%2
|
event_data_p2.value.string_value | Value copied directly |
|
Keywords
|
keywords.value.string_value | Value copied directly |
|
TimeCreated
|
metadata.event_timestamp | Converted from UNIX_MS to timestamp |
|
EventId
|
metadata.event_type | Set to STATUS_UNCATEGORIZED if EventId == '4400', STATUS_UPDATE if EventId == '13', else GENERIC_EVENT |
|
EventId
|
metadata.product_event_type | Value copied directly |
|
Opcode
|
opcode.value.string_value | Value copied directly |
|
Computer
|
principal.asset.hostname | Value copied from principal.hostname |
|
Computer
|
principal.hostname | Value copied directly |
|
ProcessID
|
principal.process.pid | Value copied if not empty |
|
UserId
|
principal.user.windows_sid | Extracted using regex to get Windows SID |
|
EventId
|
security_result.rule_name | Set to "EventID: " + EventId |
|
Level
|
security_result.severity | Transformed: INFO/Informational/Information/Normal/NOTICE to INFORMATIONAL, ERROR/Error to ERROR, WARNING/Warning to INFORMATIONAL, DEBUG to INFORMATIONAL, Critical to CRITICAL |
|
Task
|
task.value.string_value | Value copied directly |
|
ThreadID
|
thread_id.value.string_value | Value copied directly |
|
Version
|
version.value.string_value | Value copied directly |
|
metadata.vendor_name
|
metadata.vendor_name | Set to "Microsoft" |
|
ProviderName
|
metadata.product_name | Value copied directly |
Need more help? Get answers from Community members and Google SecOps professionals.

