-
DEVICE INTEGRATION
- Palo Alto (Device Integration)
- Dell Cylance Endpoint
- McAfee Web Gateway
- Imperva WAF
- Darktrace
- Forescout CounterACT
- Juniper Cortex Threat
- Zscaler
- Sophos
- Sophos Endpoint
- Trend Micro
- Sophos Cyberoam Firewall
- Radware-WAF
- NetScaler WAF
- Ubuntu
- Juniper SRX
- Forcepoint Websense
- FireEye
- Forcepoint DLP
- F5 BIG-IP ASM
- CyberArk PIM
- CheckPoint
- Bluecoat Proxy
- Accops Hyworks
- Barracuda WAF Syslog
- Forwarding F5 Distributed Cloud Services Logs to DNIF over TLS
- JIRA CLOUD
- Aruba ClearPass
- Show Remaining Articles (13) Collapse Articles
-
CONNECTORS
-
- 1Password Connector
- Abnormal Security
- Akamai Netstorage
- Atlassian
- Auth0 Connector
- AWS CloudTrail
- AWS Kinesis
- AWS S3
- AWS S3 (Optimized)
- AWS S3 Optimized Cross Account Connector
- Azure Blob Storage Connector
- Azure Event Hub
- Azure NSG
- Beats
- Box
- Cisco Duo
- Cloudflare Logpull Connector Setup Guide
- CloudWatch Connector
- Cortex XDR
- CrowdStrike
- Cyble Vision
- Device42
- Dropbox Connector
- GCP
- GCP PUB/SUB
- GitHub
- Google Workspace
- Haltdos
- HTTP Connector
- Hub Spot Connector
- Indusface
- Jira Connector
- Microsoft Graph Security API
- Microsoft Intune
- Mimecast
- Netflow
- Netskope Connector
- Network Traffic Analysis
- NextDLP Reveal
- Office 365
- Okta
- OneLogin
- Orca
- PICO Legacy Connector
- Prisma Alerts
- Prisma Incidents
- Salesforce
- Salesforce Pub/Sub Connector
- Shopify Connector
- Slack
- Snowflake
- Snyk Connector
- Syslog
- TCP
- Tenable Vulnerability Management Connector
- TLS
- Trend Micro Audit Logs
- Workday HCM Connector
- Zendesk
- Zoom
- Jumpcloud Connector
- Sophos connector
- Tenable Security Center Connector
- AWS GuardDuty Connector
- Trend Micro Vision One Connector
- RediffMail Pro Connector
- Microsoft Sentinel
- Microsoft Exchange Online Connector
- Show Remaining Articles (53) Collapse Articles
-
-
DATA INGESTION
-
HUNTING WITH WORKBOOKS
-
- Your first FIND with the HYPERCLOUD
- Create a Search Block
- Create a Signal Block
- Create a Text Block
- Create an Outlier Block
- Create a DQL Block
- Create an SQL Block
- Create a Code Block
- Create a Visualisation Block
- Create a Call Block
- Create a Return Block
- Create a Notification Block
- Schedule a Workbook
- Native Workbook
- Workbook Functions
- How to view Workbooks?
- Add Parameters to Workbook
- Working with Pass through Content
- How to create a Workbook?
- Workbooks
- Show Remaining Articles (5) Collapse Articles
-
-
DNIF Query Language (DQL Language)
-
SECURITY MONITORING
- Streamline Alert Analysis with Signal Tagging
- Workbook Versioning: Track, Collaborate, and Restore with Ease
- What is Security Monitoring?
- Creating Signal Suppression Rules
- Why EBA
- Signal Suppression Rule
-
- What are signals?
- View Signal Context Details
- Suspect & Target
- Source Stream
- Signal Filters
- Signal Data export
- Signal Context Details
- Signal Confidence Levels
- Raise and View Signals
- Investigate Anywhere
- How to add a signal to a case?
- Graph View for Signals
- Global Signals
- False Positives
- Add Multiple Signals to a Case
- Add comment to the signal
- Show Remaining Articles (1) Collapse Articles
-
OPERATIONS
-
MANAGE DASHBOARDS
-
MANAGE REPORTS
-
USER MANAGEMENT & ACCESS CONTROL
-
BILLING
-
MANAGING YOUR COMPONENTS
-
GETTING STARTED
-
INSTALLATION
-
SOLUTION DESIGN
-
AUTOMATION
-
- Active Directory
- AlienVault
- Asset Store
- ClickSend
- Domain Tools
- Fortigate
- GreenSnow
- JiraServiceDesk
- Microsoft Teams Channel
- New Relic
- Opsgenie
- PagerDuty
- Palo Alto
- ServiceNow
- Slack Configuration
- TAXII
- Trend Micro
- URLhaus
- User Store
- Virustotal
- Webhook
- Show Remaining Articles (6) Collapse Articles
-
-
TROUBLESHOOTING AND DEBUGGING
-
- TLS ( Troubleshooting Procedure)
- TCP (Troubleshooting Procedure)
- Syslog (Troubleshooting Procedure)
- Salesforce ( Troubleshooting Procedure)
- PICO
- Office 365 (Troubleshooting Procedure)
- GSuite
- GCP (Troubleshooting Procedure)
- Beats (Troubleshooting Procedure)
- Azure NSG ( Troubleshooting Procedure)
- Azure Eventhub
- AWS S3 (Troubleshooting Procedure)
-
-
LICENSE MANAGEMENT
-
RELEASE NOTES
- March 31, 2026 - Content Update
- March 16, 2026 - Application Update
- February 26, 2026 - Content Update
- January 19,2026 - Content Update
- December 23, 2025 - Application Update
- December 4,2025 - Content Update
- November 27, 2025 - Application Update
- October 28, 2025 - Content Update
- August 20, 2025 - Content Update
- August 5, 2025 - Application Update
- July 15, 2025 - Content Update
- June 13, 2025 - Content Update
- May 21, 2025 - Content Update
- April 17, 2025- Content Update
- March 25, 2025- Content Update
- March 18, 2025 - Application Update
- March 5, 2025 - Application Update
- January 27, 2025 - Application Update
- January 29, 2025 - Content update
- December 30, 2024 - Content Update
- December 12, 2024 - Content Update
- December 3, 2024 - Application Update
- November 15, 2024 - Content Update
- October 26, 2024- Application Update
- October 23, 2024 - Content Update
- October 16, 2024 - Application Update
- September 04, 2024 - Application Update
- September 04, 2024 - Content Update
- August 27, 2024 - Application Update
- July 30, 2024 - Application Update
- June 04, 2024- Application Update
- April 24, 2024- Application Update
- March 26, 2024 - Application Update
- February 19, 2024 - Application Update
- January 09, 2024 - Content Update
- January 09, 2024 - Application Update
- November 27, 2023 - Content Update
- November 27, 2023 - Application Update
- October 05, 2023 - Application Update (Release Notes v9.3.3)
- May 30, 2023 - Application Update (Release Notes v9.3.2)
- November 29, 2022 - Application Update (Release Notes v9.3.0)
- Show Remaining Articles (26) Collapse Articles
-
API
-
POLICIES
-
SECURITY BULLETINS
-
BEST PRACTICES
-
DNIF AI
-
DNIF LEGAL AND SECURITY COMPLIANCE
CloudWatch Connector
Amazon CloudWatch is a monitoring service that helps you track and monitor the health of your AWS applications and resources. DNIF supports the configuration of a CloudWatch Connector to pull log data directly from your CloudWatch, allowing you to write detections and perform investigations on the processed data.
Pre-requisites
- Log Group(s) names
- AWS Region
- AWS Access Key
- AWS Secret Key
Steps to derive prerequisites:-
To get your AWS Access Key and Secret Access Key, you need to create a set of credentials from the AWS Management Console. Follow these steps:
- Sign in to the AWS Management Console
- Go to the AWS Management Console at: https://aws.amazon.com/console/
- Sign in with your root account or IAM user credentials.
- Navigate to the IAM (Identity and Access Management) Dashboard
- Once logged in, from the top search bar, search for IAM and click on IAM to open the IAM dashboard.
- Create or Select an IAM User
- If you already have an IAM user that needs an access key:
- Go to Users from the left-hand menu.
- Select the IAM user to which you want to assign access keys.
- If you need to create a new IAM user:
- Click on Users in the left-hand menu.
- Click Add User at the top.
- Enter a username for the new user.
- Under Access type, check Programmatic access to generate an access key and secret key.
- Click Next: Permissions to proceed.
- If you already have an IAM user that needs an access key:
- Attach Permissions to the User
- If you’re creating a new IAM user or editing an existing user:
- You need to assign permissions to the user.
- Create a least-privileged IAM policy that grants only the necessary permissions for the DNIF Connector (which pulls logs from CloudWatch Logs), you would only need permissions for:
- Reading logs from CloudWatch Logs.
- Optionally, listing log groups (if you want the user to be able to see available log groups).
- Describing log streams to navigate the logs within the log groups.
- Here is an example of a minimal IAM policy that covers these actions:Minimal IAM Policy for Accessing CloudWatch Logs

- Explanation of the Policy:
- logs:DescribeLogGroups: Allows the user to list available CloudWatch log groups. This is optional and can be removed if you already know the log group names and don’t need the ability to list them.
- logs:DescribeLogStreams: Allows the user to list the log streams within the specified log group.
- logs:FilterLogEvents: Allows the user to query and filter the logs.
- Resource: The policy is scoped to specific log groups and log streams.
- Replace “your-log-group-name” with the name of the log group(s) you want to allow access to. You can add multiple log groups as needed.
- If you want to allow access to all log groups, you can use a wildcard (“arn:aws:logs:*:*:log-group:*”).
- Click Next: Review and complete the user creation.
- If you’re creating a new IAM user or editing an existing user:
- Generate the Access Key and Secret Access Key
Once the user is created or if you are editing an existing user:- Go to the user’s Security credentials tab.
- Scroll down to the Access keys section.
- Click on Create access key.
- Download or Save the Access Key
- After you generate the access key, AWS will display both the AWS Access Key ID and the AWS Secret Access Key.
- Important: This is the only time the secret key will be shown. Make sure to download the .csv file with both the access key and secret key, or copy and save it securely.
Configurations
The following are the configurations to forward AWS CloudWatch Connector logs to DNIF.

| Field Name | Description |
| Connector Name | Enter a name for the connector |
| Log Group(s) | Enter ClowdWatch Log Group(s) |
| AWS Region | Enter the CloudWatch AWS Region |
| AWS Access Key | Enter the AWS Access Key |
| AWS Secret Key | Enter the AWS Secret Key |
| Scan Interval | Enter the Scan Interval in mins |
- Click Save after entering all the required details and click Test Connection, to test the configuration.
- A Connection successful message will be displayed on screen along with the time stamp.
- If the connection is not successful an error message will be displayed. Refer Troubleshooting Connector Validations for more details on the error message.
Once the connector is configured, validate if the connector is listed under Collection Status screen with status as Active. This signifies the connector is configured successfully and data is ready to ingest.
