-
DEVICE INTEGRATION
- Palo Alto (Device Integration)
- Dell Cylance Endpoint
- McAfee Web Gateway
- Imperva WAF
- Darktrace
- Forescout CounterACT
- Juniper Cortex Threat
- Zscaler
- Sophos
- Sophos Endpoint
- Trend Micro
- Sophos Cyberoam Firewall
- Radware-WAF
- NetScaler WAF
- Ubuntu
- Juniper SRX
- Forcepoint Websense
- FireEye
- Forcepoint DLP
- F5 BIG-IP ASM
- CyberArk PIM
- CheckPoint
- Bluecoat Proxy
- Accops Hyworks
- Barracuda WAF Syslog
- Forwarding F5 Distributed Cloud Services Logs to DNIF over TLS
- JIRA CLOUD
- Aruba ClearPass
- Show Remaining Articles (13) Collapse Articles
-
CONNECTORS
-
- 1Password Connector
- Abnormal Security
- Akamai Netstorage
- Atlassian
- Auth0 Connector
- AWS CloudTrail
- AWS Kinesis
- AWS S3
- AWS S3 (Optimized)
- AWS S3 Optimized Cross Account Connector
- Azure Blob Storage Connector
- Azure Event Hub
- Azure NSG
- Beats
- Box
- Cisco Duo
- Cloudflare Logpull Connector Setup Guide
- CloudWatch Connector
- Cortex XDR
- CrowdStrike
- Cyble Vision
- Device42
- Dropbox Connector
- GCP
- GCP PUB/SUB
- GitHub
- Google Workspace
- Haltdos
- HTTP Connector
- Hub Spot Connector
- Indusface
- Jira Connector
- Microsoft Graph Security API
- Microsoft Intune
- Mimecast
- Netflow
- Netskope Connector
- Network Traffic Analysis
- NextDLP Reveal
- Office 365
- Okta
- OneLogin
- Orca
- PICO Legacy Connector
- Prisma Alerts
- Prisma Incidents
- Salesforce
- Salesforce Pub/Sub Connector
- Shopify Connector
- Slack
- Snowflake
- Snyk Connector
- Syslog
- TCP
- Tenable Vulnerability Management Connector
- TLS
- Trend Micro Audit Logs
- Workday HCM Connector
- Zendesk
- Zoom
- Jumpcloud Connector
- Sophos connector
- Tenable Security Center Connector
- AWS GuardDuty Connector
- Trend Micro Vision One Connector
- RediffMail Pro Connector
- Microsoft Sentinel
- Microsoft Exchange Online Connector
- Show Remaining Articles (53) Collapse Articles
-
-
DATA INGESTION
-
HUNTING WITH WORKBOOKS
-
- Your first FIND with the HYPERCLOUD
- Create a Search Block
- Create a Signal Block
- Create a Text Block
- Create an Outlier Block
- Create a DQL Block
- Create an SQL Block
- Create a Code Block
- Create a Visualisation Block
- Create a Call Block
- Create a Return Block
- Create a Notification Block
- Schedule a Workbook
- Native Workbook
- Workbook Functions
- How to view Workbooks?
- Add Parameters to Workbook
- Working with Pass through Content
- How to create a Workbook?
- Workbooks
- Show Remaining Articles (5) Collapse Articles
-
-
DNIF Query Language (DQL Language)
-
SECURITY MONITORING
- Streamline Alert Analysis with Signal Tagging
- Workbook Versioning: Track, Collaborate, and Restore with Ease
- What is Security Monitoring?
- Creating Signal Suppression Rules
- Why EBA
- Signal Suppression Rule
-
- What are signals?
- View Signal Context Details
- Suspect & Target
- Source Stream
- Signal Filters
- Signal Data export
- Signal Context Details
- Signal Confidence Levels
- Raise and View Signals
- Investigate Anywhere
- How to add a signal to a case?
- Graph View for Signals
- Global Signals
- False Positives
- Add Multiple Signals to a Case
- Add comment to the signal
- Show Remaining Articles (1) Collapse Articles
-
OPERATIONS
-
MANAGE DASHBOARDS
-
MANAGE REPORTS
-
USER MANAGEMENT & ACCESS CONTROL
-
BILLING
-
MANAGING YOUR COMPONENTS
-
GETTING STARTED
-
INSTALLATION
-
SOLUTION DESIGN
-
AUTOMATION
-
- Active Directory
- AlienVault
- Asset Store
- ClickSend
- Domain Tools
- Fortigate
- GreenSnow
- JiraServiceDesk
- Microsoft Teams Channel
- New Relic
- Opsgenie
- PagerDuty
- Palo Alto
- ServiceNow
- Slack Configuration
- TAXII
- Trend Micro
- URLhaus
- User Store
- Virustotal
- Webhook
- Show Remaining Articles (6) Collapse Articles
-
-
TROUBLESHOOTING AND DEBUGGING
-
- TLS ( Troubleshooting Procedure)
- TCP (Troubleshooting Procedure)
- Syslog (Troubleshooting Procedure)
- Salesforce ( Troubleshooting Procedure)
- PICO
- Office 365 (Troubleshooting Procedure)
- GSuite
- GCP (Troubleshooting Procedure)
- Beats (Troubleshooting Procedure)
- Azure NSG ( Troubleshooting Procedure)
- Azure Eventhub
- AWS S3 (Troubleshooting Procedure)
-
-
LICENSE MANAGEMENT
-
RELEASE NOTES
- February 26, 2026 - Content Update
- January 19,2026 - Content Update
- December 23, 2025 - Application Update
- December 4,2025 - Content Update
- November 27, 2025 - Application Update
- October 28, 2025 - Content Update
- August 20, 2025 - Content Update
- August 5, 2025 - Application Update
- July 15, 2025 - Content Update
- June 13, 2025 - Content Update
- May 21, 2025 - Content Update
- April 17, 2025- Content Update
- March 25, 2025- Content Update
- March 18, 2025 - Application Update
- March 5, 2025 - Application Update
- January 27, 2025 - Application Update
- January 29, 2025 - Content update
- December 30, 2024 - Content Update
- December 12, 2024 - Content Update
- December 3, 2024 - Application Update
- November 15, 2024 - Content Update
- October 26, 2024- Application Update
- October 23, 2024 - Content Update
- October 16, 2024 - Application Update
- September 04, 2024 - Application Update
- September 04, 2024 - Content Update
- August 27, 2024 - Application Update
- July 30, 2024 - Application Update
- June 04, 2024- Application Update
- April 24, 2024- Application Update
- March 26, 2024 - Application Update
- February 19, 2024 - Application Update
- January 09, 2024 - Content Update
- January 09, 2024 - Application Update
- November 27, 2023 - Content Update
- November 27, 2023 - Application Update
- October 05, 2023 - Application Update (Release Notes v9.3.3)
- May 30, 2023 - Application Update (Release Notes v9.3.2)
- November 29, 2022 - Application Update (Release Notes v9.3.0)
- Show Remaining Articles (24) Collapse Articles
-
API
-
POLICIES
-
SECURITY BULLETINS
-
BEST PRACTICES
-
DNIF AI
-
DNIF LEGAL AND SECURITY COMPLIANCE
AWS Kinesis
Amazon Kinesis Data Streams are used to collect and process large streams of data records in real time. AWS Kinesis Connector consumes data records from Kinesis Data Streams. For more information on AWS Kinesis Click Here.
Prerequisites
- Your own application that pushes data directly into a Kinesis data stream. The type of data pushed can include IT infrastructure log data, application logs, etc.
This application is to be made on your end please refer to this link while creating it. Use of any given SDK in this link is recommended.
- Required credentials:
- AWS_SECRET_ACCESS_KEY
- AWS_ACCESS_KEY_ID
Creating IAM users with programmatic access
- In the navigation pane, choose Users and then choose Add user.

- Enter the user name for the new user.
- On the same screen, under Select AWS access type, check the checkbox to enable programmatic access.

- Choose Next: Permissions.
- On the Set permissions page, specify how you want to assign permissions to this set of new users. Choose Attach existing policies directly > Create Policy

- Create policy using json method
- Following is the required policy in JSON format
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"kinesis:CreateStream",
"kinesis:DeleteStream",
"kinesis:Put*",
"kinesis:Get*",
"kinesis:DescribeStreamSummary",
"kinesis:DescribeStream"
],
"Resource": [
"*"
]
}
]
}
The Kinesis permissions for CreateStream, DeleteStream and Put* are not needed by the AWS Kinesis connector but are necessary for the application that you may use to push data on the stream which is described in the prerequisites section above.
- You can find the newly created policy in Attach existing policies directly
- Apply the policy to the IAM user.
- Click Next: Tags button

- Add user tags for identification purposes and click on Next:Review button.

- Review all the configurations and click Create user button.

- Click on the Download .csv button to download the key pair or click on Show link under the Secret access key and then copy/save both the Access key ID and Secret access key.
In case of an already existing IAM user, create access keys using the steps mentioned here: AWS Account and Access Keys
Configurations
The following are the configurations to forward AWS Kinesis stream records to DNIF.

| Field | Description |
| Connector Name | Enter a name for the connector |
| Stream Name | Enter Kinesis data stream name to consume records |
| Region Name | Region Name in which the stream was created. |
| Records Limit | Number of records to pull per query (Max 10000) |
| AWS Secret Access Key | Enter the AWS_SECRET_ACCESS_KEY received from AWS IAM account |
| AWS Access Key ID | Enter the AWS_SECRET_ACCESS_ID received from AWS IAM account |
- Click Save after entering all the required details and click Test Connection, to test the configuration.
- A Connection successful message will be displayed on screen along with the time stamp.
- If the connection is not successful an error message will be displayed. Refer Troubleshooting Connector Validations for more details on the error message.
Once the connector is configured, validate if the connector is listed under Collection Status screen with status as Active. This signifies the connector is configured successfully and data is ready to ingest.
