-
DEVICE INTEGRATION
- Palo Alto (Device Integration)
- Dell Cylance Endpoint
- McAfee Web Gateway
- Imperva WAF
- Darktrace
- Forescout CounterACT
- Juniper Cortex Threat
- Zscaler
- Sophos
- Sophos Endpoint
- Trend Micro
- Sophos Cyberoam Firewall
- Radware-WAF
- NetScaler WAF
- Ubuntu
- Juniper SRX
- Forcepoint Websense
- FireEye
- Forcepoint DLP
- F5 BIG-IP ASM
- CyberArk PIM
- CheckPoint
- Bluecoat Proxy
- Accops Hyworks
- Barracuda WAF Syslog
- Forwarding F5 Distributed Cloud Services Logs to DNIF over TLS
- JIRA CLOUD
- Aruba ClearPass
- Show Remaining Articles (13) Collapse Articles
-
CONNECTORS
-
- 1Password Connector
- Abnormal Security
- Akamai Netstorage
- Atlassian
- Auth0 Connector
- AWS CloudTrail
- AWS Kinesis
- AWS S3
- AWS S3 (Optimized)
- AWS S3 Optimized Cross Account Connector
- Azure Blob Storage Connector
- Azure Event Hub
- Azure NSG
- Beats
- Box
- Cisco Duo
- Cloudflare Logpull Connector Setup Guide
- CloudWatch Connector
- Cortex XDR
- CrowdStrike
- Cyble Vision
- Device42
- Dropbox Connector
- GCP
- GCP PUB/SUB
- GitHub
- Google Workspace
- Haltdos
- HTTP Connector
- Hub Spot Connector
- Indusface
- Jira Connector
- Microsoft Graph Security API
- Microsoft Intune
- Mimecast
- Netflow
- Netskope Connector
- Network Traffic Analysis
- NextDLP Reveal
- Office 365
- Okta
- OneLogin
- Orca
- PICO Legacy Connector
- Prisma Alerts
- Prisma Incidents
- Salesforce
- Salesforce Pub/Sub Connector
- Shopify Connector
- Slack
- Snowflake
- Snyk Connector
- Syslog
- TCP
- Tenable Vulnerability Management Connector
- TLS
- Trend Micro Audit Logs
- Workday HCM Connector
- Zendesk
- Zoom
- Jumpcloud Connector
- Sophos connector
- Tenable Security Center Connector
- AWS GuardDuty Connector
- Trend Micro Vision One Connector
- RediffMail Pro Connector
- Microsoft Sentinel
- Microsoft Exchange Online Connector
- Show Remaining Articles (53) Collapse Articles
-
-
DATA INGESTION
-
HUNTING WITH WORKBOOKS
-
- Your first FIND with the HYPERCLOUD
- Create a Search Block
- Create a Signal Block
- Create a Text Block
- Create an Outlier Block
- Create a DQL Block
- Create an SQL Block
- Create a Code Block
- Create a Visual Block
- Create a Call Block
- Create a Return Block
- Create a Notification Block
- Schedule a Workbook
- Native Workbook
- Workbook Functions
- How to view Workbooks?
- Add Parameters to Workbook
- Working with Pass through Content
- How to create a Workbook?
- Workbooks
- Show Remaining Articles (5) Collapse Articles
-
-
DNIF Query Language (DQL Language)
-
SECURITY MONITORING
- Streamline Alert Analysis with Signal Tagging
- Workbook Versioning: Track, Collaborate, and Restore with Ease
- What is Security Monitoring?
- Creating Signal Suppression Rules
- Why EBA
- Signal Suppression Rule
-
- What are signals?
- View Signal Context Details
- Suspect & Target
- Source Stream
- Signal Filters
- Signal Data export
- Signal Context Details
- Signal Confidence Levels
- Raise and View Signals
- Investigate Anywhere
- How to add a signal to a case?
- Graph View for Signals
- Global Signals
- False Positives
- Add Multiple Signals to a Case
- Add comment to the signal
- Show Remaining Articles (1) Collapse Articles
-
OPERATIONS
-
MANAGE DASHBOARDS
-
MANAGE REPORTS
-
USER MANAGEMENT & ACCESS CONTROL
-
BILLING
-
MANAGING YOUR COMPONENTS
-
GETTING STARTED
-
INSTALLATION
-
SOLUTION DESIGN
-
AUTOMATION
-
- Active Directory
- AlienVault
- Asset Store
- ClickSend
- Domain Tools
- Fortigate
- GreenSnow
- JiraServiceDesk
- Microsoft Teams Channel
- New Relic
- Opsgenie
- PagerDuty
- Palo Alto
- ServiceNow
- Slack Configuration
- TAXII
- Trend Micro
- URLhaus
- User Store
- Virustotal
- Webhook
- Show Remaining Articles (6) Collapse Articles
-
-
TROUBLESHOOTING AND DEBUGGING
-
- TLS ( Troubleshooting Procedure)
- TCP (Troubleshooting Procedure)
- Syslog (Troubleshooting Procedure)
- Salesforce ( Troubleshooting Procedure)
- PICO
- Office 365 (Troubleshooting Procedure)
- GSuite
- GCP (Troubleshooting Procedure)
- Beats (Troubleshooting Procedure)
- Azure NSG ( Troubleshooting Procedure)
- Azure Eventhub
- AWS S3 (Troubleshooting Procedure)
-
-
LICENSE MANAGEMENT
-
RELEASE NOTES
- December 4,2025 - Content Update
- November 27, 2025 - Application Update
- October 28, 2025 - Content Update
- August 20, 2025 - Content Update
- August 5, 2025 - Application Update
- July 15, 2025 - Content Update
- June 13, 2025 - Content Update
- May 21, 2025 - Content Update
- April 17, 2025- Content Update
- March 25, 2025- Content Update
- March 18, 2025 - Application Update
- March 5, 2025 - Application Update
- January 27, 2025 - Application Update
- January 29, 2025 - Content update
- December 30, 2024 - Content Update
- December 12, 2024 - Content Update
- December 3, 2024 - Application Update
- November 15, 2024 - Content Update
- October 26, 2024- Application Update
- October 23, 2024 - Content Update
- October 16, 2024 - Application Update
- September 04, 2024 - Application Update
- September 04, 2024 - Content Update
- August 27, 2024 - Application Update
- July 30, 2024 - Application Update
- June 04, 2024- Application Update
- April 24, 2024- Application Update
- March 26, 2024 - Application Update
- February 19, 2024 - Application Update
- January 09, 2024 - Content Update
- January 09, 2024 - Application Update
- November 27, 2023 - Content Update
- November 27, 2023 - Application Update
- October 05, 2023 - Application Update (Release Notes v9.3.3)
- May 30, 2023 - Application Update (Release Notes v9.3.2)
- November 29, 2022 - Application Update (Release Notes v9.3.0)
- Show Remaining Articles (21) Collapse Articles
-
API
-
POLICIES
-
SECURITY BULLETINS
-
BEST PRACTICES
-
DNIF AI
-
DNIF LEGAL AND SECURITY COMPLIANCE
Guidelines for Sanitizing Log Samples
Overview
To maintain privacy and security, it’s essential to properly sanitize log samples before sharing or using them in any context that may expose sensitive information. Follow these guidelines to ensure that no personally identifiable information (PII) or other sensitive data is included in the samples.
Definitions
Sanitization:
The process of modifying or removing sensitive information from data to prevent the exposure of PII or other confidential details. This can involve anonymization, pseudonymization, and the removal of specific data fields.
Anonymization:
Anonymization is the process of removing or modifying personal data in such a way that individuals cannot be identified, directly or indirectly. Once data is anonymized, it should be impossible to trace it back to an individual.
Anonymizing Identifiers
Anonymization should be applied to data elements that could uniquely identify an individual. This includes:
- Names: Replace with general descriptors (e.g., “User1,” “Customer A”).
- Email Addresses: Replace with non-descriptive placeholders (e.g., “email@example.com”).
- Phone Numbers: Replace with generic placeholders (e.g., “555-1234”).
- IP Addresses: Replace with non-identifiable placeholders (e.g., “192.0.2.0”).
Anonymization makes it impossible to identify individuals from the anonymized data alone.
Pseudonymization:
Pseudonymization is the process of replacing private identifiers with fake identifiers or pseudonyms. While pseudonymized data can be used for analysis or sharing, it can still be re-identified if the pseudonym is matched with the original data.
Pseudonymizing Identifiers
Pseudonymization can be used for identifiers that do not need to be completely anonymized but should be altered to maintain some degree of confidentiality. This includes:
- User IDs: Replace with pseudonyms (e.g., “user123” becomes “uABC123”).
- Transaction IDs: Replace with a different but unique pseudonym (e.g., “txn5678” becomes “txnXYZ5678”).
- Custom Identifiers: Replace with systematic pseudonyms (e.g., “order789” becomes “orderABC789”).
Caution:
- When to pseudonymize vs. anonymize: pseudonymize data that may still need to be re-linked for operational purposes but does not require direct identification of individuals. Anonymize data that should be completely de-identified to ensure privacy.
- Risk of Re-Identification: Pseudonymized data can still be re-identified if matched with the original dataset. Use pseudonymization carefully when there is a potential risk of unauthorized data correlation.
Recommended Best Practices for Sanitizing Log Samples
To protect sensitive information and ensure compliance with privacy standards, it is crucial to sanitize log samples before using them with our AI tools. Follow these guidelines:
- Avoid Including Sensitive Data: Do not include names, email addresses, phone numbers, or IP addresses in the logs that will identify your organization or its stakeholders. Replace them with placeholders.
Remove Personally Identifiable Information (PII)
Examples: Names, addresses, phone numbers, email addresses, social security numbers, account numbers.
Replace PII with generic placeholders (e.g., John Doe becomes USER_NAME, 123 Main St becomes ADDRESS).
Exclude Confidential Business Information
Examples: Company financials, proprietary algorithms, sensitive internal communications.
Replace sensitive data with placeholders (e.g., proprietary_algorithm_v1 becomes ALGORITHM_NAME).
Mask IP Addresses and Hostnames
Examples: 192.168.1.1, myserver.company.com
Replace IP addresses with IP_ADDRESS and hostnames with HOSTNAME.
Anonymize and Psuedonymize Identifiers
Examples: User IDs, session tokens, unique identifiers.
Replace with generic terms
Original: User John Doe logged in from 192.168.1.1 using johndoe@example.com
Anonymized: User [REDACTED] logged in from [REDACTED] using [REDACTED]
Pseudonymized: User u12345 logged in from 192.0.2.0 using email@example.com
Generalize Dates and Times
Examples: Specific timestamps, dates of transactions.
Use placeholders (e.g., 2023-05-01 08:00:00 becomes DATE_TIME).
- Log Sample Entry: Pasting the log samples as guided below will help in maintaining the consistency of the log format for accurate analysis.
- Enter Each Log Separately: Log samples should be entered as separate entries on new lines.
- No Empty New Lines: Ensure there are no empty new lines between log entries.
- Use Unique Log Events: Ensure to provide log samples pertaining to unique log events to enhance the quality of the DNIF AI extractor. This step will also benefit in manually building the extractor.
ENTER LOG SAMPLES AS FOLLOWS:
sample log 1
sample log 2
sample log 3
