Uploading Custom Files in DNIF
We’ve added a new feature to DNIF - the ability to import custom files, which may contain some form of log data or incident related data, directly into the DNIF platform.
Any form of data (be it related to transaction, networking, Industrial Control Systems, Sensors, etc.), can simply be uploaded to get new insights into the data.
Once these files are imported, we can query the data specifically in that file and perform heuristic analysis.
This new feature is what we call Creation of Custom Data Store Events.
Let’s create a custom data store event with a sample file.
You can check out the video here or skip the video for step by steps tutorial below.
Step 1: Getting the Sample File ready
Firstly, we need to make sure that we have our file which we want to analyze. It must meet the following requirements before uploading:
- The file size should be less than or equal to 100 MB
- The supported file formats are xls, xlsx, csv and json
- The data in the file should have appropriate column names or headers
Note - For file sizes greater than 100 MB, instead of uploading the same via the web console, simply log into the Data Store server and navigate to the UPLOADS folder within the Docker installation directory.
Step 2 : Uploading the File
Once we have logged into our console, we can go ahead and visit the “Search” page:
Once the page has finished loading, click on the Settings icon. This is a small gear-like icon as shown below:
We are now greeted with a dropdown. Select the option “Add Event Store”.
Now, we get a view titled “Create Event Store” where we can enter the details such as “Store Name”. For this example, let’s enter the name as “demo” and select the radio button named “Create and Upload”. We are presented with an option to upload a file. Refer the screenshot below for clarity:
All done? Click on the “Save” button and you’re set!
Step 3: Analyzing the File
Now we can go back to our “Search” page and execute the query.
_fetch * from demo limit 10
Notice the word
demo. So basically, we retrieved all events / data present in the data store event, which in our case is
demo. After execution of the query, we can see the data in a more organized and enriched format.
If you notice, there are some fields which were not present in the file but are present here.
These fields are added by the DNIF platform itself in order to enrich the data further.
The fields added are
$CNAMTime- It signifies the date and time stamp during which the event/data was received by the
$EvtLen- This field depicts the size of the individual or formatted event/data present within the file
$ScopeID- This field depicts a unique string for each and every entry , which signifies that they have a common source / origin i.e., present within the same file
Note - Different files uploaded will signify a different “data store event” and will have a unique “$ScopeID” for its events.
Well, that was the last step. Go ahead, try uploading some files with interesting data and see what happens!