Steps for Data Backup activity


All the data ingested by the platform is stored within the /dnif/<Deployment-Key>/data folder. This data folder contains all stored log data that has been processed by DNIF.

Taking a backup of the /data folder is pretty easy. All you have to do is:

  1. Stop all running services on Datastore(DS) or A10.

  2. Take a backup of the Configuration files before backing up the /data folder .

  3. Backup the /data folder with any one of these methods:
    • Compress the /data folder and move to a safe location.
    • Remote the sync /data folder over ssh using rsync.
  4. Start all services.

Stop all running services on the dnif-container:

It’s important to stop all running services in order to halt operations on the DS before taking a backup.

  • Login to your Datastore or A10 container. How to Login in docker container
  • Execute this command to stop all running services.
    $supervisorctl stop all
    

    The supervisorctl command will bring all running dnif-services to a halt.

    Output:

    [email protected]:~# supervisorctl stop all
    importstore: stopped
    dataprocessor:dataprocessor_03: stopped
    ilicrscheduler: stopped
    din: stopped
    config_reporter: stopped
    datastoreapi: stopped
    dataprocessor:dataprocessor_02: stopped
    dataprocessor:dataprocessor_01: stopped
    dataprocessor:dataprocessor_04: stopped
    syslog_listerner_udp: stopped
    events_parser_processor: stopped
    notifemail: stopped
    ilidsapi: stopped
    ilicrapi: stopped
    agent79: stopped
    syslog_listerner_tcp: stopped
    sshd: stopped
    nameserver: stopped
    celeryds: stopped
    celerycr: stopped
    [email protected]:~#
    

Take a backup of Configuration files:

To backup the dnif-container configuration files, refer the Configuration-Backup help page We recommend that you take the latest backup of configuration files, every-time you backup the /data folder.

Backing up /data folder:

Depending upon the size of the /data folder, you can chose one of the backup options given below, depending on feasibility:

  • Compress the /data folder and move to a safe location:

    If your /data folder size is not much and you have additional storage space available on the host machine to incorporate the additional space consumed by the backup file, then you can proceed with this option.

    1. First, login to your Datastore or A10 container. Read more on how to login to the docker container.
    2. Next, move to the /dnif/<Deployment-key/ folder path.
        $cd /dnif/CnxxxxxxxxxxxxV8/
      
    3. To make a compressed backup copy of the /data folder, you can use the tar command.
        $tar -czf backup_data-$(date +%d_%m_%Y_%H%M%S).tar.gz ./data/
      

      Within the tar command we have used, $(date +%d_%m_%Y_%H%M%S), this is used to name the backup file with it’s compiled date and time stamp. This has been done just to organize backup records. You can rename it if required.

      Example:

        [email protected]:/dnif/CnxxxxxxxxxxxxV8# tar -czf backup_data-$(date +%d_%m_%Y_%H%M%S).tar.gz ./data/
        [email protected]:/dnif/CnxxxxxxxxxxxxV8# ls
        LICENSE  NAvfields   UPLOADS   backup_data-09_10_2017_120939.tar.gz  csltuconfig  data   intel    master_config.json  ssl   thresholds  updates  vStores    Metrics  SyncDevice  a10conf.json  config  csltustat   geoip  log    reports  ssl2  tool  vFields
        [email protected]:/dnif/CnxxxxxxxxxxxxV8#
      

      Now you can move the backup file (backup_data-dd_mm_yyyy_HHMMSS.tar.gz) to another location for safe-keeping.

  • Remote sync the /data folder over ssh using rsync

    In case the /data folder size is more than the space available in your host machine, remotely synchronize the data backup to any SSH-enabled server for backup. To do this, you need SSH access to remote machine.

    For remote sync, we’re using rsync over ssh. We’ll copy all content in the /data folder to the remote backup location using these steps:

    1. First, login to your Data Store or A10 container. Read more on how to login to the docker container.
    2. Next, move to the /dnif/<Deployment-key/ folder path.
      $cd /dnif/CnxxxxxxxxxxxxV8/
      
    3. To copy all files from the /data folder to the remote machine, use this command:
      $rsync -v -r -e ssh --progress ./data/ [email protected]:/var/tmp/data/
      

      The command we’ve used recursively copies all data from the /data folder path to the remote machine /var/tmp/data path. Here, user is the ssh username and ssh-machine-ip is the domain or IP address of the backup machine.

    Example with Output:

      [email protected]:/dnif/CnxxxxxxxxxxxxV8# rsync -v -r -e ssh --progress ./data/ [email protected]:/var/tmp/data
    
      [email protected]'s password:
      sending incremental file list
      ./
      DNIF/
      DNIF/nodes/
      DNIF/nodes/0/
      DNIF/nodes/0/node.lock
                	0 100%	0.00kB/s	0:00:00 (xfr#2, to-chk=151/157)
      DNIF/nodes/0/_state/
      DNIF/nodes/0/_state/global-1
               	67 100%	1.72kB/s	0:00:00 (xfr#3, to-chk=148/157)
      DNIF/nodes/0/indices/ctdp_conf/0/_state/state-2
               	39 100%	0.98kB/s	0:00:00 (xfr#4, to-chk=138/157)
      DNIF/nodes/0/indices/ctdp_conf/0/index/
      DNIF/nodes/0/indices/ctdp_conf/0/index/segments.gen
               	36 100%	0.90kB/s	0:00:00 (xfr#5, to-chk=137/157)
      ------- snipped -----
      sent 962,800 bytes  received 10,877 bytes  92,731.14 bytes/sec
      total size is 1,890,091  speedup is 1.94
      [email protected]:/dnif/CnxxxxxxxxxxxxV8#
    

Start all services

To restart all the services that we had stopped right at the beginning of this exercise, restart dnif-container from the host machine or use the following command:

$supervisorctl start all