Create actionable data from your vulnerability scans
VulnWhisperer is a vulnerability data and report aggregator. VulnWhisperer will pull all the reports and create a file with a unique filename which is then fed into logstash. Logstash extracts data from the filename and tags all of the information inside the report (see logstash_vulnwhisp.conf file). Data is then shipped to elasticsearch to be indexed.
- Nessus (v6 & v7)
- Qualys Web Applications
- Qualys Vulnerability Management (Need license)
- OpenVAS
- Tenable.io
- Nexpose
- Insight VM
- NMAP
- More to come
- Follow the install requirements
- Fill out the section you want to process in example.ini file
- Modify the IP settings in the logstash files to accomodate your environment and import them to your logstash conf directory (default is /etc/logstash/conf.d/)
- Import the kibana visualizations
- Run Vulnwhisperer
- ElasticStack 5.x
- Python 2.7
- Vulnerability Scanner
- Optional: Message broker such as Kafka or RabbitMQ
First, install requirement dependencies
sudo apt-get install zlib1g-dev libxml2-dev libxslt1-dev
Second, install dependant modules
cd deps/qualysapi
python setup.py install
Third, install requirements
pip install -r /path/to/VulnWhisperer/requirements.txt
cd /path/to/VulnWhisperer
python setup.py install
Now you're ready to pull down scans. (see run section)
The following instructions should be utilized as a Sample Guide in the absence of an existing ELK Cluster/Node. This will cover a Debian example install guide of a stand-alone node of Elasticsearch & Kibana.
While Logstash is included in this install guide, it it recommended that a seperate host pulling the VulnWhisperer data is utilized with Logstash to ship the data to the Elasticsearch node.
Please note there is a docker-compose.yml available as well.
Debian: (https://www.elastic.co/guide/en/elasticsearch/reference/5.6/deb.html)
sudo apt-get install -y default-jre
wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
sudo apt-get install apt-transport-https
echo "deb https://artifacts.elastic.co/packages/5.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-5.x.list
sudo apt-get update && sudo apt-get install elasticsearch kibana logstash
sudo /bin/systemctl daemon-reload
sudo /bin/systemctl enable elasticsearch.service
sudo /bin/systemctl enable kibana.service
sudo /bin/systemctl enable logstash.service
Elasticsearch & Kibana Sample Config Notes
Utilizing your favorite text editor:
- Grab your host IP and change the IP of your /etc/elasticsearch/elasticsearch.yml file. (This defaults to 'localhost')
- Validate Elasticsearch is set to run on port 9200 (Default)
- Grab your host IP and change the IP of your /etc/kibana/kibana.yml file. (This defaults to 'localhost') Validate that Kibana is pointing to the correct Elasticsearch IP (This was set in the previous step)
- Validate Kibana is set to run on port 5601 (Default)
Start elasticsearch and validate they are running/communicating with one another:
sudo service elasticsearch start
sudo service kibana start
OR
sudo systemctl start elasticsearch.service
sudo systemctl start kibana.service
Logstash Sample Config Notes
- Copy/Move the Logstash .conf files from /VulnWhisperer/logstash/ to /etc/logstash/conf.d/
- Validate the Logstash.conf files input contains the correct location of VulnWhisper Scans in the input.file.path directory identified below:
input {
file {
path => "/opt/vulnwhisperer/nessus/**/*"
start_position => "beginning"
tags => "nessus"
type => "nessus"
}
}
- Validate the Logstash.conf files output contains the correct Elasticsearch IP set during the previous step above: (This will default to localhost)
output {
if "nessus" in [tags] or [type] == "nessus" {
#stdout { codec => rubydebug }
elasticsearch {
hosts => [ "localhost:9200" ]
index => "logstash-vulnwhisperer-%{+YYYY.MM}"
}
}
- Validate logstash has the correct file permissions to read the location of the VulnWhisperer Scans
Once configured run Logstash: (Running Logstash as a service will pick up all the files in /etc/logstash/conf.d/ If you would like to run only one logstash file please reference the command below):
Logstash as a service:
sudo service logstash start
OR
sudo systemctl start logstash.service
Single Logstash file:
sudo /usr/share/logstash/bin/logstash --path.settings /etc/logstash/ -f /etc/logstash/conf.d/1000_nessus_process_file.conf
There are a few configuration steps to setting up VulnWhisperer:
- Configure Ini file
- Setup Logstash File
- Import ElasticSearch Templates
- Import Kibana Dashboards
To run, fill out the configuration file with your vulnerability scanner settings. Then you can execute from the command line.
vuln_whisperer -c configs/example.ini -s nessus
or
vuln_whisperer -c configs/example.ini -s qualys
If you're running linux, be sure to setup a cronjob to remove old files that get stored in the database. Be sure to change .csv if you're using json.
Setup crontab -e with the following config (modify to your environment) - this will run vulnwhisperer each night at 0130:
00 1 * * * /usr/bin/find /opt/vulnwhisp/ -type f -name '*.csv' -ctime +3 -exec rm {} \;
30 1 * * * /usr/local/bin/vuln_whisperer -c /opt/vulnwhisp/configs/example.ini
For windows, you may need to type the full path of the binary in vulnWhisperer located in the bin directory.
Big thank you to Justin Henderson for his contributions to vulnWhisperer!