20 March 2020

pfSense/OPNsense | Elastic Stack v7.6+ | Ubuntu 18.04+

pfELK (pfSense or OPNsense)

Visit https://pfelk.3ilson.dev for Scripted, Ansible and Docker installations

 

Prerequisites 
Ubuntu Server v18.04+
pfSense v2.4.4+ or OPNsense 20.1+

Navigate to the following within pfSense
Status>>System Logs [Settings]
1) Enable Remote Logging
2) Provide 'Server 1' address (this is the IP address of the ELK installation - ex: 192.168.1.60:5140)
3) Select "Firewall events"

Preparation


1. Create Directory Structure
sudo mkdir /data
sudo mkdir /data/pfELK
sudo mkdir /data/cofigurations
sudo mkdir /data/patterns
sudo mkdir /data/templates
sudo mkdir /data/GeoIP
sudo mkdir /data/logs
sudo chmod 777 /data/logs
2. Add MaxMind Repository
sudo add-apt-repository ppa:maxmind/ppa
3. Download and Install the Public GPG Signing Key
wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
4. Download and Install apt-transport-https Package (Debian)
sudo apt-get install apt-transport-https
5. Add Elasticsearch|Logstash|Kibana Repositories (version 7+
echo "deb https://artifacts.elastic.co/packages/7.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-6.x.list
6. Update
sudo apt-get update
7. Install Java 11 LTS
sudo apt install openjdk-11-jre
8. Install MaxMind
sudo apt-get install geoipupdate
9. Update MaxMind with Credentials:
sudo nano /etc/GeoIP.conf
10. Modify lines 7 & 8 as Follows (without < >):
AccountID <Input Your Account ID>
LicenseKey <Input Your LicenseKey>
11. Modify line 13 as Follows:
EditionIDs GeoLite2-City GeoLite2-Country GeoLite2-ASN
12. Download MaxMind Databases:
sudo geoipupdate -d /data/pfELK/GeoIP/
13. Add cron (automatically updates MaxMind everyweek on Sunday at 1700hrs):
sudo nano /etc/cron.weekly/geoipupdate
14. Add the Following and Save/Exit
00 17 * * 0 geoipupdate -d /data/pfELK/GeoIP
Install
Elasticsearch v7.6+ | Kibana v7.6+ | Logstash v7.6+
Elastic Stack

15. Install Elasticsearch|Kibana|Logstash
sudo apt-get install elasticsearch && sudo apt-get install kibana && sudo apt-get install logstash

Configure Kibana|v7.6+

16. Configure Kibana
sudo nano /etc/kibana/kibana.yml
17. Amend Host File (/etc/kibana/kibana.yml)
server.port: 5601
server.host: "0.0.0.0"
Configure Logstash|v7.6+

18. Change Directory (preparation for configuration files)
cd /data/pfELK/configurations
19. Download the Following Configuration Files
sudo wget https://raw.githubusercontent.com/3ilson/pfelk/master/data/configurations/01-inputs.conf
sudo wget https://raw.githubusercontent.com/3ilson/pfelk/master/data/configurations/05-firewall.conf
sudo wget https://raw.githubusercontent.com/3ilson/pfelk/master/data/configurations/30-geoip.conf
sudo wget https://raw.githubusercontent.com/3ilson/pfelk/master/data/configurations/50-outputs.confsudo wget https://raw.githubusercontent.com/3ilson/pfelk/master/data/configurations/10-others.conf
sudo wget https://raw.githubusercontent.com/3ilson/pfelk/master/data/configurations/20-suricata.conf
sudo wget https://raw.githubusercontent.com/3ilson/pfelk/master/data/configurations/25-snort.conf
sudo wget https://raw.githubusercontent.com/3ilson/pfelk/master/data/configurations/35-rules-desc.conf
sudo wget https://raw.githubusercontent.com/3ilson/pfelk/master/data/configurations/40-dns.conf
sudo wget https://raw.githubusercontent.com/3ilson/pfelk/master/data/configurations/45-cleanup.conf

20. Navigate to Patterns Folder

cd /data/pfELK/patterns/
21. Download the Following Configuration File
sudo wget https://raw.githubusercontent.com/3ilson/pfelk/master/data/patterns/pfelk.grok
22. Navigate to Templates Folder
cd /data/pfELK/templates
23. Download the Following Template File
sudo wget https://raw.githubusercontent.com/3ilson/pfelk/master/data/templates/pf-geoip-template.json

24. Navigate to Logstash Folder
cd /etc/logstash
25. Download the Following Configuration Files
sudo wget https://raw.githubusercontent.com/3ilson/pfelk/master/pipelines.yml
sudo wget https://raw.githubusercontent.com/3ilson/pfelk/master/logstash.yml

26. Edit (01-inputs.conf)
sudo nano /etc/logstash/conf.d/01-inputs.conf
27. Revise/Update w/pfsense IP address (01-inputs.conf)
# 01-inputs.conf
input {
  udp {
    port => 5140
  }
}
filter {
  #Adjust to match the IP address of pfSense or OPNsense
  if [host] =~ /172\.22\.33\.1/ {
    mutate {
      add_tag => [ "pf", "Ready" ]
      add_field => [ "[observer][type]", "firewall" ]
    }
  }
  #To enable or ingest multiple pfSense or OPNsense instances uncomment the below section
  ##############################
  #if [host] =~ /172\.2\.22\.1/ {
  #  mutate {
  #    add_tag => ["pf", "Ready"]
  #    add_field => [ "[observer][type]", "firewall" ]
  #  }
  #}
  ##############################
  if "pf" in [tags] {
    grok {
      # OPNsense - Enable/Disable the line below based on firewall platform
      match => { "message" => "%{SYSLOGTIMESTAMP:[event][created]} %{SYSLOGHOST:[observer][name]} %{DATA:labels}(?:\[%{POSINT:pf_pid}\])?: %{GREEDYDATA:pf_message}" }
      ########################################################################################################################################
      # pfSense - Enable/Disable the line below based on firewall platform
      # match => { "message" => "%{SYSLOGTIMESTAMP:[event][created]} %{DATA:labels}(?:\[%{POSINT:[event][id]}\])?: %{GREEDYDATA:pf_message}" }
    }
    mutate {
      rename => { "[message]" => "[event][original]"}
      remove_tag => "Ready"
    }
  }
}
28. Naviaget to pfELK
cd /data/pfELK
29. Download error-data.sh
sudo wget https://raw.githubusercontent.com/3ilson/pfelk/master/error-data.sh
30. Make error-data.sh Executable
sudo chmod +x /data/pfELK/error-data.sh
31. Disable Swap
sudo swapoff -a
32. Update TimeZone  
//Update the timezone as needed - http://joda-time.sourceforge.net/timezones.html //
sudo timedatectl set-timezone EST


Configure Services

Automatic Start (on boot)
33. Start Services on Boot as Services (you'll need to reboot or start manually to proceed)
sudo /bin/systemctl daemon-reload
sudo /bin/systemctl enable elasticsearch.service
sudo /bin/systemctl enable kibana.service
sudo /bin/systemctl enable logstash.service
Manual Start

34. Start Services Manually
sudo -i service elasticsearch start
sudo -i service kibana start
sudo -i service logstash start

Point browser to url or IP:5601 (ex: 192.168.1.1:5601)
Select @timestamp and click 'Create'
*You may have to wait a few minutes to allow log retrieval 

35. Configuring Patterns

  • Click the gear icon (management) in the lower left
  • Click Kibana >> Index Patters
  • Click Create New Index Pattern
  • Type "pf-*" into the input box, then click Next Step
36. Import dashboards

  • In your web browser go to the ELK local IP using port 5601 (ex: 192.168.0.1:5601)
  • Click Management -> Saved Objects
  • You can import the dashboards found in the Dashboard folder via the Import buttom in the top-right corner.


Testing/Troubleshooting

37. Elasticsearch
curl -X GET http://localhost:9200 { "name" : "NYLJDFe", "cluster_name" : "elasticsearch", "cluster_uuid" : "7krQg2MzR0irVJ6gNAB7fg", "version" : { "number" : "7.6.2", "build_hash" : "253032b", "build_date" : "2017-10-31T05:11:34.737Z", "build_snapshot" : false, "lucene_version" : "6.6.1" }, "tagline" : "You Know, for Search" }
38. Status (Elasticsearch)
systemctl status elasticsearch.service elasticsearch.service - Elasticsearch Loaded: loaded (/usr/lib/systemd/system/elasticsearch.service; disabled; vendor preset: enabled) Active: active (running) since Tue 2020-03-13 20:53:51 EDT; 13h ago Docs: http://www.elastic.co Main PID: 6121 (java) Tasks: 74 Memory: 2.4G CPU: 7min 46.327s CGroup: /system.slice/elasticsearch.service └─6121 /usr/bin/java -Xms16g -Xmx16g -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=7 Mar 13 20:53:51 logs systemd[1]: Starting Elasticsearch... Mar 13 20:53:51 logs systemd[1]: Started Elasticsearch.
39. Status (Kibana)
systemctl status kibana.service kibana.service - Kibana Loaded: loaded (/etc/systemd/system/kibana.service; disabled; vendor preset: enabled) Active: active (running) since Tue 2020-03-13 20:54:09 EDT; 13h ago Main PID: 6205 (node) Tasks: 10 Memory: 82.2M CPU: 2min 51.950s CGroup: /system.slice/kibana.service └─6205 /usr/share/kibana/bin/../node/bin/node --no-warnings /usr/share/kibana/bin/../src/cli -c Mar 13 10:43:16 logs kibana[6205]: {"type":"response","@timestamp":"2020-03-131T14:43:16Z","tags":[],"pid":
40. Status (Logstash)
systemctl status logstash.service logstash.service - logstash Loaded: loaded (/etc/systemd/system/logstash.service; disabled; vendor preset: enabled) Active: active (running) since Tue 2020-03-13 08:52:27 EDT; 1h 58min ago Main PID: 32366 (java) Tasks: 43 Memory: 405.6M CPU: 4min 43.959s CGroup: /system.slice/logstash.service └─32366 /usr/bin/java -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFracti Mar 13 08:52:27 logs systemd[1]: Started logstash.
41. Logstash Log's
/var/log/logstash
#cat/nano/vi the files within this location to view Logstash logs





15 comments:

  1. Having issues with both grok links that were posted:

    marvosa@pfelk:/etc/logstash/conf.d/patterns$ sudo wget https://raw.githubusercontent.com/3ilson/pfelk/master/conf.d/pf-12.2019.grok
    --2020-03-30 01:38:16-- https://raw.githubusercontent.com/3ilson/pfelk/master/conf.d/pf-12.2019.grok
    Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 199.232.28.133
    Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|199.232.28.133|:443... connected.
    HTTP request sent, awaiting response... 404 Not Found
    2020-03-30 01:38:17 ERROR 404: Not Found.

    and

    marvosa@pfelk:/etc/logstash/conf.d/patterns$ sudo wget https://raw.githubusercontent.com/3ilson/pfelk/master/conf.d/pf-patterns.grok
    --2020-03-30 01:38:19-- https://raw.githubusercontent.com/3ilson/pfelk/master/conf.d/pf-patterns.grok
    Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 199.232.28.133
    Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|199.232.28.133|:443... connected.
    HTTP request sent, awaiting response... 404 Not Found
    2020-03-30 01:38:19 ERROR 404: Not Found.

    Please advise

    ReplyDelete
    Replies
    1. Updated. You can also check out the latest at https://github.com/3ilson/pfelk/

      Delete
  2. valid links can be found here: https://github.com/3ilson/pfelk/blob/master/install/custom.md

    ReplyDelete
  3. Issue with indexes. From the web console elasticsearch says “no indeces to show” and when trying to create a new Index Pattern in kibana "Couldn't find any Elasticsearch data”. Therefore adding pf-* results in "The index pattern you've entered doesn't match any indices. You can match any of your 2 indices, below."

    ReplyDelete
    Replies
    1. Hello, what is the advice here? Im having the same issue and cant really understand what has to be done. it looks like im not getting data from pfsense into logstash. The messages above im getting in kibana when creating index.

      Delete
    2. same for me...impossible to add one

      Delete
  4. Hey bro, I follow all steps of your tutorial, but I'am with the parse because him is not parse everything for my logs. He's putting everything in the message filter is where he can put everything and it's not separating correctly.
    In Discover, I see:
    host:192.168.0.70 message:<134>May 14 09:33:57 filterlog: 7,,,1000000105,em0,match,block,in,6,0x00,0x00000,1,Options,0,36,fe80::a656:ccff:fee2:18f4,ff02::1,HBH,PADN,RTALERT,0x0000, @timestamp:May 14, 2020 @ 12:34:56.756 @version:1 _id:7YHTE3IBwxTsQRCtCJhr _type:_doc _index:pf-2020.05.14 _score: -

    Thanks for support.

    ReplyDelete
  5. FYI: The tutorial has conflicting paths are we storing things under pfELK folder (GeoIP, etc...????) or should it be in the main data folder. Step one creates most folders in the data folder instead of pfELK. I'm going to assume everything is under the second mentioned path IE /data/pfELK/GeoIP

    ReplyDelete
  6. Through this post, I realize that your great information in playing with all the pieces was exceptionally useful. I advise this is the primary spot where I discover issues I've been scanning for. You have a smart yet alluring method of composing.data science certification malaysia

    ReplyDelete
  7. I've been looking at a portion of your posts on this site, and I think this site is really enlightening! Keep setting it up.
    artificial intelligence course in delhi

    ReplyDelete
  8. I was looking at a portion of your posts on this site and I consider this site is really enlightening! Keep setting up..
    hrdf contribution

    ReplyDelete
  9. On the off chance that your searching for Online Illinois tag sticker restorations, at that point you have to need to go to the privileged place.
    360DigiTMG big data course in malaysia

    ReplyDelete
  10. incredible article!! sharing these kind of articles is the decent one and I trust you will share an article on information science.By giving an organization like 360DigiTMG.it is one the best foundation for doing guaranteed courses
    "
    hrdf claimable courses"

    ReplyDelete
  11. This is a bit frustrating, it seems to be written by the person who owns the GitHub repo, but all og the wget links are wrong. A number of the files do not exist when I go to GitHub (snort/suricata). Trying to follow a cross between this post and
    https://snehpatel.com/index.php/2020/02/01/installation-of-pfelk-on-ubuntu-elk-for-pfsense/
    but both have issues. The contents of 01-inputs.conf do not match either post. I know the WWW is constantly changing, but as the owner of these objects, can you please keep the install docs current?

    ReplyDelete
  12. sudo mkdir /data/cofigurations or sudo mkdir /data/pfELK/configurations ??

    ReplyDelete