ELK is a technology stack comprised of (E)lasticsearch (L)ogstash and (K)ibana which provides intake, parsing, storage, search, and visualization of data. Using these tools, it becomes easy to aggregate data and make it usable in unique ways customizable to any situation.
The data generated from F5 load balancers is a great example of data that might be cumbersome and require great effort in order to use to the greatest potential. The goal in using ELK with the data from F5 is to give new insight of the behavior of networks and to solve problems as they occur.
Kibana Dashboard for F5 LTM
The ELK stack is comprised primarily of three technologies-
Elasticsearch - provides indexing, search and storage of data. This piece could more simply be thought of as a database for all of your logs.
Logstash - the input point and output point for data which also provides parsing.
Kibana - provides a user interface to visualize the data contained in Elasticsearch.
Additional deployment technologies not covered here but could be implemented include messaging queues (Kafka/RabbitMQ), Beats (log shipping client), or Riemann (event alerting).
F5 Products Utilized:
- Local Traffic Manager (LTM)
- Application Security Module (ASM)
- Application Visibility and Reporting (AVR)
Elasticsearch and Kibana can either be deployed manually or a number of hosting companies provide 'one click' trials to get going quickly.
Manual/Self Managed -
Elastic.co Hosted 14 Day Trial: https://www.elastic.co/cloud
AWS Hosted Elasticsearch: https://aws.amazon.com/elasticsearch-service/
Version: 2.4 ¯\_(ツ)_/¯
- Example Logstash configurations located on Github for use
- If using 'apt' to install, default configuration directory will be /etc/logstash/conf.d
- Store the template file in a separate directly such as /etc/logstash/templates
- Edit the line
hosts => ["x.x.x.x:9200"]in
40-f5-output.confto point to your Elasticsearch instance
Configuring Elasticsearch (manual install):
- Elasticsearch requires little to no configuration in order to work properly. If you are working with large amounts of data, you will need to be aware of Java memory limits - https://www.elastic.co/guide/en/elasticsearch/guide/current/heap-sizing.html
- Elasticsearch listens on port
Configuring Kibana (manual install):
- Kibana's default configuration will use a local Elasticsearch instance, but it will only allow localhost access to the user interface. Edit the following line in /etc/kibana/kibana.yml:
Kibana listens on port
Example F5 dashboards located on Github can be imported by using the utility in Kibana's Web Interface:
Management -> Saved Objects -> Dashboards -> Import
Configuring F5 LTM and AVR Logging:
In System -> Logs -> Configuration, create two log destinations. The first will be of the type 'Remote HSL' configured with a pool of logstash servers. The second will be of the type 'Splunk' that AVR will be configured to send logs to (this is for log formatting purposes). The Splunk log destination will have a destination of the Remote HSL destination created.
Create a Log Publisher. The log destination for this will be the Splunk log destination created above.
In Local Traffic -> Profiles -> Analytics, configure a profile pointing to the newly created log publisher above. Assign Virtual Servers that you wish to collect statistics for.
Example of statistics to collect:
Configuring F5 ASM Logging:
Note, this does not cover creating and tuning ASM protections. The guide assumes that has already been completed.
Set the logging for ASM in the virtual server's security tab. I have selected the 'all' profile which is described below.
Application Security uses a new log server and DDoS uses the previously configured HSL Log Publisher. In the Github configuration, I have separated these two inputs.
Logging profiles for Application Security
Security -> Event Logs -> Logging Profiles -> 'all'
Give F5 and ELK a shot and let me know how it goes! If you have a Kibana dashboard to share, send me a pull request!