Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

This article will introduce Metron's default dashboard that is built upon Kibana 4. It will cover the elements present in the dashboard and how you can extend the dashboard for your own purposes.  This is Part 7 of a multi-part tutorial series covering Apache Metron (incubating).

Metron's Dashboard

Metron's default dashboard is intended to allow you to easily validate the end-to-end functioning of Metron with its default sensor suite. It highlights some of the useful widgets available in Kibana 4, and serves as a starting point for you to build your own customized dashboards.

...

The previous tutorials covering Squid produced a limited data set. These consisted of a few basic requests. To make this tutorial more interesting, we are going to need a bit more variety in the sample data.

Open a terminal and run 1. Copy and paste the following set of links to a local file called `links.txt`.

    https://www.amazon.com/Cards-Against-Humanity-LLC-CAHUS/dp/B004S8F7QM/ref=zg_bs_toys-and-games_home_1?pf_rd_p=2140216822&pf_rd_s=center-1&pf_rd_t=2101&pf_rd_i=home&pf_rd_m=ATVPDKIKX0DER&pf_rd_r=2231TS0FE044EZT85PQ4
    https://www.amazon.com/Brain-Game-Cube-Intelligence-Development/dp/B01CRXM1JU/ref=zg_bs_toys-and-games_home_2?pf_rd_p=2140216822&pf_rd_s=center-1&pf_rd_t=2101&pf_rd_i=home&pf_rd_m=ATVPDKIKX0DER&pf_rd_r=MANXEWDTKDH2RD9Y3466
    https://www.amazon.com/Zuru-Balloons-different-colors-Seconds/dp/B00ZPW3U14/ref=zg_bs_toys-and-games_home_3?pf_rd_p=2140216822&pf_rd_s=center-1&pf_rd_t=2101&pf_rd_i=home&pf_rd_m=ATVPDKIKX0DER&pf_rd_r=MANXEWDTKDH2RD9Y3466
    https://www.amazon.com/MAGINOVO-Bluetooth-Headphones-Wireless-Earphones/dp/B01EFKFQL8/ref=zg_bs_electronics_home_1?pf_rd_p=2140225402&pf_rd_s=center-2&pf_rd_t=2101&pf_rd_i=home&pf_rd_m=ATVPDKIKX0DER&pf_rd_r=MANXEWDTKDH2RD9Y3466
    https://www.amazon.com/Amazon-Fire-TV-Stick-Streaming-Media-Player/dp/B00GDQ0RMG/ref=zg_bs_electronics_home_2?pf_rd_p=2140225402&pf_rd_s=center-2&pf_rd_t=2101&pf_rd_i=home&pf_rd_m=ATVPDKIKX0DER&pf_rd_r=MANXEWDTKDH2RD9Y3466
    http://www.walmart.com/ip/All-the-Light-We-Cannot-See/26737727
    http://www.walmart.com/ip/Being-Mortal-Medicine-and-What-Matters-in-the-End/36958209
    http://www.walmart.com/ip/My-Brilliant-Friend-Book-One-Childhood-Adolescence/20527482
    http://www.walmart.com/ip/A-Game-of-Thrones/402949
    http://www.bbc.co.uk/capital/story/20160622-there-are-people-making-millions-from-your-pets-poo
    http://www.bbc.co.uk/earth/story/20160620-can-we-predict-the-time-of-our-death
    http://www.bbc.co.uk/news/uk-england-somerset-36596557

2. Run this command to choose one of the links above at random and make a request for that link through Squid. Leave this command running in a terminal series of commands. This will download a list of the top 1 million sites as defined by Alexa. This will continually choose one at random and make a request through Squid for that web site. Leave this command running on the host so that a continual feed of data is generated as we work through the remainder of this tutorial.

curl  -O http://s3.amazonaws.com/alexa-static/top-1m.csv.zip
unzip top-1m.csv.zip
  while sleep 2; do head -10 top-1m.csvcat links.txt | shuf -n 1 | awk -F, '{print $2}' | xargs -i squidclient -g 4 -v "http://{}"; done

3. The previous command is generating log records at `/var/log/squid/access.log`. Run the following command in another terminal to extract this data and publish it to Kafka. Again, leave this command running to generate that continuous feed of data. You will need to have two separate terminal sessions left running.


    tail -F /var/log/squid/access.log | /usr/hdp/current/kafka-broker/bin/kafka-console-producer.sh --broker-list ec2-50-112-203-38.us-west-2.compute.amazonaws.com:6667$KAFKA_BROKER_URL --topic squid

4. Ensure that the parser topology for Squid continues to run based on the steps outlined in the previous tutorials.

...

1. Run the following command to create an index template for Squid.

 curl -XPOST ec2-52-40-44-64.us-west-2.compute.amazonaws.com:9200$ES_HOST:$ES_PORT/_template/squid_index -d '
{
"template": "squid_index*",
"mappings": {
"bro_doc": {
"_timestamp": {
"enabled": true
},
"properties": {
"timestamp": {
"type": "date",
"format": "epoch_millis"
},
"source:type": {
"type": "string",
"index": "not_analyzed"
},
"action": {
"type": "string",
"index": "not_analyzed"
},
"bytes": {
"type": "integer"
},
"code": {
"type": "string",
"index": "not_analyzed"
},
"domain_without_subdomains": {
"type": "string",
"index": "not_analyzed"
},
"full_hostname": {
"type": "string",
"index": "not_analyzed"
},
"elapsed": {
"type": "integer"
},
"method": {
"type": "string",
"index": "not_analyzed"
},
"ip_dst_addr": {
"type": "string",
"index": "not_analyzed"
}
}
}
}
}'

...

3. An index template will only apply for indices that are created after the template is created. Delete the existing Squid indices so that new ones can be generated with the index template.


    curl -XDELETE node1:9200/squid*

4. Wait for the Squid index to be re-created. This may take a minute or two based on how fast the Squid data is being consumed in your environment.


curl -XGET node1:9200/squid*

Configure the Squid Index in Kibana

Now that we have a Squid index with all of the right data types, we need to tell Kibana about this index.

Image Modified

Info

Click on the image above to see each of these steps performed.

 

1. Login to your Kibana user interface and then click on 'Settings', then 'Indices'.

...

4. Then click the 'Create' button.

Review the Squid Data

Now that Kibana is aware of the new Squid index, let's take a look at the data.

Image Modified

Info

Click on the image above to see each of these steps performed.

 

1. Click on `Discover` and then choose the newly created `squid*` index pattern.

...

3. Clicking on a specific record will show each field available in the data.

Save a Squid Search

Let's create a basic data table so that a user can inspect record-level details for Squid.  In Kibana, this is done by creating a 'Saved Search'

 

Image Added

Info

Click on the image above to see each of these steps performed.

 

1. Click on `Discover` and then choose the newly created `squid*` index pattern.

2. In the 'Fields' panel on the left, choose which fields to include in the saved search.  Click the 'Add' button next to each field.

3. Click on the 'Save' icon near the top-right to save the search.

Visualize the Squid Data

After using the `Discover` panel to better understand the Squid data, let's create a few visualizations.

Image Modified

Info

Click on the image above to see each of these steps performed.

 

1. Click on 'Visualize' in the top level menu.

...

6. Near the top-right side of the screen click on the 'Save' icon to save the visualization. Name it something appropriate. This will allow us to use the visualization in a dashboard later.

Customize the Dashboard

Image Modified

Info

Click on the image above to see each of these steps performed.

 

1. Open the Metron Dashboard by clicking on 'Dashboard' in the top-level menu.

...

4. Scroll to the bottom of the dashboard to find the visualization that was added. From here you can resize and move the visualization as needed.

5. Continue enhancing the dashboard by adding the 'Saved Search' that was previously created.

Summary

At this point you should be comfortable customizing a dashboard as you add new sources of telemetry to Metron. This article introduced Metron's default dashboard that is built upon Kibana 4. It covered the elements present in the dashboard and how you can extend the dashboard for your own purposes.

...