Apache Airavata
If you want to enable elasticsearch based logging feature for Airavata
feature you first have to setup a Kafka cluster so that all the logs will be pushed to the Kafka topic created based on the configuration provided. Once you have kafka cluster setup then you can start airavata by simply changing two parameters in airavata-server.properties, but if you want to know
how to setup Kafka cluster please use below related articles for the instructions.
Overwrite the following property names with your configurations based on your Kafla setup.
# Kafka Logging related configuration
isRunningOnAws= false - Set to true if you are running Airavata on AWS
kafka.broker.list= localhost:9092 - One or more kafka broker node address with the port, Giving one is enough because KafkaProducer will find out the addresses of other nodes
kafka.topic.prefix= local - Topic prefix you want to use because Airavata will create the topic names for you
enable.kafka.logging= true - Enable kafka Appender to register as a log Appender.
{ "serverId" => { "serverId" => "192.168.59.3", "hostName" => "192.168.59.3", "version" => "airavata-0.16-135-gac0cae6", "roles" => [ [0] "gfac" ] }, "message" => "Skipping Zookeeper embedded startup ...", "timestamp" => "2016-09-09T20:57:08.329Z", "level" => "INFO", "loggerName" => "org.apache.airavata.common.utils.AiravataZKUtils", "mdc" => {}, "threadName" => "main", "@version" => "1", "@timestamp" => "2016-09-09T20:57:11.678Z", "type" => "gfac_logs", "tags" => [ [0] "local", [1] "CoreOS-899.13.0" ], "timestamp_usec" => 0 }
http://kafka.apache.org/documentation.html
https://www.elastic.co/guide/en/logstash/current/getting-started-with-logstash.html
https://www.elastic.co/cloud/as-a-service/signup