Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

configure to reflect your dev env in several files

update service/src/main/resources/application.properties
Code Block
languagetext
spring.datasource.url = jdbc:mysql://<your IP>:3306/quartz?autoReconnect=true&useSSL=false
spring.datasource.username = <user name>
spring.datasource.password = <password>

hive.metastore.uris = thrift://<your IP>:9083
hive.metastore.dbname = <hive database name>    # default is "default"
create a griffin working directory in hdfs
Code Block
languagebash
hdfs dfs -mkdir -p <griffin working dir>

 

update measure/src/main/resources/env.json with your elastic search instance,

...

 and copy env.json to griffin working directory in hdfs.
Code Block
languagejs
/*Please update as your elastic search instance*/
"api": "http://HOSTNAME:9200/griffin/accuracy"

and copy env.json to griffin work directory in hdfs.


update service/src/main/resources/sparkJob.properties file
Code Block
languagetext
sparkJob.file = hdfs://<griffin measure path>/griffin-measure.jar
sparkJob.args_1 = hdfs://<griffin workworking directory>/env.json
sparkJob.jars_1 = hdfs://<pathTo>/datanucleus-api-jdo-3.2.6.jar
sparkJob.jars_2 = hdfs://<pathTo>/datanucleus-core-3.2.10.jar
sparkJob.jars_3 = hdfs://<pathTo>/datanucleus-rdbms-3.2.9.jar
sparkJob.uri = http://<your IP>:8998/batches

 

update ui/js/services/service.js
Code Block
languagetext
#make sure you can access es by http
ES_SERVER = "http://<your IP>:9200"

...