THIS IS A TEST INSTANCE. ALL YOUR CHANGES WILL BE LOST!!!!
...
- Download pentaho report designer from the pentaho website
- 1.#2 Overwrite report-designer.sh with the code provided below.
$ sh reporter-designer.shCode Block #!/bin/sh HADOOP_CORE={{ls $HADOOP_HOME/hadoop-*-core.jar}} CLASSPATH=.:$HADOOP_CORE:$HIVE_HOME/conf for i in ${HIVE_HOME}/lib/*.jar ; do CLASSPATH=$CLASSPATH:$i done CLASSPATH=$CLASSPATH:launcher.jar echo java -XX:MaxPermSize=512m -cp $CLASSPATH -jar launcher.jar java -XX:MaxPermSize=512m -cp $CLASSPATH org.pentaho.commons.launcher.Launcher}}} 1.#3 Build and start the hive server with instructions from [HiveServer|Hive/HiveServer] 1.#4 compile and run the hive jdbc client code to load some data (I haven't figured out how to do this in report designer yet). See [sample code|Hive/HiveClient#head-fd2d8ae9e17fdc3d9b7048d088b2c23a53a6857d] for loading the data. 1.#5 Run the report designer (note step 2)
URL:Code Block 1.#6 Select 'Report Design Wizard' 1.#7 select a template - say 'fall template' - next 1.#8 create a new data source - JDBC (custom), Generic database 1.#9 Provide hive jdbc parameters. Give the connection a name 'hive'.
- Build and start the hive server with instructions from HiveServer
- Compile and run the hive jdbc client code to load some data (I haven't figured out how to do this in report designer yet). See sample code for loading the data.
- Run the report designer (note step 2)
Code Block $ sh reporter-designer.sh
- Select 'Report Design Wizard'
- select a template - say 'fall template' - next
- create a new data source - JDBC (custom), Generic database
- Provide hive jdbc parameters. Give the connection a name 'hive'.
Code Block URL: jdbc:hive://localhost:10000/default
Driver name: org.apache.hadoop.hive.jdbc.HiveDriver
Username and password are empty
1.#10 Click onCode Block - Click on 'Test'. The test should succeed
- Edit the succeed 1.#11 Edit the query: select 'Sample Query', click edit query, click on the connection 'hive'. create a new query. Write a query on the table testHiveDriverTable: eg: select * from testHiveDriverTable. Click next.
- Layout
1.#12 Layout Step: Add {{
PageOfPages
}} to Group Items By. Add key and value as Selected Items. Click next. And Finish. 1.#13 Change the Report - Change the Report header to header to 'hive-pentaho-report'. Change the type of the header to 'html' 1.#14 Run the report and generate pdf. You should get something like the report attached here. h3. Integration with SQuirrel SQL Client # Download, install and start the SQuirrel SQL Client from the [SQuirrel SQL website|http://squirrel-sql.sourceforge.net/]. # Select 'Drivers
- Run the report and generate pdf. You should get something like the report attached here.
Integration with SQuirrel SQL Client
- Download, install and start the SQuirrel SQL Client from the SQuirrel SQL website.
- Select 'Drivers -> New -> New Driver...' to register the Hive JDBC driver.
##
- Enter
- the
- driver
- name
- and
- example
- URL:
Code Block
Name: Hive
Example URL: jdbc:hive://localhost:10000/default
- Select
# Select 'Extra Class Path -> Add' to add the following jars from your local Hive and Hadoop distribution. You will need to build Hive from the trunk after the commit of [ HIVE-679.Code Block Code Block |https://issues.apache.org/jira/browse/HIVE-679]. HIVE_HOME/build/dist/lib/*.jar
HADOOP_HOME/hadoop-*-core.jar
#Code Block - Select 'List Drivers'. This will cause SQuirrel to parse your jars for JDBC drivers and might take a few seconds. From the 'Class Name' input box select the Hive driver:
Code Block codeorg.apache.hadoop.hive.jdbc.HiveDriver
- Click 'OK' to complete the driver registration.
...