THIS IS A TEST INSTANCE. ALL YOUR CHANGES WILL BE LOST!!!!
...
- Create a directory for the Pig source code: `mkdir pig`
- cd into that directory: `cd pig`
- Checkout the Pig source code: `svn checkout http://svn.apache.org/repos/asf/pig/trunk/ .`
- Build the project: `ant`
- cd into the piggybank dir: `cd contrib/piggybank/java`
- Build the piggybank: `ant`
- You should now see a piggybank.jar file in that directory.
Make sure your classpath includes the hadoop jars as well. This workedforme worked for me using the cloudera CDH2 / hadoop AMIs:
Code Block |
---|
pig_version=0.4.99.0+10 ; pig_dir=/usr/lib/pig ;
hadoop_version=0.20.1+152 ; hadoop_dir=/usr/lib/hadoop ;
export CLASSPATH=$CLASSPATH:${hadoop_dir}/hadoop-${hadoop_version}-core.jar:${hadoop_dir}/hadoop-${hadoop_version}-tools.jar:${hadoop_dir}/hadoop-${hadoop_version}-ant.jar:${hadoop_dir}/lib/commons-logging-1.0.4.jar:${pig_dir}/pig-${pig_version}-core.jar
export PIG_CONF_DIR=/path/to/mapred-site/and/core-site/pointing/to/your/cluster
|
To obtain `javadoc` description of the functions run `ant javadoc` from `trunk/contrib/piggybank/java` directory. The documentation is generate in `trunk/contrib/piggybank/java/build/javadoc` directory.
...