THIS IS A TEST INSTANCE. ALL YOUR CHANGES WILL BE LOST!!!!
...
- You have the latest JDK installed on your system as well. You can either get it from the official Oracle website (http://www.oracle.com/technetwork/java/javase/downloads/jdk-6u29-download-513648.html) or follow the advice given by your Linux distribution (e.g. some Debian based Linux distributions have JDK packaged as part of their extended set of packages). If your JDK is installed in a non-standard location, make sure to add the line below to the /etc/default/hadoop file
No Format export JAVA_HOME=XXXX
- Format the namenode
No Format sudo -u hdfs hadoop namenode -format
- Start the necessary Hadoop services. E.g. for the pseudo distributed Hadoop installation you can simply do:
No Format for i in hadoop-namenode hadoop-datanode hadoop-jobtracker hadoop-tasktracker ; do sudo service $i start ; done
- Once your basic cluster is up and running it is a good idea to create a home directory on the HDFS:
No Format sudo -u hdfs hadoop fs -mkdir /user/$USER sudo -u hdfs hadoop fs -chown $USER /user/$USER
- Enjoy your cluster
No Format hadoop fs -lsr / hadoop jar /usr/lib/hadoop/hadoop-examples.jar pi 10 1000
- If you are using Amazon AWS it is important the IP address in /etc/hostname matches the Private IP Address in the AWS Management Console. If the addresses do not match Map Reduce programs will not complete.
No Format ubuntu@ip-10-224-113-68:~$ cat /etc/hostname ip-10-224-113-68
- If the IP address in /etc/hostname does not match then open the hostname file in a text editor, change and reboot
...
HTML |
---|
<h1>Running Hadoop Components </h1>
<a href="https://cwiki.apache.org/confluence/display/BIGTOP/Running+various+Bigtop+components" target="_blank">Step-by-step instructions on running Bigtop Components!</a>
|
One of the advantages of Bigtop is the ease of installation of the different Hadoop Components without having to hunt for a specific Hadoop Component distribution and matching it with a specific Hadoop version. Please visit the link above to run some easy examples from the Bigtop distribution!
Running Pig
- Install Pig
No Format sudo apt-get install pig
- Create a tab delimited file using a text editor and import it into HDFS. Start the pig shell and verify a load and dump work. Make sure you have a space on both sides of the = sign. The statement using PigStorage('\t') tells Pig the columns in the text file are delimited using tabs.
No Format $pig grunt>A = load '/pigdata/PIGTESTA.txt' using PigStorage('\t'); grunt>dump A
...