...
Code Block | ||
---|---|---|
| ||
# The Apache HAWQ source code can be obtained from the the following link:
# Apache Repo: https://git.apache.org/repos/asf/hawq.git
# GitHub Mirror: https://github.com/apache/hawq.git
# Gitee Mirror: https://gitee.com/mirrors/hawq.git
git clone https://git.apache.org/repos/asf/hawq.git |
...
Panel | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
1.
...
3 Compile and Install HAWQ
Once you have an environment with the necessary dependencies installed and Hadoop is ready, the next step is to get the code and build HAWQ
Code Block | ||
---|---|---|
| ||
# The code directory is hawq. CODE_BASE=`pwd`/hawq cd $CODE_BASE # Run command to generate makefile. ./configure # You can also run the command with --help for more configuration. ./configure --help # Run command to build and install # To build concurrently , run make with -j option. For example, make -j8 # On Linux system without large memory, you will probably encounter errors like # "Error occurred during initialization of VM" and/or "Could not reserve enough space for object heap" # and/or "out of memory", try to set vm.overcommit_memory = 1 temporarily, and/or avoid "-j" build, # and/or add more memory and then rebuild. # On mac os, you will probably see this error: "'openssl/ssl.h' file not found". # "brew link openssl --force" should be able to solve the issue. make -j8 # Install HAWQ make install |
...
2. Init/Start/Stop HAWQ
2.1 Install and Start Hadoop
Please follow the steps here: https://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-common/SingleCluster.html
...
Code Block | ||
---|---|---|
| ||
# start HDFS start-dfs.sh # Do some basic tests to make sure HDFS works echo "test data" >> ./localfile hadoop fs -mkdir /test hadoop fs -put ./localfile /test hadoop fs -ls / hadoop fs -get /test/localfile ./hdfsfile |
2.2 Init/Start/Stop HAWQ
Code Block | ||
---|---|---|
| ||
# Before initializing HAWQ, you need to install HDFS and make sure it works. source /hawq/install/path/greenplum_path.sh # Besides you need to set password-less ssh on the systems. # Exchange SSH keys between the hosts host1, host2, and host3: hawq ssh-exkeys -h host1 -h host2 -h host3 hawq init cluster # after initialization, HAWQ is started by default # Now you can stop/restart/start the cluster by using: hawq stop/restart/start cluster # HAWQ master and segments are completely decoupled. So you can also init, start or stop the master and segments separately. # For example, to init: hawq init master, then hawq init segment # to stop: hawq stop master, then hawq stop segment # to start: hawq start master, then hawq start segment |
...
7. Build optional extension modules(optional)
Extension | How to enable | Pre-build steps on Mac |
---|---|---|
PL/R | ./configure --with-r | #install R before build brew tap homebrew/science brew install r |
PL/Python | ./configure --with-python | |
PL/Java | ./configure --with-java | |
PL/PERL | ./configure --with-perl | |
pgcrypto | ./configure --with-pgcrypto --with-openssl | |
gporca | ./configure --enable-orca | |
rps | ./configure --enable-rps | brew install tomcat@6 |