Warning | ||
---|---|---|
| ||
This page is being obsoleted and is replaced by: http://trafodion.apache.org/build.html |
Panel | ||||||
---|---|---|---|---|---|---|
| ||||||
Describes |
...
the steps required to build |
...
and run Apache Trafodion. |
Table of Contents |
---|
...
Supported Platforms
Red Hat or Centos 6.4 or Centos x (6.4 or later) versions are supported as development and production platforms.
Required Software
- Install Cloudera or Horton works Hadoop distribution. In situations where you do not have the hadoop distributions already available you can use the optional sandbox install steps described below in section "Sandbox Hadoop Install"
- Java 1.7.x or greater must be installed. Ensure JAVA_HOME environment variable is set and points to your JDK installation.
- Download, build and install additional development tools via Additional Build Tools
- Install the following packages via yum install <package>
Build prerequisites
You need to install the following packages before you can install Apache Trafodion.
Code Block | ||
---|---|---|
| ||
sudo yum install epel-release sudo yum install alsa-lib-devel |
...
ant |
...
ant-nodeps |
...
boost-devel |
...
cmake \ device-mapper-multipath |
...
dhcp |
...
flex gcc-c++ |
...
gd
gd git glibc-devel \ glibc-devel.i686 |
...
graphviz-perl |
...
gzip |
...
java-1.7.0-openjdk-devel |
...
java-1.6.0-openjdk-devel
...
libaio-devel
...
libibcm.i686
...
libibumad-devel
\ libX11-devel libXau-devel libaio-devel \ libcurl-devel libibcm.i686 libibumad-devel libibumad-devel.i686 |
...
libiodbc
\ libiodbc libiodbc-devel |
...
librdmacm-devel |
...
librdmacm-devel.i686 |
...
log4cxx
\ libxml2-devel log4cxx log4cxx-devel |
...
lua-devel |
...
lzo-minilzo |
...
\ net-snmp-devel |
...
net-snmp-perl |
...
openldap-clients |
...
openldap-devel \ openldap-devel.i686 |
...
openmotif
openmotif openssl-devel openssl-devel.i686 |
...
\ openssl-static |
...
perl-Config-IniFiles |
...
perl- |
...
perl-Config-Tiny
Config-Tiny \ perl-DBD-SQLite perl-Expect |
...
perl-IO-Tty |
...
perl-Math-Calc-Units |
...
\ perl-Params-Validate |
...
perl-Parse-RecDescent |
...
perl-TermReadKey |
...
\ perl-Time-HiRes |
...
protobuf-compiler |
...
protobuf-devel |
...
python-qpid |
...
\ python-qpid-qmf |
...
qpid-cpp-client |
...
\ qpid-cpp-client- |
...
qpid-cpp-client-ssl
ssl qpid-cpp-server |
...
qpid-cpp-server-ssl |
...
\ qpid-qmf |
...
qpid-tools |
...
readline-devel |
...
saslwrapper |
...
sqlite-devel |
...
tog-pegasus
...
libXext-devel
...
libX11-devel
...
libXau-devel
...
unixODBC
...
unixODBC-devel
...
uuid-perl
...
xinetd
\ unixODBC unixODBC-devel uuid-perl wget xerces-c-devel |
...
Note :
- The qpid-cpp-client-devel package is not in the latest CentOS distribution, you may need to enable an earlier repo using the following command
yum --enablerepo=C6.3-updates install qpid-cpp-client-devel
2. Not all packages come standard with RHEL/CentOS, the EPEL repo will need to be downloaded and installed using wget command
wget http://download.fedoraproject.org/pub/epel/6/x86_64/epel-release-6-8.noarch.rpm
sudo rpm -Uvh epelrelease-6-8.noarch.rpm
Compiling and Configuring Trafodion
To build and compile Trafodion
...
Configuring DCS and starting Trafodion
- Open a new terminal via ssh
- cd <apache-incubator-dir>/core/sqf
- . ./sqenv.sh
- cd $MY_SQROOT/sql/scripts
- sqgen
- sqstart (Wait till all process is up and running)
- sqcheck
sqlci > initialize trafodion;
Perform a quick sanity test of the install, by creating a sample table and querying it
- sqlci
>set schema trafodion.sch;
>create table t (a int not null, primary key(a));
>insert into t values (1), (3);
>select * from t;
>exit;
- sqlci
Edit $DCS_INSTALL_DIR/conf/dcs-site.xml and
Sandbox Hadoop install (Optional)
The instructions below describe steps to install and start sand boxed version of Hadoop using non-default ports
- Create a sand-boxed installation of Hadoop, HBase, Hive, MySQL to be used for building and testing. You can install them locally to your workspace.
install_local_hadoop -p <port>
Note: This script will download Hadoop and HBase jar files from the internet. To avoid this overhead for future executions of the script, you can save the downloaded files into a separate directory and set the environment variable MY_LOCAL_SW_DIST to point to that directory.
The files to save are: $MY_SQROOT/sql/local_hadoop/*.tar.gz $MY_SQROOT/sql/local_hadoop/tpcds/tpcds_kit.zip. - Make sure you have set up password less authentication. Basically you should be able to "ssh localhost" without having to enter a password
- Bring up your Hadoop/HBase instance by using the custom script
swstartall
Hadoop Components (needs more work here...)
- Install Hadoop, HBase and Hive to a your local workspace. Dependencies for release 0.9.x
Download the files to $HOME/tools folder. If 'tools' folder does not exist please create before issuing a wget command
wget http://archive-primary.cloudera.com/cdh5/cdh/5/hbase-0.98.1-cdh5.1.0.tar.gz
wget http://archive.apache.org/dist/hive/hive-0.13.1/apache-hive-0.13.1-bin.tar.gz
wget http://archive.apache.org/dist/hadoop/common/hadoop-2.4.0/hadoop-2.4.0-src.tar.gz
Untar the files in $HOME/tools folder
tar xzf apache-hive-0.13.1-bin.tar.gz $HOME/tools/apache-hive-0.13.1-bin
tar xzf hbase-0.98.1-cdh5.1.0.tar.gz $HOME/tools/hbase-0.98.1-cdh5.1.0
tar xzf $HOME/tools/hadoop-2.4.0-src.tar.gz
...
xinetd
|
Once installed, check the following.
Java Version
The Java version must be 1.7.x. Check as following:
Code Block | ||
---|---|---|
| ||
$ java -version
java version "1.7.0_85"
OpenJDK Runtime Environment (rhel-2.6.1.3.el6_6-x86_64 u85-b01)
OpenJDK 64-Bit Server VM (build 24.85-b03, mixed mode) |
Ensure that the Java environment exists and points to your JDK installation. By default Java is located in /usr/lib/java-<version>.
Code Block | ||
---|---|---|
| ||
$ echo $JAVA_HOME
$ export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk.x86_64 |
Note |
---|
You should export JAVA_HOME in your .bashrc or .profile file. |
Verify Trafodion Download
Verify that the Trafodion source has been either:
- Downloaded and unpackaged.
- Cloned from github.
If not, please do so now.
To download from github, refer to Contributor Workflow - Code/Docs.
Otherwise downloaded and untar the source tar file from Apache Trafodion Incubator release.
Install Required Build Tools
Trafodion requires a set of tools to be installed in order to build. Refer to Required Build Tools for instructions. One of the tools that is built if it does not already exist is Maven. At this time, you should verify that Maven is part of your path. To verify that Maven is available do:
mvn --version.
If not found, then added it to your path:
PATH=$PATH:<tool installation directory>/apache-maven-3.3.3/bin
Note |
---|
You should add Maven to your PATH in your .bashrc or .profile file. |
Build Trafodion
Start a new ssh session. Use the following commands to set up the Trafodion environmental variables.
- <Trafodion source directory> is source tree base for Trafodion.
- <tools installation directory> is where Trafodion required tools are located. The following example assumes that you installed all the required tools in a single location. If you installed or used pre-installed tools in different directories, then you need to export the location of each tool as described Required Build Tools prior to sourcing in env.sh.
Code Block | ||
---|---|---|
| ||
cd <Trafodion source directory>
export TOOLSDIR=<tools installation directory>
source ./env.sh |
Build a debug version of Trafodion using one of the following options:
Command | What It Builds |
---|---|
make all | Trafodion, DCS, and REST. |
make package | Trafodion, DCS, REST, and Client Drivers. |
make package-all | Trafodion, DCS, REST, Client Drivers, and tests for all components. |
If the build fails, you might want to rerun the make step. Trafodion downloads many dependencies and sometimes one of the download operations fail. Rerunning the build generally works.
Verify the build:
Code Block | ||||
---|---|---|---|---|
| ||||
$ sqvers -u
MY_SQROOT=/home/centos/apache-trafodion-1.3.0-incubating/core/sqf
who@host=centos@mysystem
JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.91.x86_64
SQ_MBTYPE=64d (64-debug)
linux=2.6.32-504.1.3.el6.x86_64
redhat=6.7
NO patches
Most common Apache_Trafodion Release 1.3.0 (Build debug [centos], branch -, date 06Nov15)
UTT count is 1
[6] Release 1.3.0 (Build debug [centos], branch -, date 06Nov15)
export/lib/hbase-trx-cdh5_3-1.3.0.jar
export/lib/hbase-trx-hbase_98_4-1.3.0.jar
export/lib/hbase-trx-hdp2_2-1.3.0.jar
export/lib/sqmanvers.jar
export/lib/trafodion-dtm-1.3.0.jar
export/lib/trafodion-sql-1.3.0.jar |
The output show several jar files. The number of files differs based on the version of Trafodion you downloaded.
Setup Test Environment
You should test your installation using:
- Trafodion installation on a system that already has a compatible version of Hadoop installed
- A local Hadoop environment created by the install_local_hadoop script
Your installation approach depends on whether you already have installed Hadoop.
Hadoop is Already Installed
Build binary tar files and then install Trafodion following instructions described in Installation.
Code Block | ||||
---|---|---|---|---|
| ||||
cd <Trafodion source directory>
make package |
The binary tar files will be created in <Trafodion source directory>/distribution directory.
Install a Local Hadoop Environment
Local Hadoop prerequisites
Setup Passwordless SSH
Check to see if you have passwordless SSH setup.
Code Block | ||||
---|---|---|---|---|
| ||||
ssh localhost
Last login: Fri Nov 6 22:44:00 2015 from 192.168.1.9 |
If passwordless SSH is not setup, please do so now. The following is an example of setting up passwordless SSH using id_rsa keys. You can choose the method that best represents your environment.
If you already have an existing set of ssh keys. Simply copy both the id_rsa.pub and id_rsa to your ~/.ssh directory.
Then, do the following:
Code Block | ||||
---|---|---|---|---|
| ||||
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
chmod 600 ~/.ssh/id_rsa
echo "NoHostAuthenticationForLocalhost=yes" >>~/.ssh/config
chmod go-w ~/.ssh/config
chmod 755 ~/.ssh; chmod 640 ~/.ssh/authorized_keys; cd ~/.ssh; chmod 700 .. |
If you need to create your keys first, then do the following:
Code Block | ||||
---|---|---|---|---|
| ||||
ssh-keygen -t rsa -N "" -f ~/.ssh/id_rsa
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
chmod 600 ~/.ssh/id_rsa.pub
echo "NoHostAuthenticationForLocalhost=yes" >>~/.ssh/config
chmod go-w ~/.ssh/config
chmod 755 ~/.ssh; chmod 640 ~/.ssh/authorized_keys; cd ~/.ssh; chmod 700 .. |
Verify System Limits
Please check that the system limits in your environment are appropriate for Apache Trafodion. If they are not, then you will need to increase the limits or Trafodion cannot start.
Code Block | ||||
---|---|---|---|---|
| ||||
ulimit –a core file size (blocks, -c) 1000000 data seg size (kbytes, -d) unlimited scheduling priority (-e) 0 file size (blocks, -f) unlimited pending signals (-i) 515196 max locked memory (kbytes, -l) 49595556 max memory size |
...
c. For Pre-installed Hadoop/HBase version, Update the HBase configuration and restart HBase.
...
(kbytes, -m) unlimited open files (-n) 32000 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 real-time priority (-r) 0 stack size |
...
|
...
(kbytes, -s) 10240 cpu time |
...
(seconds, -t) unlimited max user processes |
...
(-u) 267263 virtual memory |
...
(kbytes, -v) unlimited file locks |
...
|
...
|
...
|
...
|
...
(-x) unlimited
|
Please refer to this article for information on how you change system limits.
Run
Before building trafodion, be sure to export the TOOLSDIR environment variable set to the directory location of these components and the additional build tools from the section above.
...
install_local_hadoop
...
NOTE: The hadoop release contains 32-bit libraries. You must build hadoop from source for 64-bit architecture, and not just download the release tar file. See: http://wiki.apache.org/hadoop/HowToContribute
Notes
...
The install_local_hadoop script downloads compatible versions of Hadoop, HBase, Hive, and MySQL. Then, it starts Trafodion.
Tip | ||
---|---|---|
| ||
install_local_hadoop downloads Hadoop, HBase, Hive, and MySql jar files from the internet. To avoid this overhead, you can download the required files into a separate directory and set the environment variable MY_LOCAL_SW_DIST to point to this directory. |
Command | What It Does |
---|---|
install_local_hadoop | Uses default ports for all services. |
install_local_hadoop -p fromDisplay | Start Hadoop with a port number range determined from the DISPLAY environment variable. |
install_local_hadoop -p rand | Start with any random port number range between 9000 and 49000. |
install_local_hadoop -p |
...
<port > | Start with the specified port number. |
For a list of ports that get configured and their default values, see Configure Ports for a Firewall.
Sample Procedure
Code Block | ||||
---|---|---|---|---|
| ||||
# Ensure that the Trafodion environmental variables have been loaded.
cd <Trafodion source directory>
source ./env.sh |
Install the Hadoop software.
Code Block | ||
---|---|---|
| ||
cd $MY_SQROOT/sql/scripts
install_local_hadoop
./install_traf_components |
Verify installation.
Code Block | ||
---|---|---|
| ||
$ swstatus
6 java servers and 2 mysqld processes are running
713 NameNode
19513 HMaster
1003 SecondaryNameNode
838 DataNode
1173 ResourceManager
1298 NodeManager
|
Six java servers as shown above and two mysqld processes should be running.
Manage Hadoop Environment
Use the following commands to manage the Hadoop environment.
Command | Usage |
---|---|
swstartall | Start the complete Hadoop environment. |
swstopall | Stops the complete Hadoop environment. |
swstatus | Checks the status of the Hadoop environment. |
swuninstall_local_hadoop | Removes the Hadoop installation. |
Run Trafodion
This section describes how to start Trafodion and run operations.
Each Time New Source is Downloaded
You need to do the following each time you download new source code.
Code Block | ||
---|---|---|
| ||
cd <Trafodion source directory>
source ./env.sh
cd $MY_SQROOT/etc
# delete ms.env, if it exists
rm ms.env
cd $MY_SQROOT/sql/scripts
sqgen |
Start Trafodion
Do the following to start the Trafodion environment.
Code Block | ||
---|---|---|
| ||
cd $MY_SQROOT/sql/scripts
sqstart
sqcheck |
Management Scripts
Component | Start | Stop | Status |
---|---|---|---|
All of Trafodion | sqstart | sqstop | sqcheck |
DCS (Database Connectivity Services) | dcstart | dcsstop | dcscheck |
REST Server | reststart | reststop | |
LOB Server | lobstart | lobstop | |
RMS Server | rmsstart | rmsstop | rmscheck |
Create Trafodion Metadata
Code Block | ||||
---|---|---|---|---|
| ||||
# Ensure that the Trafodion environmental variables have been loaded.
cd <Trafodion source directory>
source ./env.sh |
Assumption: Trafodion is up and running.
Use sqlci to create the Trafodion metadata.
Code Block | ||
---|---|---|
| ||
$ sqlci
>> initialize trafodion;
.
.
.
>> exit;
$ |
Validate Your Installation
You can use sqlci or trafci (connects via DCS) to validate your installation.
Code Block | ||||
---|---|---|---|---|
| ||||
get schemas;
create table table1 (a int);
invoke table1;
insert into table1 values (1), (2), (3), (4);
select * from table1;
exit; |
Assuming no errors, your installation has been successful. You can start working on your modifications.
Troubleshooting Notes
If you are not able to start up the environment or if there are problems running sqlci or trafci, then verify that the all the processes are up and running.
- swstatus should show at 6 java servers and 2 mysql processes.
- sqcheck should indicate all processes are running.
If processes are not running as expected, then:
- sqstop to shut down Traodion. If some Trafodion processes do not terminate cleanly, then run ckillall.
- swstopall to shut down the Hadoop ecosystem.
- swstartall to restart the Hadoop ecosystem.
- sqstart to restart Trafodion.
If problems persist please review logs:
- $MY_SQROOT/sql/local_hadoop/*/log: Hadoop, HBase, and Hive logs.
- $MY_SQROOT/logs Trafodion logs.
...
sqstop
# edit source files
cd $MY_SQROOT/..
make all
sqstart
...
sqstop
swstopall
To start it up later, use the following commands:
swstartall
sqstart
To check on the status, use these commands:
sqcheck
swstatus
...