...
JDK install on apple arm:
Code Block |
---|
language | bash |
---|
title | JDK 8 on arm |
---|
|
brew install homebrew/cask-versions/adoptopenjdk8 --cask
brew untap adoptopenjdk/openjdk |
...
Notes for arm: after a proper configuration, you should see something like this:
Code Block |
---|
|
mvn | configured on armmvn -version
Apache Maven 3.6.3 (cecedd343002696d0abb50b32b541b8a6ba2883f)
Maven home: /Users/yourusername/programs/apache-maven-3.6.3
Java version: 1.8.0_292, vendor: AdoptOpenJDK, runtime: /Library/Java/JavaVirtualMachines/adoptopenjdk-8.jdk/Contents/Home/jre
Default locale: en_HU, platform encoding: UTF-8
OS name: "mac os x", version: "10.16", arch: "x86_64", family: "mac" |
...
You have to download and compile protobuf. And also, install it into the local maven repository. Protobuf 2.5.0 is not ready for ARM. On this chipset, you will need to do some extra steps.
Code Block |
---|
language | bash |
---|
title | mvn configured on arm |
---|
|
wget https://github.com/google/protobuf/releases/download/v2.5.0/protobuf-2.5.0.tar.bz2
tar -xvf protobuf-2.5.0.tar.bz2
cd protobuf-2.5.0
./configure |
On ARM, edit the src/google/protobuf/stubs/platform_macros.h and add arm to the part, processor architecture detection, after the last elif branch:
Code Block |
---|
language | bash |
---|
title | mvn configured on arm |
---|
|
#elif defined(__arm64__)
#define GOOGLE_PROTOBUF_ARCH_ARM 1
#define GOOGLE_PROTOBUF_ARCH_64_BIT 1 |
Now, you can compile and install protobuf:
Code Block |
---|
language | bash | title | mvn configured on arm |
---|
|
make
make check
sudo make install |
You can validate your install:
Code Block |
---|
language | bash | title | mvn configured on arm |
---|
|
protoc --version |
Hadoop
Firstly, move through the instructions on the official documentation, single-node, pseudo-distributed configuration: https://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-common/SingleCluster.html#Pseudo-Distributed_Operation.
After that, set up HADOOP_HOME:
Code Block |
---|
|
export HADOOP_HOME=/yourpathtohadoop/hadoop-3.3.6 |
Tez
Tez will require some additional steps. Hadoop uses a tez tarball but it expects it in other compressed directory structure than it is realeased. So we will extract the tarbal and compress again. And also, we will put the extracted jars into hdfs. After that we set the necessary environment variables.
Download tez, extract and re-compress the tar:
Code Block |
---|
|
wget https://dlcdn.apache.org/tez/0.10.2/apache-tez-0.10.2-bin.tar.gz
tar -xzvf apache-tez-0.10.2-bin.tar.gz
cd apache-tez-0.10.2-bin
tar zcvf ../apache-tez-0.10.2-bin.tar.gz * && cd .. |
Add the necessary tez files to hdfs
Code Block |
---|
|
$HADOOP_HOME/bin/hadoop fs -mkdir -p /apps/tez
$HADOOP_HOME/bin/hadoop fs -put apache-tez-0.10.2-bin.tar.gz /apps/tez # copy the tarball
$HADOOP_HOME/bin/hadoop fs -put apache-tez-0.10.2-bin /apps/tez # copy the whole folder |
Set up TEZ_HOME environment variable
Code Block |
---|
|
export TEZ_HOME=/Users/zsoltmiskolczi/work/hive/hive-from-tar/apache-tez-0.10.2-bin |
Installing Hive from a Tarball
Start by downloading the most recent stable release of Hive from one of the Apache download mirrors (see Hive Releases).
...