You are viewing an old version of this page. View the current version.

Compare with Current View Page History

Version 1 Next »

Spark Installation

Follow instructions here: https://spark.apache.org/docs/latest/spark-standalone.html.  Make sure the following steps are done:

  1. Install spark (either download pre-built spark, or build assembly from source).  Note that Spark has different distributions for different versions of Hadoop.  Keep note of the spark-assembly-*.jar location on the node Hive will run from.
  2. Start Spark cluster (Master and workers).  Keep note of the Spark master URL.

Configuration Hive

Known Issues

  • No labels