Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

First, check if there are outstanding blockers for your target version on JIRA. If there are none, make sure the unit tests pass. Note that the Maven tests are highly dependent on the run environment. It’s a good idea to verify that they have been passing in Jenkins before spending hours trying to fix them yourself.

 

 

Code Block
languagebash
$ git clone https://git-wip-us.apache.org/repos/asf/spark.git -b branch-1.1
$ cd spark
$ sbt/sbt clean assembly test

# Ensure MAVEN_OPTS is set with at least 3G of JVM memory
$ mvn -DskipTests clean package
$ mvn test

Additionally, check for dead links in the documentation.

 

Code Block
languagebash
$ cd spark/docs
$ jekyll serve --watch
$ sudo apt-get install linkchecker
$ linkchecker -r 2 http://localhost:4000 --no-status --no-warnings

Next, ensure that all Spark versions are correct in the code base (see this example commit). You should grep through the codebase to find all instances of the version string. Some known places to change are:

  •  SparkContext. Search for VERSION (only for branch 1.x)
  • Maven build. Ensure that the version in all the pom.xml files is <SPARK-VERSION>-SNAPSHOT (e.g. 1.1.1-SNAPSHOT). This will be changed to <SPARK-VERSION> (e.g. 1.1.1) automatically by Maven when cutting the release. Note that there are a few exceptions that should just use <SPARK-VERSION>, namely yarn/alpha/pom.xml and extras/java8-tests/pom.xml. These modules are not published as artifacts.
  • Spark REPLs. Look for the Spark ASCII art in SparkILoopInit.scala for the Scala shell and in shell.py for the Python REPL.
  • Docs. Search for VERSION in docs/_config.yml
  • Spark EC2 scripts. Update default Spark version and mapping between Spark and Shark versions.

Finally, update CHANGES.txt with this script in the Spark repository. CHANGES.txt captures all the patches that have made it into this release candidate since the last release.

 

Code Block
languagebash
$ export SPARK_HOME=<your Spark home>
$ cd spark
# Update release versions
$ vim dev/create-release/generate-changelist.py
$ dev/create-release/generate-changelist.py

 

 

Ensure Spark is Ready for a Release

...