Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • Check JIRA for remaining issues tied to the release
    • Review and merge any blocking features
    • Bump other remaining features to subsequent releases
  • Ensure Spark versions are correct in the codebase
    • See this example commit
    • The places to change are:
      • SBT build: Change version in file 'project/SparkBuild.scala'
      • Maven build: Change version in ALL the pom.xml files in repo. Note that the version should be SPARK-VERSION_SNAPSHOT and it will be changed to SPARK-VERSION automatically by Maven when cutting the release.
        • Exception: Change 'yarn/alpha/pom.xml' to SPARK-VERSION. Note that this is different from the main 'pom.xml' because the YARN alpha module does not get published as an artifact through Maven when cutting the release and so does not get version bumped from SPARK-VERSION_SNAPSHOT to SPARK-VERSION.
      • Spark REPLs
        • Scala REPL: Check inside 'repl/src/main/scala/org/apache/spark/repl/'
        • Python REPL: Check inside 'python/pyspark'
      • Docs: Change in file 'docs/_config.yml'
      • Spark EC2 scripts: Change mapping between Spark and Shark versions and the default Spark version in cluster

Check

...

out run tests

Code Block
languagebash
$ git 

...

clone https://git-wip-us.apache.org/repos/asf/spark.git -b branch-0.9
$ 

...

cd 

...

incubator-

...

spark
$ 

...

sbt/sbt assembly
$ 

...

export MAVEN_OPTS="-

...

Xmx3g -XX:MaxPermSize=1g -XX:ReservedCodeCacheSize=1g"
$ mvn test


Check for dead links in the docs

...

Code Block
languagebash
$ git clone https://git-wip-us.apache.org/repos/asf/spark.git -b branch-0.9
$ cd incubator-spark
$ sbt/sbt assembly
$ export MAVEN_OPTS="-Xmx3g -XX:MaxPermSize=1g -XX:ReservedCodeCacheSize=1g"
$ mvn test cd $SPARK_HOME/docs
$ jekyll serve --watch
$ sudo apt-get install linkchecker
$ linkchecker -r 2 http://localhost:4000 --no-status --no-warnings

 

Run License Audit Tool

Check whether all the source files have Apache headers.  Note that Spark REPL files and some pyspark files will have other license headers (as they are not Apache licensed) and should be ignored.

...

Cutting a release candidate involves a two steps. First, we use the Maven release plug-in to create a release commit (a single commit where all of the version files have the correct number) and publish the code associated with that release to a staging repository in Maven. Second, we check out that release commit and package binary releases and documentation.

Creating Release

...

Candidates

  • The process of creating releases has been automated via this release script
  • Read and understand the script fully before you execute it. It will cut a Maven release, build binary releases and documentation, then copy the binary artifacts to a staging location on people.apache.org.
  • NOTE: You must use git 1.7.X for this or else you'll hit this horrible bug

...

Code Block
languagebash
$ git fetch apache
$ git checkout apache/branch-0.8
$ git tag -d v0.8.1-incubating
$ git push origin :v0.8.1-incubating
$ git revert HEAD --no-edit    # revert dev version commit
$ git revert HEAD~2 --no-edit  # revert release commit
$ git push apache HEAD:branch-0.8

Auditing a Staged Release Candidate

  • The process of auditing release has been automated via this release audit script
  • The release auditor will test example builds against the staged artifacts, verify signatures, and check for common mistakes made when cutting a release

...