Out of date

Many of the details on this page are out of date. For current steps on producing/distributing the release artifacts, see the RELEASE.md file in the git repo.

Producing Source Artifacts

The actual exercise of producing the release tarballs right now is quite simple. It can be done by running bin/release.sh -v <VERSION>. It is little more than an svn export, however there are a few key things the release script does beyond just this, and they are worth understanding should it be necessary to modify the release script. It starts by querying the head revision and storing it in a variable so that all subsequent exports can be made from a specific revision. This serves two purposes:

  1. It guarantees that separate exports will be of the same revision. Just doing svn export proton-c; svn export proton-j could result in different svn revisions if there were intervening commits.
  2. It allows each release artifact to record the specific svn revision (along with repo and branch info) from which the export was made. This guarantees that any release artifact can be mapped back to a precise point in the history of our tree, even if it was made from just a random snapshot of an arbitrary feature branch.

Another extremely important factor here is that the release script itself should never have local changes when producing a release. If it does, then this means the logic that generates the release tarballs is not captured in the svn tree that the release artifacts themselves refer to. The only time the release script should ever have local changes is when testing those changes themselves, and any artifacts generated by those test runs should be thrown away and those changes checked in before producing proper artifacts.

The above measures are important in ensuring that the releases are repeatable. Beyond this there is one structural change to the source that the release script makes. The top level test directory is merged into both the proton-c and proton-j tarballs. Also, for the Java release, this magic incantation is run:

mvn org.codehaus.mojo:versions-maven-plugin:1.2:set org.codehaus.mojo:versions-maven-plugin:1.2:commit -DnewVersion="${VERSION}" -f ${WORKDIR}/${rootname}/pom.xml

Beyond this there is signing and checksums and what not, but that is all documented as part of the standard apache release process.

Note that it really doesn't matter what branching policy is used on the svn tree, e.g. release candidates could be made from a feature branch, from trunk, or from a release branch and because the artifacts capture the precise point in svn history they were made from, we can always retroactively branch and tag from that point should the release candidate prove worthy.

Testing the Source Artifacts

Testing Checklist

The following checklist can be used to sanity check the output of the steps above. Ideally we will automate some or all of these checks.

Structure

Untar qpid-proton-version.tar.gz and check that the following files and folders exist, and perform the checks indicated.

.
|-- CMakeLists.txt
|-- LICENSE
|-- README
|-- SVN_INFO           ** check contents **
|-- TODO
|-- bin
|   `-- release.sh
|-- config.sh
|-- pom.xml            ** check version number **
|-- proton-c
|   |-- CMakeLists.txt
|   |-- ...
|-- proton-j
|   |-- ...
`-- tests
    |-- ...

Build and install

Follow the instructions in README to install proton, preferably using a temporary install prefix, e.g.:

mkdir build
cd build
cmake -DCMAKE_INSTALL_PREFIX=/myprefix ... 
make DESTDIR=/tmp/test-install-proton install

Check that the following files exist under the install directory.

.
|-- myprefix
|   |-- bin
|   |   |-- proton
|   |   `-- proton-dump
|   |-- include
|   |   `-- proton
|   |       |-- buffer.h
|   |       |-- ...
|   |-- lib
|   |   |-- libqpid-proton.so -> libqpid-proton.so.1
|   |   |-- ...
|   `-- share
|       |-- man
|       |   `-- man1
|       |       `-- proton.1
|       `-- proton-0.3
|           |-- LICENSE
|           |-- README
|           |-- TODO
|           `-- docs
|               `-- api-c
|                   |-- ...
|
`-- <installed bindings - details depend on your environment >

TODO Java installation is not yet implemented as at Jan 2013 so the best you can do is to check that proton-api, proton-j-impl and proton-jni jars have been built under ./build/.

Test

Follow the instructions in README to run the tests.

Testing Notes

The challenge here is the variety of different environmental variables to factor in. For proton-c itself we have different architectures to contend with as well as different cmake versions and different versions of the few key libraries we depend on (e.g. openssl). On top of this, for each binding we have not only the language version, but also the version of swig available, and even if those are the same, the configuration of the interpreter may vary depending on which OS we're building on.

All of this of course makes a myriad of combinations, and we don't really have much in the way of CI coverage to help us yet, and even if we did, many of the factors are actually related to install testing and won't actually show up unless you do a full make install as root and try to access the proton library from each binding as an end user would. While not impossible to automate this level of testing is of course much more difficult to automate, so for the short to medium term at least, testing the C artifacts requires a reasonable amount of manual checking.

The full check for any given configuration/environment would involve installing all the dependencies required for any optional pieces (omitting any that are unavailable of course) doing a full build and a make install (as root), running the python test suite (this checks both the C library and the python binding), and then for each of the non-python bindings doing at least a minimal check that the binding works and running any binding tests available.

During this process of course some steps may fail, e.g. the specific version of a given language may not build, or the openssl library might not be recent enough, and some level of that is ok of course as long as we document what did/did not work for the given configuration/environment.

Producing Binary Artifacts

For the C implementation we simply don't do this. For the Java implementation there are a bunch of steps that I don't really understand but have learned to monkey through:

  1. untar and cd into the proton source tarball
  2. run "mvn -P apache-release deploy" (note that for this step to succeed you need to do mysterious and annoying things like putting your apache password in cleartext inside your .m2/settings.xml file)
  3. go to https://repository.apache.org/index.html, login and click on one of the Staging links (Repositories I think) and find the thing that maven just uploaded (it should say open or opened or something like that and have a recent date)
  4. poke at the clumsy ui until you figure out how to "close" it, and then record the URL it gives you for the repo

Notes

The only stage here that really needs to be done by one person is producing the release tarballs. I expect as we wish to offer more binary artifacts and producing them becomes more complex (e.g. requiring a windows box) we will need to distribute the responsibility for producing binaries, e.g. have a java developer produce the java binaries, have a windows developer produce windows binaries, have a ruby developer produces gems, and so forth.

  • No labels