The core of the release process is the build-vote-fix cycle. Each cycle produces one release candidate. The Release Manager repeats this cycle until the community approves one release candidate, which is then finalized. Build and stage Java and Python artifactsSet up a few environment variables to simplify Maven commands that follow. This identifies the release candidate being built. Start with RC_NUM equal to 1 and increment it for each candidate. Code Block |
---|
| RC_NUM="1"
TAG="release-${RELEASE_VERSION}-rc${RC_NUM}" |
Now, create a release branch: Code Block |
---|
| $ cd tools
tools $ OLD_VERSION=$CURRENT_SNAPSHOT_VERSION NEW_VERSION=$RELEASE_VERSION RELEASE_CANDIDATE=$RC_NUM releasing/create_release_branch.sh |
Tag the release commit: Code Block |
---|
git tag -s ${TAG} -m "${TAG}" |
We now need to do several things: - Create the source release archive
- Deploy jar artefacts to the Apache Nexus Repository, which is the staging area for deploying the jars to Maven Central
- Build PyFlink wheel packages (since 1.11)
- Create binary convenience releases for different Hadoop versions
You might want to create a directory on your local machine for collecting the various source and binary releases before uploading them. Creating the binary releases is a lengthy process but you can do this on a another machine (for example, in the "cloud"). When doing this, you can skip signing the release files on the remote machine, download them to your local machine and sign them there. First, we build the source release: Code Block |
---|
| tools $ RELEASE_VERSION=$RELEASE_VERSION releasing/create_source_release.sh |
Next, we stage the maven artifacts: Code Block |
---|
| tools $ releasing/deploy_staging_jars.sh |
Review all staged artifacts (https://repository.apache.org/). They should contain all relevant parts for each module, including pom.xml , jar, test jar, source, test source, javadoc, etc. Carefully review any new artifacts. Close the staging repository on Apache Nexus. When prompted for a description, enter “Apache Flink, version X, release candidate Y”. Then, you need to build the PyFlink wheel packages.(since 1.11) - Firstly, you need to set Set up an azure pipeline in your own Azure account. You can refer to Azure Pipelines for more details on how to set up azure pipeline for a fork of the Flink repository. Note that a google cloud mirror in Europe is used for downloading maven artifacts, therefore it is recommended to set your Azure organization region to Europe to speed up the downloads.
Secondly, you need to push Push the release candidate branch to your forked personal Flink repository, e.g. Code Block |
---|
| tools $ git push <remote> refs/heads/release-${RELEASE_VERSION}-rc${RC_NUM}:release-${RELEASE_VERSION}-rc${RC_NUM} |
- Thirdly, you need to trigger Trigger the Azure Pipelines manually to build the PyFlink wheel packages
- Go to your Azure Pipelines Flink project → Pipelines
- Click the "Run pipeline" button on the top right
- Select "GitHub" → your GitHub Flink repository → "Existing Azure Pipelines YAML file"
- Select your branch / commit, and → Set path to ".azure-pipelines.yaml" → click on "Variables"
- Then click "Add Variable" bottom, fill the name with "MODE", and the value with "release". Click "Create" to set the variable, then go back to the "Run pipeline" screen and trigger the build.
- You should now see a build where only the "CI build (release)" is running
- Lastly, you need to download Download the PyFlink wheel packages from the build result page after the jobs of "build_wheels mac" and "build_wheels linux" have finished.
- Download the PyFlink wheel packages
- Open the build result page of the pipeline
- Go to the `Artifacts` page (build_wheels linux -> 1 artifact)
- Click `wheel_Darwin_build_wheels mac` and `wheel_Linux_build_wheels linux` separately to download the zip files
Unzip these two zip files Code Block |
---|
| $ cd /path/to/downloaded_wheel_packages
$ unzip wheel_Linux_build_wheels\ linux.zip
$ unzip wheel_Darwin_build_wheels\ mac.zip |
Create directory `dist` under the directory of flink-python Code Block |
---|
| $ cd <flink-dir>
$ mkdir flink-python/dist |
Move the unzipped wheel packages to the directory of flink-python/dist Code Block |
---|
| $ mv /path/to/wheel_Darwin_build_wheels\ mac/* flink-python/dist/
$ mv /path/to/wheel_Linux_build_wheels\ linux/* flink-python/dist/
$ cd tools |
Finally, we create the binary convenience release files: Code Block |
---|
| tools $ RELEASE_VERSION=$RELEASE_VERSION releasing/create_binary_release.sh |
If you want to run this step in parallel on a remote machine you have to make the release commit available there (for example by pushing to a repository). This is important: the commit inside the binary builds has to match the commit of the source builds and the tagged release commit. When building remotely, you can skip gpg signing by setting SKIP_GPG=true . You would then sign the files manually after downloading them to your machine: Code Block |
---|
| for f in flink-*-bin*.tgz; do gpg --armor --detach-sig $f; done
gpg --armor --detach-sig apache-flink-*.tar.gz |
The release manager need to make sure the PyPI project `apache-flink` and `apache-flink-libraries` has enough available space for the python artifacts. The remaining space must be larger than the size of `tools/releasing/release/python`. Login with the PyPI admin account (account info is only available to PMC members) and check the remaining space in project settings. Request an increase if there's not enough space. Note, it could take some days for PyPI to review our request. Stage source and binary releases on dist.apache.orgCopy the source release to the dev repository of dist.apache.org . If you have not already, check out the Flink section of the dev repository on dist.apache.org via Subversion. In a fresh directory: Code Block |
---|
| svn checkout https://dist.apache.org/repos/dist/dev/flink --depth=immediates |
Make a directory for the new release: Code Block |
---|
| mkdir flink/flink-${RELEASE_VERSION}-rc${RC_NUM} |
Copy Flink source/binary distributions, hashes, and GPG signature and the python subdirectory: Code Block |
---|
mv <flink-dir>/tools/releasing/release/* flink/flink-${RELEASE_VERSION}-rc${RC_NUM} |
Add and commit all the files. Code Block |
---|
| cd flink
svn add flink-${RELEASE_VERSION}-rc${RC_NUM}
svn commit -m "Add flink-${RELEASE_VERSION}-rc${RC_NUM}" |
Verify that files are present
(Push the release tag) If you haven't pushed the release tag yet, here's the command: Code Block |
---|
git push <remote> refs/tags/release-${RELEASE_VERSION}-rc${RC_NUM} |
Propose a pull request for website updatesThe final step of building the candidate is to propose a website pull request. Start by updating the variables for the latest released version in the top-level _config.yml , (this should've been done when creating a new branch for the release) and list the new release in downloads.md , linking to the source code download and the Release Notes in JIRA. Also add a new blog entry announcing the release in _posts . Finally, propose a pull request with these changes. (Don’t merge before finalizing the release.) Checklist to proceed to the next step- Maven artifacts deployed to the staging repository of repository.apache.org
- Source distribution deployed to the dev repository of dist.apache.org
- Website pull request proposed to list the release
- Check
docs/config.toml to ensure that- the version constants refer to the new version
- the
baseurl does not point to flink-docs-master but flink-docs-release-X.Y instead
You can (optionally) also do additional verification by: Check hashes (e.g. shasum -c *.sha512) - Check signatures (e.g.
gpg --verify flink-1.2.3-source-release.tar.gz.asc flink-1.2.3-source-release.tar.gz ) grep for legal headers in each file.
|