...
Expand | |||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| |||||||||||||||||||||||||||||||||||
Before your first release, you should perform one-time configuration steps. This will set up your security keys for signing the release and access to various release repositories. To prepare for each release, you should audit the project status in the JIRA issue tracker, and do necessary bookkeeping. Finally, you should create a release branch from which individual release candidates will be built. One-time setup instructionsGPG KeyYou need to have a GPG key to sign the release artifacts. Please be aware of the ASF-wide release signing guidelines. If you don’t have a GPG key associated with your Apache account, please create one according to the guidelines. Determine your Apache GPG Key and Key ID, as follows:
This will list your GPG keys. One of these should reflect your Apache account, for example:
Here, the key ID is the 8-digit hex string in the Now, add your Apache GPG key to the Flink’s
You may drop the You may wish to start
Access to Apache Nexus repositoryConfigure access to the Apache Nexus repository, which enables final deployment of releases to the Maven Central Repository. org.apache.flink under Staging Profiles .Profile (top right dropdown menu of the page).User Token from the dropdown, then click Access User Token . Copy a snippet of the Maven XML configuration block.Insert this snippet twice into your global Maven
Website development setupGet ready for updating the Flink website by following the website development instructions. Create a new version in JIRAWhen contributors resolve an issue in JIRA, they are tagging it with a release that will contain their changes. With the release currently underway, new issues should be resolved against a subsequent future release. Therefore, you should create a release item for this subsequent release, as follows:
(Note: Only PMC members have access to the project administration. If you do not have access, ask on the mailing list for assistance.) Triage release-blocking issues in JIRAThere could be outstanding release-blocking issues, which should be triaged before proceeding to build a release candidate. We track them by assigning a specific The list of release-blocking issues is available at the version status page. Triage each unresolved issue with one of the following resolutions:
Review and update documentationThere are a few pages in the documentation that need to be reviewed and updated for each release.
Cross team testingFor user facing features that go into the release we'd like to ensure they can actually be used by Flink users. To achieve this the release managers ensure that an issue for cross team testing is created in the Apache Flink Jira. This can and should be picked up by other community members to verify the functionality and usability of the feature.
Unless the pages have not been updated before, please create a JIRA ticket and mark it as release blocker.Setup environment variablesSet up a few environment variables to simplify commands that follow. (We use
Review Release Notes in JIRAJIRA automatically generates Release Notes based on the Open the release notes from the version status page by choosing the release underway and clicking Release Notes. You should verify that the issues listed automatically by JIRA are appropriate to appear in the Release Notes. Specifically, issues should:
Adjust any of the above properties to the improve clarity and presentation of the Release Notes. Ensure that the JIRA release notes are also included in the release notes of the documentation (see section "Review and update documentation"). Content of Release Notes field from JIRA ticketsTo get the list of "release notes" field from JIRA, you can ran the following script using JIRA REST API:
Verify Java and Maven VersionAll of the following steps require to use Maven 3.2.5 and Java 8. Modify your PATH environment variable accordingly if needed. Verify that a Release Build WorksRun Create a release branchRelease candidates are built from a release branch. As a final step in preparation for the release, you should create the release branch, push it to the code repository (you should probably do this once the whole process is done), and update version information on the original branch. Most of the following commands have to be executed in the Expand | | ||||||||||||||||||||||||||||||||||
|
Code Block | ||
---|---|---|
| ||
tools $ releasing/create_snapshot_branch.sh
tools $ git checkout master
tools $ OLD_VERSION=$CURRENT_SNAPSHOT_VERSION NEW_VERSION=$NEXT_SNAPSHOT_VERSION releasing/update_branch_version.sh |
In the master branch, add a new value (i.e. v1_16("1.16")
) to the FlinkVersion
enum, as the last entry. The enum resides under flink-annotations
module in org.apache.flink
package:
Code Block | ||
---|---|---|
| ||
v1_12("1.12"),
v1_13("1.13"),
v1_14("1.14"),
v1_15("1.15"),
v1_16("1.16"); |
The newly created branch and updated master branch need to be pushed to the official repository.
Afterwards fork off from dev-master
a dev-x.y
branch in the https://github.com/apache/flink-docker repository. Make sure that flink-docker/testing/run_travis_tests.sh
points to the correct snapshot version; for dev-x.y
it should point to x.y-SNAPSHOT
, while for dev-master
it should point to the most recent snapshot version ($NEXT_SNAPSHOT_VERSION
).
After pushing the new major release branch, as the last step you should also update the documentation workflow to also build the documentation for the new release branch. Check Managing Documentation on details on how to do that. You may also want to manually trigger a build to make the changes visible as soon as possible.
Flink Benchmarks repository
Inside https://github.com/apache/flink-benchmarks repository you need to manually update the flink.version
property inside the main pom.xml file. It should be pointing to the most recent snapshot version ($NEXT_SNAPSHOT_VERSION
). For example:
Code Block | ||
---|---|---|
| ||
<flink.version>1.13-SNAPSHOT</flink.version> |
title | Minor release |
---|
Minor release
If you're creating a new minor release you do not need to modify Flink Benchmarks. You will skip the "Major release" step and simply check out the the already existing branch for that version:
Code Block | ||
---|---|---|
| ||
tools $ git checkout release-$SHORT_RELEASE_VERSION |
The rest of this guide assumes that commands are run in the root (or tools directory) of a repository on the branch of the release version with the above environment variables set.
Checklist to proceed to the next step
- Release Manager’s GPG key is published to
dist.apache.org
- Release Manager’s GPG key is configured in
git
configuration - Release Manager's GPG key is configured as the default gpg key.
- Release Manager has
org.apache.flink
listed underStaging Profiles
in Nexus - Release Manager’s Nexus User Token is configured in
settings.xml
- There are no release blocking JIRA issues
- Release Notes in JIRA have been audited and adjusted
- Update upgrade compatibility table (
docs/ops/upgrading.md).
- (major only) Release branch has been created and pushed
- Cron job has been added on the release branch in (tools/azure-pipelines/build-apache-repo.yml)
- (major only) Originating branch has the version information updated to the new version
- (major only) New version is added to the
FlinkVersion
Enum. - (major only) Make sure flink-docker has dev-x.y branch and docker e2e tests run against this branch
- (major only) docs/config.toml has been updated appropriately.
- (major only) The documentation for the new major release is visible under https://nightlies.apache.org/flink/flink-docs-release-$SHORT_RELEASE_VERSION (after at least one doc build finishes).
- (major only) The documentation for the new major release does not contain "-SNAPSHOT" in its version title, and all links refer to the corresponding version docs instead of master.
- (major only) The
flink.version
property of Flink Benchmark library has been updated to the latest snapshot version.
svn co https://dist.apache.org/repos/dist/release/flink flink-dist-release-repo
cd flink-dist-release-repo
(gpg --list-sigs <YOUR_KEY_ID> && gpg --armor --export <YOUR_KEY_ID>) >> KEYS
svn ci -m "[flink] Add <YOUR_NAME>'s public key" |
Configure git
to use this key when signing code by giving it your key ID, as follows:
Code Block | ||
---|---|---|
| ||
git config --global user.signingkey 845E6689 |
You may drop the --global
option if you’d prefer to use this key for the current repository only.
You may wish to start gpg-agent
to unlock your GPG key only once using your passphrase. Otherwise, you may need to enter this passphrase hundreds of times. The setup for gpg-agent
varies based on operating system, but may be something like this:
Code Block | ||
---|---|---|
| ||
eval $(gpg-agent --daemon --no-grab --write-env-file $HOME/.gpg-agent-info)
export GPG_TTY=$(tty)
export GPG_AGENT_INFO |
Access to Apache Nexus repository
Configure access to the Apache Nexus repository, which enables final deployment of releases to the Maven Central Repository.
- You log in with your Apache account.
- Confirm you have appropriate access by finding
org.apache.flink
underStaging Profiles
. - Navigate to your
Profile
(top right dropdown menu of the page). - Choose
User Token
from the dropdown, then clickAccess User Token
. Copy a snippet of the Maven XML configuration block. Insert this snippet twice into your global Maven
settings.xml
file, typically${HOME}/.m2/settings.xml
. The end result should look like this, whereTOKEN_NAME
andTOKEN_PASSWORD
are your secret tokens:Code Block language xml title settings.xml <settings> <servers> <server> <id>apache.releases.https</id> <username>TOKEN_NAME</username> <password>TOKEN_PASSWORD</password> </server> <server> <id>apache.snapshots.https</id> <username>TOKEN_NAME</username> <password>TOKEN_PASSWORD</password> </server> </servers> </settings>
Website development setup
Get ready for updating the Flink website by following the website development instructions.
GNU Tar Setup for Mac
Skip this step if you are not using a Mac. The default tar application on Mac does not support GNU archive format and defaults to Pax. This bloats the archive with unnecessary metadata that can result in additional files when decompressing (see 1.15.2-RC2 vote thread). Install gnu-tar and create a symbolic link to use in preference of the default tar program.
Code Block |
---|
brew install gnu-tar
ln -s /usr/local/bin/gtar /usr/local/bin/tar
which tar |
Create a new version in JIRA
When contributors resolve an issue in JIRA, they are tagging it with a release that will contain their changes. With the release currently underway, new issues should be resolved against a subsequent future release. Therefore, you should create a release item for this subsequent release, as follows:
- In JIRA, navigate to the Flink
> Administration > Versions
. - Add a new release: choose the next minor version number compared to the one currently underway, select today’s date as the
Start Date
, and chooseAdd
.
(Note: Only PMC members have access to the project administration. If you do not have access, ask on the mailing list for assistance.)
Triage release-blocking issues in JIRA
There could be outstanding release-blocking issues, which should be triaged before proceeding to build a release candidate. We track them by assigning a specific Fix version
field even before the issue resolved.
The list of release-blocking issues is available at the version status page. Triage each unresolved issue with one of the following resolutions:
- If the issue has been resolved and JIRA was not updated, resolve it accordingly.
- If the issue has not been resolved and it is acceptable to defer this until the next release, update the
Fix Version
field to the new version you just created. Please consider discussing this with stakeholders and the dev@ mailing list, as appropriate.- When using "Bulk Change" functionality of Jira
- First, add the newly created version to
Fix Version
for all unresolved tickets that have old the old version among itsFix Version
s. - Afterwards, remove the old version from the
Fix Version
.
- First, add the newly created version to
- When using "Bulk Change" functionality of Jira
- If the issue has not been resolved and it is not acceptable to release until it is fixed, the release cannot proceed. Instead, work with the Flink community to resolve the issue.
Review and update documentation
There are a few pages in the documentation that need to be reviewed and updated for each release.
- Ensure that there exists a release notes page for each non-bugfix release (e.g., 1.5.0) in
./docs/release-notes
/, that it is up-to-date, and linked from the start page of the documentation. - Upgrading Applications and Flink Versions: https://ci.apache.org/projects/flink/flink-docs-master/ops/upgrading.html
- please extend this list.
Cross team testing
For user facing features that go into the release we'd like to ensure they can actually be used by Flink users. To achieve this the release managers ensure that an issue for cross team testing is created in the Apache Flink Jira. This can and should be picked up by other community members to verify the functionality and usability of the feature.
The issue should contain some entry points which enables other community members to test it. It should not contain documentation on how to use the feature as this should be part of the actual documentation. The cross team tests are performed after the feature freeze. Documentation should be in place before that. Those tests are manual tests, so do not confuse them with automated tests.
To sum that up:
- User facing features should be tested by other contributors
- The scope is usability and sanity of the feature
- The feature needs to be already documented
- The contributor creates an issue containing some pointers on how to get started (e.g. link to the documentation, suggested targets of verification)
- Other community members pick those issues up and provide feedback
- Cross team testing happens right after the feature freeze
Unless the pages have not been updated before, please create a JIRA ticket and mark it as release blocker.
Setup environment variables
Set up a few environment variables to simplify commands that follow. (We use bash
Unix syntax in this guide.)
Code Block | ||
---|---|---|
| ||
RELEASE_VERSION="1.5.0"
SHORT_RELEASE_VERSION="1.5"
CURRENT_SNAPSHOT_VERSION="$SHORT_RELEASE_VERSION-SNAPSHOT"
NEXT_SNAPSHOT_VERSION="1.6-SNAPSHOT"
SHORT_NEXT_SNAPSHOT_VERSION="1.6" |
Review Release Notes in JIRA
JIRA automatically generates Release Notes based on the Fix Version
field applied to issues. Release Notes are intended for Flink users (not Flink committers/contributors). You should ensure that Release Notes are informative and useful.
Open the release notes from the version status page by choosing the release underway and clicking Release Notes.
You should verify that the issues listed automatically by JIRA are appropriate to appear in the Release Notes. Specifically, issues should:
- Be appropriately classified as
Bug
,New Feature
,Improvement
, etc. - Represent noteworthy user-facing changes, such as new functionality, backward-incompatible API changes, or performance improvements.
- Have occurred since the previous release; an issue that was introduced and fixed between releases should not appear in the Release Notes.
- Have an issue title that makes sense when read on its own.
Adjust any of the above properties to the improve clarity and presentation of the Release Notes.
Ensure that the JIRA release notes are also included in the release notes of the documentation (see section "Review and update documentation").
Content of Release Notes field from JIRA tickets
To get the list of "release notes" field from JIRA, you can ran the following script using JIRA REST API (notes the maxResults limits the number of entries):
Code Block |
---|
curl -s https://issues.apache.org/jira//rest/api/2/search?maxResults=200&jql=project%20%3D%20FLINK%20AND%20%22Release%20Note%22%20is%20not%20EMPTY%20and%20fixVersion%20%3D%20${RELEASE_VERSION} | jq '.issues[]|.key,.fields.summary,.fields.customfield_12310192' | paste - - -
|
jq
is present in most Linux distributions and on MacOS can be installed via brew.
Verify Java and Maven Version
All of the following steps require to use Maven 3.8.6 and Java 8. Modify your PATH environment variable accordingly if needed.
Info |
---|
Please make sure you are using Maven 3.8.6 and Java 8 |
Clone flink into a fresh workspace
Create a new directory for this release and clone the Flink repo from github to ensure you have a clean workspace. This step is optional.
Verify that a Release Build Works
Run mvn -Prelease clean install
to ensure that the build processes that are specific to that profile are in good shape. This step is optional.
Verify that no exclusions were erroneously added to the japicmp plugin that break compatibility guarantees
Check the exclusions for the japicmp-maven-plugin in the root pom for exclusions that:
- for minor releases, break source compatibility for
@Public
APIs - for patch releases, break source/binary compatibility for
@Public
/@PublicEvolving
APIs
Any such exclusion must be properly justified, in advance.
Create a release branch
Release candidates are built from a release branch. As a final step in preparation for the release, you should create the release branch, push it to the code repository (you should probably do this once the whole process is done), and update version information on the original branch.
Most of the following commands have to be executed in the tools
directory, we will prefix the command prompt to make this explicit.
Expand | |||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| |||||||||||||||
Major releaseIf you are doing a new major release, you need to update Flink version in the following repositories Flink repositoryCreate a branch for the new version that we want to release before updating the master branch to the next development version:
In the master branch, add a new value (i.e.
Additionally in The newly created branch and updated master branch need to be pushed to the official repository. Flink Docker repositoryAfterwards fork off from After pushing the new major release branch, as the last step you should also update the documentation workflow to also build the documentation for the new release branch. Check Managing Documentation on details on how to do that. You may also want to manually trigger a build to make the changes visible as soon as possible. Flink Benchmarks repositoryFirst of all, checkout the Then, inside the repository you need to manually update the
|
Expand | |||||
---|---|---|---|---|---|
| |||||
Minor releaseIf you're creating a new minor release you do not need to modify Flink Benchmarks. You will skip the "Major release" step and simply check out the the already existing branch for that version:
|
The rest of this guide assumes that commands are run in the root (or tools directory) of a repository on the branch of the release version with the above environment variables set.
Checklist to proceed to the next step
- Release Manager’s GPG key is published to
dist.apache.org
- Release Manager’s GPG key is configured in
git
configuration - Release Manager's GPG key is configured as the default gpg key.
- Release Manager has
org.apache.flink
listed underStaging Profiles
in Nexus - Release Manager’s Nexus User Token is configured in
settings.xml
- There are no release blocking JIRA issues
- Release Notes in JIRA have been audited and adjusted
- Update upgrade compatibility table (docs/ops/upgrading.md).
- Update Release Overview in Confluence
- (major only) Release branch has been created and pushed
- Cron job has been added on the release branch in (tools/azure-pipelines/build-apache-repo.yml)
- (major only) Originating branch has the version information updated to the new version
- (major only) New version is added to the
FlinkVersion
Enum. - (major only) Make sure flink-docker has dev-x.y branch and docker e2e tests run against this branch
- (major only) docs/config.toml has been updated appropriately.
- (major only) The documentation for the new major release is visible under https://nightlies.apache.org/flink/flink-docs-release-$SHORT_RELEASE_VERSION (after at least one doc build finishes).
- (major only) The documentation for the new major release does not contain "-SNAPSHOT" in its version title, and all links refer to the corresponding version docs instead of master.
- (major only) The
flink.version
property of Flink Benchmark repo has been updated to the latest snapshot version.
Expand | |||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| |||||||||||||||||||||||||||
The core of the release process is the build-vote-fix cycle. Each cycle produces one release candidate. The Release Manager repeats this cycle until the community approves one release candidate, which is then finalized. Build and stage Java and Python artifactsSet up a few environment variables to simplify Maven commands that follow. This identifies the release candidate being built. Start with
Now, create a release branch ( this step can not be skipped for minor releases):
Tag the release commit:
You can use We now need to do several things:
You might want to create a directory on your local machine for collecting the various source and binary releases before uploading them. Creating the binary releases is a lengthy process but you can do this on a another machine (for example, in the "cloud"). When doing this, you can skip signing the release files on the remote machine, download them to your local machine and sign them there. First, we build the source release:
Next, we stage the maven artifacts:
Review all staged artifacts in the staging repositories(https://repository.apache.org/#stagingRepositories). They should contain all relevant parts for each module, including Close the staging repository on Apache Nexus. When prompted for a description, enter “Apache Flink, version X, release candidate Y”. Then, you need to build the PyFlink wheel packages.(since 1.11)
| |||||||||||||||||||||||||||
Expand | |||||||||||||||||||||||||||
| |||||||||||||||||||||||||||
The core of the release process is the build-vote-fix cycle. Each cycle produces one release candidate. The Release Manager repeats this cycle until the community approves one release candidate, which is then finalized. Build and stage Java and Python artifactsSet up a few environment variables to simplify Maven commands that follow. This identifies the release candidate being built. Start with Code Block | | ||||||||||||||||||||||||||
|
release-${RELEASE_VERSION}-rc${RC_NUM} |
- Trigger the Azure Pipelines manually to build the PyFlink wheel packages
- Go to your Azure Pipelines Flink project → Pipelines
- Click the "New pipeline" button on the top right
- Select "GitHub" → your GitHub Flink repository → "Existing Azure Pipelines YAML file"
- Select your branch → Set path to "/azure-pipelines.yaml" → click on "Continue" → click on "Variables"
- Then click "New Variable" button, fill the name with "MODE", and the value with "release". Click "OK" to set the variable and the "Save" button to save the variables, then back on the "Review your pipeline" screen click "Run" to trigger the build.
- You should now see a build where only the "CI build (release)" is running
- Download the PyFlink wheel packages from the build result page after the jobs of "build_wheels mac" and "build_wheels linux" have finished.
- Download the PyFlink wheel packages
- Open the build result page of the pipeline
- Go to the `Artifacts` page (build_wheels linux -> 1 artifact)
- Click `wheel_Darwin_build_wheels mac` and `wheel_Linux_build_wheels linux` separately to download the zip files
Unzip these two zip files
Code Block language bash $ cd /path/to/downloaded_wheel_packages $ unzip wheel_Linux_build_wheels\ wheel_Linux_build_wheels_on_Linux.zip $ unzip wheel_Darwin_build_wheels\ wheel_Darwin_build_wheels_on_macos.zip
Create directory `dist` under the directory of flink-python
Code Block language bash $ cd <flink-dir> $ mkdir flink-python/dist
Move the unzipped wheel packages to the directory of flink-python/dist
Code Block language bash $ mv /path/to/wheel_Darwin_build_wheels\ mac/* flink-python/dist/ $ mv /path/to/wheel_Linux_build_wheels\ linux/* flink-python/dist/ $ cd tools
- Download the PyFlink wheel packages
Finally, we create the binary convenience release files:
Now, create a release branch ( this step can not be skipped for minor releases):
Code Block | ||
---|---|---|
| ||
$ cd tools
tools $ OLD_VERSION=$CURRENT_SNAPSHOT_VERSION NEW_VERSION=$RELEASE_VERSION RELEASE_CANDIDATE=$RC_NUM releasing/create_release_branch.sh |
Tag the release commit:
Code Block |
---|
git tag -s ${TAG} -m "${TAG}" |
We now need to do several things:
- Create the source release archive
- Deploy jar artefacts to the Apache Nexus Repository, which is the staging area for deploying the jars to Maven Central
- Build PyFlink wheel packages (since 1.11)
You might want to create a directory on your local machine for collecting the various source and binary releases before uploading them. Creating the binary releases is a lengthy process but you can do this on a another machine (for example, in the "cloud"). When doing this, you can skip signing the release files on the remote machine, download them to your local machine and sign them there.
First, we build the source release:
Code Block | ||
---|---|---|
| ||
tools $ RELEASE_VERSION=$RELEASE_VERSION releasing/create_source_release.sh |
Next, we stage the maven artifacts:
Code Block | ||
---|---|---|
| ||
tools $ releasing/deploy_staging_jars.sh |
Review all staged artifacts (https://repository.apache.org/). They should contain all relevant parts for each module, including pom.xml
, jar, test jar, source, test source, javadoc, etc. Carefully review any new artifacts.
Close the staging repository on Apache Nexus. When prompted for a description, enter “Apache Flink, version X, release candidate Y”.
Then, you need to build the PyFlink wheel packages.(since 1.11)
Code Block | ||
---|---|---|
| ||
tools $ |
|
RELEASE_VERSION |
- Go to your Azure Pipelines Flink project → Pipelines
- Click the "Run pipeline" button on the top right
- Select "GitHub" → your GitHub Flink repository → "Existing Azure Pipelines YAML file"
- Select your branch → Set path to ".azure-pipelines.yaml" → click on "Variables"
- Then click "Add Variable" bottom, fill the name with "MODE", and the value with "release". Click "Create" to set the variable, then go back to the "Run pipeline" screen and trigger the build.
- You should now see a build where only the "CI build (release)" is running
- Open the build result page of the pipeline
- Go to the `Artifacts` page (build_wheels linux -> 1 artifact)
- Click `wheel_Darwin_build_wheels mac` and `wheel_Linux_build_wheels linux` separately to download the zip files
Unzip these two zip files
Code Block | ||
---|---|---|
| ||
$ cd /path/to/downloaded_wheel_packages
$ unzip wheel_Linux_build_wheels\ linux.zip
$ unzip wheel_Darwin_build_wheels\ mac.zip |
=$RELEASE_VERSION releasing/create_binary_release.sh |
If you want to run this step in parallel on a remote machine you have to make the release commit available there (for example by pushing to a repository). This is important: the commit inside the binary builds has to match the commit of the source builds and the tagged release commit. When building remotely, you can skip gpg signing by setting SKIP_GPG=true
. You would then sign the files manually after downloading them to your machine:
Code Block | ||
---|---|---|
| ||
for f in flink-*-bin*.tgz; do gpg --armor --detach-sig $f; done
gpg --armor --detach-sig apache-flink-*.tar.gz |
The release manager need to make sure the PyPI project `apache-flink` and `apache-flink-libraries` has enough available space for the python artifacts. The remaining space must be larger than the size of `tools/releasing/release/python`. Login with the PyPI admin account (account info is only available to PMC members) and check the remaining space in project settings.
Request an increase if there's not enough space. Note, it could take some days for PyPI to review our request.
Stage source and binary releases on dist.apache.org
Copy the source release to the dev repository of dist.apache.org
.
If you have not already, check out the Flink section of the
dev
repository ondist.apache.org
via Subversion. In a fresh directory:Code Block language bash svn checkout https://dist.apache.org/repos/dist/dev/flink --depth=immediates
Make a directory for the new release
Create directory `dist` under the directory of flink-python
Code Block | ||
---|---|---|
| ||
$ cd <flink-dir>
$ mkdir flink-python/dist |
Move the unzipped wheel packages to the directory of flink-python/dist
Code Block | ||
---|---|---|
| ||
$ mv /path/to/wheel_Darwin_build_wheels\ mac/* flink-python/dist/
$ mv /path/to/wheel_Linux_build_wheels\ linux/* flink-python/dist/
$ cd tools |
:
Code Block language bash
mkdir flink/flink-$
{RELEASE_VERSION
SKIP_GPG=true
. You would then sign the files manually after downloading them to your machine:}-rc${RC_NUM}
Copy Flink source/binary distributions, hashes, and GPG signature and the python subdirectory:
Code Block mv <flink-dir>/tools/releasing/release/* flink/flink-${RELEASE_VERSION}-rc${RC_NUM}
Add and commit all the files.
Code Block language bash
cd flink svn add flink-
The release manager need to make sure the PyPI project `apache-flink` and `apache-flink-libraries` has enough available space for the python artifacts. The remaining space must be larger than the size of `tools/releasing/release/python`. Login with the PyPI admin account (account info is only available to PMC members) and check the remaining space in project settings.
Request an increase if there's not enough space. Note, it could take some days for PyPI to review our request.
Stage source and binary releases on dist.apache.org
Copy the source release to the dev repository of dist.apache.org
.
If you have not already, check out the Flink section of the dev
repository on dist.apache.org
via Subversion. In a fresh directory:
Code Block | ||
---|---|---|
| ||
svn checkout https://dist.apache.org/repos/dist/dev/flink --depth=immediates |
Make a directory for the new release:
Code Block | ||
---|---|---|
| ||
mkdir flink/flink-${RELEASE_VERSION}-rc${RC_NUM} |
Copy Flink source/binary distributions, hashes, and GPG signature and the python subdirectory:
Code Block |
---|
mv <flink-dir>/tools/releasing/release/* flink/flink-${RELEASE_VERSION}-rc${RC_NUM} |
Add and commit all the files.
Code Block | ||
---|---|---|
| ||
cd flink
svn add flink-${RELEASE_VERSION}-rc${RC_NUM}
svn commit -m "Add flink-${RELEASE_VERSION}-rc${RC_NUM}" |
Verify that files are present
(Push the release tag)
If you haven't pushed the release tag yet, here's the command:
Code Block |
---|
git push <remote> refs/tags/release-${RELEASE_VERSION}-rc${RC_NUM} |
Propose a pull request for website updates
The final step of building the candidate is to propose a website pull request containing the following changes:
- update
_config.yml (
in the root directory offlink-web)
- update
FLINK_VERSION_STABLE
andFLINK_VERSION_STABLE_SHORT
as required - update version references in quickstarts (
q/
directory) as required - (major only) add a new entry to
flink_releases
for the release binaries and sources - (minor only) update the entry for the previous release in the series in
flink_releases
- Please pay notice to the ids assigned to the download entries. They should be unique and reflect their corresponding version number.
- add a new entry to
release_archive.flink
- update
- add a blog post announcing the release in
_posts
Don’t merge the PR before finalizing the release.
Checklist to proceed to the next step
- Maven artifacts deployed to the staging repository of repository.apache.org
- Source distribution deployed to the dev repository of dist.apache.org
- Website pull request proposed to list the release
- (major only) Check
docs/config.toml
to ensure that- the version constants refer to the new version
- the
baseurl
does not point toflink-docs-master
butflink-docs-release-X.Y
instead
You can (optionally) also do additional verification by:
Check hashes (e.g. shasum -c *.sha512)
- Check signatures (e.g.
gpg --verify flink-1.2.3-source-release.tar.gz.asc flink-1.2.3-source-release.tar.gz
) grep
for legal headers in each file.
${RELEASE_VERSION}-rc${RC_NUM} svn commit -m "Add flink-${RELEASE_VERSION}-rc${RC_NUM}"
Verify that files are present
(Push the release tag)
If you haven't pushed the release tag yet, here's the command:
Code Block |
---|
git push <remote> refs/tags/release-${RELEASE_VERSION}-rc${RC_NUM} |
Propose a pull request for website updates
The final step of building the candidate is to propose a website pull request containing the following changes:
- update docs/data/flink.yml
- Add a new major version or update minor version as required
- update docs/data/release_archive.yml
- update version references in quickstarts (
q/
directory) as required (outdated?) - add a blog post announcing the release in
_posts
- (major only) add a organized release notes page under docs/content/release-notes and docs/content.zh/release-notes (like https://nightlies.apache.org/flink/flink-docs-release-1.15/release-notes/flink-1.15/). The page is based on the non-empty release notes collected from the issues, and only the issues that affect existing users should be included (e.g., instead of new functionality). It should be in a separate PR since it would be merged to the flink project.
Don’t merge the PRs before finalizing the release.
Checklist to proceed to the next step
- Maven artifacts deployed to the staging repository of repository.apache.org
- Source distribution deployed to the dev repository of dist.apache.org
- Website pull request proposed to list the release
- (major only) Check
docs/config.toml
to ensure that- the version constants refer to the new version
- the
baseurl
does not point toflink-docs-master
butflink-docs-release-X.Y
instead
You can (optionally) also do additional verification by:
Check hashes (e.g. shasum -c *.sha512)
- Check signatures (e.g.
gpg --verify flink-1.2.3-source-release.tar.gz.asc flink-1.2.3-source-release.tar.gz
) grep
for legal headers in each file.- If time allows check the NOTICE files of the modules whose dependencies have been changed in this release in advance, since the license issues from time to time pop up during voting. See Verifying a Flink Release "Checking License" section.
Expand | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|
| ||||||||||
Once you have built and individually reviewed the release candidate, please share it for the community-wide review. Please review foundation-wide voting guidelines for more information. Start the review-and-vote thread on the dev@ mailing list. Here’s an email template; please adjust as you see fit.
If there are any issues found in the release candidate, reply on the vote thread to cancel the vote. There’s no need to wait 72 hours. Proceed to the For cancelling a release, the release manager needs to send an email to the release candidate thread, stating that the release candidate is officially cancelled. Next, all artifacts created specifically for the RC in the previous steps need to be removed:
If there are no issues, reply on the vote thread to close the voting. Then, tally the votes in a separate email | ||||||||||
Expand | ||||||||||
| ||||||||||
Once you have built and individually reviewed the release candidate, please share it for the community-wide review. Please review foundation-wide voting guidelines for more information. Start the review-and-vote thread on the dev@ mailing list. Here’s an email template; please adjust as you see fit.
If there are any issues found in the release candidate, reply on the vote thread to cancel the vote. There’s no need to wait 72 hours. Proceed to the For cancelling a release, the release manager needs to send an email to the release candidate thread, stating that the release candidate is officially cancelled. Next, all artifacts created specifically for the RC in the previous steps need to be removed:
If there are no issues, reply on the vote thread to close the voting. Then, tally the votes in a separate email. Here’s an email template; please adjust as you see fit.
Checklist to proceed to the finalization step
|
Expand | ||
---|---|---|
| ||
Any issues identified during the community review and vote should be fixed in this step. Code changes should be proposed as standard pull requests to the Once all issues have been resolved, you should go back and build a new release candidate with these changes. Checklist to proceed to the next step
|
Checklist to proceed to the finalization step
|
Expand | ||
---|---|---|
| ||
Any issues identified during the community review and vote should be fixed in this step. Code changes should be proposed as standard pull requests to the Once all issues have been resolved, you should go back and build a new release candidate with these changes. Checklist to proceed to the next step
|
Expand | |||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| |||||||||||||||||
Once the release candidate has been reviewed and approved by the community, the release should be finalized. This involves the final deployment of the release candidate to the release repositories, merging of the website changes, etc. Deploy Python artifacts to PyPI (Since 1.9)Release manager should create a PyPI account and ask the PMC add this account to pyflink collaborator list with Maintainer role (The PyPI admin account info can be found here. NOTE, only visible to PMC members) to deploy the Python artifacts to PyPI. The artifacts could be uploaded using twine(https://pypi.org/project/twine/). To install twine, just run:
Note: Please ensure that the version of
Download the python artifacts from dist.apache.org and upload it to pypi.org:
If upload failed or incorrect for some reason(e.g. network transmission problem), you need to delete the uploaded release package of the same version(if exists) and rename the artifact to apache-flink-${RELEASE_VERSION}.post0.tar.gz, then re-upload. Note: re-uploading to pypi.org must be avoided as much as possible because it will cause some irreparable problems. If that happens, users cannot install the apache-flink package by explicitly specifying the package version, i.e. the following command "pip install apache-flink==${RELEASE_VERSION}" will fail. Instead they have to run "pip install apache-flink" or "pip install apache-flink==${RELEASE_VERSION}.post0" to install the apache-flink package. Deploy artifacts to Maven Central RepositoryUse the Apache Nexus repository to release the staged binary artifacts to the Maven Central repository. In the Deploy source and binary releases to dist.apache.orgCopy the source and binary releases from the
(Note: Only PMC members have access to the release repository. If you do not have access, ask on the mailing list for assistance.) Remove old release candidates from dist.apache.org | |||||||||||||||||
Expand | |||||||||||||||||
| |||||||||||||||||
Once the release candidate has been reviewed and approved by the community, the release should be finalized. This involves the final deployment of the release candidate to the release repositories, merging of the website changes, etc. Deploy Python artifacts to PyPI (Since 1.9)Release manager should create a PyPI account and ask the PMC add this account to pyflink collaborator list with Maintainer role (The PyPI admin account info can be found here. NOTE, only visible to PMC members) to deploy the Python artifacts to PyPI. The artifacts could be uploaded using twine(https://pypi.org/project/twine/). To install twine, just run:
Download the python artifacts from dist.apache.org and upload it to pypi.org: Code Block | | ||||||||||||||||
|
Code Block | ||||
---|---|---|---|---|
| ||||
svn checkout https://dist.apache.org/repos/dist/dev/flink --depth=immediates cd flink svn remove flink/flink-${RELEASE_VERSION}-rc${RC_NUM} cd flink-${RELEASE_VERSION}-rc${RC_NUM} cd python #uploads wheels for f in *.whl; do twine upload --repository-url https://upload.pypi.org/legacy/ $f $f.asc; done #upload source packages twine upload --repository-url https://upload.pypi.org/legacy/ apache-flink-libraries-${RELEASE_VERSION}.tar.gz apache-flink-libraries-${RELEASE_VERSION}.tar.gz.asc twine upload --repository-url https://upload.pypi.org/legacy/ apache-flink--rc* svn commit -m "Remove old release candidates for Apache Flink ${RELEASE_VERSION}.tar.gz apache-flink" |
Git tag
Create and push a new Git tag for the released version by copying the tag for the final release candidate, as follows:
Code Block |
---|
git tag -s "release-${RELEASE_VERSION}.tar.gz.asc |
" refs/tags/${TAG}^{} -m "Release Flink ${RELEASE_VERSION} |
" git push <remote> refs/tags/release-${RELEASE_VERSION} |
Deploy artifacts to Maven Central Repository
Use the Apache Nexus repository to release the staged binary artifacts to the Maven Central repository. In the Staging Repositories
section, find the relevant release candidate orgapacheflink-XXX
entry and click Release
. Drop all other release candidates that are not being released.
Deploy source and binary releases to dist.apache.org
Copy the source and binary releases from the dev
repository to the release
repository at dist.apache.org
using Subversion.
Code Block |
---|
svn move -m "Release Flink ${RELEASE_VERSION}" https://dist.apache.org/repos/dist/dev/flink/flink-${RELEASE_VERSION}-rc${RC_NUM} https://dist.apache.org/repos/dist/release/flink/flink-${RELEASE_VERSION}
|
(Note: Only PMC members have access to the release repository. If you do not have access, ask on the mailing list for assistance.)
Remove old release candidates from dist.apache.org
Remove the old release candidates from https://dist.apache.org/repos/dist/dev/flink using Subversion.
language | bash |
---|---|
title | Remove old release candidates from dist.apache.org |
Mark the version as released in JIRA
In JIRA, inside version management, hover over the current release and a settings menu will appear. Click Release
, and select today’s date.
(Note: Only PMC members have access to the project administration. If you do not have access, ask on the mailing list for assistance.)
If PRs have been merged to the release branch after the the last release candidate was tagged, make sure that the corresponding Jira tickets have the correct Fix Version set.
Publish the Dockerfiles for the new release
Note: the official Dockerfiles fetch the binary distribution of the target Flink version from an Apache mirror. After publishing the binary release artifacts, mirrors can take some hours to start serving the new artifacts, so you may want to wait to do this step until you are ready to continue with the "Promote the release" steps below.
Follow the instructions in the flink-docker repo to build the new Dockerfiles and send an updated manifest to Docker Hub so the new images are built and published.
Info | ||
---|---|---|
| ||
Please make sure "publishing to DockerHub: apache/flink" is finished before the announce mail or announcement blog post is sent. |
Checklist to proceed to the next step
svn checkout- Python artifacts released and indexed in the PyPI Repository https://pypi.org/project/apache-flink/#history
- Maven artifacts released and indexed in the Maven Central Repository (usually takes about a day to show up)
- Source & binary distributions available in the release repository of
- release/flink/
- Dev repository https://dist.apache.org/repos/dist/dev/flink/ is empty
- Release tagged in the source code repository
- Release version finalized in JIRA. (Note: Not all committers have administrator access to JIRA. If you end up getting permissions errors
Git tag
Create and push a new Git tag for the released version by copying the tag for the final release candidate, as follows:
Code Block |
---|
git tag -s "release-${RELEASE_VERSION}" refs/tags/${TAG}^{} -m "Release Flink ${RELEASE_VERSION}"
git push <remote> refs/tags/release-${RELEASE_VERSION} |
Mark the version as released in JIRA
In JIRA, inside version management, hover over the current release and a settings menu will appear. Click Release
, and select today’s date.
- ask on the mailing list for assistance
- )
If PRs have been merged to the release branch after the the last release candidate was tagged, make sure that the corresponding Jira tickets have the correct Fix Version set.
Publish the Dockerfiles for the new release
Note: the official Dockerfiles fetch the binary distribution of the target Flink version from an Apache mirror. After publishing the binary release artifacts, mirrors can take some hours to start serving the new artifacts, so you may want to wait to do this step until you are ready to continue with the "Promote the release" steps below.
Follow the instructions in the flink-docker repo to build the new Dockerfiles and send an updated manifest to Docker Hub so the new images are built and published.
Checklist to proceed to the next step
- Python artifacts released and indexed in the PyPI Repository
- Maven artifacts released and indexed in the Maven Central Repository (usually takes about a day to show up)
- Source & binary distributions available in the release repository of https://dist.apache.org/repos/dist/release/flink/
- Dev repository https://dist.apache.org/repos/dist/dev/flink/ is empty
- Release tagged in the source code repository
- Release version finalized in JIRA. (Note: Not all committers have administrator access to JIRA. If you end up getting permissions errors ask on the mailing list for assistance)
- Website contains links to new release binaries and sources in download page
- For major releases, the front page references the correct new major release version and directs to the correct link
- Dockerfiles in flink-docker updated for the new Flink release and pull request opened on the Docker official-images with an updated manifest
- Website contains links to new release binaries and sources in download page
- For major releases, the front page references the correct new major release version and directs to the correct link
- Dockerfiles in flink-docker updated for the new Flink release and pull request opened on the Docker official-images with an updated manifest
Expand | |||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| |||||||||||||||||||||||||||||||||||||||||
Once the release has been finalized, the last step of the process is to promote the release within the project and beyond. Please wait for 24h after finalizing the release in accordance with the ASF release policy. Update japicmp configurationUpdate the japicmp reference version and wipe exclusions / enable API compatibility checks for For a new major release (x.y.0), run the same command also on the master branch for updating the japicmp reference version and removing out-dated exclusions in the japicmp configuration. Make sure that all Maven artifacts are already pushed to Maven Central. Otherwise, there's a risk that CI fails due to missing reference artifacts.
Merge website pull requestMerge the website pull request to list the release. Make sure to regenerate the website as well, as it isn't build automatically. Remove outdated versionsdist.apache.orgFor a new major release remove all release files older than 2 versions, e.g., when releasing 1.7, remove all releases <= 1.5. For a new bugfix version remove all release files for previous bugfix releases in the same series, e.g., when releasing 1.7.1, remove the 1.7.0 release.
| |||||||||||||||||||||||||||||||||||||||||
Expand | |||||||||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||||||||
Once the release has been finalized, the last step of the process is to promote the release within the project and beyond. Please wait for 24h after finalizing the release in accordance with the ASF release policy. Update japicmp configurationUpdate the japicmp reference version and enable API compatibility checks for
CIDisable the cron job for the now-unsupported version from (tools/azure-pipelines/build-apache-repo.yml) in the respective branch. Apache mailing listsAnnounce on the dev@ mailing list that the release has been finished. Announce on the release on the user@ mailing list, listing major improvements and contributions. Announce the release on the announce@apache.org mailing list.
RecordkeepingUse reporter.apache.org to seed the information about the release into future project reports. (Note: Only PMC members have access report releases. If you do not have access, ask on the mailing list for assistance.) Flink blogMajor or otherwise important releases should have a blog post. Write one if needed for this particular release. Minor releases that don’t introduce new major functionality don’t necessarily need to be blogged (see flink-web PR #581 for Flink 1.15.3 as an example for a minor release blog post). Please make sure that the release notes of the documentation (see section "Review and update documentation") are linked from the blog post of a major release.
Social mediaTweet, post on Facebook, LinkedIn, and other platforms. Ask other contributors to do the same. Flink Release Wiki pageAdd a summary of things that went well or that went not so well during the release process. This can include feedback from contributors but also more generic things like the release have taken longer than initially anticipated (and why) to give a bit of context to the release process. End of Life for Unsupported versionsFor major versions only. As per our support policy for old Flink versions when we release a new 1.x version we should start a discussion thread to end support for old versions.
Merge website pull requestMerge the website pull request to list the release. Make sure to regenerate the website as well, as it isn't build automatically. Remove outdated versionsdist.apache.orgFor a new major release remove all release files older than 2 versions, e.g., when releasing 1.7, remove all releases <= 1.5. For a new bugfix version remove all release files for previous bugfix releases in the same series, e.g., when releasing 1.7.1, remove the 1.7.0 release. If you have not already, check out the Flink section of the
Remove files for outdated releases and commit the changes.
Verify that files are removed
CIDisable the cron job for the now-unsupported version from (tools/azure-pipelines/build-apache-repo.yml) in the respective branch. Apache mailing listsAnnounce on the dev@ mailing list that the release has been finished. Announce on the release on the user@ mailing list, listing major improvements and contributions. Announce the release on the announce@apache.org mailing list.
RecordkeepingUse reporter.apache.org to seed the information about the release into future project reports. (Note: Only PMC members have access report releases. If you do not have access, ask on the mailing list for assistance.) Flink blogMajor or otherwise important releases should have a blog post. Write one if needed for this particular release. Minor releases that don’t introduce new major functionality don’t necessarily need to be blogged. Please make sure that the release notes of the documentation (see section "Review and update documentation") are linked from the blog post of a major release.
Social mediaTweet, post on Facebook, LinkedIn, and other platforms. Ask other contributors to do the same.Checklist to declare the process completed
StatefulJobWBroadcastStateMigrationITCase |
Improve the process
It is important that we improve the release processes over time. Once you’ve finished the release, please take a step back and look what areas of this process and be improved. Perhaps some part of the process can be simplified. Perhaps parts of this guide can be clarified.
...