...
Now, add your Apache GPG key to the Flink’s KEYS
file in the release
repository at dist.apache.org
. Follow the instructions listed at the top of these files. (Note: Only PMC members have write access to the release repository. If you end up getting 403 errors ask on the mailing list for assistance.)Configure git
to use this key when signing code by giving it your key ID, as follows: PMC member can refer following scripts to add your Apache GPG key to the KEYS in the release repository.
Code Block | ||
---|---|---|
| ||
git config --global user.signingkey 845E6689 |
You may drop the --global
option if you’d prefer to use this key for the current repository only.
svn co https://dist.apache.org/repos/dist/release/flink flink-dist-release-repo
cd flink-dist-release-repo
(gpg --list-sigs <YOUR_KEY_ID> && gpg --armor --export <YOUR_KEY_ID>) >> KEYS
svn ci -m "[flink] Add <YOUR_NAME>'s public key" |
Configure git
to use this key when signing code by giving it your key ID, as follows:
Code Block | ||
---|---|---|
| ||
git config --global user.signingkey 845E6689 |
You may drop the --global
option if you’d prefer to use this key for the current repository only.
You may wish to start gpg-agent
to unlock your GPG key only once using your You may wish to start gpg-agent
to unlock your GPG key only once using your passphrase. Otherwise, you may need to enter this passphrase hundreds of times. The setup for gpg-agent
varies based on operating system, but may be something like this:
...
The core of the release process is the build-vote-fix cycle. Each cycle produces one release candidate. The Release Manager repeats this cycle until the community approves one release candidate, which is then finalized.
Build and
...
stage artifacts
Set Set up a few environment variables to simplify Maven commands that follow. This identifies the release candidate being built. Start with RC_NUM
equal to 1
and increment it for each candidate.
...
Code Block | ||
---|---|---|
| ||
RELEASE_CANDIDATE=$RC_NUM tools/releasing/create_release_branch.sh |
Update branch version:
Code Block | ||
---|---|---|
| ||
$ OLD_VERSION=$CURRENT_SNAPSHOT_VERSION NEW_VERSION=$RELEASE_VERSION tools/releasing/update_branch_version.sh |
Tag the release commit:
Code Block |
---|
git tag -s ${TAG} -m "${TAG}" |
See Telling Git about your signing key
We now need to do several things:
- Create the source release archiveCreate the Python artifacts
- Deploy jar artifacts to the Apache Nexus Repository, which is the staging area for deploying the jars to Maven Central
First, we build the source release:
Code Block | ||
---|---|---|
| ||
$ OLD_VERSION=$CURRENT_SNAPSHOTRELEASE_VERSION NEW_VERSION=$RELEASE_VERSION tools/releasing/updatecreate_branchsource_version.sh $ RELEASE_VERSION=$RELEASE_VERSION tools/releasing/create_source_release.shrelease.sh |
Finally, we stage the maven Next, we build the Python artifacts:
Code Block | ||
---|---|---|
| ||
$ RELEASE_VERSION=$RELEASE_VERSION tools/releasing/createdeploy_pythonstaging_sdk_releasejars.sh |
You will be able to find the built source and Python release artifacts under the "release/" folder created under the project root directory.
Finally, we stage the maven artifacts:
Code Block | ||
---|---|---|
| ||
$ tools/releasing/deploy_staging_jars.sh |
Review all staged artifacts (https://repository.apache.org/). Review all staged artifacts (https://repository.apache.org/). They should contain all relevant parts for each module, including pom.xml
, jar, test jar, source, test source, javadoc, etc. Carefully review any new artifacts.
Close the staging repository on Apache Nexus. When prompted for a description, enter “Apache Flink Table Store, version X, release candidate Y”.
Finally, we create the binary convenience release files:
Code Block | ||
---|---|---|
| ||
tools $ RELEASE_VERSION=$RELEASE_VERSION releasing/create_binary_release.sh |
Stage source and binary releases on dist.apache.org
...
If you have not already, check out the Flink section of the
dev
repository ondist.apache.org
via Subversion. In a fresh directory:Code Block language bash svn checkout https://dist.apache.org/repos/dist/dev/flink --depth=immediates
Make a directory for the new release:
Code Block language bash mkdir flink/flink-table-store-${RELEASE_VERSION}-rc${RC_NUM}
Copy
all release distributions, hashes, and GPGsource distributions, hashes, and GPG signature:
Code Block mv <flink-table-store-dir>/release/* flink/flink-table-store-${RELEASE_VERSION}-rc${RC_NUM}
Copy binary distributions, hashes, and GPG signature:
Code Block mv <flink-table-store-dir>/tools/releasing/release/* flink/flink-table-store-${RELEASE_VERSION}-rc${RC_NUM}
Add and commit all the files.
Code Block language bash cd flink svn add flink-table-store-${RELEASE_VERSION}-rc${RC_NUM} svn commit -m "Apache Flink Table Store, version ${RELEASE_VERSION}, release candidate ${RC_NUM}"
Verify that files are present
...
Code Block | ||
---|---|---|
| ||
From: Release Manager
To: dev@flink.apache.org
Subject: [VOTE] Apache Flink Table Store Release 1.2.3, release candidate #3
Hi everyone,
Please review and vote on the release candidate #3 for the version 1.2.3 of Apache Flink Table Store,
as follows:
[ ] +1, Approve the release
[ ] -1, Do not approve the release (please provide specific comments)
**Release Overview**
As an overview, the release consists of the following:
a) Flink Table Store source release to be deployed to dist.apache.org
b) Flink Table Store Python source distributions to be deployed to PyPI
c) Maven artifacts to be deployed to the Maven Central Repository
**Staging Areas to Review**
The staging areas containing the above mentioned artifacts are as follows, for your review:
* All artifacts for a) and b) can be found in the corresponding dev repository at dist.apache.org [2], which are signed with the key with fingerprint FFFFFFFF [3]
* All artifacts for b) can be found at PyPI [4]
* All artifacts for c) can be found at the Apache Nexus Repository [5]
Other links for your review:
* JIRA release notes [6]
* Source code tag "release-1.2.3-rc3" [7]
* PR to update the website Downloads page to include Flink Table Store links [8]
**Vote Duration**
The voting time will run for at least 72 hours.
It is adopted by majority approval, with at least 3 PMC affirmative votes.
Thanks,
Release manager
[1] link
[2] link
[3] https://dist.apache.org/repos/dist/release/flink/KEYS
[4] link
[5] link
[6] link
[7] link
[8] link |
...
- Issues identified during vote have been resolved, with fixes committed to the release branch.
Finalize the release
Once the release candidate has been reviewed and approved by the community, the release should be finalized. This involves the final deployment of the release candidate to the release repositories, merging of the website changes, etc.
Deploy Python artifacts to PyPI (Since 1.9)
Release manager should create a PyPI account and ask the PMC add this account to pyflink collaborator list with Maintainer role
(The PyPI admin account info can be found here. NOTE, only visible to PMC members) to deploy the Python artifacts to PyPI. The artifacts could be uploaded using twine(https://pypi.org/project/twine/). To install twine, just run:
Code Block | ||
---|---|---|
| ||
$ pip install --upgrade twine==1.12.0 |
Download the python artifacts from dist.apache.org and upload it to pypi.org:
...
language | text |
---|
- the release branch.
Finalize the release
Once the release candidate has been reviewed and approved by the community, the release should be finalized. This involves the final deployment of the release candidate to the release repositories, merging of the website changes, etc.
...
Deploy artifacts to Maven Central Repository
Use the Apache Nexus repository to release the staged binary artifacts to the Maven Central repository. In the Staging Repositories
section, find the relevant release candidate orgapacheflink-XXX
entry and click Release
. Drop all other release candidates that are not being released.
...
Code Block |
---|
$ svn move -m "Release Flink Table Store ${RELEASE_VERSION}" \
https://dist.apache.org/repos/dist/dev/flink/flink-table-store-${RELEASE_VERSION}-rc${RC_NUM} \
https://dist.apache.org/repos/dist/release/flink/flink-table-store-${RELEASE_VERSION}
|
...
If you have not already, check out the Flink section of the
release
repository ondist.apache.org
via Subversion. In a fresh directory:Code Block language bash svn checkout https://dist.apache.org/repos/dist/release/flink --depth=immediates cd flink
Remove files for outdated releases and commit the changes.
Code Block language bash svn remove flink-table-store-<version_to_remove> svn commit_remove> svn commit -m "Remove old release for Apache Flink Table Store ${RELEASE_VERSION_MAJOR}"
Verify that files are removed
...
- add a new entry to
release_archive.flink_table_mlstore
- for major releases, add a new entry to
flink_table_mlstore_releases
for the release binaries and sources - for minor releases, update the entry for the previous release in the series in
flink_mltable_store_releases
Please pay notice to the ids assigned to the download entries. They should be unique and reflect their corresponding version number.
Checklist to proceed to the next step
- Python artifacts released and indexed in the PyPI Repository
- Maven artifacts released and indexed in the Maven Central Repository (usually takes about a day to show up)
- Source & binary distributions available in the release repository of https://dist.apache.org/repos/dist/release/flink/
- Dev repository https://dist.apache.org/repos/dist/dev/flink/ is empty
- Release tagged in the source code repository
- Release version finalized in JIRA. (Note: Not all committers have administrator access to JIRA. If you end up getting permissions errors ask on the mailing list for assistance)
- Website contains links to new release binaries and sources in download page
- For major releases, the front page references the correct new major release version and directs to the correct link
...
Code Block | ||
---|---|---|
| ||
From: Release Manager To: dev@flink.apache.org, user@flink.apache.org, user-zh@flink.apache.org, announce@apache.org Subject: [ANNOUNCE] Apache Flink Table Store 1.2.3 released The Apache Flink community is very happy to announce the release of Apache Flink Table Store 1.2.3. Apache Flink Table Store provides API and infrastructure that simplifies implementing distributed ML algorithms, and it also provides a library of off-the-shelf ML algorithmsFor building dynamic tables for both stream and batch processing in Flink, supporting high speed data ingestion and timely data query. Please check out the release blog post for an overview of the release: https://flink.apache.org/news/2020/04/07/release-table-store-0.1.0.html The release is available for download at: https://flink.apache.org/downloads.html Maven artifacts for Flink Table Store can be found at: https://search.maven.org/search?q=g:org.apache.flink%20mlflink%20table-store The full release notes are available in Jira: https://issues.apache.org/jira/secure/ReleaseNote.jspa?projectId=12315522&version=12346878 We would like to thank all contributors of the Apache Flink community who made this release possible! Regards, Release Manager |
...