You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 15 Next »

How do I configure SLF4J?

Add a log4j-test.properties under the directory of java test and then add the following snippets into your build.gradle file.

test {
systemProperty "log4j.configuration", "log4j-test.properties"
}
dependencies {
shadow library.java.slf4j_api
shadow library.java.slf4j_log4j12
// or shadow library.java.slf4j_jdk14
}

The second dependency shadow library.java.slf4j_log4j12 is not necessary if this dependency is already provided by other library. You can check the dependency tree by launching ./gradlew dependencies to see if it has been included.

If you encounter the error message like this: 

SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation

It means that no SLF4J binding is found. So you need to add library.java.slf4j_log4j12 or library.java.slf4j_jdk14 in build.gradle file.


How do I automatically format code and avoid spotless errors?

You can set up a git precommit hook to always autoformat code, by putting this in .git/hooks/pre-commit and setting the executable bit.

    #!/bin/sh
    set -e
    ./gradlew spotlessApply

If you haven't used git hooks, the docs are here: https://git-scm.com/docs/githooks.

Using --no-verify will skip it and `chmod u-x` will disable it.


How do I run a single test?

./gradlew :examples:java:test --tests org.apache.beam.examples.subprocess.ExampleEchoPipelineTest --info

Running Java Dataflow Hello World pipeline with compiled Dataflow Java worker.

You can dump multiple definitions for gcp project name and temp folder. They are present, since different targets use different names.

Before running command, remember to configure your gcloud credentials. Add GOOGLE_APPLICATION_CREDENTIALS to your env variables.

./gradlew :runners:google-cloud-dataflow-java:examples:preCommitLegacyWorker -PdataflowProject=<GcpProjectName> -Pproject=<GcpProjectName> -PgcpProject=<GcpProjectName> -PgcsTempRoot=<Gcs location in format: gs://..., no trailing slash> -PdataflowTempRoot=<Gcs location in format: gs://...>

./gradlew :runners:google-cloud-dataflow-java:examples:preCommitFnApiWorker -PdataflowProject=<GcpProjectName> -Pproject=<GcpProjectName> -PgcpProject=<GcpProjectName>  -PgcsTempRoot=<Gcs location in format: gs://..., no trailing slash> -PdataflowTempRoot=<Gcs location in format: gs://..., no trailing slash> -PdockerImageRoot=<docker image store location in format gcr.io/...>

Running a User Defined Pipeline (example is on Java Direct Runner). 

If you want to run your own pipeline, and in the meanwhile change beam repo code for dev/testing purpose. Here is an example.

If it is just a simple runner like directRunner, all you need to do is to put your pipeline code under example folder, and then add following build target to the related build.gradle: 

task execute(type:JavaExec) {
main = "org.apache.beam.examples.SideInputWordCount"
classpath = configurations."directRunnerPreCommit"
}


There are also alternative choices, with slight difference:

1) Create a maven project. And use the following command to publish changed code to local repository. 

 ./gradlew -Ppublishing -PnoSigning publishMavenJavaPublicationToMavenLocal

2) Make use of Integration tests, and make your user defined pipeline part of the integration test. 

Continue on error

--continue – this flag makes compileJava task to dump all found errors, not stop on first.

IntelliJ Proto Intellisense doesn't work.

This can happen when you start IntellijJ or (in my case) after modifying protos.

This is not a solved problem yet.

Currently tried approaches:

  1. Clean build from console
  2. Build from IntelliJ
  3. Refresh Gradle Project in IntelliJ
  4. Restart IntelliJ
  5. Another option looked at that should help if index is not updated via 3 and 4: https://stackoverflow.com/questions/6652540/rebuild-intellij-project-indexes

Workaround that did the trick. Since many things were tried in process and no clear way to reproduce error, this might not be the correct/best steps. Update steps if you find shorter/cleaner way to do the trick.

  1. Refresh tradle project in IntelliJ
  2. Close intellij
  3. clean build project from console (./gradlew clean cleanTest build -x testWebsite -x :rat -x test)
  4. Open IntelliJ


What command should I run locally before creating a pull request?

We recommend running this command, in order to catch common style issues, potential bugs (using code analysis), and javadoc issues before creating a pull request. Running this takes 5-10 minutes.

./gradlew spotlessApply && ./gradlew checkstyleMain checkstyleTest javadoc spotbugsMain compileJava compileTestJava

If you don't run this locally, they will be ran during presubmit, by Jenkins. However, if these fail during presubmit, you may not see the output of test failures. So doing this first is recommended to make your development process a bit smoother and iterate on your PR until it passes the presubmit.

How do I perform a dependency upgrade?

To perform a dependency upgrade we want to ensure that the PR is not introducing any new linkage errors. We do this by combining successful Jenkins test runs with analysis performed using a linkage checker. This allows us to gain confidence that we are minimizing the number of linkage issues that will arise for users.

The overall process is:

  1. Find all gradle subprojects that are impacted by the dependency change
  2. For each gradle subproject, perform the before and after linkage checker analysis and provide the results as part of your PR
  3. For each gradle subproject, find and run relevant Jenkins test suites

Find all gradle subprojects that are impacted by the dependency change

The command below will print out a dependency report in a text file for each project:

./gradlew dependencyReport

You may then grep for a specific maven artifact identifier such as guava in all the dependency reports with:

grep -l "guava" `find ./ -name dependencies.txt`

Linkage checker analysis

You can use the shell script to do this on your behalf (note that it will run the manual command below on your current workspace and also on HEAD):

/bin/bash sdks/java/build-tools/beam-linkage-check.sh "artifactId1,artifactId2,..."

You should copy and paste the output to the PR. If it is large, you may want to use a github gist. Some example PRs (1, 2, 3, 4, 5).


Note that you can manually run the linkage checker on your current workspace by invoking:

./gradlew -Ppublishing -PjavaLinkageArtifactIds=artifactId1,artifactId2,... :checkJavaLinkage

Example output is:

Class org.brotli.dec.BrotliInputStream is not found;
  referenced by 1 class file
    org.apache.beam.repackaged.core.org.apache.commons.compress.compressors.brotli.BrotliCompressorInputStream (beam-sdks-java-core-2.20.0-SNAPSHOT.jar)
Class com.github.luben.zstd.ZstdInputStream is not found;
  referenced by 1 class file
    org.apache.beam.repackaged.core.org.apache.commons.compress.compressors.zstandard.ZstdCompressorInputStream (beam-sdks-java-core-2.20.0-SNAPSHOT.jar)
Class com.github.luben.zstd.ZstdOutputStream is not found;
  referenced by 1 class file
    org.apache.beam.repackaged.core.org.apache.commons.compress.compressors.zstandard.ZstdCompressorOutputStream (beam-sdks-java-core-2.20.0-SNAPSHOT.jar)
Class org.apache.beam.vendor.bytebuddy.v1_9_3.net.bytebuddy.jar.asm.commons.ModuleHashesAttribute is not found;
  referenced by 1 class file
    org.apache.beam.vendor.bytebuddy.v1_9_3.net.bytebuddy.jar.asm.commons.ClassRemapper (beam-vendor-bytebuddy-1_9_3-0.1.jar)

Run relevant Jenkins test suites

You can find all jenkins job configurations within https://github.com/apache/beam/tree/master/.test-infra/jenkins and request that the reviewer run the relevant test suites by providing them with a list of all the relevant trigger phrases. You can perform this request directly on your PR or on the dev mailing list (example).


  • No labels