Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Consideration #1: How Much Precision?

The first question to answer is how much precision do we need?

Since ProducerPerformance currently uses millisecond precision, we will determine the spread using values close to 1.0.

The examples below use two numbers that differ by only 1 in the least significant digit (last/right digit). Bracket denotes inclusive, and parenthesis denotes exclusive.

NOTE: When calculating the spread using numbers with a larger difference, you will observe a higher minimum spread (will not be zero).

Using scale = 1 (1 decimal place)

If we record a latency of 1.1 ms for the first run, and 1.2 ms in the second run, then the true values are:

  • 1.1 = [1.05, 1.15)
  • 1.2 = [1.15, 1.25)

Min spread: 1.15 - (1.15) = ~0

Max spread: (1.25) - 1.05 = ~0.2

Conclusion: Benchmark 2 is 0 to 20% slower than benchmark 1.

Using scale = 2 (2 decimal places)

If we record a latency of 1.01 ms for the first run, and 1.02 ms in the second run, then the true values are:

  • 1.01 = [1.005, 1.015)
  • 1.02 = [1.015, 1.025)

Min spread: 1.015 - (1.015) = ~0

Max spread: (1.025) - 1.005 = ~0.02

Conclusion: Benchmark 2 is 0 to 2% slower than benchmark 1.

Using scale = 3 (3 decimal places)

If we record a latency of 1.001 ms for the first run, and 1.002 ms in the second run, then the true values are:

  • 1.001 = [1.0005, 1.0015)
  • 1.002 = [1.0015, 1.0025)

Min spread: 1.0015 - (1.0015) = ~0

Max spread: (1.0025) - 1.0005 = ~0.0020

Conclusion: Benchmark 2 is 0 to 0.2% slower than benchmark 1.


By using 3 decimal places in millisecond latency measurements, we are able to observe a ~0.2% spread in a low-latency environment dealing with latencies around 1 ms. 3 decimal places for milliseconds is equivalent to microseconds. This 0.2% spread is quite small, especially since other factors are likely to generate a larger variance.


Compatibility, Deprecation, and Migration Plan

...