Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

We recently created a first prototype for E2E tests, with the goal to introduce more automated tests into StreamPipes. On this page we collect structured procedures, which we will test automatically in the future. The descriptions for tests should be so detailed that we can directly implement the tests (e.g. necessary dependencies, data, expected results ...).

...

Sinks - Adapter

e.g.

  • Kafka (Create Adapter → Pipeline (Source → Kafka Sink) → Create Kafka Adapter (validate event schema)
  • MQTT (Create Adapter → Pipeline (Source → MQTT Sink) → Create MQTT Adapter (validate event schema)
  • MySQL(Create Adapter → Pipeline (Source → MySQL Sink) → Create MySQL Adapter (validate event schema)
  • InfluxDB (Create Adapter → Pipeline (Source → InfluxDB Sink) → Create InfluxDB Adapter (validate event schema)
  • How to deal with database set adapters?

...

  • Precondition:
    • None
    • Use StreamPipes internal ActiveMQ
  • Before:
    • Create Adapter (Machine Data Simulator / Flow Rate)
    • Create Pipeline (Machine Data Simulator → MQTT Sink)
      • Configure Kafka Sink (Selector: mqtt_publisher ) 
        • Type: input Selector: topic Value: #testname
        • Type: input Selector: host Value: activemq
        • Type: input Selector: port Value: 1883
  • Test:
    • Create Kafka Adapter
      • (Selector: MQTT ) 
        • Type: input Selector: broker_url Value: tcp://activemq:1883
        • Type: input Selector: topic Value: #testname
        • Type: radio Selector: access-mode Value: Unauthenticated
      • Format: json
      • Mark as timestamp: timestamp
      • Name: #testname
    • Validate in live preview: Contain 7 event properties
  • Clean Up
    • Delete pipeline
    • Delete two adapter:
      • MQTT
      • Machine Data Simulator
InfluxDB
  • Precondition:
    • None
    • Use StreamPipes internal InfluxDB
  • Before:
    • Create Adapter (Machine Data Simulator / Flow Rate)
    • Create Pipeline (Machine Data Simulator → Influx Sink)
      • Configure Kafka Sink (Selector: InfluxDB Stream Adapter ) 
        • Type: input Selector: db_host Value: http://influxdb
        • Type: input Selector: db_port Value: 8086
        • Type: input Selector: db_name Value: sp
        • Type: input Selector: db_measurement Value: #testname
        • Type: input Selector: db_user Value: sp
        • Type: input Selector: db_password Value: default
        • Type: input Selector: batch_interval_actions Value: 2
        • Type: input Selector: max_flush_duration Value: 500
        • Type: drop-down Selector: timestamp_mapping Value: timestamp
  • Test:
    • Create Kafka Adapter
      • (Selector: Apache_Kafka ) 
        • Type: input Selector: influxDbHost Value: http://influxdb
        • Type: input Selector: influxDbPort Value: 8086
        • Type: input Selector: influxDbDatabase Value: sp
        • Type: input Selector: influxDbMeasurement Value: #testname
        • Type: input Selector: influxDbUsername Value: sp
        • Type: input Selector: influxDbPassword Value: default
        • Type: input Selector: pollingInterval Value: 200
        • Type: radio Selector: replaceNullValues Value: Yes
      • Mark as timestamp: timestamp
      • Name: #testname
    • Validate in live preview: Contain 7 event properties
  • Clean Up
    • Delete pipeline
    • Delete two adapter:
      • InfluxDB
      • Machine Data Simulator

Connect

...