You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 10 Next »

Automated E2E Tests

We recently created a first prototype for E2E tests, with the goal to introduce more automated tests into StreamPipes. On this page we collect structured procedures, which we will test automatically in the future. The descriptions for tests should be so detailed that we can directly implement the tests (e.g. necessary dependencies, data, expected results ...).

Sinks - Adapter

e.g.

  • Kafka (Create Adapter → Pipeline (Source → Kafka Sink) → Create Kafka Adapter (validate event schema)
Kafka
  • Precondition:
    • None
    • Use StreamPipes internal Kafka
  • Before:
    • Create Adapter (Machine Data Simulator / Flow Rate)
    • Create Pipeline (Machine Data Simulator → Kafka Sink)
      • Configure Sink (Selector: kafka_publisher ) 
        • Type: input Selector: topic Value: #testname
        • Type: input Selector: host Value: kafka
        • Type: input Selector: port Value: 9092
        • Type: radio Selector: access-mode Value: Unauthenticated
  • Test:
    • Create Kafka Adapter
      • (Selector: Apache_Kafka ) 
        • Type: input Selector: host Value: kafka
        • Type: input Selector: port Value: 9092
        • Type: radio Selector: access-mode Value: Unauthenticated
        • Type: radio Selector: topic Value: #testname
      • Format: json
      • Mark as timestamp: timestamp
      • Name: #testname
    • Validate in live preview: Contain 7 event properties
  • Clean Up
    • Delete pipeline
    • Delete two adapter:
      • Kafka
      • Machine Data Simulator
MQTT
  • Precondition:
    • None
    • Use StreamPipes internal ActiveMQ
  • Before:
    • Create Adapter (Machine Data Simulator / Flow Rate)
    • Create Pipeline (Machine Data Simulator → MQTT Sink)
      • Configure Sink (Selector: mqtt_publisher ) 
        • Type: input Selector: topic Value: #testname
        • Type: input Selector: host Value: activemq
        • Type: input Selector: port Value: 1883
  • Test:
    • Create Kafka Adapter
      • (Selector: MQTT ) 
        • Type: input Selector: broker_url Value: tcp://activemq:1883
        • Type: input Selector: topic Value: #testname
        • Type: radio Selector: access-mode Value: Unauthenticated
      • Format: json
      • Mark as timestamp: timestamp
      • Name: #testname
    • Validate in live preview: Contain 7 event properties
  • Clean Up
    • Delete pipeline
    • Delete two adapter:
      • MQTT
      • Machine Data Simulator
InfluxDB
  • Precondition:
    • None
    • Use StreamPipes internal InfluxDB
  • Before:
    • Create Adapter (Machine Data Simulator / Flow Rate)
    • Create Pipeline (Machine Data Simulator → Influx Sink)
      • Configure Sink (Selector: InfluxDB ) 
        • Type: input Selector: db_host Value: http://influxdb
        • Type: input Selector: db_port Value: 8086
        • Type: input Selector: db_name Value: sp
        • Type: input Selector: db_measurement Value: #testname
        • Type: input Selector: db_user Value: sp
        • Type: input Selector: db_password Value: default
        • Type: input Selector: batch_interval_actions Value: 2
        • Type: input Selector: max_flush_duration Value: 500
        • Type: drop-down Selector: timestamp_mapping Value: timestamp
  • Test:
    • Create InfluxDB Adapter
      • (Selector: InfluxDB_Stream_Adapter ) 
        • Type: input Selector: influxDbHost Value: http://influxdb
        • Type: input Selector: influxDbPort Value: 8086
        • Type: input Selector: influxDbDatabase Value: sp
        • Type: input Selector: influxDbMeasurement Value: #testname
        • Type: input Selector: influxDbUsername Value: sp
        • Type: input Selector: influxDbPassword Value: default
        • Type: input Selector: pollingInterval Value: 200
        • Type: radio Selector: replaceNullValues Value: Yes
      • Mark as timestamp: timestamp
      • Name: #testname
    • Validate in live preview: Contain 7 event properties
  • Clean Up
    • Delete pipeline
    • Delete two adapter:
      • InfluxDB
      • Machine Data Simulator
MySQL
  • Precondition:
    • Install a MySQL database
  • Before:
    • Create Adapter (Machine Data Simulator / Flow Rate)
    • Create Pipeline (Machine Data Simulator → MySQL Sink)
      • Configure Sink (Selector: mysql_database
        • Type: input Selector: host Value: mysql
        • Type: input Selector: port Value: 3306
        • Type: input Selector: user Value: user1
        • Type: input Selector: password Value: uYaF8c5bxqaHpEEi
        • Type: input Selector: db Value: sp
        • Type: input Selector: table Value: #testname
  • Test:
    • Create Kafka Adapter TBD
      • (Selector: MySQL ) 
        • Type: input Selector: mysqlHost Value: mysql
        • Type: input Selector: mysqlPort Value: 3306
        • Type: input Selector: mysqlUser Value: user1
        • Type: input Selector: mysqlPassword Value: uYaF8c5bxqaHpEEi
        • Type: input Selector: mysqlDatabase Value: sp
        • Type: input Selector: mySQLTable Value: #testname
        • Type: radio Selector: replaceNullValues Value: Yes
      • Mark as timestamp: timestamp
      • Name: #testname
    • Validate in live preview: Contain 7 event properties
  • Clean Up
    • Delete pipeline
    • Delete two adapter:
      • MySQL adapter
      • Machine Data Simulator

Connect

  • File Set / Stream
  • HTTP Set / Stream
  • Image Set / Stream
  • ISS Location (Can be used to test Dashboard map)
  • Machine Data Simulator
  • OPC UA
  • PLC adapters?
  • Random Data Set / Stream
  • ROS
  • Slack?
  • HTTP Server
Formate
  • XML
  • JSON
    • Array With Key
    • Array No Key
    • Object
  • GeoJson
  • CSV
  • Image
Preprocessing Rules
  • Schema Rules
    • Add fixed property
    • Add timestamp
    • Rename
    • Add nested
    • Move
    • Delete
  • Value Rules
    • Number transformation
    • Unit transformation
    • Privacy transformation
    • Timestamp transformation
      • UNIX timestamp sec
      • REGEX
  • Stream Rules
    • Aggregation
    • Remove Duplicates

Pipeline Elements

Dashboard

Data Explorer

  • Wait till we finished the refactoring

Notifications

File Management

  • No labels