THIS IS A TEST INSTANCE. ALL YOUR CHANGES WILL BE LOST!!!!
...
- In Memory Data Stream
Users have ability to produce to & consume from in memory system partitions after SEP-8. In Memory Data system alleviates the need of serialization/deserialization of data since it does not persist data. We take advantage of this and provide very succinct Stream classes to serve as input data sources, which are the following:- Collection Stream
Users can plug a collection (either List or Map) to create an in-memory input stream
e.g: CollectionStream.of(...,{1,2,3,4}) - Event Builder Stream
Event builder helps a user to mimic runtime samza processing environment in its tests, for example adding an exception in the steam, advancing time for window functions
- Collection Stream
- File Stream
Users can create an input stream reading and writing from local file
e.g: FileStream.of("/path/to/file") - Local Kafka Stream
Users can also consume bounded streams from a kafka topic which serves as initial input. Samza already provide api's to consume from and produce to kafka. For kafka streams we will need serde configurations
...
Traditionally we ask users to set up config for any samza job, for test purposes we set up basic config boiler plate for users and provide them a flexible option to still add any custom config (rarely needed), api exposes functions to configure single container or multi container mode (using Zookeeper). It also provides apis to configure concurrency semantics for the job.
Public Interfaces
...