This repository contains examples of use cases that utilize Decodable streaming solution.
| Example | Description |
|---|---|
| AsyncAPI | Publishing Data Products with AsyncAPI |
| Opinionated Data Pipelines | Building data pipelines with schema on write streams. |
| Postman | Building data pipelines with Postman. |
| Change Streams | Using change streams to build materialized views in Postgres |
| XML Processing | Parse XML and transform to JSON |
| OSQuery Routing | Route OSQuery logs with SQL |
| Masking | Ways to mask data |
| Apache Pinot | Transforming osquery logs to Apache Pinot and Superset |
| Apache Druid | This example sends covid 19 data to Decodable using it's REST API. The data is then cleansed using Decodable SQL and send the data to a Kafka sink. |
| Rockset | We will be utilizing a cloud MQTT broker and AWS Kinesis to capture and stream data. Decodable will be responsible for preparing and aggregating the data prior to reaching the real-time analytical database (Rockset) |
| Tinybird | We write data to Tinybird and build a simple real time web application. |
| Apache Kafka | Installing Apache Kafka on EC2 and writing to S3 with Decodable |
| Apache Kafka mTLS | We install Apache Kafka on EC2 and configure it with mTLS and configure Decodable to read from it |