This is a curated list of demos that showcase Apache Kafka® stream processing on the Confluent Platform. Some demos run on local Confluent Platform installs (download Confluent Platform) and others run on Docker (install Docker and Docker Compose).
The best demo to start with is cp-demo, which spins up a Kafka event streaming application using KSQL for stream processing.
cp-demo
also comes with a playbook and video series, and is a great configuration reference for Confluent Platform.
Demo | Local | Docker | Description |
---|---|---|---|
ACLs in Cloud | Y | N | ACLs in Confluent Cloud |
Clients to Cloud | Y | N | Client applications in different programming languages connecting to Confluent Cloud |
GCP pipeline | N | Y | Work with Confluent Cloud to build cool pipelines into Google Cloud Platform (GCP) |
Kinesis to Cloud | Y | N | AWS Kinesis -> Confluent Cloud -> Google Cloud Storage pipeline |
On-Prem Kafka to Cloud | Y | Y | On-prem Kafka cluster and Confluent Cloud cluster, and data copied between them with Confluent Replicator |
Demo | Local | Docker | Description |
---|---|---|---|
Clickstream | Y | Y | Automated version of the KSQL clickstream demo |
Kafka Tutorials | Y | Y | Collection of common event streaming use cases, with each tutorial featuring an example scenario and several complete code solutions |
KSQL UDF | Y | N | Advanced KSQL User-Defined Function (UDF) use case for connected cars |
KSQL workshop | N | Y | showcases Kafka event stream processing using KSQL and can run self-guided as a KSQL workshop |
Microservices ecosystem | Y | N | Microservices orders Demo Application integrated into the Confluent Platform |
Music demo | Y | Y | KSQL version of the Kafka Streams Demo Application |
Demo | Local | Docker | Description |
---|---|---|---|
CDC with MySQL | N | Y | Self-paced steps to set up a change data capture (CDC) pipeline |
CDC with Postgres | N | Y | Enrich event stream data with CDC data from Postgres and then stream into Elasticsearch |
Connect and Kafka Streams | Y | N | Demonstrate various ways, with and without Kafka Connect, to get data into Kafka topics and then loaded for use by the Kafka Streams API |
MQTT | Y | N | Internet of Things (IoT) integration example using Apache Kafka + Kafka Connect + MQTT Connector + Sensor Data |
MySQL and Debezium | Y | Y | End-to-end streaming ETL with KSQL for stream processing using the Debezium Connector for MySQL |
Syslog | N | Y | Real-time syslog processing with Apache Kafka and KSQL: filtering logs, event-driven alerting, and enriching events |
Demo | Local | Docker | Description |
---|---|---|---|
Avro | Y | N | Client applications using Avro and Confluent Schema Registry |
CP Demo | Y | Y | Confluent Platform demo with a playbook for Kafka event streaming ETL deployments |
Kubernetes | N | Y | Demonstrations of Confluent Platform deployments using the Confluent Operator |
Multi Datacenter | N | Y | Active-active multi-datacenter design with two instances of Confluent Replicator copying data bidirectionally between the datacenters |
Quickstart | Y | Y | Automated version of the Confluent Platform Quickstart |
Role-Based Access Control | Y | Y | Role-based Access Control (RBAC) provides granular privileges for users and service accounts |
Secret Protection | Y | Y | Secret Protection feature encrypts secrets in configuration files |
Replicator Security | N | Y | Demos of various security configurations supported by Confluent Replicator and examples of how to implement them |
As a next step, you may want to build your own custom demo or test environment. We have several resources that launch just the services in Confluent Platform with no pre-configured connectors, data sources, topics, schemas, etc. Using these as a foundation, you can then add any connectors or applications.
- cp-all-in-one: This Docker Compose file launches all services in Confluent Platform, and runs them in containers in your local host.
- cp-all-in-one-cloud: Use this with your pre-configured Confluent Cloud instance. This Docker Compose file launches all services in Confluent Platform (except for the Kafka brokers), runs them in containers in your local host, and automatically configures them to connect to Confluent Cloud.
- Confluent CLI: For local, non-Docker installs of Confluent Platform. Using this CLI, you can launch all services in Confluent Platform with just one command
confluent local start
, and they will all run on your local host. - Generate test data: "Hello, World!" for launching Confluent Platform, plus different ways to generate more interesting test data for your topics
Additional documentation: Getting Started
For local installs:
- Confluent Platform 5.3
- Env var
CONFLUENT_HOME=/path/to/confluentplatform
- Env var
PATH
includes$CONFLUENT_HOME/bin
- Each demo has its own set of prerequisites as well, documented in each demo's README
For Docker: demos have been validated with
- Docker version 17.06.1-ce
- Docker Compose version 1.14.0 with Docker Compose file format 2.1