Skip to content

An end-to-end workflow for processing streaming data on Azure.

Notifications You must be signed in to change notification settings

fpgmaas/stream-iot

Repository files navigation

stream-iot

Welcome to stream-iot! This project contains an example of an end-to-end workflow for processing streaming data. The workflow consists of mocking sensor data, channeling it through Kafka, and then storing the parsed data in a MongoDB database.

I started this project as a way to learn and get some initial hands-on experience with Kafka. I hope that this project can help others get started on similar projects. At the same time, I am sure I can learn from others viewing this project. If you have feedback or improvement suggestions, please create an issue in this repository.

Tools & Technologies

Architecture

Alt text

Prerequisites

Installation

Follow the installation instructions in the README's of these directories in order:

When this is done, open the Airflow UI with

kubectl port-forward svc/airflow-webserver 8080:8080 -n airflow

and trigger the DAG's manually.

About

An end-to-end workflow for processing streaming data on Azure.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published