You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Apache Kafka has a diverse ecosystem that simplifies the storage, analysis and processing of data in motion. Confluent Cloud on Azure is a cloud-native data streaming solution designed to be the intelligent connective tissue enabling real-time data streaming, from multiple sources, to constantly stream across your organization.
The objective of the hack is to demonstrate how one can build event-driven services and applications at any scale with the fully integrated Confluent on Azure.
The hack provides an overview of the Kafka ecosystem such as Kafka brokers, producers, consumers, connectors and schema registry and seeks to leverage self-paced challenges to re-enforce the learning and understanding of these concepts.
Delivery Date
2023/04/27
Authors
JakeBogie
Other
No response
Code of Conduct
I agree to follow this project's Code of Conduct
The text was updated successfully, but these errors were encountered:
Topic
Confluent Cloud on Azure
Abstract/Learning Objectives
Apache Kafka has a diverse ecosystem that simplifies the storage, analysis and processing of data in motion. Confluent Cloud on Azure is a cloud-native data streaming solution designed to be the intelligent connective tissue enabling real-time data streaming, from multiple sources, to constantly stream across your organization.
The objective of the hack is to demonstrate how one can build event-driven services and applications at any scale with the fully integrated Confluent on Azure.
The hack provides an overview of the Kafka ecosystem such as Kafka brokers, producers, consumers, connectors and schema registry and seeks to leverage self-paced challenges to re-enforce the learning and understanding of these concepts.
Delivery Date
2023/04/27
Authors
Other
No response
Code of Conduct
The text was updated successfully, but these errors were encountered: