Skip to content

Commit

Permalink
Merge branch '2.5.0' into sink-connector-client-config-create
Browse files Browse the repository at this point in the history
  • Loading branch information
subkanthi authored Oct 25, 2024
2 parents 0f353c2 + 57c59fe commit 902ba0b
Show file tree
Hide file tree
Showing 168 changed files with 9,498 additions and 1,317 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/docker-build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -99,7 +99,7 @@ jobs:

- name: Build Docker image (Lightweight)
run: |
docker build . --file sink-connector-lightweight/Dockerfile --tag altinityinfra/clickhouse-sink-connector:${{ env.IMAGE_TAG }}-lt
docker build . --file sink-connector-lightweight/Dockerfile --build-arg DOCKER_TAG=${{ env.IMAGE_TAG }} --tag altinityinfra/clickhouse-sink-connector:${{ env.IMAGE_TAG }}-lt
docker save altinityinfra/clickhouse-sink-connector:${{ env.IMAGE_TAG }}-lt | gzip > clickhouse-sink-connector_${{ env.IMAGE_TAG }}-lt.tar.gz
- name: Upload Docker tar (Lightweight)
Expand Down
43 changes: 41 additions & 2 deletions .github/workflows/testflows-sink-connector-kafka.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
name: Kafka - TestFlows Tests
run-name: ${{ inputs.custom_run_name || 'Kafka - TestFlows Tests' }}

on:
workflow_call:
Expand All @@ -7,6 +8,14 @@ on:
description: "Kafka connector docker image"
required: true
type: string
package:
description: "Package either 'docker://' or 'https://'. Example: 'https://s3.amazonaws.com/clickhouse-builds/23.3/.../package_release/clickhouse-common-static_23.3.1.64_amd64.deb', or 'docker://altinity/clickhouse-server:23.8.8'"
type: string
default: docker://clickhouse/clickhouse-server:23.3
output_format:
description: "Testflows output style."
type: string
default: new-fails
secrets:
DOCKERHUB_USERNAME:
required: false
Expand All @@ -22,7 +31,37 @@ on:
description: "Kafka connector docker image"
required: true
type: string

package:
description: "Package either 'docker://' or 'https://'. Example: 'https://s3.amazonaws.com/clickhouse-builds/23.3/.../package_release/clickhouse-common-static_23.3.1.64_amd64.deb', or 'docker://altinity/clickhouse-server:23.8.8'"
type: string
default: docker://clickhouse/clickhouse-server:23.3
extra_args:
description: "Specific Suite To Run (Default * to run everything)."
required: false
type: string
custom_run_name:
description: 'Custom run name (optional)'
required: false
output_format:
description: "Testflows output style."
type: choice
options:
- new-fails
- nice-new-fails
- brisk-new-fails
- plain-new-fails
- pnice-new-fails
- classic
- nice
- fails
- slick
- brisk
- quiet
- short
- manual
- dots
- progress
- raw
env:
SINK_CONNECTOR_IMAGE: ${{ inputs.SINK_CONNECTOR_IMAGE }}

Expand Down Expand Up @@ -75,7 +114,7 @@ jobs:

- name: Run testflows tests
working-directory: sink-connector/tests/integration
run: python3 -u regression.py --only "/mysql to clickhouse replication/*" --clickhouse-binary-path=docker://clickhouse/clickhouse-server:22.8 --test-to-end -o classic --collect-service-logs --attr project="${GITHUB_REPOSITORY}" project.id="$GITHUB_RUN_NUMBER" user.name="$GITHUB_ACTOR" github_actions_run="$GITHUB_SERVER_URL/$GITHUB_REPOSITORY/actions/runs/$GITHUB_RUN_ID" sink_version="altinity/clickhouse-sink-connector:${SINK_CONNECTOR_VERSION}" s3_url="https://altinity-test-reports.s3.amazonaws.com/index.html#altinity-sink-connector/testflows/${{ steps.date.outputs.date }}_${{github.run.number}}/" --log logs/raw.log
run: python3 -u regression.py --only "/regression/${{ inputs.extra_args != '' && inputs.extra_args || '*' }}" --clickhouse-binary-path="${{inputs.package}}" --test-to-end --output ${{ inputs.output_format }} --collect-service-logs --attr project="${GITHUB_REPOSITORY}" project.id="$GITHUB_RUN_NUMBER" user.name="$GITHUB_ACTOR" github_actions_run="$GITHUB_SERVER_URL/$GITHUB_REPOSITORY/actions/runs/$GITHUB_RUN_ID" sink_version="altinity/clickhouse-sink-connector:${SINK_CONNECTOR_VERSION}" s3_url="https://altinity-test-reports.s3.amazonaws.com/index.html#altinity-sink-connector/testflows/${{ steps.date.outputs.date }}_${{github.run.number}}/" --log logs/raw.log

- name: Create tfs results report
if: always()
Expand Down
28 changes: 26 additions & 2 deletions .github/workflows/testflows-sink-connector-lightweight.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,10 @@ on:
description: "Package either 'docker://' or 'https://'. Example: 'https://s3.amazonaws.com/clickhouse-builds/23.3/.../package_release/clickhouse-common-static_23.3.1.64_amd64.deb', or 'docker://altinity/clickhouse-server:23.8.8'"
type: string
default: docker://clickhouse/clickhouse-server:23.3
output_format:
description: "Testflows output style."
type: string
default: new-fails
secrets:
DOCKERHUB_USERNAME:
required: false
Expand All @@ -38,6 +42,26 @@ on:
custom_run_name:
description: 'Custom run name (optional)'
required: false
output_format:
description: "Testflows output style."
type: choice
options:
- new-fails
- nice-new-fails
- brisk-new-fails
- plain-new-fails
- pnice-new-fails
- classic
- nice
- fails
- slick
- brisk
- quiet
- short
- manual
- dots
- progress
- raw

env:
SINK_CONNECTOR_IMAGE: ${{ inputs.SINK_CONNECTOR_IMAGE }}
Expand Down Expand Up @@ -91,7 +115,7 @@ jobs:

- name: Run testflows tests
working-directory: sink-connector-lightweight/tests/integration
run: python3 -u regression.py --only "/mysql to clickhouse replication/auto table creation/${{ inputs.extra_args != '' && inputs.extra_args || '*' }}" --clickhouse-binary-path="${{inputs.package}}" --test-to-end -o classic --collect-service-logs --attr project="${GITHUB_REPOSITORY}" project.id="$GITHUB_RUN_NUMBER" user.name="$GITHUB_ACTOR" github_actions_run="$GITHUB_SERVER_URL/$GITHUB_REPOSITORY/actions/runs/$GITHUB_RUN_ID" sink_version="registry.gitlab.com/altinity-public/container-images/clickhouse_debezium_embedded:latest" s3_url="https://altinity-test-reports.s3.amazonaws.com/index.html#altinity-sink-connector/testflows/${{ steps.date.outputs.date }}_${{github.run.number}}/" --log logs/raw.log
run: python3 -u regression.py --only "/mysql to clickhouse replication/auto table creation/${{ inputs.extra_args != '' && inputs.extra_args || '*' }}" --clickhouse-binary-path="${{inputs.package}}" --test-to-end --output ${{ inputs.output_format }} --collect-service-logs --attr project="${GITHUB_REPOSITORY}" project.id="$GITHUB_RUN_NUMBER" user.name="$GITHUB_ACTOR" github_actions_run="$GITHUB_SERVER_URL/$GITHUB_REPOSITORY/actions/runs/$GITHUB_RUN_ID" sink_version="registry.gitlab.com/altinity-public/container-images/clickhouse_debezium_embedded:latest" s3_url="https://altinity-test-reports.s3.amazonaws.com/index.html#altinity-sink-connector/testflows/${{ steps.date.outputs.date }}_${{github.run.number}}/" --log logs/raw.log

- name: Create tfs results report
if: always()
Expand Down Expand Up @@ -169,7 +193,7 @@ jobs:

- name: Run testflows tests
working-directory: sink-connector-lightweight/tests/integration
run: python3 -u regression.py --only "/mysql to clickhouse replication/auto replicated table creation/${{ inputs.extra_args != '' && inputs.extra_args || '*' }}" --clickhouse-binary-path="${{inputs.package}}" --test-to-end -o classic --collect-service-logs --attr project="${GITHUB_REPOSITORY}" project.id="$GITHUB_RUN_NUMBER" user.name="$GITHUB_ACTOR" github_actions_run="$GITHUB_SERVER_URL/$GITHUB_REPOSITORY/actions/runs/$GITHUB_RUN_ID" sink_version="registry.gitlab.com/altinity-public/container-images/clickhouse_debezium_embedded:latest" s3_url="https://altinity-test-reports.s3.amazonaws.com/index.html#altinity-sink-connector/testflows/${{ steps.date.outputs.date }}_${{github.run.number}}/" --log logs/raw.log
run: python3 -u regression.py --only "/mysql to clickhouse replication/auto replicated table creation/${{ inputs.extra_args != '' && inputs.extra_args || '*' }}" --clickhouse-binary-path="${{inputs.package}}" --test-to-end --output ${{ inputs.output_format }} --collect-service-logs --attr project="${GITHUB_REPOSITORY}" project.id="$GITHUB_RUN_NUMBER" user.name="$GITHUB_ACTOR" github_actions_run="$GITHUB_SERVER_URL/$GITHUB_REPOSITORY/actions/runs/$GITHUB_RUN_ID" sink_version="registry.gitlab.com/altinity-public/container-images/clickhouse_debezium_embedded:latest" s3_url="https://altinity-test-reports.s3.amazonaws.com/index.html#altinity-sink-connector/testflows/${{ steps.date.outputs.date }}_${{github.run.number}}/" --log logs/raw.log

- name: Create tfs results report
if: always()
Expand Down
9 changes: 6 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
[![License](http://img.shields.io/:license-apache%202.0-brightgreen.svg)](http://www.apache.org/licenses/LICENSE-2.0.html)
[![Sink Connector(Kafka version) tests](https://github.com/Altinity/clickhouse-sink-connector/actions/workflows/sink-connector-kafka-tests.yml/badge.svg)](https://github.com/Altinity/clickhouse-sink-connector/actions/workflows/sink-connector-kafka-tests.yml)
[![Sink Connector(Light-weight) Tests](https://github.com/Altinity/clickhouse-sink-connector/actions/workflows/sink-connector-lightweight-tests.yml/badge.svg)](https://github.com/Altinity/clickhouse-sink-connector/actions/workflows/sink-connector-lightweight-tests.yml)
<a href="https://join.slack.com/t/altinitydbworkspace/shared_invite/zt-w6mpotc1-fTz9oYp0VM719DNye9UvrQ">
<a href="https://altinity.com/slack">
<img src="https://img.shields.io/static/v1?logo=slack&logoColor=959DA5&label=Slack&labelColor=333a41&message=join%20conversation&color=3AC358" alt="AltinityDB Slack" />
</a>
<img alt="Docker Pulls" src="https://img.shields.io/docker/pulls/altinityinfra/clickhouse-sink-connector">
Expand All @@ -16,7 +16,7 @@ for analysis.

## Features

* Initial data dump and load
* [Initial data dump and load(MySQL)](sink-connector/python/README.md)
* Change data capture of new transactions using [Debezium](https://debezium.io/)
* Automatic loading into ClickHouse
* Sources: Support for MySQL, PostgreSQL (other databases experimental)
Expand Down Expand Up @@ -55,9 +55,12 @@ First two are good tutorials on MySQL and PostgreSQL respectively.
* [ClickHouse Table Engine Types](doc/clickhouse_engines.md)
* [Troubleshooting](doc/Troubleshooting.md)
* [TimeZone and DATETIME/TIMESTAMP](doc/timezone.md)
* [Replication Start Position](doc/replication_start_position.md)
* [Logging](doc/logging.md)
* [Production Setup](doc/production_setup.md)
* [Adding new tables(Incremental Snapshot)](doc/incremental_snapshot.md)
* [Configuration](doc/configuration.md)
* [State Storage](doc/state_storage.md)

### Operations

Expand Down Expand Up @@ -100,6 +103,6 @@ to ClickHouse and analytic applications built on ClickHouse.
- [Official website](https://altinity.com/) - Get a high level overview of Altinity and our offerings.
- [Altinity.Cloud](https://altinity.com/cloud-database/) - Run ClickHouse in our cloud or yours.
- [Altinity Support](https://altinity.com/support/) - Get Enterprise-class support for ClickHouse and Sink Connector.
- [Slack](https://altinitydbworkspace.slack.com/join/shared_invite/zt-1togw9b4g-N0ZOXQyEyPCBh_7IEHUjdw#/shared-invite/email) - Talk directly with ClickHouse users and Altinity devs.
- [Slack](https://altinity.com/slack) - Talk directly with ClickHouse users and Altinity devs.
- [Contact us](https://hubs.la/Q020sH3Z0) - Contact Altinity with your questions or issues.
- [Free consultation](https://hubs.la/Q020sHkv0) - Get a free consultation with a ClickHouse expert today.
9 changes: 8 additions & 1 deletion doc/Monitoring.md
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,14 @@ record_insert_seq:
```select event_time, database, table, rows, duration_ms,size_in_bytes from system.part_log where table='table' and event_type='NewPart' and event_time > now () - interval 30 minute and database='db' ;```

## Grafana Dashboard
JMX metrics of sink connector are exposed through the port
JMX metrics of sink connector are exposed through the port.(Default: 8084)
If you need to override the port, add `metrics.port` in config.yml

```
metrics.enable: "true"
metrics.port: 8085
```


The JMX_exporter docker image scrapes the JMX metrics from the sink connector
The metrics can be read through the following URL
Expand Down
3 changes: 3 additions & 0 deletions doc/Troubleshooting.md
Original file line number Diff line number Diff line change
Expand Up @@ -55,3 +55,6 @@ https://stackoverflow.com/questions/63523998/multiple-debezium-connector-for-one

### PostgreSQL - ERROR - Error starting connectorio.debezium.DebeziumException: Creation of replication slot failed; when setting up multiple connectors for the same database host, please make sure to use a distinct replication slot name for each.
Make sure to add `slot.name` to the configuration(config.yml) and change it to a unique name.

### PostgreSQL (WAL size growing)
[Handling PostgreSQL WAL Growth with Debezium Connectors](doc/postgres_wal_growth.md)
Loading

0 comments on commit 902ba0b

Please sign in to comment.