[Kafka Connector]Fix BadRequest exception when using serverless database account #43125
+28
−3
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Issue
When customer using Event stream in Fabric with Serverless CosmosDB database account, the connector failed with:
Setting offer throughput or autopilot on container is not supported for serverless accounts
Root cause
When using event stream in Fabric with kafka connector, for
azure.cosmos.source.metadata.storage.type
will be default toCosmos
. Internally if the metadata container does not exists, kafka connector will try to create one with autoscale 4000 RU. However creating a container with throughput(neither manual nor autoscale) on serverless account is not supported, hence the errorFix
When Kafka connector try to create the metadata container, if it fails with 400, then will retry to create the metadata container without throughput configured
Tests
Manually test with serverless database account