You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We are trying to apply Confluent For Kubernetes from a Flux Kustomization pointing to a kustomize overlay created by inflating the CFK Helm chart using kustomize. We try to avoid using Helm in our clusters for various reasons, so in case the software is delivered only as Helm charts, we always inflate them and commit to Git. So kustomize is our Flux single source-of-truth.
I am sadly unable to find the source for the CFK Helm chart, but as we inflate all Helm charts before handing the resources over to Flux, I am pretty sure I know what Flux sees.
This is the problematic resource - as it looks like in the kustomization given to Flux:
apiVersion: apiextensions.k8s.io/v1kind: CustomResourceDefinitionmetadata:
annotations:
controller-gen.kubebuilder.io/version: v0.9.2creationTimestamp: nullname: schemas.platform.confluent.iospec:
group: platform.confluent.ionames:
categories:
- all
- confluent-platform
- confluentkind: SchemalistKind: SchemaListplural: schemasshortNames:
- schemasingular: schemascope: Namespacedversions:
- additionalPrinterColumns:
- jsonPath: .status.formatname: Formattype: string
- jsonPath: .status.idname: IDtype: string
- jsonPath: .status.versionname: Versiontype: string
- jsonPath: .status.phasename: Statustype: string
- jsonPath: .metadata.creationTimestampname: Agetype: date
- jsonPath: .status.schemaRegistryEndpointname: SchemaRegistryEndpointpriority: 1type: stringname: v1beta1schema:
openAPIV3Schema:
properties:
apiVersion:
description: 'APIVersion defines the versioned schema of this representation of an object. Servers should convert recognized schemas to the latest internal value, and may reject unrecognized values. More info: https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#resources'type: stringkind:
description: 'Kind is a string value representing the REST resource this object represents. Servers may infer this from the endpoint the client submits requests to. Cannot be updated. In CamelCase. More info: https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#types-kinds'type: stringmetadata:
type: objectspec:
description: spec defines the desired state of the Schema.properties:
compatibilityLevel:
description: 'compatibilityLevel specifies the compatibility level requirement for the schema under the specified subject. Valid options are `BACKWARD`, `BACKWARD_TRANSITIVE`, `FORWARD`, `FORWARD_TRANSITIVE`, `FULL`, `FULL_TRANSITIVE` and `NONE`. more info: https://docs.confluent.io/platform/current/schema-registry/avro.html#schema-evolution-and-compatibility'enum:
- BACKWARD
- BACKWARD_TRANSITIVE
- FORWARD
- FORWARD_TRANSITIVE
- FULL
- FULL_TRANSITIVE
- NONEtype: stringdata:
description: data defines the data required to create the schema.properties:
configRef:
description: configRef is the name of the Kubernetes ConfigMapresource containing the schema.minLength: 1type: stringformat:
description: format is the format type of the encoded schema.Valid options are `avro`, `json`, and `protobuf`.enum:
- avro
- json
- protobufminLength: 1type: stringrequired:
- configRef
- formattype: objectmode:
description: Mode specifies the schema registry mode for the schemasunder the specified subject. Valid options are `IMPORT`, `READONLY`,`READWRITE`.enum:
- IMPORT
- READONLY
- READWRITEtype: stringname:
description: name specifies the subject name of schema. If not configured,the Schema CR name is used as the subject name.maxLength: 255minLength: 1pattern: ^[^\\]*$type: stringnormalize:
description: 'Normalize specifies whether to normalize the schema at the time of registering to schema registry. more info: https://docs.confluent.io/platform/current/schema-registry/fundamentals/serdes-develop/index.html#schema-normalization'type: booleanschemaReferences:
description: schemaReferences defines the schema references in theschema data.items:
description: SchemaReference is the schema to be used as a referencefor the new schema.properties:
avro:
description: avro is the data for the referenced Avro schema.properties:
avro:
description: name is the fully qualified name of the referencedAvro schema.minLength: 1type: stringrequired:
- avrotype: objectformat:
description: format is the format type of the referenced schema.Valid options are `avro`, `json`, and `protobuf`.enum:
- avro
- json
- protobufminLength: 1type: stringjson:
description: json is the data for the referenced JSON schema.properties:
url:
description: url is the referenced JSON schema url.minLength: 1type: stringrequired:
- urltype: objectprotobuf:
description: protobuf is the data for the referenced Protobufschema.properties:
file:
description: file is the file name of the referenced Protobufschema.minLength: 1type: stringrequired:
- filetype: objectsubject:
description: subject is the subject name for the referencedschema through the configRef.minLength: 1type: stringversion:
description: version is the version type of the referenced schema.format: int32type: integerrequired:
- format
- subject
- versiontype: objecttype: arrayschemaRegistryClusterRef:
description: schemaRegistryClusterRef references the CFK-managed SchemaRegistry cluster.properties:
name:
description: name specifies the name of the Confluent Platformcomponent cluster.type: stringnamespace:
description: namespace specifies the namespace where the ConfluentPlatform component cluster is running.type: stringrequired:
- nametype: objectschemaRegistryRest:
description: schemaRegistryRest specifies the Schema Registry RESTAPI configuration.properties:
authentication:
description: authentication specifies the REST API authenticationmechanism.properties:
basic:
description: basic specifies the basic authentication settingsfor the REST API client.properties:
debug:
description: debug enables the basic authentication debuglogs for JaaS configuration.type: booleandirectoryPathInContainer:
description: 'directoryPathInContainer allows to pass the basic credential through a directory path in the container. More info: https://docs.confluent.io/operator/current/co-authenticate.html#basic-authentication'minLength: 1type: stringrestrictedRoles:
description: restrictedRoles specify the restricted roleson the server side only. Changes will be only reflectedin Control Center. This configuration is ignored onthe client side configuration.items:
type: stringminItems: 1type: arrayroles:
description: roles specify the roles on the server sideonly. This configuration is ignored on the client sideconfiguration.items:
type: stringtype: arraysecretRef:
description: 'secretRef defines secret reference to pass the required credentials. More info: https://docs.confluent.io/operator/current/co-authenticate.html#basic-authentication'maxLength: 30minLength: 1pattern: ^[a-z0-9]([-a-z0-9]*[a-z0-9])?$type: stringtype: objectbearer:
description: bearer specifies the bearer authentication settingsfor the REST API client.properties:
directoryPathInContainer:
description: directoryPathInContainer specifies the directorypath in the container where the credential is mounted.minLength: 1type: stringsecretRef:
description: 'secretRef specifies the name of the secret that contains the credential. More info: https://docs.confluent.io/operator/current/co-authenticate.html#bearer-authentication'maxLength: 30minLength: 1pattern: ^[a-z0-9]([-a-z0-9]*[a-z0-9])?$type: stringtype: objecttype:
description: type specifies the REST API authentication type.Valid options are `basic`, `bearer`, and `mtls`.enum:
- basic
- bearer
- mtlstype: stringrequired:
- typetype: objectendpoint:
description: endpoint specifies where Confluent REST API is running.minLength: 1pattern: ^https?://.*type: stringkafkaClusterID:
description: kafkaClusterID specifies the id of Kafka cluster.It takes precedence over using the Kafka REST API to get thecluster id.minLength: 1type: stringtls:
description: tls specifies the custom TLS structure for the applicationresources, e.g. connector, topic, schema, of the Confluent Platformcomponents.properties:
directoryPathInContainer:
description: directoryPathInContainer contains the directorypath in the container where `keystore.jks`, `truststore.jks`,`jksPassword.txt`keys are mounted.minLength: 1type: stringjksPassword:
description: jksPassword specifies the secret name that containsthe JKS password.properties:
secretRef:
description: 'secretRef references the name of the secret containing the JKS password. More info: https://docs.confluent.io/operator/current/co-network-encryption.html#configure-user-provided-tls-certificates'maxLength: 30minLength: 1pattern: ^[a-z0-9]([-a-z0-9]*[a-z0-9])?$type: stringrequired:
- secretReftype: objectsecretRef:
description: 'secretRef specifies the secret name that contains the certificates. More info about certificates key/value format: https://docs.confluent.io/operator/current/co-network-encryption.html#configure-user-provided-tls-certificates'maxLength: 63minLength: 1pattern: ^[a-z0-9]([-a-z0-9]*[a-z0-9])?$type: stringtype: objecttype: objectrequired:
- datatype: objectstatus:
description: status defines the observed state of the Schema.properties:
appState:
default: Unknowndescription: appState is the current state of the Schema application.enum:
- Unknown
- Created
- Failed
- Deletedtype: stringcompatibilityLevel:
description: compatibilityLevel specifies the compatibility levelof the schema under the subject.type: stringconditions:
description: conditions are the latest available observed state ofthe schema.items:
description: Condition represent the latest available observationsof the current state.properties:
lastProbeTime:
description: lastProbeTime shows the last time the conditionwas evaluated.format: date-timetype: stringlastTransitionTime:
description: lastTransitionTime shows the last time the conditionwas transitioned from one status to another.format: date-timetype: stringmessage:
description: message shows a human-readable message with detailsabout the transition.type: stringreason:
description: reason shows the reason for the last transitionof the condition.type: stringstatus:
description: status shows the status of the condition, one of`True`, `False`, or `Unknown`.type: stringtype:
description: type shows the condition type.type: stringtype: objecttype: arraydeletedVersions:
description: deletedVersions are the successfully hard deleted versionsfor the subject.items:
format: int32type: integertype: arrayformat:
description: format is the format of the latest schema for the subject.type: stringid:
description: id is the id of the latest schema for the subject.format: int32type: integermode:
description: Mode specifies the operating mode of schema under thesubject.type: stringnormalize:
description: Normalize specifies whether schema has been normalizedat the time of registering.type: booleanobservedGeneration:
description: observedGeneration is the most recent generation observedfor this Confluent component.format: int64type: integerschemaReferences:
description: schemaReferences are the schema references for the subject.items:
description: SchemaReference is the schema to be used as a referencefor the new schema.properties:
avro:
description: avro is the data for the referenced Avro schema.properties:
avro:
description: name is the fully qualified name of the referencedAvro schema.minLength: 1type: stringrequired:
- avrotype: objectformat:
description: format is the format type of the referenced schema.Valid options are `avro`, `json`, and `protobuf`.enum:
- avro
- json
- protobufminLength: 1type: stringjson:
description: json is the data for the referenced JSON schema.properties:
url:
description: url is the referenced JSON schema url.minLength: 1type: stringrequired:
- urltype: objectprotobuf:
description: protobuf is the data for the referenced Protobufschema.properties:
file:
description: file is the file name of the referenced Protobufschema.minLength: 1type: stringrequired:
- filetype: objectsubject:
description: subject is the subject name for the referencedschema through the configRef.minLength: 1type: stringversion:
description: version is the version type of the referenced schema.format: int32type: integerrequired:
- format
- subject
- versiontype: objecttype: arrayschemaRegistryAuthenticationType:
description: schemaRegistryAuthenticationType is the authenticationmethod used.type: stringschemaRegistryEndpoint:
description: schemaRegistryEndpoint is the Schema Registry REST endpoint.type: stringschemaRegistryTLS:
description: schemaRegistryTLS shows whether the Schema Registry isusing TLS.type: booleansoftDeletedVersions:
description: softDeletedVersions are the successfully soft deletedversions for the subject.items:
format: int32type: integertype: arraystate:
description: state is the state of the Schema CR.type: stringsubject:
description: subject is the subject of the schema.type: stringversion:
description: version is the version of the latest schema for the subject.format: int32type: integertype: objectrequired:
- spectype: objectserved: truestorage: truesubresources:
status: {}
When attempting to apply this resource, Flux errors out with the following error message:
CustomResourceDefinition/schemas.platform.confluent.io dry-run failed (Invalid): CustomResourceDefinition.apiextensions.k8s.io "schemas.platform.confluent.io" is invalid: spec.validation.openAPIV3Schema.properties[spec].properties[name].pattern: Invalid value: "^[^\\]*$": must be a valid regular expression, but isn't: error parsing regexp: missing closing ]: `[^\]*$`
The reason I suspect this to be a problem with Flux, is that this works without issues if I server-side apply (or dry-run) the same kustomization using kustomize + kubectl. The Flux Kustomization points at the path infrastructure/confluent/default/. And both the following commands run fine:
$ kustomize build infrastructure/confluent/default/ | k apply -f - --dry-run=server --server-side --field-manager erikbo
namespace/confluent serverside-applied (server dry run)
customresourcedefinition.apiextensions.k8s.io/clusterlinks.platform.confluent.io serverside-applied (server dry run)
customresourcedefinition.apiextensions.k8s.io/confluentrolebindings.platform.confluent.io serverside-applied (server dry run)
customresourcedefinition.apiextensions.k8s.io/connectors.platform.confluent.io serverside-applied (server dry run)
customresourcedefinition.apiextensions.k8s.io/connects.platform.confluent.io serverside-applied (server dry run)
customresourcedefinition.apiextensions.k8s.io/controlcenters.platform.confluent.io serverside-applied (server dry run)
customresourcedefinition.apiextensions.k8s.io/kafkarestclasses.platform.confluent.io serverside-applied (server dry run)
customresourcedefinition.apiextensions.k8s.io/kafkarestproxies.platform.confluent.io serverside-applied (server dry run)
customresourcedefinition.apiextensions.k8s.io/kafkas.platform.confluent.io serverside-applied (server dry run)
customresourcedefinition.apiextensions.k8s.io/kafkatopics.platform.confluent.io serverside-applied (server dry run)
customresourcedefinition.apiextensions.k8s.io/kraftcontrollers.platform.confluent.io serverside-applied (server dry run)
customresourcedefinition.apiextensions.k8s.io/ksqldbs.platform.confluent.io serverside-applied (server dry run)
customresourcedefinition.apiextensions.k8s.io/schemaexporters.platform.confluent.io serverside-applied (server dry run)
customresourcedefinition.apiextensions.k8s.io/schemaregistries.platform.confluent.io serverside-applied (server dry run)
customresourcedefinition.apiextensions.k8s.io/schemas.platform.confluent.io serverside-applied (server dry run)
customresourcedefinition.apiextensions.k8s.io/zookeepers.platform.confluent.io serverside-applied (server dry run)
clusterrole.rbac.authorization.k8s.io/confluent-operator serverside-applied (server dry run)
clusterrolebinding.rbac.authorization.k8s.io/confluent-operator serverside-applied (server dry run)
Error from server (NotFound): namespaces "confluent" not found
Error from server (NotFound): namespaces "confluent" not found
Error from server (NotFound): namespaces "confluent" not found
Error from server (NotFound): namespaces "confluent" not found
We are trying to apply Confluent For Kubernetes from a Flux Kustomization pointing to a kustomize overlay created by inflating the CFK Helm chart using kustomize. We try to avoid using Helm in our clusters for various reasons, so in case the software is delivered only as Helm charts, we always inflate them and commit to Git. So kustomize is our Flux single source-of-truth.
I am sadly unable to find the source for the CFK Helm chart, but as we inflate all Helm charts before handing the resources over to Flux, I am pretty sure I know what Flux sees.
This is the problematic resource - as it looks like in the kustomization given to Flux:
When attempting to apply this resource, Flux errors out with the following error message:
The reason I suspect this to be a problem with Flux, is that this works without issues if I server-side apply (or dry-run) the same kustomization using kustomize + kubectl. The Flux Kustomization points at the path
infrastructure/confluent/default/
. And both the following commands run fine:This issue was discussed in https://cloud-native.slack.com/archives/CLAJ40HV3/p1707072757221329 with @stefanprodan.
The text was updated successfully, but these errors were encountered: