You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[2023-07-13 22:49:25,231] ERROR Error encountered in task jdbc_sink_connector_s3_src_connect-0. Executing stage 'VALUE_CONVERTER' with class 'io.confluent.connect.avro.AvroConverter', where source record is = SourceRecord{sourcePartition={uri=s3://bucket-aws-useast1-apps-dev-1-dev/bucket/test.csv}, sourceOffset={position=132, rows=2, timestamp=1689288565168}} ConnectRecord{topic='topicname', kafkaPartition=null, key=null, keySchema=null, value=Struct{empName=John}, valueSchema=Schema{STRUCT}, timestamp=1689288565168, headers=ConnectHeaders(headers=[ConnectHeader(key=connect.file.name, value=bucket/test.csv, schema=Schema{STRING}), ConnectHeader(key=connect.file.uri, value=s3://bucket-aws-useast1-apps-dev-1-dev/bucket/test.csv, schema=Schema{STRING}), ConnectHeader(key=connect.file.contentLength, value=132, schema=Schema{INT64}), ConnectHeader(key=connect.file.lastModified, value=1689275469000, schema=Schema{INT64}), ConnectHeader(key=connect.file.s3.object.summary.key, value=bucket/test.csv, schema=Schema{STRING}), ConnectHeader(key=connect.file.s3.object.summary.etag, value=df96017ba0e96ddacd2de1736ade34eb, schema=Schema{STRING}), ConnectHeader(key=connect.file.s3.object.summary.bucketName, value=bucket, schema=Schema{STRING}), ConnectHeader(key=connect.task.hostname, value=kafka-connect-s3sourceconn-79cdd7466f-fnjzw, schema=Schema{STRING})])}. (org.apache.kafka.connect.runtime.errors.LogReporter)
org.apache.kafka.connect.errors.DataException: Failed to serialize Avro data from topic topicname :
at io.confluent.connect.avro.AvroConverter.fromConnectData(AvroConverter.java:93)
at org.apache.kafka.connect.storage.Converter.fromConnectData(Converter.java:63)
at org.apache.kafka.connect.runtime.WorkerSourceTask.lambda$convertTransformedRecord$3(WorkerSourceTask.java:329)
at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndRetry(RetryWithToleranceOperator.java:156)
at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execAndHandleError(RetryWithToleranceOperator.java:190)
at org.apache.kafka.connect.runtime.errors.RetryWithToleranceOperator.execute(RetryWithToleranceOperator.java:132)
at org.apache.kafka.connect.runtime.WorkerSourceTask.convertTransformedRecord(WorkerSourceTask.java:329)
at org.apache.kafka.connect.runtime.WorkerSourceTask.sendRecords(WorkerSourceTask.java:355)
at org.apache.kafka.connect.runtime.WorkerSourceTask.execute(WorkerSourceTask.java:257)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:188)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:243)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: org.apache.kafka.common.errors.SerializationException: Error registering Avro schema{"type":"record","name":"ConnectDefault","namespace":"io.confluent.connect.avro","fields":[{"name":"empName","type":["null","string"],"default":null}]}
at io.confluent.kafka.serializers.AbstractKafkaSchemaSerDe.toKafkaException(AbstractKafkaSchemaSerDe.java:259)
at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:156)
at io.confluent.connect.avro.AvroConverter$Serializer.serialize(AvroConverter.java:153)
at io.confluent.connect.avro.AvroConverter.fromConnectData(AvroConverter.java:86)
... 15 more
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Unexpected character ('<' (code 60)): expected a valid value (JSON String, Number, Array, Object or token 'null', 'true' or 'false')
at [Source: (sun.net.www.protocol.http.HttpURLConnection$HttpInputStream); line: 1, column: 2]; error code: 50005
at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:297)
at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:367)
at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:544)
at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:532)
at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:490)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:257)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:366)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:337)
at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:115)
... 17 more
I tried to add below schema key and value converters with schema registry url and keystore and truststore to the connector config. But i am getting the same error.
The source file contains a field that has ">" in its name
Avro in general does not support all characters in field names
You might wanna change the fieldname via an SMT. That could resolve your issue.
This doesn't seem like a connector issue.
Connector could though have a feature to alert if avro naming conventions are broken or he could fix it by himself (remove that characters for example...)
Does this help? Do you have the header in the CSV to check this?
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Hi, I am trying to create a s3 source connector with below configuration
I am getting the following exception
I tried to add below schema key and value converters with schema registry url and keystore and truststore to the connector config. But i am getting the same error.
Any suggestions on how to fix this?
The text was updated successfully, but these errors were encountered: