Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

avro deserializer error #57

Open
dhgokul opened this issue Aug 23, 2021 · 0 comments
Open

avro deserializer error #57

dhgokul opened this issue Aug 23, 2021 · 0 comments

Comments

@dhgokul
Copy link

dhgokul commented Aug 23, 2021

Using redpanda as a broker
Using Redpanda's default schema registry

    schema ="""    {
        "namespace": "confluent.io.examples.serialization.avro",
        "name": "user2",
        "type": "record",
        "fields": [
            {"name": "id", "type": "string"},
            {"name": "emailid", "type": "string"},
            {"name": "ip", "type": "string"},
            {"name": "browser", "type": "string"},
            {"name": "segment", "type": "string"}
        ]
    }"""

Actual payload:
{"id":"123","emailid":"[email protected]","ip":"192.112.222.183","browser":"Firefox-57.0.4","segment":"home"}

connector properties:

bootstrap.servers=redpanda_server0:9092,redpanda_server1:9092,redpanda_server3:9092
key.converter=io.confluent.connect.avro.AvroConverter
key.converter.schema.registry.url=http://redpanda_server0:8081
value.converter=io.confluent.connect.avro.AvroConverter
value.converter.schema.registry.url=http://redpanda_server0:8081
offset.storage.file.filename=/tmp/connect.offsets 
plugin.path=target/components/packages/

sink properties:

name=scylladb-sink-connector2
connector.class=io.connect.scylladb.ScyllaDbSinkConnector
tasks.max=1
topics=topic1
scylladb.contact.points=scylla1,scylla2,scylla3
scylladb.port=9042
scylladb.keyspace=redpanda
key.converter=org.apache.kafka.connect.storage.StringConverter
key.converter.schema.registry.url=http://redpanda_server0:8081
value.converter=io.confluent.connect.avro.AvroConverter
value.converter.schema.registry.url=http://redpanda_server0:8081
key.converter.schemas.enable=true
value.converter.schemas.enable=true
key.deserializer=org.apache.kafka.common.serialization.ByteArrayDeserializer
value.deserializer=org.apache.kafka.common.serialization.ByteArrayDeserializer
transforms=createKey
transforms.createKey.fields=id
transforms.createKey.type=org.apache.kafka.connect.transforms.ValueToKey

facing deserializer error:

Caused by: org.apache.kafka.connect.errors.DataException: Failed to deserialize data for topic topic1 to Avro: 
Caused by: org.apache.kafka.common.errors.SerializationException: Error retrieving Avro value schema version for id 7

Since facing an error on avro now using JSON method:

connector:

bootstrap.servers=redpanda_server0:9092,redpanda_server1:9092,redpanda_server3:9092

key.converter=org.apache.kafka.connect.storage.StringConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
offset.storage.file.filename=/tmp/connect.offsets 
plugin.path=target/components/packages/

sink property:

name=scylladb-sink-connector
connector.class =io.connect.scylladb.ScyllaDbSinkConnector
tasks.max= 1
topics =topic1
scylladb.contact.points=scylla1,scylla2,scylla3
scylladb.port=9042
scylladb.keyspace=redpanda
key.converter=org.apache.kafka.connect.storage.StringConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schemas.enable=true
value.converter.schemas.enable=true
transforms=createKey
transforms.createKey.fields=id
transforms.createKey.type=org.apache.kafka.connect.transforms.ValueToKey

Now producing message on redpanda as:
{"schema":{"type":"struct","fields":[{"type":"string","optional":false,"field":"id"},{"type":"string","optional":false,"field":"emailid"},{"type":"string","optional":false,"field":"ip"},{"type":"string","optional":false,"field":"browser"},{"type":"string","optional":false,"field":"segment"}]},"payload":{"id":"9999993","emailid":"[email protected]","ip":"192.112.222.183","browser":"Firefox-57.0.4","segment":"purchase"}}

But this method is not efficient in case producing large numbers of messages.

So looking for Avro format but showing deserializer error. Are there any changes that have to do with sink or connector properties?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant