-
Notifications
You must be signed in to change notification settings - Fork 124
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to write JSON schemaless events #185
Comments
Hi @mroiter-larus, Kafka Connect is a little weird. It has it's own type system that is independent to how the data is serialized. For example a Struct is basically a row like structure. In Json this would be an object like your examples. In Avro it's a Record. So basically what is happening is this connector is using jackson to stream the file breaking by each object boundary. It hands this off as a string. If you were using only the string converter you would be done. In your example you're running a transformation so it looks like this |
Hi @jcustenborder , Thanks a lot for your answer! I tried the first approach you suggested (using FromJson). Actually it works, but doing so the JSON events, that were initially schemaless, are treated as they had schema (and i think this exactly what is expected by the FromJson transformation). I mean, the RenameField step is still running the Is there a way to do the ReplaceField step so that the executed method should be the Thanks a lot! |
Off the top of my head I'm not sure. I don't know how much I would worry about it being schemaless or with schema. The converter could be Apache Kafka Json Converter which would remove the schema or you can use the JsonSchemaConverter I wrote which will attach the schema as a header. It's in the same project. That would give you json in your topic. |
Hi @jcustenborder,
I’m having some trouble trying to use SMT functions with
SpoolDirSchemaLessJsonSourceConnector
.I would like to simply ingest some schemaless JSON events from a file into a topic, applying the ReplaceField SMT function. Here is the connector configuration:
my sample source file is populated as follow:
Despite i disabled the schemas for both the key and the value, it seems the SMT function is still interpreting my JSON events as if they had a schema. I got the following exception from the connector logs:
As you can see into the stacktrace, it keep going into the
applyWithSchema
method, which obviously fails!As suggested here, i already tried to use StringConverter instead of JsonConverter but with no luck. Same error.
Am i doing something wrong??
Thanks in advance!
Regards,
Mauro
The text was updated successfully, but these errors were encountered: