You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Description
When using the HoodieSinkConnector in Kafka Connect, the connector only synchronize the Kafka Topic to the Hudi table once.
After the initial upsert, it does not persist any future messages that might be produced in the Kafka Topic.
A restart of the connector fetches the new messages and correctly updates the Hudi Table.
To Reproduce
The HoodieSinkConnector has the following configuration:
I have create the schema in SchemaRegistry and produced messages to the Kafka Topic called "input". The messages adheres to the schema in SchemaRegistry. The data contains fields "date" and "volume" which are used in the config.
Further steps to reproduce the behavior:
Create a local environment with Kafka, Kafka Connect, SchemaRegistry, and Redpanda (optional. This is just to have a nice interface to interact with Kafka and Kafka Connect.)
For the Kafka Connect instance I have the following dependencies
Create a HoodieConnector in Kafka Connect with the above config, and add a schema in SchemaRegistry containing the fields "volume" and "date"
Create a new records in the Kafka topic "input" with fields "volume" and "date".
Check that the Hudi table is created in this path: "/tmp/hoodie/input/" in your Kafka Connect container.
Create a additional new records in the Kafka topic "input" with fields "volume" and "date".
Verify that the Hudi table in "/tmp/hoodie/input/" is not updated with new data.
Expected behavior
We expect data from the new records from the Kafka topic to be upserted into the Hudi table.
Environment Description
Hudi version : [0.15.0]
Spark version : N/A
Hive version : N/A
Hadoop version : 2.10.2
Storage (HDFS/S3/GCS..) : HDFS
Running on Docker? (yes/no) : yes
Additional context
A restart of the connector adds new data to the Hudi table.
Stacktrace
There are no error in the logs.
I have enabled DEBUG logging in log4j.properties and the best guess are these lines:
DEBUG org.apache.kafka.connect.runtime.WorkerSinkTask |ConnectorName: | WorkerSinkTask{id=test-hudi-connector-1-0} Skipping offset commit, no change since last commit [org.apache.kafka.connect.runtime.WorkerSinkTask]
DEBUG org.apache.kafka.connect.runtime.WorkerSinkTask |ConnectorName: | WorkerSinkTask{id=test-hudi-connector-1-0} Finished offset commit successfully in 0 ms for sequence number 39: null [org.apache.kafka.connect.runtime.WorkerSinkTask]
In advance, thank you for your help!
The text was updated successfully, but these errors were encountered:
bpp-incom
changed the title
[SUPPORT]
[SUPPORT] HoodieSinkConnector not updating the Hudi Table after initial update.
Oct 3, 2024
Description
When using the HoodieSinkConnector in Kafka Connect, the connector only synchronize the Kafka Topic to the Hudi table once.
After the initial upsert, it does not persist any future messages that might be produced in the Kafka Topic.
A restart of the connector fetches the new messages and correctly updates the Hudi Table.
To Reproduce
The HoodieSinkConnector has the following configuration:
I have create the schema in SchemaRegistry and produced messages to the Kafka Topic called "input". The messages adheres to the schema in SchemaRegistry. The data contains fields "date" and "volume" which are used in the config.
Further steps to reproduce the behavior:
For the Kafka Connect instance I have the following dependencies
Expected behavior
We expect data from the new records from the Kafka topic to be upserted into the Hudi table.
Environment Description
Hudi version : [0.15.0]
Spark version : N/A
Hive version : N/A
Hadoop version : 2.10.2
Storage (HDFS/S3/GCS..) : HDFS
Running on Docker? (yes/no) : yes
Additional context
A restart of the connector adds new data to the Hudi table.
Stacktrace
There are no error in the logs.
I have enabled DEBUG logging in log4j.properties and the best guess are these lines:
In advance, thank you for your help!
The text was updated successfully, but these errors were encountered: