You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have a StitchData integration tracking opens and clicks tables in my Campaign Monitor data, which pushes CSV files to an S3 bucket at hourly intervals.
I assumed that only new data would be pushed in these CSVs (I understand the replication method is key-based incremental), but clearly there's loads of duplication going on. As an example: in a 24 hour period, looking at the data in Campaign Monitor, I see ~800 clicks in total. However, if I merge the (69!) CSV files generated in the exact same 24 period, I end up with over 133,000 rows of data.
Even during times where there is clearly zero Campaign Monitor activity (3am - 4am on a Sunday night/Monday morning - no mailings having been sent) I get thousands and thousands of rows of data pushed to S3.
Clearly I'm doing something wrong, so would appreciate some help.
The text was updated successfully, but these errors were encountered:
I have a StitchData integration tracking opens and clicks tables in my Campaign Monitor data, which pushes CSV files to an S3 bucket at hourly intervals.
I assumed that only new data would be pushed in these CSVs (I understand the replication method is key-based incremental), but clearly there's loads of duplication going on. As an example: in a 24 hour period, looking at the data in Campaign Monitor, I see ~800 clicks in total. However, if I merge the (69!) CSV files generated in the exact same 24 period, I end up with over 133,000 rows of data.
Even during times where there is clearly zero Campaign Monitor activity (3am - 4am on a Sunday night/Monday morning - no mailings having been sent) I get thousands and thousands of rows of data pushed to S3.
Clearly I'm doing something wrong, so would appreciate some help.
The text was updated successfully, but these errors were encountered: