Skip to content

Commit

Permalink
fix: join multiparts on file beginning (#3784)
Browse files Browse the repository at this point in the history
* fix: join multiparts on file beginning

Signed-off-by: Dominik Rosiek <[email protected]>

* chore: changelog

Signed-off-by: Dominik Rosiek <[email protected]>

* Apply suggestions from code review

* Apply suggestions from code review

---------

Signed-off-by: Dominik Rosiek <[email protected]>
(cherry picked from commit 58f19f8)
  • Loading branch information
sumo-drosiek authored and github-actions[bot] committed Jun 28, 2024
1 parent 1d42a8b commit 67f2e76
Show file tree
Hide file tree
Showing 9 changed files with 19 additions and 15 deletions.
1 change: 1 addition & 0 deletions .changelog/3784.fixed.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
fix: join multiparts on file beginning
Original file line number Diff line number Diff line change
Expand Up @@ -99,7 +99,8 @@ filelog/containers:
output: strip-trailing-newline
source_identifier: attributes["log.file.path"]
type: recombine
max_unmatched_batch_size: 1
## Ensure we combine everything up to `is_last_entry` even on the file beginning
max_unmatched_batch_size: 0

## merge-cri-lines stitches back together log lines split by CRI logging drivers.
## Input Body (JSON): { "log": "2001-02-03 04:05:06 very long li", "logtag": "P" }
Expand All @@ -113,7 +114,8 @@ filelog/containers:
overwrite_with: newest
source_identifier: attributes["log.file.path"]
type: recombine
max_unmatched_batch_size: 1
## Ensure we combine everything up to `is_last_entry` even on the file beginning
max_unmatched_batch_size: 0

## strip-trailing-newline removes the trailing "\n" from the `log` key. This is required for logs coming from Docker container runtime.
## Input Body (JSON): { "log": "2001-02-03 04:05:06 very long line that was split by the logging driver\n", "stream": "stdout" }
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,8 @@ processors:
overwrite_with: newest
source_identifier: resource["cloudwatch.log.stream"]
type: recombine
max_unmatched_batch_size: 1
## Ensure we combine everything up to `is_last_entry` even on the file beginning
max_unmatched_batch_size: 0
- id: merge-multiline-logs
combine_field: attributes.log
combine_with: "\n"
Expand Down
4 changes: 2 additions & 2 deletions tests/helm/testdata/goldenfile/logs_otc/basic.output.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -99,15 +99,15 @@ data:
combine_with: ""
id: merge-docker-lines
is_last_entry: body.log matches "\n$"
max_unmatched_batch_size: 1
max_unmatched_batch_size: 0
output: strip-trailing-newline
source_identifier: attributes["log.file.path"]
type: recombine
- combine_field: body.log
combine_with: ""
id: merge-cri-lines
is_last_entry: body.logtag == "F"
max_unmatched_batch_size: 1
max_unmatched_batch_size: 0
output: extract-metadata-from-filepath
overwrite_with: newest
source_identifier: attributes["log.file.path"]
Expand Down
4 changes: 2 additions & 2 deletions tests/helm/testdata/goldenfile/logs_otc/debug.output.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -109,15 +109,15 @@ data:
combine_with: ""
id: merge-docker-lines
is_last_entry: body.log matches "\n$"
max_unmatched_batch_size: 1
max_unmatched_batch_size: 0
output: strip-trailing-newline
source_identifier: attributes["log.file.path"]
type: recombine
- combine_field: body.log
combine_with: ""
id: merge-cri-lines
is_last_entry: body.logtag == "F"
max_unmatched_batch_size: 1
max_unmatched_batch_size: 0
output: extract-metadata-from-filepath
overwrite_with: newest
source_identifier: attributes["log.file.path"]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -99,15 +99,15 @@ data:
combine_with: ""
id: merge-docker-lines
is_last_entry: body.log matches "\n$"
max_unmatched_batch_size: 1
max_unmatched_batch_size: 0
output: strip-trailing-newline
source_identifier: attributes["log.file.path"]
type: recombine
- combine_field: body.log
combine_with: ""
id: merge-cri-lines
is_last_entry: body.logtag == "F"
max_unmatched_batch_size: 1
max_unmatched_batch_size: 0
output: extract-metadata-from-filepath
overwrite_with: newest
source_identifier: attributes["log.file.path"]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -78,15 +78,15 @@ data:
combine_with: ""
id: merge-docker-lines
is_last_entry: body.log matches "\n$"
max_unmatched_batch_size: 1
max_unmatched_batch_size: 0
output: strip-trailing-newline
source_identifier: attributes["log.file.path"]
type: recombine
- combine_field: body.log
combine_with: ""
id: merge-cri-lines
is_last_entry: body.logtag == "F"
max_unmatched_batch_size: 1
max_unmatched_batch_size: 0
output: extract-metadata-from-filepath
overwrite_with: newest
source_identifier: attributes["log.file.path"]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -89,15 +89,15 @@ data:
combine_with: ""
id: merge-docker-lines
is_last_entry: body.log matches "\n$"
max_unmatched_batch_size: 1
max_unmatched_batch_size: 0
output: strip-trailing-newline
source_identifier: attributes["log.file.path"]
type: recombine
- combine_field: body.log
combine_with: ""
id: merge-cri-lines
is_last_entry: body.logtag == "F"
max_unmatched_batch_size: 1
max_unmatched_batch_size: 0
output: extract-metadata-from-filepath
overwrite_with: newest
source_identifier: attributes["log.file.path"]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -78,15 +78,15 @@ data:
combine_with: ""
id: merge-docker-lines
is_last_entry: body.log matches "\n$"
max_unmatched_batch_size: 1
max_unmatched_batch_size: 0
output: strip-trailing-newline
source_identifier: attributes["log.file.path"]
type: recombine
- combine_field: body.log
combine_with: ""
id: merge-cri-lines
is_last_entry: body.logtag == "F"
max_unmatched_batch_size: 1
max_unmatched_batch_size: 0
output: extract-metadata-from-filepath
overwrite_with: newest
source_identifier: attributes["log.file.path"]
Expand Down

0 comments on commit 67f2e76

Please sign in to comment.