Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ensure jackson overrides are available to static initializers #16719

Merged
merged 1 commit into from
Dec 4, 2024

Conversation

yaauie
Copy link
Member

@yaauie yaauie commented Nov 22, 2024

Release notes

Fixes an issue where Logstash could fail to read an event containing a very large string from the PQ.

What does this PR do?

Moves the application of jackson defaults overrides into pure java, and applies them statically before the org.logstash.ObjectMappers has a chance to start initializing object mappers that rely on the defaults.

We replace the runner's invocation (which was too late to be fully applied) with a verification that the configured defaults have been applied.

Why is it important/What is the impact to the user?

Ensures that the configured constraints are applied to all object mappers, including the CBOR_MAPPER that is used to decode events from the PQ.

Checklist

  • My code follows the style guidelines of this project
  • I have commented my code, particularly in hard-to-understand areas
  • [ ] I have made corresponding changes to the documentation
  • [ ] I have made corresponding change to the default configuration files (and/or docker env variables)
  • I have added tests that prove my fix is effective or that my feature works

How to test this PR locally

  1. enable the PQ
    echo 'queue.type: persisted' > config/logstash.yml`
    
  2. Configure Logstash's Jackson to handle 30MB strings
    gsed -i '/logstash.jackson.stream-read-constraints.max-string-length/d' config/jvm.options
    (echo; echo '-Dlogstash.jackson.stream-read-constraints.max-string-length=30000000') >> config/jvm.options
    
  3. generate a 20MiB+ newline-terminated string
    (dd if=<(base64 < /dev/urandom) bs=1K count="$(expr 20 '*' 1024)"; echo) > big.txt
    
  4. run logstash in a way where the input plugin will generate an event containing the aforementioned 20MB+ string:
    bin/logstash -e 'input { stdin { codec => plain } } output { stdout { codec => dots } }' < big.txt 
    

Related issues

@yaauie yaauie requested a review from donoghuc November 22, 2024 19:36
@donoghuc donoghuc self-assigned this Nov 22, 2024
@donoghuc
Copy link
Member

I'm having a hard time reproducing the original error. Here is what I've tried:

  1. I checked out head of logstash#main (aff8d1cce)
  2. Build ./gradlew clean bootstrap assemble installDefaultGems
  3. Configure: my config/logstash.yml has a single kv pair queue.type: persisted
  4. Generate test large input data
logstash git:(aff8d1cce) ✗ (dd if=<(base64 < /dev/urandom) bs=1K count="$(expr 25 '*' 1024)"; echo) > big.txt
25600+0 records in
25600+0 records out
26214400 bytes transferred in 0.057778 secs (453709024 bytes/sec)logstash git:(aff8d1cce) ✗ ls -lh big.txt
-rw-r--r--@ 1 cas  staff    25M Nov 22 13:56 big.txt
  1. Run pipeline
logstash git:(aff8d1cce) ✗ bin/logstash -e 'input { stdin { codec => plain } } output { stdout { codec => dots } }' < big.txt
Using system java: /Users/cas/.jenv/shims/java
Sending Logstash logs to /Users/cas/elastic-repos/logstash/logs which is now configured via log4j2.properties
[2024-11-22T13:57:07,543][INFO ][logstash.runner          ] Log4j configuration path used is: /Users/cas/elastic-repos/logstash/config/log4j2.properties
[2024-11-22T13:57:07,546][WARN ][logstash.runner          ] The use of JAVA_HOME has been deprecated. Logstash 8.0 and later ignores JAVA_HOME and uses the bundled JDK. Running Logstash with the bundled JDK is recommended. The bundled JDK has been verified to work with each specific version of Logstash, and generally provides best performance and reliability. If you have compelling reasons for using your own JDK (organizational-specific compliance requirements, for example), you can configure LS_JAVA_HOME to use that version instead.
[2024-11-22T13:57:07,546][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"9.0.0", "jruby.version"=>"jruby 9.4.9.0 (3.1.4) 2024-11-04 547c6b150e OpenJDK 64-Bit Server VM 21.0.5 on 21.0.5 +indy +jit [arm64-darwin]"}
[2024-11-22T13:57:07,547][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
[2024-11-22T13:57:07,548][INFO ][logstash.runner          ] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
[2024-11-22T13:57:07,548][INFO ][logstash.runner          ] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`
[2024-11-22T13:57:07,561][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2024-11-22T13:57:07,724][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2024-11-22T13:57:07,778][INFO ][org.reflections.Reflections] Reflections took 39 ms to scan 1 urls, producing 149 keys and 522 values
[2024-11-22T13:57:07,858][INFO ][logstash.javapipeline    ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2024-11-22T13:57:07,868][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>12, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1500, "pipeline.sources"=>["config string"], :thread=>"#<Thread:0x4792dd2 /Users/cas/elastic-repos/logstash/logstash-core/lib/logstash/java_pipeline.rb:138 run>"}
[2024-11-22T13:57:08,062][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>0.19}
[2024-11-22T13:57:08,071][INFO ][logstash.inputs.stdin    ][main] Automatically switching from plain to line codec {:plugin=>"stdin"}
[2024-11-22T13:57:08,074][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2024-11-22T13:57:08,091][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
.[2024-11-22T13:57:09,917][INFO ][logstash.javapipeline    ][main] Pipeline terminated {"pipeline.id"=>"main"}
[2024-11-22T13:57:10,117][INFO ][logstash.pipelinesregistry] Removed pipeline from registry successfully {:pipeline_id=>:main}
[2024-11-22T13:57:10,125][INFO ][logstash.runner          ] Logstash shut down.

It seems like it handles the input without error. I did a run with debug enabled and it seem to be using a queue but maybe the "default: memory" is not correct? (I would include all the debug run but it prints the huge file 😅 ).

[2024-11-22T14:04:31,964][DEBUG][logstash.runner          ] *queue.type: persisted (default: memory)
[2024-11-22T14:04:31,964][DEBUG][logstash.runner          ] queue.drain: false
[2024-11-22T14:04:31,964][DEBUG][logstash.runner          ] queue.page_capacity: 67108864
[2024-11-22T14:04:31,964][DEBUG][logstash.runner          ] queue.max_bytes: 1073741824
[2024-11-22T14:04:31,964][DEBUG][logstash.runner          ] queue.max_events: 0
[2024-11-22T14:04:31,964][DEBUG][logstash.runner          ] queue.checkpoint.acks: 1024
[2024-11-22T14:04:31,964][DEBUG][logstash.runner          ] queue.checkpoint.writes: 1024
[2024-11-22T14:04:31,964][DEBUG][logstash.runner          ] queue.checkpoint.interval: 1000
[2024-11-22T14:04:31,964][DEBUG][logstash.runner          ] queue.checkpoint.retry: true

I'm sure i'm missing something silly, i'll keep looking but if you can spot it let me know!

@yaauie
Copy link
Member Author

yaauie commented Nov 25, 2024

Logging isn't deterministic in tests; depending on which tests are run first, the root logger can be left at the OFF level. I'll need to chase this down.

@yaauie yaauie force-pushed the jackson-static-init-defaults branch 2 times, most recently from d1d3dc5 to 4f09282 Compare December 3, 2024 16:41
@yaauie
Copy link
Member Author

yaauie commented Dec 4, 2024

🤦 last tests were validating the settings before they are applied, because they no longer triggered the static load of ObjectMappers (and therefore the application of the defaults). I've moved all of it to run though ObjectMappers in 37bf0a4, and am hoping for that to resolve the issue.

@donoghuc donoghuc self-requested a review December 4, 2024 17:41
Copy link
Member

@robbavey robbavey left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Superficial pass through

eachOverride((override, specifiedValue) -> {
final Integer effectiveValue = override.observer.apply(streamReadConstraints);
if (Objects.equals(specifiedValue, effectiveValue)) {
logger.info("Jackson default value override `{}` configured to `{}`", override.propertyName, specifiedValue);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Currently, this will log at info for the default Logstash Jackson config set in config/jvm.options:

[2024-12-04T12:53:30,180][INFO ][org.logstash.jackson.StreamReadConstraintsUtil] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
[2024-12-04T12:53:30,180][INFO ][org.logstash.jackson.StreamReadConstraintsUtil] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`

What do you think about logging at debug, or verifying against the Logstash defaults?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It replaces identical info-level logging from before:

https://github.com/elastic/logstash/pull/16719/files#diff-8779c222532118bcfe1d6ccd7f5152238150615ec0b0d394d44e4fbcf334f903L66

I hesitate to add functionality to compare it against logstash defaults, when those defaults are currently held in and provided by a user-overridable config file.

@yaauie
Copy link
Member Author

yaauie commented Dec 4, 2024

I'm still chasing down the ability to replicate the issue.

So far, if anything triggers org.logstash.ObjectMappers to load before the defaults have been applied, it can be replicated. I can of course do that manually with:

diff --git a/logstash-core/lib/logstash/util/jackson.rb b/logstash-core/lib/logstash/util/jackson.rb
index 63f072a81..10617f899 100644
--- a/logstash-core/lib/logstash/util/jackson.rb
+++ b/logstash-core/lib/logstash/util/jackson.rb
@@ -19,6 +19,7 @@ module LogStash
   module Util
     module Jackson
       def self.set_jackson_defaults(logger)
+        org.logstash.ObjectMappers::CBOR_MAPPER # trigger eager
         JacksonStreamReadConstraintsDefaults.new(logger).configure
       end

And then the replication works reliably:

echo 'queue.type: persisted' > config/logstash.yml; (dd if=<(base64 < /dev/urandom) bs=1K count="$(expr 32 '*' 1024)"; echo) > big_hex.txt; bin/logstash --pipeline.ecs_compatibility=disabled -e 'input { stdin { codec => plain } } filter { ruby { code => "event.set(%q(message), event.get(%q(message))&.size)" } }' < big_hex.txt
[click to expand]
╭─{ rye@perhaps:~/src/elastic/logstash@main (main ✘) }
╰─● echo 'queue.type: persisted' > config/logstash.yml; (dd if=<(base64 < /dev/urandom) bs=1K count="$(expr 32 '*' 1024)"; echo) > big_hex.txt; bin/logstash --pipeline.ecs_compatibility=disabled -e 'input { stdin { codec => plain } } filter { ruby { code => "event.set(%q(message), event.get(%q(message))&.size)" } }' < big_hex.txt
32768+0 records in
32768+0 records out
33554432 bytes transferred in 0.064686 secs (518727885 bytes/sec)
Using system java: /Users/rye/.jenv/shims/java
WARN: Unresolved or ambiguous specs during Gem::Specification.reset:
      date (>= 0)
      Available/installed versions of this gem:
      - 3.4.1
      - 3.3.3
WARN: Clearing out unresolved specs. Try 'gem cleanup <gem>'
Please report a bug if this causes problems.
Sending Logstash logs to /Users/rye/src/elastic/logstash@main/logs which is now configured via log4j2.properties
[2024-12-04T19:46:00,216][INFO ][logstash.runner          ] Log4j configuration path used is: /Users/rye/src/elastic/logstash@main/config/log4j2.properties
[2024-12-04T19:46:00,218][WARN ][logstash.runner          ] The use of JAVA_HOME has been deprecated. Logstash 8.0 and later ignores JAVA_HOME and uses the bundled JDK. Running Logstash with the bundled JDK is recommended. The bundled JDK has been verified to work with each specific version of Logstash, and generally provides best performance and reliability. If you have compelling reasons for using your own JDK (organizational-specific compliance requirements, for example), you can configure LS_JAVA_HOME to use that version instead.
[2024-12-04T19:46:00,218][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"9.0.0", "jruby.version"=>"jruby 9.4.9.0 (3.1.4) 2024-11-04 547c6b150e OpenJDK 64-Bit Server VM 17.0.12+0 on 17.0.12+0 +indy +jit [arm64-darwin]"}
[2024-12-04T19:46:00,220][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
[2024-12-04T19:46:00,233][INFO ][logstash.runner          ] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
[2024-12-04T19:46:00,233][INFO ][logstash.runner          ] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`
[2024-12-04T19:46:00,247][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2024-12-04T19:46:00,486][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2024-12-04T19:46:00,611][INFO ][org.reflections.Reflections] Reflections took 38 ms to scan 1 urls, producing 149 keys and 522 values
[2024-12-04T19:46:00,730][INFO ][logstash.javapipeline    ] Pipeline `main` is configured with `pipeline.ecs_compatibility: disabled` setting. All plugins in this pipeline will default to `ecs_compatibility => disabled` unless explicitly configured otherwise.
[2024-12-04T19:46:00,739][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>12, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1500, "pipeline.sources"=>["config string"], :thread=>"#<Thread:0x57d5555e /Users/rye/src/elastic/logstash@main/logstash-core/lib/logstash/java_pipeline.rb:138 run>"}
[2024-12-04T19:46:00,997][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>0.26}
[2024-12-04T19:46:01,010][INFO ][logstash.inputs.stdin    ][main] Automatically switching from plain to line codec {:plugin=>"stdin"}
[2024-12-04T19:46:01,015][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2024-12-04T19:46:01,027][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2024-12-04T19:46:01,314][ERROR][logstash.javapipeline    ][main] Pipeline worker error, the pipeline will be stopped {:pipeline_id=>"main", :error=>"deserialize invocation error", :exception=>Java::OrgLogstashAckedqueue::QueueRuntimeException, :backtrace=>["org.logstash.ackedqueue.Queue.deserialize(Queue.java:752)", "org.logstash.ackedqueue.Batch.deserializeElements(Batch.java:89)", "org.logstash.ackedqueue.Batch.<init>(Batch.java:49)", "org.logstash.ackedqueue.Queue.readPageBatch(Queue.java:681)", "org.logstash.ackedqueue.Queue.readBatch(Queue.java:614)", "org.logstash.ackedqueue.ext.JRubyAckedQueueExt.readBatch(JRubyAckedQueueExt.java:158)", "org.logstash.ackedqueue.AckedReadBatch.create(AckedReadBatch.java:49)", "org.logstash.ext.JrubyAckedReadClientExt.readBatch(JrubyAckedReadClientExt.java:87)", "org.logstash.execution.WorkerLoop.run(WorkerLoop.java:82)", "java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)", "java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)", "java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)", "java.base/java.lang.reflect.Method.invoke(Method.java:569)", "org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(JavaMethod.java:300)", "org.jruby.javasupport.JavaMethod.invokeDirect(JavaMethod.java:164)", "org.jruby.java.invokers.InstanceMethodInvoker.call(InstanceMethodInvoker.java:32)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:193)", "org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:346)", "org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:66)", "org.jruby.ir.interpreter.Interpreter.INTERPRET_BLOCK(Interpreter.java:118)", "org.jruby.runtime.MixedModeIRBlockBody.commonYieldPath(MixedModeIRBlockBody.java:136)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:66)", "org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:58)", "org.jruby.runtime.Block.call(Block.java:144)", "org.jruby.RubyProc.call(RubyProc.java:354)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:111)", "java.base/java.lang.Thread.run(Thread.java:840)"], :thread=>"#<Thread:0x57d5555e /Users/rye/src/elastic/logstash@main/logstash-core/lib/logstash/java_pipeline.rb:138 sleep>"}
[2024-12-04T19:46:02,501][INFO ][logstash.javapipeline    ][main] Pipeline terminated {"pipeline.id"=>"main"}
[2024-12-04T19:46:02,543][INFO ][logstash.pipelinesregistry] Removed pipeline from registry successfully {:pipeline_id=>:main}
[2024-12-04T19:46:02,544][INFO ][logstash.runner          ] Logstash shut down.
[success (00:00:07)]

But on main I haven't found a set of other circumstances to trigger ObjectMappers to be loaded before the runner applies those defaults.

Logging configuration. Some of our logging can use object mappers defined in ObjectMappers, which means that certain emitted logs could cause it to be statically loaded before the runner has configured the defaults.

Moves the application of jackson defaults overrides into pure java, and
applies them statically _before_ the `org.logstash.ObjectMappers` has a chance
to start initializing object mappers that rely on the defaults.

We replace the runner's invocation (which was too late to be fully applied) with
a _verification_ that the configured defaults have been applied.
@yaauie yaauie force-pushed the jackson-static-init-defaults branch from 16403d0 to 5f60662 Compare December 4, 2024 20:00
Copy link

@elasticmachine
Copy link
Collaborator

💚 Build Succeeded

History

cc @donoghuc

Copy link
Member

@donoghuc donoghuc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did some manual testing looking at the two implementations. Everything seems to maintain parity WRT to logging and surfacing errors. 👍

@donoghuc donoghuc removed their assignment Dec 4, 2024
Copy link
Contributor

@mashhurs mashhurs left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm~

I spent alot time to reproduce the origin issue, unfortunately failed. However, manually placing org.logstash.ObjectMappers::CBOR_MAPPER before applying constraints indicates that in a certain situation/environment CBOR_MAPPER (as it is static) is initialized before jackson constraints logic. From this perspective, the intention here is clear that any of object mappers applies the constraints.

@yaauie yaauie merged commit 202d07c into elastic:main Dec 4, 2024
6 checks passed
@yaauie yaauie deleted the jackson-static-init-defaults branch December 4, 2024 22:27
@yaauie
Copy link
Member Author

yaauie commented Dec 4, 2024

@logstashmachine backport 8.x

github-actions bot pushed a commit that referenced this pull request Dec 4, 2024
Moves the application of jackson defaults overrides into pure java, and
applies them statically _before_ the `org.logstash.ObjectMappers` has a chance
to start initializing object mappers that rely on the defaults.

We replace the runner's invocation (which was too late to be fully applied) with
a _verification_ that the configured defaults have been applied.

(cherry picked from commit 202d07c)
@yaauie
Copy link
Member Author

yaauie commented Dec 4, 2024

@logstashmachine backport 8.17

@yaauie
Copy link
Member Author

yaauie commented Dec 4, 2024

@logstashmachine backport 8.16

github-actions bot pushed a commit that referenced this pull request Dec 4, 2024
Moves the application of jackson defaults overrides into pure java, and
applies them statically _before_ the `org.logstash.ObjectMappers` has a chance
to start initializing object mappers that rely on the defaults.

We replace the runner's invocation (which was too late to be fully applied) with
a _verification_ that the configured defaults have been applied.

(cherry picked from commit 202d07c)
github-actions bot pushed a commit that referenced this pull request Dec 4, 2024
Moves the application of jackson defaults overrides into pure java, and
applies them statically _before_ the `org.logstash.ObjectMappers` has a chance
to start initializing object mappers that rely on the defaults.

We replace the runner's invocation (which was too late to be fully applied) with
a _verification_ that the configured defaults have been applied.

(cherry picked from commit 202d07c)
yaauie added a commit that referenced this pull request Dec 5, 2024
#16757)

Moves the application of jackson defaults overrides into pure java, and
applies them statically _before_ the `org.logstash.ObjectMappers` has a chance
to start initializing object mappers that rely on the defaults.

We replace the runner's invocation (which was too late to be fully applied) with
a _verification_ that the configured defaults have been applied.

(cherry picked from commit 202d07c)

Co-authored-by: Ry Biesemeyer <[email protected]>
yaauie added a commit that referenced this pull request Dec 5, 2024
#16758)

Moves the application of jackson defaults overrides into pure java, and
applies them statically _before_ the `org.logstash.ObjectMappers` has a chance
to start initializing object mappers that rely on the defaults.

We replace the runner's invocation (which was too late to be fully applied) with
a _verification_ that the configured defaults have been applied.

(cherry picked from commit 202d07c)

Co-authored-by: Ry Biesemeyer <[email protected]>
robbavey pushed a commit that referenced this pull request Dec 6, 2024
Moves the application of jackson defaults overrides into pure java, and
applies them statically _before_ the `org.logstash.ObjectMappers` has a chance
to start initializing object mappers that rely on the defaults.

We replace the runner's invocation (which was too late to be fully applied) with
a _verification_ that the configured defaults have been applied.

(cherry picked from commit 202d07c)
robbavey pushed a commit that referenced this pull request Dec 6, 2024
#16756)

Moves the application of jackson defaults overrides into pure java, and
applies them statically _before_ the `org.logstash.ObjectMappers` has a chance
to start initializing object mappers that rely on the defaults.

We replace the runner's invocation (which was too late to be fully applied) with
a _verification_ that the configured defaults have been applied.

(cherry picked from commit 202d07c)

Co-authored-by: Ry Biesemeyer <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

PQ: Event containing 20+MB String cannot be deserialized for processing
6 participants