Skip to content

Commit

Permalink
correctly handle stack overflow errors during pipeline compilation (#…
Browse files Browse the repository at this point in the history
…16323) (#16336)

This commit improves error handling when pipelines that are too big hit the Xss limit and throw a StackOverflowError. Currently the exception is printed outside of the logger, and doesn’t even show if log.format is json, leaving the user to wonder what happened.

A couple of thoughts on the way this is implemented:

* There should be a first barrier to handle pipelines that are too large based on the PipelineIR compilation. The barrier would use the detection of Xss to determine how big a pipeline could be. This however doesn't reduce the need to still handle a StackOverflow if it happens.
* The catching of StackOverflowError could also be done on the WorkerLoop. However I'd suggest that this is unrelated to the Worker initialization itself, it just so happens that compiledPipeline.buildExecution is computed inside the WorkerLoop class for performance reasons. So I'd prefer logging to not come from the existing catch, but from a dedicated catch clause.

Solves #16320

(cherry picked from commit 8f2dae6)

Co-authored-by: João Duarte <[email protected]>
  • Loading branch information
github-actions[bot] and jsvd authored Jul 18, 2024
1 parent 3624a82 commit 4f34601
Show file tree
Hide file tree
Showing 2 changed files with 38 additions and 0 deletions.
5 changes: 5 additions & 0 deletions logstash-core/lib/logstash/java_pipeline.rb
Original file line number Diff line number Diff line change
Expand Up @@ -603,6 +603,11 @@ def init_worker_loop
"Worker loop initialization error",
default_logging_keys(:error => e.message, :exception => e.class, :stacktrace => e.backtrace.join("\n")))
nil
rescue Java::java.lang.StackOverflowError => se
@logger.error(
"Stack overflow error while compiling Pipeline. Please increase thread stack size using -Xss",
default_logging_keys())
nil
end
end

Expand Down
33 changes: 33 additions & 0 deletions logstash-core/spec/logstash/java_pipeline_spec.rb
Original file line number Diff line number Diff line change
Expand Up @@ -1016,6 +1016,39 @@ def flush(options)
it_behaves_like 'it flushes correctly'
end
end
context "Pipeline created with too many filters" do
# create pipeline with 2000 filters
# 2000 filters is more than a thread stack of size 2MB can handle
let(:config) do
<<-EOS
input { dummy_input {} }
filter {
#{" nil_flushing_filter {}\n" * 2000}
}
output { dummy_output {} }
EOS
end
let(:output) { ::LogStash::Outputs::DummyOutput.new }

before do
allow(::LogStash::Outputs::DummyOutput).to receive(:new).with(any_args).and_return(output)
allow(LogStash::Plugin).to receive(:lookup).with("input", "dummy_input").and_return(LogStash::Inputs::DummyBlockingInput)
allow(LogStash::Plugin).to receive(:lookup).with("filter", "nil_flushing_filter").and_return(NilFlushingFilterPeriodic)
allow(LogStash::Plugin).to receive(:lookup).with("output", "dummy_output").and_return(::LogStash::Outputs::DummyOutput)
allow(LogStash::Plugin).to receive(:lookup).with("codec", "plain").and_return(LogStash::Codecs::Plain)
end

let(:pipeline) { mock_java_pipeline_from_string(config, pipeline_settings_obj) }

it "informs the user that a stack overflow occurred" do
allow(pipeline.logger).to receive(:error)

pipeline.start
pipeline.shutdown

expect(pipeline.logger).to have_received(:error).with(/Stack overflow/, anything).at_least(:once)
end
end
context "Periodic Flush that intermittently returns nil" do
let(:config) do
<<-EOS
Expand Down

0 comments on commit 4f34601

Please sign in to comment.