-
Notifications
You must be signed in to change notification settings - Fork 200
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TRACE log level causes an application to fail #3524
Comments
hi @progxaker! the error you mention is only logged and shouldn't cause any other issues. can you provide the full |
Hi @trask. My bad, the error confused me and it's not related to the main problem.
That's what I thought too, but the main code doesn't execute.
No exceptions at all :/ Latest logs:
I've prepared a Dockerfile to reproduce the problem more easily:
SimpleApp.javaimport org.apache.spark.sql.SparkSession;
import org.apache.spark.sql.Dataset;
public class SimpleApp {
public static void main(String[] args) {
String logFile = "/tmp/spark/README.md"; // Should be some file on your system
SparkSession spark = SparkSession.builder().appName("Simple Application").getOrCreate();
Dataset<String> logData = spark.read().textFile(logFile).cache();
long length = logData.count();
System.out.println("Lines: " + length);
spark.stop();
}
} pom.xml<project>
<groupId>edu.berkeley</groupId>
<artifactId>simple-project</artifactId>
<modelVersion>4.0.0</modelVersion>
<name>Simple Project</name>
<packaging>jar</packaging>
<version>1.0</version>
<properties>
<maven.compiler.source>17</maven.compiler.source>
<maven.compiler.target>17</maven.compiler.target>
</properties>
<dependencies>
<dependency> <!-- Spark dependency -->
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.12</artifactId>
<version>3.5.0</version>
<scope>provided</scope>
</dependency>
</dependencies>
</project> DockerfileFROM registry.access.redhat.com/ubi8/openjdk-17:1.18-2.1705573234
USER 0
WORKDIR /tmp/
RUN microdnf -y install gzip procps
RUN curl -fsSLo /tmp/spark.tgz https://dlcdn.apache.org/spark/spark-3.5.0/spark-3.5.0-bin-hadoop3.tgz
RUN tar -C /tmp/ -xzf spark.tgz
RUN mv /tmp/spark-3.5.0-bin-hadoop3 /tmp/spark/
RUN curl -fsSLo /tmp/spark/jars/applicationinsights-agent.jar https://repo1.maven.org/maven2/com/microsoft/azure/applicationinsights-agent/3.4.19/applicationinsights-agent-3.4.19.jar
WORKDIR /tmp/project/
COPY SimpleApp.java ./src/main/java/SimpleApp.java
COPY pom.xml ./pom.xml
RUN mvn package
ENV APPLICATIONINSIGHTS_SELF_DIAGNOSTICS_LEVEL="TRACE"
ENV JAVA_TOOL_OPTIONS="-javaagent:/tmp/spark/jars/applicationinsights-agent.jar"
ENV CLASSPATH="/tmp/spark/jars/"
ENV APPLICATIONINSIGHTS_CONNECTION_STRING="InstrumentationKey=00000000-0000-0000-0000-0FEEDDADBEEF;IngestionEndpoint=http://host.testcontainers.internal:6060/;LiveEndpoint=http://host.testcontainers.internal:6060/"
CMD ["/tmp/spark/bin/spark-submit", "--class", "SimpleApp", "--master", "local[4]", "--conf", "spark.jars.ivy=/tmp/.ivy", "target/simple-project-1.0.jar"] |
While I'm trying to solve a higher priority problem via the CET, I tried to print the OpenTelemetry logs bypassing your configuration: Lines 72 to 76 in 80f6eaf
and found that if the loggerLevel value is changed to Level.TRACE the problem is reproduced, if I set Level.INFO the problem is solved. I hope this may be helpful.
- loggerLevel = getAtLeastInfoLevel(level);
+ loggerLevel = Level.TRACE; - loggerLevel = getAtLeastInfoLevel(level);
+ loggerLevel = Level.INFO; UPD: If add |
Keep digging deeper into it. TL;DR The problem occurs when the logger name starts with I get a list of configured loggers Code block// Originally published at https://mailman.qos.ch/pipermail/logback-user/2008-November/000751.html
// and modified by me to return a string.
import org.slf4j.LoggerFactory;
String findNamesOfConfiguredAppenders() {
LoggerContext lc = (LoggerContext) LoggerFactory.getILoggerFactory();
String strList = "";
for (Logger log : lc.getLoggerList()) {
strList = strList + ";" + log.getName();
}
return strList;
} added the function before Line 75 in 80f6eaf
+ System.out.println("Loggers" + findNamesOfConfiguredAppenders());
loggerLevel = getDefaultLibraryLevel(level); and get all the loggers from a log file.
and started experimenting with LoggingLevelConfigurator.java@@ -33,6 +36,13 @@ public class LoggingLevelConfigurator {
updateLoggerLevel(loggerContext.getLogger("io.opentelemetry.exporter.logging"));
updateLoggerLevel(
loggerContext.getLogger("io.opentelemetry.sdk.metrics.internal.state.DeltaMetricStorage"));
+ updateLoggerLevel(loggerContext.getLogger("io.opentelemetry.sdk.autoconfigure"));
+ updateLoggerLevel(loggerContext.getLogger("io.opentelemetry.sdk.resources"));
+ updateLoggerLevel(loggerContext.getLogger("io.opentelemetry.sdk.extension"));
+ updateLoggerLevel(loggerContext.getLogger("io.opentelemetry.sdk.metrics"));
+ updateLoggerLevel(loggerContext.getLogger("io.opentelemetry.sdk.metrics.internal"));
+ //updateLoggerLevel(loggerContext.getLogger("io.opentelemetry.sdk.metrics.internal.state"));
+ updateLoggerLevel(loggerContext.getLogger("io.opentelemetry.sdk"));
updateLoggerLevel(loggerContext.getLogger("io.opentelemetry"));
updateLoggerLevel(loggerContext.getLogger("muzzleMatcher"));
updateLoggerLevel(
@@ -69,10 +80,26 @@ public class LoggingLevelConfigurator {
loggerLevel = level;
else if (name.equals("io.opentelemetry.sdk.metrics.internal.state.DeltaMetricStorage")) {
loggerLevel = getDeltaMetricStorageLoggerLevel(level);
+ } else if (name.startsWith("io.opentelemetry.sdk.autoconfigure")) {
+ loggerLevel = Level.ALL;
+ } else if (name.startsWith("io.opentelemetry.sdk.resources")) {
+ loggerLevel = Level.ALL;
+ } else if (name.startsWith("io.opentelemetry.sdk.extension")) {
+ loggerLevel = Level.ALL;
+ //} else if (name.startsWith("io.opentelemetry.sdk.metrics.internal.state")) {
+ //loggerLevel = Level. INFO;
+ } else if (name.startsWith("io.opentelemetry.sdk.metrics.internal")) {
+ loggerLevel = Level. ERROR:
+ } else if (name.startsWith("io.opentelemetry.sdk.metrics")) {
+ loggerLevel = Level.ALL;
+ } else if (name.startsWith("io.opentelemetry.sdk")) {
+ loggerLevel = Level.ALL;
+ } else if (name.startsWith("io.opentelemetry")) {
// OpenTelemetry instrumentation debug log has lots of things that look like errors
// which has been confusing customers, so only enable it when user configures "trace"
- level loggerLevel = getDefaultLibraryLevel(level);
+ //loggerLevel = getDefaultLibraryLevel(level);
+ loggerLevel = Level.ALL;
} else if (name.equals("com.azure.core.implementation.jackson.MemberNameConverterImpl")) {
// never want to log at trace or debug, as it logs confusing stack trace that
// looks like error but isn't |
hi @progxaker, can you provide a runnable repro? (as a github repository would be easiest for us to consume and try it out) |
Hi @trask. I've consolidated the shared files ( P.S. I tested the workaround for v3.4.19, but it doesn't work for v3.5.0. |
As for v3.5.0, the problem occurs not only with the TRACE log level, but also with DEBUG. |
hi @progxaker, I tried to run your repro locally but got error b/c I'm running on Windows. is there any chance you could simplify the repro, maybe along the lines of https://github.com/trask/apache-spark-extension-test? that would also make it easier for us to run in Intellij under a debugger (instead of having to remote debug inside of a docker image). thanks! |
Hi @trask. I hope you're doing well. Sorry for the long delay, been busy implementing OpenTelemetry (Python) into a project. TL;DR:
InvalidPathException exception
As a result, the problem is reproducible on
I'll prepare a Windows version by the end of the day (~5 p.m. UTC). |
Hello @trask. Have you had time to look at this? |
Hello @trask. Are there any updates? |
Hello team. Could you please tell if there are any updates? |
Expected behavior
Don't fail a main application. Probably use
WARN
log level.Actual behavior
The enabled JVM agent with
TRACE
log level causes a Java application to fail.To Reproduce
-javagent:/path/to/application-insights-3.4.17.jar
).APPLICATIONINSIGHTS_SELF_DIAGNOSTICS_LEVEL
toTRACE
.System information
Please provide the following information:
Logs
The text was updated successfully, but these errors were encountered: