-
Notifications
You must be signed in to change notification settings - Fork 107
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve spec compliance #632
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #632 +/- ##
==========================================
- Coverage 73.02% 72.94% -0.09%
==========================================
Files 61 61
Lines 1924 1918 -6
==========================================
- Hits 1405 1399 -6
Misses 519 519
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
The checking of positive numbers is actually a mistake in the matrix. I see you put it in the SDK at least but the matrix I think is referring to the API. We already drop negative values in aggregators if the aggregation is monotonic. |
I think the matrix is referring to this part of the spec: I'm not sure this is the same as dropping negative values on monotonic aggregations |
Right, that part of the spec says:
|
Yes, my interpretation of
was that the validation should be done on SDK. Not clear to me why this validation is done on aggregations. Monotonic is a property of the instrument not of the aggregation, in addition at the moment the check for positive values is done only on sum aggregation (checked from smartphone so I may have missed something) |
Ah yea, its only done on the sum aggregation right now. |
@tsloughter changelog updated. I also removed the check for positive values on the sum aggregation module since it is now already done elsewhere. If you prefer I can revert that last commit but to me it seems more clear in this way |
9926c65
to
44bc2a1
Compare
Well that is concerning... tests pass on 24 but not 26 and the failure is unexpected metric results. |
So the test expects 3.3 and 10.0 but gets 5.4 and 15.4. I don't even see how those numbers are possible from the recordings made. |
(_) -> | ||
ok | ||
end, ViewAggregations). | ||
|
||
maybe_init_aggregate(Value, #instrument{kind=Kind} = Instrument, _MetricsTab, _ViewAggregation, _Attributes) | ||
when Value < 0, Kind == ?KIND_COUNTER orelse Kind == ?KIND_HISTOGRAM -> | ||
?LOG_INFO("Discarding negative value for instrument ~s of type ~s", [Instrument#instrument.name, Kind]), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wonder if this shouldn't be a debug log instead? To guard against a messed up dependency flooding info logs with logs not about actual functionality shrug
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yea I'm always dubious about the level. I got your point but on the other side a debug log is hardly seen by the user. So I'm ok both with debug and info, we should also align this all over the code
Really strange indeed, I'll take a look |
@tsloughter tests are running in while loop since a while with the same version of CI (26.1.2) and they are consistently passing. Can you try to re-trigger the CI and run them locally? |
Yea, they pass, makes me fear a race condition. |
Merged but opened #661 |
For the second point I only log and discard the value, should we also return an error? If so it will happen only when the SDK is present so I think we are spec compliant. The following is the only part of spec I found mentioning this