-
Notifications
You must be signed in to change notification settings - Fork 58
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(hesai): add filtered pointcloud counter function #247
base: main
Are you sure you want to change the base?
feat(hesai): add filtered pointcloud counter function #247
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #247 +/- ##
==========================================
+ Coverage 26.07% 26.13% +0.05%
==========================================
Files 101 104 +3
Lines 9232 9429 +197
Branches 2213 2242 +29
==========================================
+ Hits 2407 2464 +57
- Misses 6436 6578 +142
+ Partials 389 387 -2
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the PR! Here is the review so far. Performance looks good but there are some more counters and naming changes I'd like to request 🙇
nebula_decoders/include/nebula_decoders/nebula_decoders_hesai/decoders/hesai_decoder.hpp
Outdated
Show resolved
Hide resolved
nebula_decoders/include/nebula_decoders/nebula_decoders_hesai/decoders/hesai_decoder.hpp
Outdated
Show resolved
Hide resolved
@@ -36,6 +37,52 @@ | |||
namespace nebula::drivers | |||
{ | |||
|
|||
struct HesaiDecodeFilteredInfo |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please also add counters for
- invalid points (distance == 0), line 198
- points filtered due to dual_return_threshold (line 238)
- points filtered due to identical return type (line 217)
- points kept in total (= not filtered) (shall be equal to the final pointcloud size)
nebula_decoders/include/nebula_decoders/nebula_decoders_hesai/decoders/hesai_decoder.hpp
Outdated
Show resolved
Hide resolved
NebulaPointCloud point_timestamp_start; | ||
NebulaPointCloud point_timestamp_end; | ||
|
||
void clear() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please make sure that all fields are reset (e.g. timestamp_counter is missing)
float distance_start = 0; | ||
float distance_end = 0; | ||
float raw_azimuth_start = 0; | ||
float raw_azimuth_end = 0; | ||
std::uint32_t packet_timestamp_start = 0; | ||
std::uint32_t packet_timestamp_end = 0; | ||
NebulaPointCloud point_azimuth_start; | ||
NebulaPointCloud point_azimuth_end; | ||
NebulaPointCloud point_timestamp_start; | ||
NebulaPointCloud point_timestamp_end; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- Please rename from
start/end
tomin/max
. - Please also add unit suffixes like
_ns
for nanoseconds,_rad
for radians,_m
for meters etc. packet_timestamp_min/max
should probably have typeuint64_t
(uint32_t
cannot represent absolute timestamps in nanoseconds)
Instead of point_
, please rename to cloud_
so that it is clear that those values are among the points that were not filtered.
I would suggest replacing raw_
with packet_
as well, so we have packet_
(before filtering) vs. cloud_
(after filtering).
j["distance_counter"] = distance_counter; | ||
j["fov_counter"] = fov_counter; | ||
j["timestamp_counter"] = timestamp_counter; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For filters, let's output them as a JSON array instead:
"filter_pipeline": [
{ "filter": "distance", "count": 50 },
{ "filter": "fov", "count": 120 }
]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ike-kazu Thanks for the changes, I have a few more small requests to polish everything!
After implementing the changes, could you also provide a self-evaluation (you running Nebula with different parameters for min_range/max_range, cloud_min_angle, cloud_max_angle, dual_return_distance_threshold
and showing the output JSON diagnostics?
Thank you!
@@ -26,6 +26,8 @@ | |||
#include <rclcpp/logging.hpp> | |||
#include <rclcpp/rclcpp.hpp> | |||
|
|||
#include <sys/types.h> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should be #include <cstdint>
probably
uint16_t timestamp_filtered_count = 0; | ||
uint16_t invalid_point_count = 0; | ||
uint16_t multiple_return_point_count = 0; | ||
uint16_t mutliple_return_point_count = 0; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please remove this misspelled line
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, I guess this should be the counter for filtering identical points? Please rename to identical_filtered_count
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please remove this submodule
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please remove this submodule
uint16_t distance_filtered_count = 0; | ||
uint16_t fov_filtered_count = 0; | ||
uint16_t timestamp_filtered_count = 0; | ||
uint16_t invalid_point_count = 0; | ||
uint16_t multiple_return_point_count = 0; | ||
uint16_t mutliple_return_point_count = 0; | ||
uint16_t total_kept_point_count = 0; | ||
uint16_t invalid_packet_count = 0; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Given that there can be a few 100k points in a pointcloud, uint16_t
(only going up to 65535) could easily overflow.
Generally try to use uint64_t
for counters.
{ | ||
uint16_t distance_filtered_count = 0; | ||
uint16_t fov_filtered_count = 0; | ||
uint16_t timestamp_filtered_count = 0; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nebula does not filter by timestamp, so this counter can be removed.
uint16_t fov_filtered_count = 0; | ||
uint16_t timestamp_filtered_count = 0; | ||
uint16_t invalid_point_count = 0; | ||
uint16_t multiple_return_point_count = 0; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please rename to multiple_return_filtered_count
nlohmann::json distance_j; | ||
distance_j["filter"] = "distance"; | ||
distance_j["distance_filtered_count"] = distance_filtered_count; | ||
distance_j["cloud_distance_min_m"] = cloud_distance_min_m; | ||
distance_j["cloud_distance_max_m"] = cloud_distance_max_m; | ||
nlohmann::json fov_j; | ||
fov_j["filter"] = "fov"; | ||
fov_j["fov_filtered_count"] = fov_filtered_count; | ||
fov_j["cloud_azimuth_min_rad"] = cloud_azimuth_min_rad; | ||
fov_j["cloud_azimuth_max_rad"] = cloud_azimuth_max_rad; | ||
nlohmann::json timestamp_j; | ||
timestamp_j["filter"] = "timestamp"; | ||
timestamp_j["timestamp_filtered_count"] = timestamp_filtered_count; | ||
timestamp_j["packet_timestamp_min_ns"] = packet_timestamp_min_ns; | ||
timestamp_j["packet_timestamp_max_ns"] = packet_timestamp_max_ns; | ||
nlohmann::json invalid_j; | ||
invalid_j["filter"] = "invalid"; | ||
invalid_j["invalid_point_count"] = invalid_point_count; | ||
invalid_j["invalid_packet_count"] = invalid_packet_count; | ||
nlohmann::json identical_j; | ||
identical_j["filter"] = "identical"; | ||
identical_j["multiple_return_point_count"] = multiple_return_point_count; | ||
nlohmann::json multiple_j; | ||
multiple_j["filter"] = "multiple"; | ||
multiple_j["mutliple_return_point_count"] = mutliple_return_point_count; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please separate the measured pointcloud bounds (distance/azimuth/time) from the filter pipeline information.
Also make sure that the keys for each filter are called "name"
and "filtered_count"
, regardless of filter type:
"filter_pipeline": [
{"name": "distance", "filtered_count": 300},
...
],
"pointcloud_bounds": {
"azimuth_deg": {
"min": 0.0,
"max": 270.0
},
// same with timestamp and distance
}
invalid_packet_count
has very different causes than invalid_point_count
, so I'd like to keep them separate.
Make it a top-level JSON property instead (same level as total_kept_point_count
.
return j; | ||
} | ||
|
||
void get_minmax_info(const NebulaPoint & point) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please rename to update_pointcloud_bounds
for more clarity.
float cloud_distance_max_m = 0; | ||
float cloud_azimuth_min_rad = 0; | ||
float cloud_azimuth_max_rad = 0; | ||
uint64_t packet_timestamp_min_ns = 0; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For easier reaability, please make these deg
instead of rad
and convert accordingly in the get_minmax_info
function below.
PR Type
Description
This pr will be able to watch filtered pointcloud's count by each filtering such as distance, fov and so on. It is usefull for finding filtering error while seeking causes of lidar pointclouds error.
Review Procedure
Remarks
Pre-Review Checklist for the PR Author
PR Author should check the checkboxes below when creating the PR.
Checklist for the PR Reviewer
Reviewers should check the checkboxes below before approval.
Post-Review Checklist for the PR Author
PR Author should check the checkboxes below before merging.
CI Checks