Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about SimpleGPsOdometry #50

Closed
juliangaal opened this issue Jul 10, 2023 · 11 comments
Closed

Question about SimpleGPsOdometry #50

juliangaal opened this issue Jul 10, 2023 · 11 comments

Comments

@juliangaal
Copy link

If I understand your code correctly, you convert LLA -> ECEF -> ENU, and add this value to the factor graph.

I don't quite understand how this is possible, since the keyframes in addOdomFactor are in the lidar frame, not ENU. How do you convert the ENU pose into the robot's frame?

Also, could you explain this comment? The IMU is in it's own frame in normal systems, to my understanding

    // maybe you need to get the extrinsics between your gnss and imu
    // most of the time, they are in the same frame
    Eigen::Vector3d calib_enu = enu;

Thanks a lot for your help!

@JokerJohn
Copy link
Owner

If I understand your code correctly, you convert LLA -> ECEF -> ENU, and add this value to the factor graph.

I don't quite understand how this is possible, since the keyframes in addOdomFactor are in the lidar frame, not ENU. How do you convert the ENU pose into the robot's frame?

Also, could you explain this comment? The IMU is in it's own frame in normal systems, to my understanding

    // maybe you need to get the extrinsics between your gnss and imu
    // most of the time, they are in the same frame
    Eigen::Vector3d calib_enu = enu;

Thanks a lot for your help!

We don't need to transform the pose from the GPS ENU (East, North, Up) system to the Lidar system. The ENU system is absolute, with fixed east and north directions, and can be represented by either an ellipsoid model (with the current origin as the origin of the ENU system) or a plane model (UTM system, where altitude is assumed to equal z). On the other hand, the LIO system is relative and is initialized at any angle from the starting point, typically using a unit matrix for pose. The difference between these two systems is merely a rotation.

In this project, we've adopted a batch processing method to align these two coordinate systems effectively. This method ensures that the initial segment of GNSS is of good quality and has sufficient displacement. We then use a factor graph to introduce the GPS factor and odom factor for batch optimization. The final fused pose trajectory is unified into the GNSS ENU GPS receiver coordinate system, following the implementation of the GPS factor.

The reason for unifying everything into the GNSS ENU coordinate system is its absolute nature. If the point cloud map is to be used for positioning based on a prior map and needs to be combined with GNSS for positioning, the coordinate system of the prior map must be unified to GNSS. This requires the WGS 84 coordinates of the point cloud map's origin. Without this unification, it would be impossible to use GNSS for positioning, or additional initialization steps would be required.

Regarding the external parameters of the Lidar and GNSS, there is a rigid body transformation between these two sensors, making direct calibration challenging. Accurate external parameters are beneficial when there's a need to obtain the pose trajectory in the optimized Lidar coordinate system. However, for the map, these parameters are not crucial.

@juliangaal
Copy link
Author

The ENU system is absolute, with fixed east and north directions, and can be represented by either an ellipsoid model (with the current origin as the origin of the ENU system) or a plane model (UTM system, where altitude is assumed to equal z)

Right, calculated in gpsTools.h

        Eigen::Vector3d t = -LLA2ECEF(lla_origin_);
        Eigen::Matrix3d r = ...

        Eigen::Vector3d enu;
        enu = ecef + t;
        enu = r * enu;

I am mainly confused about this statement.

In this project, we've adopted a batch processing method to align these two coordinate systems effectively

Where in the code does this happen? To my understanding, the unaligned poses of GPS (in ENU, relative) and Lidar (in local ROS right hand xyz, relative) are put into the factor graph by addOdomFactor() and addGPSFactor(). Both poses are "absolute" from 0,0,0 in their respective coordinate system. Doesn't the factor graph need poses from the same coordinate system, in the spirit of align these two coordinate systems? Or does the factor graph itself do the alignment (which I couldn't wrap my head around)?

@JokerJohn JokerJohn reopened this Jul 12, 2023
@JokerJohn
Copy link
Owner

The following code demonstrates the process of batch optimization. That is, when a sufficient number of GPS Factors are introduced, ISAM will incrementally update, thus achieving trajectory fusion (i.e., migrating the whole to the GPS coordinate system).

      if (keyframeGPSfactor.size() < 20) {
        ROS_INFO("Accumulated gps factor: %d", keyframeGPSfactor.size());
        return;
      }

      if (!gpsTransfromInit) {
        ROS_INFO("Initialize GNSS transform!");
        for (int i = 0; i < keyframeGPSfactor.size(); ++i) {
          gtsam::GPSFactor gpsFactor = keyframeGPSfactor.at(i);
          gtSAMgraph.add(gpsFactor);
          gpsIndexContainer[gpsFactor.key()] = i;
        }
        gpsTransfromInit = true;
      }

You can understand the GPS factor as a unary 3-DOF prior pose constraint, while the odom factor is a relative constraint between two consecutive poses. Essentially, it's a least squares problem. Given weights (noise model), the goal of least squares is to minimize the sum of the prior pose error and the relative pose error. The prior pose error accounts for the majority so the final fused pose trajectory will be pulled to the GNSS coordinate system. For more detailed information about the GPS factor, you can refer to the implementation in GTSAM, which is very straightforward.

@juliangaal
Copy link
Author

juliangaal commented Jul 12, 2023

Ah, ok, thank you very much! I saw that line but had no idea all the magic is hidden in gtsam::GPSFactor! I guess this would be the equivalent to transforming the GPS pose to the body frame (e.g. from some EKF fusion, like robot_localization in LIO-SAM which transforms GPS data into a frame that is consistent with your robot’s starting pose (position and orientation) in its world frame) and adding the GPSPose in the body/world frame as a regular Odometry Factor

@JokerJohn
Copy link
Owner

Ah, ok, thank you very much! I saw that line but had no idea all the magic is hidden in gtsam::GPSFactor! I guess this would be the equivalent to transforming the GPS pose to the body frame (e.g. from some EKF fusion, like robot_localization in LIO-SAM which transforms GPS data into a frame that is consistent with your robot’s starting pose (position and orientation) in its world frame) and adding the GPSPose in the body/world frame as a regular Odometry Factor

You can regard it as the Umeyama algorithm in the EVO tool, which is used to align two pose trajectories and solved by the least squares method. Thanks to the global optimization capability of the Factor graph, we don't need to complete the coordinate system transformation first like in the origin version of LIO-SAM that relies on robot-localization. Considering that this could lead to initialization failure in areas where the GNSS quality is poor at the starting position (such as the data in hkust-full-xx.bag), the batch processing method in this project doesn't need to worry about the initial GNSS quality. Fusion can be completed as long as there are enough GNSS observations in the overall databag.

@juliangaal
Copy link
Author

Nice, yes, umeyama in EVO is a nice way to think about it.

Unrelated: Is there any way I can contact you via email?

@JokerJohn
Copy link
Owner

Nice, yes, umeyama in EVO is a nice way to think about it.

Unrelated: Is there any way I can contact you via email?

[email protected]

@juliangaal
Copy link
Author

At the moment, addLoopFactor only adds loop factors based on lidar odometry with simple radius search. I understand that the gtsam::GPSFactor is coordinate system agnostic, as long as the gps pose is relative to the origin (thanks for your help!).

Now, if I want to add a GPS loop factor based on radius search as well, I don't think I can get around transforming GPS coordinates into the lidar frame manually (no GTSAM umeyama style alignment), because the GTSAM BetweenFactors are all added in the lidar frame? Am I correct in my assumption?

@JokerJohn
Copy link
Owner

JokerJohn commented Jul 21, 2023

@juliangaal I think maybe you need to reconsider this issue. The GNSS coordinate system is an absolute ENU (East, North, Up) system, and the fusion of GNSS and LIO generally requires calibration. There are two types of calibrations:

  1. Sole GNSS: This measures only a point and calibrates the external parameters from GNSS to LIDAR. Since a point only involves translation, this translation is typically measured manually.

  2. Calibration in the Projected Coordinate System (ENU): This calibrates GNSS with LIDAR assuming the initial gravity is aligned. What needs to be calibrated here is just the translation (3 dimensions) and yaw(1 dimension).

In this repo, the calibration between GNSS and LIDAR falls under the second category. We assume that GNSS and IMU are installed together (the external parameters from LIDAR to IMU include this translation). However, there's an issue with the second calibration method regarding the yaw. If you change the starting point, the observed coordinate values of GNSS will change (meaning different origins of the ENU system), which implies different yaw values. Even the number of GNSS observations used for fusion can affect the estimation of the yaw angle. As a result, it's challenging to calibrate a fixed yaw offline. If you want a fixed yaw to transform GNSS observations to the LIO system, this isn't a mature approach. Many papers have shown that fusing GNSS in the LIO/VIO system can lead to non-objectivity in 4 degrees of freedom, resulting in larger errors. On the other hand, fusion in the ENU system ensures observability in all 6 degrees of freedom (as referenced in VIO studies).

@zhao-zhibo
Copy link

zhao-zhibo commented Nov 6, 2023

> If you change the starting point, the observed coordinate values of GNSS will change (meaning different origins of the ENU system), which implies different yaw values.

上文中的这一句话,这里面说的ENU坐标系,在计算航向角时,我觉得是可以认为ENU坐标系不发生改变,在小范围内可以忽略不计。我是测绘方向的研究生
When calculating the heading angle, the ENU coordinate system is usually considered not to change. This can be ignored in a small range.

@JokerJohn
Copy link
Owner

这就是一个简单的位置差分,认为瞬时在平面运动,roll pitch为0,姿态就是yaw,计算的参考位姿而已,画图用的,与优化无关。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants