-
Thank you for this excellent library and the well-written paper. I have been exploring the library and working on a SE(3) example for a mixed left- and right-invariant EKF for localization. Here is the setup: The robot operates in 3D space and receives control inputs as linear and angular velocities. It is equipped with a GNSS sensor, and the latitude, longitude, and altitude readings are converted into the local Cartesian coordinates. The robot's initial position in the local Cartesian coordinate system is (0,0,0). For the ESKF prediction step, I use the linear and angular velocities to predict the robot's state. For the ESKF update step, I use the GPS measurements to update the robot's state. The prediction follows the same process as the Since the linear and angular velocities are measured in the body frame, the ESKF prediction step is performed using the right-plus operation. Conversely, since the GNSS data is measured in the fixed frame, the ESKF correction step is performed using the left-plus operation. (Please correct me if I'm thinking this wrong). Here is my first attempt at a code snippet to implement this approach: // State & Error covariance
manif::SE3d X_;
X_.setIdentity();
Matrix6d P_;
P_.setZero();
// Control
manif::SE3Tangentd u_;
Vector6d u_noisy_;
u_noisy_ << 0.0, 0.0, 0.0, 0.0, 0.0, 0.0;
Array6d u_sigmas_;
u_sigmas_ << 0.1, 0.1, 0.1, 0.1, 0.1, 0.1;
Matrix6d U_;
U_ = (u_sigmas_ * u_sigmas_).matrix().asDiagonal();
// Declare the Jacobians of the motion wrt robot and control
manif::SE3d::Jacobian J_x_, J_u_;
// ...
// IEKF callback
u_noisy_ << msg->twist.linear.x, msg->twist.linear.y, msg->twist.linear.z,
msg->twist.angular.x, msg->twist.angular.y, msg->twist.angular.z
u_ = (u_noisy_ * dt_); // dt_ = 0.025 is a constant
// ESKF prediction.
X_ = X_.rplus(u_, J_x_, J_u_);
P_ = J_x_ * P_ * J_x_.transpose() + J_u_ * U_ * J_u_.transpose();
// ESKF correction
// measurement
Eigen::Vector3d y(msg->latitude , msg->longitude, msg->altitude); // lat, long, alt are in local Cartesian
Eigen::Matrix3d R;
R.setZero();
R.diagonal() << msg->position_covariance.at(0), msg->position_covariance.at(4), msg->position_covariance.at(8);
// expection
Eigen::Matrix<double, 3, 6> J_e_x;
Eigen::Vector3d e = X_.act(Eigen::Vector3d::Zero(), J_e_x); // y = x⋅b, b = 0 to get the translation
Eigen::Matrix<double, 3, 6> H = J_e_x;
// switch the covariance from right to left
Matrix6d LP = X_.adj() * P_ * X_.adj().transpose();
Eigen::Matrix3d E = H * LP * H.transpose();
// innovation
Eigen::Vector3d z = y - e; // innovation
Eigen::Matrix3d Z = E + R; // innovation cov
// Kalman gain
Eigen::Matrix<double, 6, 3> K = LP * H.transpose() * Z.inverse();
// Correction step
manif::SE3Tangentd dx = K * z;
// Update
X_ = X_.lplus(dx);
LP = LP - K * Z * K.transpose();
// switch the covariance from the left to right
P_ = X_.adj().inverse() * LP * X_.adj().inverse().transpose(); The ESKF prediction seems reasonable, but the ESKF correction gives me completely wrong updates. I then changed all the ESKF correction steps to use the right-plus operation as shown below. With this adjustment, my second attempt produced correct results. // ESKF correction
// ...
// expection
Eigen::Matrix<double, 3, 6> J_e_x;
Eigen::Vector3d e = X_.act(Eigen::Vector3d::Zero(), J_e_x); // y = x⋅b, b = 0 to get the translation
Eigen::Matrix<double, 3, 6> H = J_e_x;
Eigen::Matrix3d E = H * P_ * H.transpose();
// innovation
Eigen::Vector3d z = y - e; // innovation
Eigen::Matrix3d Z = E + R; // innovation cov
// Kalman gain
Eigen::Matrix<double, 6, 3> K = P_ * H.transpose() * Z.inverse();
// Correction step
manif::SE3Tangentd dx = K * z;
// Update
X_ = X_.rplus(dx);
P_ = P_ - K * Z * K.transpose(); So, here are my questions:
Thank you for any insights, and I am happy to provide additional details if needed. |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 3 replies
-
Hi Chris, I never implemented left-invariant filters so I am a bit unsure. But before that, I noticed you express the measurement in lat-lon-height. Are these numbers in meters? Or are lat-lon in degrees or radians? Because then the comparison you do with the expectation, which seems a Cartesian vector, would be wrong. Even if lat-lon are in meters, are you sure they are in the same reference frame as your expectation? I also do not get why you act over the zero vector to compute your expectation, Can you please clarify this point before we dig further into the invariance? |
Beta Was this translation helpful? Give feedback.
-
Thanks for the clarification. What about lat-lon-altitude (north-east-up) being a left-haded reference? Is this intended? Usually all references we use in robotics are right-handed. Hence you should use a right-handed reference such as ENU (east-north-up) , which would be lon-lat-altitude. |
Beta Was this translation helpful? Give feedback.
-
Just found an example of GPS-like measurements for fusions in the kalmanif examples here: https://github.com/artivis/kalmanif/blob/devel/examples/demo_se2.cpp. This is exactly what I was looking for. Thank you for your help! |
Beta Was this translation helpful? Give feedback.
Just found an example of GPS-like measurements for fusions in the kalmanif examples here: https://github.com/artivis/kalmanif/blob/devel/examples/demo_se2.cpp. This is exactly what I was looking for. Thank you for your help!