-
Notifications
You must be signed in to change notification settings - Fork 112
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Timestamp jitter #52
Comments
Hi, I have observed a similar behaviour (continual deviation) a while ago on the serial-connected IMU (third generation). It was configured to yield too much data compared to the baudrate of the line and, as a consequence, the time at which the message was received was more and more shifted until one message was dropped by the IMU (as shown by the sample counter). (I'm not sure what we did about it: either take it into account or lower somehow the number of messages sent, or change the configuration so that it fits. However the warning that changing the baudrate might not work dates back from this issue.) For the record, what was the exact configuration of your IMU for which you observed this behavior? Did you try to reduce the frequencies to see whether it makes a difference? You are right in suggesting that, when available, we should use the timestamp of the IMU rather than the timestamp of the computer on the receiving end. The slight issue with that is two-fold:
I hope it helps answer some of your questions. |
@jpapon, I believe you are actually looking (partially) for one way timestamp translation (typically done by estimating offset and slope of hardware timestamps based on the associated receive timestamps). It is not a very hard problem but still some work to implement and test. We are currently working on a library solely for that purpose and designed to be used within (c++) sensor drivers: https://github.com/ethz-asl/cuckoo_time_translator. It has also python bindings for the underlying algorithms (currently the "convex hull" algo. and a Kalman filter). @fcolas , I hope you don't mind the little advertisement :). |
@jpapon I've pushed a working branch with additional tools to help look into this issue. Namely the node publishes the time it took to get the measurement from the IMU through the driver ( (Note that I'm doing these tests in a virtual machine, but it is still using ntp.) |
@fcolas Interesting, I'll try it out when I have some time. I actually wound up throwing together my own c++ node using the xcommunication library, and that eliminated the problem for the most part. There was still some small amount of jitter, but I suspect that was due to us reading from 5 RealSense cameras on USB as well. |
@jpapon Is the minimal cpp driver you've mentioned available somewhere? I'm running the ethzasl_xsens_driver on a very low power Atom and it's taking ~40% CPU, that's why I'd be interested in a more lightweight option. |
So the new 2019 SDK by Xsens has a ROS cpp driver implementation. It has a few pecularities (such as looking up ROS params on the parameter server in sensor data callbacks leading to huge CPU consumpion of the ROS master). After fixing those, I integrated https://github.com/ethz-asl/cuckoo_time_translator as suggested by @HannesSommer. It seems to work very well, time stamp jitter gets eliminated completely. Jitter with data stamped based on ros::Time::now() on data receive, IMU transmitting at 50Hz (blue plot): Jitter with cuckoo time translator (blue plot again): |
Hey,
I've been working with an XSens IMU recently connected via USB, and I'm running into some timestamp jitter issues. Looking through the code, it appears you are using rospy.Time.now() as the message timestamp - so the time the IMU data is received.
On my machine, there is considerable jitter in the received messages. For instance, if I have the IMU set to report at 200hz, if I run:
rostopic hz /imu/data -w 2
I see rates between 190 and 210hz reported. Furthermore, if I record a bunch of data and dump out the timestamps to file, then look at the difference, there is considerable jitter, as well as occasional delays where the next data might not be published for .1 seconds.
To try to fix this, I've begun using the timestamps coming out of the IMU, and simply calculating an offset from the ROS clock so I can publish sync'd to the ROS time stamps. I've pushed this to my fork here (this is just test code, it would need to be cleaned for a PR) https://github.com/jpapon/ethzasl_xsens_driver.git
The problem is, even with this, I'm seeing a continual deviation between the clocks that builds. I got around this by allowing up to .1 seconds of slop, after which the offset resets. In my tests, the clocks were off by about .01 seconds per second - this seems quite large to me.
My main question is - is this jitter normal? Or is it a problem with my USB bus?
Doesn't it make sense to use the XSens time reference, rather than rospy.Time.now()?
Also, any idea if this is caused by using Python? Has there been any work on a C++ version of an XSens ROS node?
Cheers, and thanks!!
The text was updated successfully, but these errors were encountered: