[1] KLEIN G, MURRAY D W. Parallel tracking and mapping for small AR workspaces[C]//Proceedings of the 6th IEEE/ACM International Symposium on Mixed and Augmented Reality, Nara, Nov 13-16, 2007. Washington: IEEE Computer Society, 2007: 225-234.
[2] MOURIKIS A I, ROUMELIOTIS S I. A multi-state cons-traint Kalman filter for vision-aided inertial navigation[C]// Proceedings of the 2007 IEEE International Conference on Robotics and Automation, Roma, Apr 10-14, 2007. Pisca-taway: IEEE, 2007: 3565-3572.
[3] MUR-ARTAL R, MONTIEL J M M, TARDOS J D. ORB-SLAM: a versatile and accurate monocular SLAM system[J]. IEEE Transactions on Robotics, 2015, 31(5): 1147-1163.
[4] QIN T, SHEN S J. Robust initialization of monocular visual-inertial estimation on aerial robots[C]//Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Ro-bots & Systems, Vancouver, Sep 24-28, 2017. Piscataway: IEEE, 2017: 4225-4232.
[5] LEUTENEGGER S, FURGALE P T, RABAUD V, et al. Keyframe-based visual-inertial SLAM using nonlinear opti-mization[C]//Proceedings of the Robotics: Science and Systems IX, Berlin, Jun 24-28, 2013: 1-8.
[6] JONES E S, SOATTO S. Visual-inertial navigation, mapping and localization: a scalable real-time causal approach[J]. The International Journal of Robotics Research, 2011, 30(4): 407-430.
[7] LEUTENEGGER S, LYNEN S, FURGALE P, et al. Keyframe-based visual-inertial odometry using nonlinear optimization[J]. International Journal of Robotics Research, 2015, 34:314-334.
[8] GEORGE M, TARDIF J P, KELLY A. Visual and inertial odometry for a disaster recovery humanoid[M]//Field and Service Robotics. Berlin, Heidelberg: Springer, 2015: 501-514.
[9] WU K J, AHMED A, GEORGIOU G A, et al. A square root inverse filter for efficient vision-aided inertial navigation on mobile devices[C]//Proceedings of the Robotics: Science and Systems XI, Rome, Jul 13-17, 2015. Cambridge: MIT Press, 2015: 1-9.
[10] BL?SCH M, OMARI S, HUTTER M, et al. Robust visual inertial odometry using a direct EKF-based approach[C]// Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots & Systems, Hamburg, Sep 28-Oct 2, 2015. Piscataway: IEEE, 2015: 298-304.
[11] STRASDAT H, MONTIEL J M M, DAVISON A J. Real-time monocular SLAM: why filter?[C]//Proceedings of the 2010 International Conference on Robotics & Automation, Anchorage, May 3-7, 2010. Piscataway: IEEE, 2010: 2657-2664.
[12] HESCH J A, KOTTAS D G, BOWMAN S L, et al. Con-sistency analysis and improvement of vision-aided inertial navigation[J]. IEEE Transactions on Robotics, 2017, 30(1): 158-176.
[13] SHEN S J, MICHAEL N, KUMAR V. Tightly-coupled monocular visual-inertial fusion for autonomous flight of rotorcraft MAVs[C]//Proceedings of the 2015 IEEE Inter-national Conference on Robotics & Automation, Seattle, May 26-30, 2015. Piscataway: IEEE, 2015: 5303-5310.
[14] YANG Z F, SHEN S J. Monocular visual-inertial state esti-mation with online initialization and camera-IMU extrinsic calibration[J]. IEEE Transactions on Automation Science & Engineering, 2017, 14(1): 39-51.
[15] GUERRA E, MUNGUIA R, GRAU A. Monocular SLAM for autonomous robots with enhanced features Initialization[J]. Sensors, 2014, 14(4): 6317-6337.
[16] KNEIP L, WEISS S, SIEGWART R. Deterministic initiali-zation of metric state estimation filters for loosely-coupled monocular vision-inertial systems[C]//Proceedings of the 2011 IEEE/RSJ International Conference on Intelligent Robots & Systems, San Francisco, Sep 25-30, 2011. Pisca-taway: IEEE, 2011: 2235-2241.
[17] WEISS S, BROCKERS R, ALBREKTSEN S, et al. Inertial optical flow for throw-and-go micro air vehicles[C]//Proc-eedings of the 2015 IEEE Winter Conference on Applications of Computer Vision, Waikoloa, Jan 5-9, 2015. Washington: IEEE Computer Society, 2015: 262-269.
[18] FAESSLER M, FONTANA F, FORSTER C, et al. Automatic re-initialization and failure recovery for aggressive flight with a monocular vision-based quadrotor[C]//Proceedings of the 2015 IEEE International Conference on Robotics & Automation, Seattle, May 26-30, 2015. Piscataway: IEEE, 2015: 1722-1729.
[19] YANG Z, SHEN S. Monocular visual-inertial fusion with online initialization and camera-IMU calibration[C]//Pro-ceedings of the 2015 IEEE International Symposium on Safety, Security, and Rescue Robotics. Piscataway: IEEE, 2015.
[20] MARTINELLI A. Closed-form solution of visual-inertial structure from motion[J]. International Journal of Computer Vision, 2014, 106(2): 138-152.
[21] LI P L, QIN T, HU B T, et al. Monocular visual-inertial state estimation for mobile augmented reality[C]//Proceedings of the 2017 IEEE International Symposium on Mixed and Augmented Reality, Nantes, Oct 9-13, 2017. Washington:IEEE Computer Society, 2017: 11-21.
[22] MUR-ARTAL R, TARDóS J D. Visual-inertial monocular SLAM with map reuse[J]. IEEE Robotics & Automation Letters, 2016, 2(2): 796-803.
[23] FORSTER C, CARLONE L, DELLAERT F, et al. On-manifold preintegration for real-time visual-inertial odo-metry[J]. IEEE Transactions on Robotics, 2017, 33(1): 1-21.
[24] BERCZI L P, BARFOOT T D. State estimation for robotics[M]. New York: Cambridge University Press, 2017.
[25] CHOU J C K. Quaternion kinematic and dynamic differen-tial equations[J]. IEEE Transactions on Robotics and Auto-mation, 1992, 8(1): 53-64.
[26] LUPTON T, SUKKARIEH S. Visual-inertial-aided naviga-tion for high-dynamic motion in built environments without initial conditions[J]. IEEE Transactions on Robotics, 2012, 28(1): 61-76.
[27] CARLONE L, KARAMAN S. Attention and anticipation in fast visual-inertial navigation[C]//Proceedings of the 2017 IEEE International Conference on Robotics and Automa-tion, Singapore, May 29-Jun 3, 2017. Piscataway: IEEE, 2017: 3886-3893.
[28] JOSHI S, BOYD S P. Sensor selection via convex optimi-zation[J]. IEEE Transactions on Signal Processing, 2009, 57(2): 451-462.
[29] BURRI M, NIKOLIC J, GOHL P, et al. The EuRoC micro aerial vehicle datasets[J]. International Journal of Robotics Research, 2016, 35(10): 1157-1163.
[30] AGARWAL S, MIERLE K. Ceres solver[EB/OL]. (2017-11-14) [2020-03-10]. http://ceres-solver.org. |