GPS and IMU Replacement
Using Stereovision in Small Unmanned Aerial Vehicles
GPS is not effective indoors, and measuring
systems like LiDAR are too big to be used on some small Unmanned Aerial Systems
(sUAS) according to the developers of the Centeye (Barrows, 2016, 0:01-0:28). The
idea of the Centeye is to use two light cameras for stereo vision on all four
sides of the quadcopter to create a full 360º field of view on which
measurements of objects can be made (Centeye, n.d.a). The cameras used on the visual
system are less than a 1cm (0.4in) long and weight less than 0.2 grams
(0.000441lbs), so they can be installed on Nano quadcopters weighting in at 38
grams (0.0838lbs) (Centeye, n.d.a). While the small format CMOS sensor (Centeye
RockCreek™ vision chips) is the heart of the Centeye, the whole sensor is made
up of two of the trademark light sensors fitted with 150º field of view lens, a
Laser, and two Infrared LED to create stereovision, optical flow, pulsed light,
and a Laser ranger (Barrows, 2016, 0:30-1:00), (Centeye, n.d.a). The Centeye
sensor can be mounted in four locations to offer full surround vision.
GPS uses multiple satellite signals to
triangulate a location by solving the time it takes to receive the different
signals, and Inertial Measurement Units (IMUs) track the accelerations and the
movements to follow the movements of a vehicle which can determine a final
position by knowing the initial location (Liu, 2017). These methods have their
flaws, GPS does not work indoors, or small spaces, and IMU cannot track drift
produced by the air motion creating errors. Instead of relying on satellite signals
to triangulate location, or tracked movements of the aircraft, Visual Odometry,
or V-Slam uses the live changes on the picture obtained by the cameras to
calculate where the aircraft has travelled (Liu, 2017). This method compares
the current picture with the previous one to locate reference points to track
the position of the vehicle. The obstacle is manufacturing a sensor which is
small enough to fit in small UAV, and that is good enough to operate drawing low
power while obtaining enough information. Doctor Lee from the College of
Maryland presented a CMOS sensor on his dissertation capable of enabling this
type of measurements to add the ability of visual position to sUAS (Lee, 2016,
pp84-100).
There is not a lot of information available
about the technology of the Centeye sensor, or a price, as their model seems to
be configure towards fitting custom solutions. The Centeye sensor utilizes a
new type of technology which employs IR LED for low light conditions, a LASER
to measure distance, and a stereo vision for V-SLAM. The sensor uses a continuous
comparison of the current picture with the previous to compare and obtain
information that allows it to hover in place, keep track of its location, and
identify obstacles. The Centeye solution enables the user to give high level
instructions like general direction of travel, while the system uses its
cameras to determine a path clear of objects while avoiding hitting blank
walls. There is no mention about how it fares with clear windows, and maybe the
IR LED can help with this. As this type of technology gets more developed it
will make its way into commercial sUAS, and it will expand to other unmanned
platforms enabling them to operate indoors, and in narrow spaces as the Centeye
sensor is so small it can be fitted to tiny vehicles.
Centeye fitted on a
Crazyflie platform, total weight 38 grams. Complete Centeye sensor
References
Barrows, G. (2016, November 25). Centeye Nano
drone with obstacle avoidance November 2016. [Video file]. Retrieved from https://www.youtube.com/watch?v=YTi8bjbZJ4s
Centeye. (n.d.a) Solution for GPS-denied
near-Earth autonomy. Retrieved from http://www.centeye.com/small-nano-uas-autonomy/
Centeye. (n.d.b) Vision-based hover in
place. Retrieved from http://www.centeye.com/technology/vision-based-hover-in-place/
Lee, T-H. (2016). Enabling hardware
technologies for autonomy in tiny robots: Control, integration, actuation. University of Maryland, College Park. Dissertation.
Retrieved from https://search-proquest-com.ezproxy.libproxy.db.erau.edu/docview/1814236732/abstract/268A16FED1D044A7PQ/1?accountid=27203
Liu, Y. Gu, Y. Li, J. Zhang, X. (2017,
October 13). Robust stereo visual odometry using improved RANSAC-based methods
for mobile robot localization. Sensors
Journal. 17, 10. Retrieved form https://search-proquest-com.ezproxy.libproxy.db.erau.edu/docview/1965671001?pq-origsite=summon

ReplyDeletethe blog is nice.im really enjoyed the above information.the information is mainly based on studying.Thanks for this.
RPA Training in Chennai
Robotics Process Automation Training in Chennai
RPA course in Chennai
Blue Prism Training in Chennai
UiPath Training in Chennai
UiPath Training Institutes in Chennai
I’m glad that it help you
Delete