Sammendrag
When collecting environmental data with a mobile platform such as a Remotely Operated Vehicle (ROV) or an Autonomous Underwater Vehicle (AUV), the typical end user is not concerned with how accurate the navigation system is, as long as it is "good enough" not to distort the data. How accurate the localization needs to be to accomplish this, depends highly on the resolution of the payload sensor used. For Multibeam echosounders with a grid size measured in meters this is quite easily achievable, whereas a photomosaic camera with pixels measured in millimeters puts extreme demands on the navigation accuracy. Simultaneously, the photomosaic camera also provides highly accurate measurements which can be utilized by the navigation system. If a distinct feature on the sea floor is detected in two separate images taken some time apart, the navigation system can accurately calculate the difference in the localization of the platform between the two points in time. This payload aided navigation technique is known as Simultaneous Localization and Mapping (SLaM). While a global position offset may still be present in the data, this is now shared by the entire set, and the all-important relative position between the data measurements becomes extremely accurate. This paper aims to use a SUB-Fighter 7500 ROV as a demonstration platform for photomosaic aided navigation both through simulations and field experiments in Trondheim Harbour. The simulations will examine the potential benefit of optically aided navigation of underwater vehicles with various inertial sensor accuracies and external position update frequencies.
Vis fullstendig beskrivelse