Passive Acoustic Localization and Tracking with Mobile Robots

Loading...
Thumbnail Image

Date

2021

Journal Title

Journal ISSN

Volume Title

Repository Usage Stats

126
views
101
downloads

Abstract

Acoustic sensing has received a lot of attention in the underwater domain as this is usually the only form of sensing available. As robotic platforms have been ever increasing in terms of computational capabilities, there now exists the ability to autonomously make decisions and navigate without human intervention. This dissertation proposes and demonstrates acoustic sensing onboard mobile robotic platforms in passive bearing-only tracking of surface vessels in the water and in detecting nearby obstacles in aerial systems.

First, we consider the problem of target tracking with a bearing-only sensor in the presence of merged measurements. Assuming the number of targets in the domain is known, we incorporated a merged measurement model into a nonlinear joint probabilistic data association filter (JDPAF). We demonstrate the ability to track multiple targets through merging events. Furthermore, we propose a novel planning algorithm that incorporates the merged measurement model into the planning process. The result is a planning trajectory biased away from regions where targets will merged in the measurement space, as this leads to higher uncertainty in the target state estimates. We present experimental results with unmanned ground vehicles equipped with camera sensors acting as a surrogate for a bearing-only passive sonar sensor.

Next, we consider the problem of bearing-only tracking of multiple targets using a port-starboard ambiguous sensor. This is the type of sensor used onboard our Autonomous Underwater vehicles (AUVs). We address the problem of resolving the ambiguity by using a likelihood ratio detection and tracking (LRDT) method. The LRDT serves as a front end detector to initialize tracks and pass off to a tracking algorithm. We show that as long as the ambiguity is resolved, the JPDAF algorithm can track targets even with ambiguous measurements. We run our detector-tracker system on a dataset taken in Boston Harbor in August 2018. We show effective functioning of the detector tracker system and provide a discussion for improvements that we are currently working on at the time of writing this dissertation.

We also explore acoustic sensing in aerial vehicles using the self-generated noise caused by the vehicles normal operation. We first propose an algorithm to actively control the distance between a motor propeller system (MPS) and large obstacle using data from a single microphone. By first recording and storing the free-field response of the MPS, we show that by subtracting the power spectrum of the free-field response from the power spectrum when a wall is present, we can reveal a broadband interference pattern. The dominant oscillating frequency of this interference pattern is linearly related to the distance from the microphone to the wall. By performing a fast Fourier transform on the difference between the spectra, we show that we can extract this distance and actively control it in real time. We present a test rig demonstrating the algorithm experimentally.

Finally, we offer an improvement to the aerial acoustic sensing system by adding an additional microphone. We develop a novel cross-correlation processing algorithm that is able to extract the distance from the microphones to the wall. This method does not rely on computing the free-field response of the MPS. We demonstrate the algorithm in experiment by controlling the altitude of a blimp-like vehicle using only the self-generated noise and two microphones placed on the bottom of the vehicle.

Description

Provenance

Citation

Citation

Calkins, William Lucas (2021). Passive Acoustic Localization and Tracking with Mobile Robots. Dissertation, Duke University. Retrieved from https://hdl.handle.net/10161/23080.

Collections


Dukes student scholarship is made available to the public using a Creative Commons Attribution / Non-commercial / No derivative (CC-BY-NC-ND) license.