APL-UW Home

Jobs
About
Campus Map
Contact
Privacy
Intranet

Aaron Marburg

Principal Electrical/Computer Engineer

Email

amarburg@uw.edu

Phone

206-685-8461

Biosketch

Dr. Marburg's research focuses on the development of robotic platforms for ocean exploration and science, with a focus on perception, situational awareness, and mission planning. He also has a background in remote sensing, photogrammetry and precision navigation, and a strong interest in human-machine interfaces, and data and metadata management. He has over 15 years experience in electrical and software design for robotics, scientific instrumentation and high-performance computing. Dr. Marburg joined APL-UW as a SEED postdoctoral researcher in 2015 after completing his Ph.D. at the University of Canterbury in Christchurch, New Zealand.

Department Affiliation

Ocean Engineering

Education

B.S. Engineering, Swarthmore College, 1998

M.S. Aeronautical & Astronautical Engineering, Stanford University, 2004

Ph.D. Electrical & Computer Engineering, Canterbury University, 2015

Publications

2000-present and while at APL-UW

Extrinsic calibration between an optical camera and an imaging sonar

Lindzey, L., and A. Marburg, "Extrinsic calibration between an optical camera and an imaging sonar," In Proc., OCEANS 21, San Diego, CA, 20-23 September 2021, doi:10.23919/OCEANS44145.2021.9705956 (IEEE, 2022.)

More Info

15 Feb 2022

In this paper, we present an open-source tool for calculating the extrinsic calibration between an optical camera and an imaging sonar. Precise determination of the relative 3D location of two sensors is a prerequisite to combining their data into a unified representation of the scene. Optical cameras are commonly used in many robotic domains due to their rich data, low price point, and small form factor. However, optical imagery and the resulting reconstruction degrade quickly in the presence of turbidity and marine snow. Conversely, imaging sonars resolve feature locations in range and azimuth, are largely unaffected by turbidity, but have an inherent elevation ambiguity due to their wide vertical beam pattern. The complementary strengths and weaknesses of these sensors make it appealing to combine them when developing a perception system for use in underwater reconstruction and manipulation tasks. As a first step, an extrinsic calibration describing the relative position of the two sensors must be determined before any data integration is possible. This paper presents a technique that uses a readily-available calibration target and simple hardware to construct a target with both optical and acoustic features and then introduces an open source toolbox for computing the extrinsic calibration.

Report of the Resident AUV Workshop, 9–11 May 2018.

Delaney, J.B., D.A. Manalang, A. Marburg, A. Nawaz, and K. Daly, "Report of the Resident AUV Workshop, 9–11 May 2018." Technical Report APL-UW TR 1901, Applied Physics Laboratory, University of Washington, Seattle, 84 pp.

More Info

27 Mar 2020

Workshop participants divided into focus groups to consider resident autonomous undersea vehicle (R-AUV) use cases related to these four application areas: mid-ocean ridges and the overlying water column; gas hydrates and coastal oceans; polar, under-ice, and off-planet oceans; and maintenance and operation of installations.

The following technical elements emerged as clear common themes across R-AUV deployment scenarios: power and data management sub-systems, communications, navigation, capable sensor and payload systems, advanced autonomy functions. The single most important conclusion of the workshop is that incremental technological steps toward realizing routine R-AUV operations could yield revolutionary scientific and operational value.

Cloud-accelerated analysis of subsea high-definition camera data

Marburg, A., T.J. Crone, and F. Knuth, "Cloud-accelerated analysis of subsea high-definition camera data," Proc., OCEANS, 18-21 September, Anchorage, AK, 6 pp. (IEEE, 2017).

More Info

18 Sep 2017

The seafloor high-definition camera (CamHD) installed on the Ocean Observatories Initiative (OOI) Cabled Array (CA) provides real-time video of the Mushroom vent at the ASHES hydrothermal field in the Axial Volcano caldera on the Juan de Fuca spreading zone (Figure 1). CamHD performs a pre-programmed 13-minute motion sequence every 3 hours. The video captured during this sequence is stored as a 13GB HD video file in the OOI Cyber-Infrastructure (CI) at Rutgers University. As of July 2017 there are approx. 6700 videos in the CI, all of which are publicly accessible through a conventional HTTP interface. Unfortunately, it is impractical for a researcher (and taxing on the CI bandwidth) to download, store, and process the extent of the video archive for analysis. We describe two elements of our efforts to accelerate CamHD video analysis: a cloud-hosted application which provides a simplified interface for extracting individual frames from CamHD videos in a time- and bandwidth-efficient manner; and a tool for the automatic isolation and identification of video subsets showing a sequence of known camera positions. Automatic identification of these video segments allows rapid and automatic development of e.g., time lapse videos.

More Publications

Acoustics Air-Sea Interaction & Remote Sensing Center for Environmental & Information Systems Center for Industrial & Medical Ultrasound Electronic & Photonic Systems Ocean Engineering Ocean Physics Polar Science Center
Close

 

Close