HydrOffice represents a collaborative effort leaded by the Center to develop a research software environment with applications to facilitate all phases of the ping-to-chart process. This environment, by minimizing the efforts to kick-start and by easing the configuration management, facilitates the creation of new tools for Center students and in the field; and, potentially, it eases the industrialization of some of these tools. The overall goal is thus to speed up both algorithms testing and R2O (Research-to-Operation).
HydrOffice’s wide scope is structured in three research themes:
- Facilitate Data Acquisition
- Automate and Enhance Data Processing
- Improve Hydrographic Products
These themes are driving the creation of a collection of hydro-packages, each of them dealing with a specific issue of the field.
HydrOffice has open licenses and encourages free contribution, and can facilitate development with an existing infrastructure and interface. Individual tools within HydrOffice are built in contained, modularized structures, such that they can be easily updated and maintained.
One of the main HydrOffice requirement is easiness in its extension. This goal was achieved by natively supporting a plugin architecture:
- A base package that provides with common boiler-plate code.
- Several hydro-packages where each one ship a few task-specific algorithms and can access common code from the base package. Furthermore, a skeleton package is provided with a base GUI to speed up and to ease the focus on the targeted weakness.
The current hydro-packages structures is shown in Figure 2.
Finally, the individual tools in HydrOffice are usually also provided as “frozen”, standalone, click-and-play solutions that require no installation on behalf of the user.
All of the HydrOffice applications are made available within Pydro (an in-house hydrographic environment developed and maintained by NOAA Office of Coast Survey's Hydrographic Systems and Technology Branch), to facilitate delivery to (and feedback from) NOAA users.
Giuseppe Masetti, Brian Calder, Matt Wilson
The SSP Manager is an application based on the HydrOffice SSP library to bridge the gap between sound speed profilers and multibeam echo-sounders (MBES).
The application supports several data formats as file and network inputs (Castaway, Digibar, Idronaut, Seabird, Sippican, Turo, UNB, MVP). Once successfully imported, the application provides tools and functionalities to edit, improve (e.g., by using oceanographic atlases) and extend the collected raw samples. The resulting SSP can be then exported to files or directly send to hydrographic data acquisition software (e.g., Hypack, Kongsberg SIS, QPS Qinsy, Reson).
SSP Manager provides a mechanism to store the (raw, processed, and transmitted) data samples into a database so that additional analysis can be applied to the SSPs collected during the whole survey, or exported in other well-known geographic data formats for further analysis and visualization.
The initial code was based on SVP Editor, an application developed by Dr. Jonathan Beaudoin while he was at the Center.
SSP Manager is written in Python 2.7, and the current release is 2.1.
➽ SSPManager.2.1.rar [32bit]
Brian Calder, Giuseppe Masetti
The Bathymetric Attributed Grid (BAG) is a hydrographic exchange data format developed and maintained by the ONS-WG (Open Navigation Surface Working Group).
HydrOffice BAG library provides access to BAG-specific features, as well as a collection of tools to verify and manipulate BAG data files.
BAG Explorer is a light application, based on HDF Compass and the HydrOffice BAG library tools, to explore BAG data files.
It provides a mechanism to explore the tree-like structure of a BAG file, to visualize and validate the XML metadata content, to inspect the tracking list, and to plot the elevation and the uncertainty layers.
BAG Explorer is written in Python 2.7, and the current release is 0.1.1.
Advancing multibeam technology allows us to map the seafloor better than ever before, and we have plenty of stunning bathymetry to show for it. Yet the ever-increasing data volume presents some challenges during both quality review and in its generalization to nautical chart scale. The tools under this research theme aim to identify and address specific inefficiencies in these processes by shifting manually-intensive efforts toward automation, for a faster, easier review. With more timely feedback to field parties, these tools were a catalyst for the newly-implemented Rapid Survey Assessment (for more info, watch this seminar).
Each of the algorithms presented in the following sub-sections have a wide margin of improvements with the ongoing development, and new algorithms will be added to the existing two tools as needed. Ultimately, high quality surveys applied to the chart in a more timely fashion is the goal, and how well the current workflow will be improved by the SARScan and HCellScan algorithms, is the metric by which these tools will be judged.
Matt Wilson, Brian Calder, Giuseppe Masetti
The overall aim of this tool it to improve data quality issues, to reduce review and acceptance times, and ultimately to reduce ping-to-chart times. Furthermore, once that one of the developed algorithms is mature and effective enough, existing commercial software might decide to adopt it with a relatively easy transition based on the existing working implementation. The speed in prototyping, a characteristic of the adopted Python language, eases the decision to abandon a developed algorithms in case that is not effective or a commercially supported implementation becomes available.
The “Flier finder” algorithm is dedicated to one of the major identified problem: the quite common presence of anomalous data in the finalized gridded bathymetry delivered to the hydrographic branches (aka “fliers”). This represents a major concern since, when fliers are found, there is considerable time and effort required to remove them, as it involves re-computation and re-finalization of the grids, which can take several days (or longer) to accomplish with the additional disadvantage that the output is no longer the authentic field submission. This algorithm contributes to detect fliers as early as possible in the quality control process. Its initial implementation scans gridded bathymetry and flags abrupt depth changes as per user-set criteria, as shown in Figure 9 (white “lassos” encircle the anomalous grid data). Several algorithm modifications have also been testing (e.g., by including the evaluation of additional statistic layers provided by a CUBE DTM).
The “VALSOU to grid check” and “feature scan” algorithms have their focus on the required agreement between gridded bathymetry and submitted feature files, as well as the adherence of those feature files to current specifications. Wrecks, rocks, and obstructions should have appropriate representation in the gridded bathymetry with regards to position and least depth. It is a common situation at HBs to receive surveys with hundreds (or even thousands) of features that need to be manually checked against the grid to ensure agreement, and also to ensure proper attribution. This process can be a massive time sink and, having a monotonous nature, makes it perfectly suited for automation. The developed algorithms scan the gridded bathymetry and feature files to ensure this agreement, and that the attributes of the feature are set per current version of the NOAA Hydrographic Survey Specifications and Deliverables (HSSD) manual (SARScan tool), or the current NOAA HCell Specifications (depending on which phase of the ping-to-chart process that the survey is in) (HCellScan tool). An example of the agreement we wish to observe is shown in Figure 10.
SARScan is written in Python (both 2.7+ and 3.4+ supported), and the current release is 0.2.1.
Matt Wilson, Brian Calder, Giuseppe Masetti
Since this early phase several advantages of the adoption of the described HydrOffice framework are quite evident: almost complete and total freedom in development (whereas commercial “off-the-shelf” software have limits coming the need to meet several diverse customer requirements); direct customization to NOAA specifications, and even in-house best practices; and quick delivery to the branches and to the field for testing and evaluate the effectiveness.
The “Sounding scan” algorithm is concerned about the chart compilation process, since it involves reducing survey data volumes of hundreds of gigabytes (or even terabytes) to a final product that is generally only a couple of megabytes, which will be applied to a nautical chart such that the display is optimal for the mariner and safety to navigation is upheld to the fullest extent possible. The algorithm automates a long-time best practice at AHB known as the “triangle rule”, in which a TIN is created from the potential soundings and then the dense, survey scale soundings (from which the chart soundings are a subset) are analyzed and flagged if are shoal of the three vertices of the surrounding triangle (Figure 12). Those flagged soundings should be considered for selection, or for representation in some way (perhaps as a feature, or by contour). The same algorithm, created primarily for chart sounding evaluation, can also be useful for the field when performing a comparison between survey soundings and the existing charted soundings, a requirement for the Descriptive Report of Survey. Furthermore, the procedure can highlight potential dangers to navigation otherwise overlooked. The initial implementation of this algorithm is going to be improved both as computation performance (e.g., in the TIN computation) and the used feature inputs (e.g., contours and depth areas).
HCellScan is written in Python (both 2.7+ and 3.4+ supported), and the current release is 0.2.1.