Vue d'ensemble de la session |
Wednesday, May 29 |
15:30 |
Demystifying Uncertainty - TPU and You
* Jonathan Beaudoin, HydroOctave Consulting Inc., Canada Burns Foster, Spatialnetics, Canada It is incredibly important as a hydrographic surveyor to understand if our measurements are fit for purpose. One key determinant of fitness of our measurements is their accuracy, or uncertainty, as we hear more often. It seems, however, that there is nothing more uncertain in surveying than the uncertainty of our measurements. We have mathematical tools that tell us the uncertainty levels we should, in theory, be able to achieve. But as is often said “In theory there is no difference between theory and practice. In practice, there is.” Just how is it that surveyors go about ensuring they are achieving desired accuracy levels in practice? Some of this is learned in the early education phase of a hydrographer’s career but the “book smarts” learned in school do not instantly become “field smarts”. As a result, much of what we do to manage uncertainty is learned on the job, often in an incoherent manner. Uncertainty is rarely synthesized and discussed as a single topic. How DO you master uncertainty management in hydrographic surveying? It is a hard question to answer nor is the answer short. We cannot possibly answer the entire question in a single paper. Our aim instead is to make surveyors aware that they may need to work harder on this topic to master it and to point them towards ideas and resources to help them do so. We seek to demystify the difference between achieving accuracy in theory and achieving it in practice. We will lightly cover core concepts such as hydrographic survey standards and their role in managing uncertainty. We will also explore what tools and techniques are at our disposal and how best to use them during planning, acquisition and processing stages. Lastly, we will aim to dispel some common misconceptions on these various topics. |
15:45 |
Graph Bathymetry Network (GraBN) : A Graph Approach to Deep Learning in Bathymetry Data
* Adriano Fonseca, CCOM/JHC, United States of America Brian Calder, CCOM/JHC, United States of America The costs of producing a chart or map are not only associated with the cost of vessel and acquisition but also by the work required by the operators; therefore, generating new methods that reduce the amount of manual inspection and/or remediation has always been a priority for hydrographic research. The techniques currently used to process Multibeam Echosounder (MBES) data typically involve application of some computer-assisted hydrographic algorithm, such as CUBE or CHRT, followed by manual remediation of any data that the operator believes has not been correctly addressed by the algorithm. These techniques have been shown to work well with typical MBES data, but induce structure in the data (e.g., through use of a, potentially variable resolution, grid) that is not inherent to the dataset. Recent advances in deep learning have developed Graph Neural Networks (GNN), which describe points in a data cloud as nodes in a graph and thus avoid imposing any gridding or interpolation on the data: points are interconnected on the unstructured 3D space in which they are acquired. This work applies GNN innovations to an Autoencoder Network, the Graph Bathymetry Network (GraBN). The network attempts to reconstruct the input soundings and their neighbors; a score for the probability of these soundings being true bathymetry data is provided by the reconstruction loss. Results show that if trained sufficiently well, the network was able discriminate between bathymetry (low reconstruction loss) and noise data (high reconstruction loss). On a shallow-water survey conducted off the coast of Hampton Beach, New Hampshire, we were able to optimize our decision score threshold to achieve an accuracy of 96.48% on our test dataset, with a precision of 93.62% and recall of 99.75%. |