ROV localization is essential for subsea operations and is commonly achieved by using acoustic sensors such as ultra-short baseline (USBL) and long-baseline (LBL). These systems are costly and prone to errors in specific scenarios, for example: operations in shallow waters or in proximity to subsea structures. In these scenarios, the operator may experience shadow areas where the accuracy will be compromised or position estimation is not possible at all.
Since a significant number of subsea inspection operations are performed under these circumstances, we propose a solution to estimate the movement of the ROV from its live video feed, providing a real-time estimation for ROV position in any scenario. The neural network returns motion estimation and orientation, acting as an inertial navigation system that is able to combine with correct, or replace the acoustic sensors position estimation.
The solution proposed in this paper was tested in a real subsea operation where acoustic sensors were not accurate. We describe the use case and how, with our solution and an in-house simulator, we were able to monitor the operation and create a replay of the entire mission in a simulated environment. We show that our solution is effective at estimating the ROV trajectory during the entire operation. Our method can be used to improve the accuracy of acoustic sensors and to replace them in situations where they are not able to operate, and, later on, we can see the entire operation in our digital twin.