The incumbent need of sustainable and cost-effective solutions for inspections of live assets installed on the seabed along with the increasing complexity of next generation components within the domain of subsea processing (e.g., boosting, separation, water treatment, H2S removal), have been the key drivers for the Hydrone Program, the 5-year innovation and industrialization plan of Saipem aimed at developing subsea drones for maintenance and inspection. While moving from aerial applications to the underwater environment, several technological challenges had to be faced, including among the others: i) interoperability of multi-physics wireless communication systems, ii) A.I. based tools for autonomous navigation into an unknown subsea environment, iii) on board real-time big data management, iv) energy storage, v) data fusion, vi) remote control from onshore and, vii) closed-loop lagging management of manipulation tasks. One of the most demanding requirements is the capability to switch autonomously from a flowline inspection to a riser inspection. From a topology standpoint, the navigation capabilities must be able to seamlessly move from a 2D target to a 3D body with no human intervention in the loop. Saipem subsea drones called Hydrone-R and FlatFish are both able to automatically track a pipeline, a flowline, a cable or an umbilical (either exposed or partially buried) and to detect when these assets turn into risers to connect the seafloor with the topside facilities. Such capability applies to simple catenaries as well as to lazy-wave catenary patterns. Hydrone-R and FlatFish can track the riser throughout the water column following predefined missions namely a one-side (180° domain) or a two-sides (360° domain) inspection. This paper presents how information coming from different sources is managed and processed to safely guide the vehicle while performing an autonomous subsea riser inspection. Feedback and information of relevance from a deepwater pilot project offshore Brazil are presented too. Data coming from stereo-cameras and laser scanners are digitalized, merged and processed by the navigation algorithms supported by A.I. based computing tools in order to ensure the pre-determined inspection pattern is complied with. Transition from a DVL based tracking to a pure video-referenced navigation has been one of the most demanding features to be dealing with. The vehicles can hold the position automatically when the vehicle detects anomalies like corrosion, torn parts, scratching and the like thus increasing the resolution of data acquisition relevant to the identified anomalies. 4K video and high-resolution image sets are taken during the inspection and exported into a Survey Report for Client's record and use within its own Life-of-Field master plan.
Skip Nav Destination
Autonomous Deepwater Inspections of Risers via Subsea Drones and A.I. Based Data Fusion Tools
Paper presented at the OMC Med Energy Conference and Exhibition, Ravenna, Italy, October 2023.
Paper Number:
OMC-2023-469
Published:
October 24 2023
Citation
Nevoso, Cristian, Bertagnoli, Marco, Watanabe, Thomio, Masiero, Michael, Junior, Jessivaldo, and Camila Lima. "Autonomous Deepwater Inspections of Risers via Subsea Drones and A.I. Based Data Fusion Tools." Paper presented at the OMC Med Energy Conference and Exhibition, Ravenna, Italy, October 2023.
Download citation file:
Sign in
Don't already have an account? Register
Personal Account
You could not be signed in. Please check your username and password and try again.
Could not validate captcha. Please try again.
Pay-Per-View Access
$10.00