DeepOcean and Aker BP have successfully completed underwater trials using an autonomous inspection drone (AID) in the Alvheim area, located in the heart of the Norwegian North Sea.
Jarle Marius Solland, operations manager at Aker BP, said: “Our primary goal was to discover whether underwater inspection can be performed more cost-effectively and with better, more accurate data quality with this new technology and associated methods. The conclusion is definitely yes”.
Autonomous inspection drone: those responsible for its development
AID represents a collaboration between DeepOcean, Argus Remote Systems and Vaarst, with its development being financially supported by Aker BP. During an underwater campaign spanning 10 days, Aker BP and DeepOcean carried out inspections on underwater trees and other infrastructure in Alvheim.
The autonomous inspection drone (AID) has dimensions of 1.25 x 0.85 x 0.77 meters, a weight of 320 kg and has the capacity to operate at sea depths of up to 3,000 meters . It has the ability to fly in DP (dynamic positioning) mode with station keeping and remote control functions.
” Based on this experience, we believe that next year we will be able to perform inspections of Alvheim-specific subsea infrastructure considerably more efficiently ,” said Kristoffer Johansen, DeepOcean’s chief technology officer. Mission planning by inspection personnel was moved from the digital mission planner to the AID via the API (application programming interface).
The drone was deployed from the IMR and ROV underwater support vessel Edda Fauna, replacing the existing observation class ROV. Mission monitoring was performed locally from the main ship and remotely from Remota’s remote operations center in Haugesund. The AID is based on an Argus Remote Systems ROV Rover MK2, equipped with both hardware and software upgrades.
The role of each company
Argus is the entity in charge of developing the AID platform and the navigation algorithm. DeepOcean is responsible for managing the digital twin platform, mission planning software and providing real-time visualization of the AID in operation.
For its part, Vaarst takes responsibility for the Subslam 2x machine vision camera for autonomous navigation and data collection . Inspection data is transmitted to the ground and the vehicle position is continuously communicated to the digital twin to ensure data quality and improve situational understanding.
“ For this first test – continued Johansen – “We experienced more stable flight with the AID, including very stable navigation during inspections. As a result, the collected data used for post-processing of 3D models shows extraordinarily high quality”.
Don’t miss any of our posts and follow us on social media!
Inspenet.com YouTube LinkedIn Facebook Instagram
Source: offshore-mag.com