The UK Hydrographic Office (UKHO), located in Taunton, and Marine AI, a Plymouth-based leader in maritime autonomy software, have launched a research programme to enable Maritime Autonomous Surface Ships (MASS) to read, interpret, and act upon official navigational information hitherto only interpretable by humans.
An eight-month project will see Marine AI finetune its baseline large language model (LLM) to process ADMIRALTY Sailing Directions (SDs) information and Radio Navigation Warnings (RNWs). These are currently written in “natural” language for human interpretation. The processed information will be fed into the GuardianAI software suite for autonomous control.
At present, MASS rely on human operators to interpret text-heavy navigation data, which is often described in non-standard nautical language and distributed through legacy systems. The aim of the project will be to address the challenges posed by unstructured text, legacy broadcast formats, and the lack of machine-readability.
That will be achieved by retraining a bespoke LLM and developing supporting AI agents to structure the data before it is fed into GuardianAI’s tactical engine and human-machine interface.
The intention is that autonomous vessels will be able to operate more independently, responding immediately to navigational warnings and Sailing Directions without waiting for human interpretation.
It is hoped that there will be a live on-water demonstration in spring 2026, using ZeroUSV’s Oceanus12 vessel equipped with Marine AI’s GuardianAI suite on Plymouth’s waterways.
Oliver Thompson, technical director at Marine AI, said: “This is the first time anyone has attempted to process ADMIRALTY Sailing Directions and Radio Navigation Warnings in a way that an autonomous control system can act upon. By proving this capability on the water, we are closing one of the biggest gaps in MASS autonomy and taking a major step toward safe, fully automated operations.”