• Media
  • Blog
  • Sensor Fusion Makes Situational Awareness Data More Certain, Actionable

Blog Sensor Fusion Makes Situational Awareness Data More Certain, Actionable

November 9, 2021

[vc_row][vc_column][vc_column_text]From the wheelhouse of nearly any commercial surface vessel, mariners must interpret multiple sources of data about their surrounding marine domain. In addition to using their natural senses, they understand their at-sea position by referencing physical landmarks, mapping the vessel’s course along ECDIS charts and referencing a GPS. They become aware of nearby vessel traffic and patterns via AIS data and radar. Radar also allows them to anticipate weather, such as squalls, that may impact the voyage. If a vessel has been outfitted with cameras for a bird’s-eye view or other difficult-to-see angles, they will reference video screens to take in the various perspectives. For mariners who are responsible for the safe transit of marine assets and valuable cargo, these many data sources are not only helpful to operations, but vitally important.

As evidenced above, the very active role of mariner relies on a human’s ability to comprehensively aggregate data from numerous dynamic sources, interpret it quickly and make critical decisions, often in rapidly changing environments. Crews are highly trained to manually execute this task continuously as vessels operate around the world 24/7, but the job comes with inherent risk. Human vision is flawed, and distraction and fatigue are root causes of the majority of at-sea incidents.

The automotive industry faces a similar scenario: Drivers have historically relied on human vision, physical road signs, maps and GPS technology, odometers, traffic aggregators and other sources of data to manually operate vehicles. However, with today’s Advanced Driver Assistance Systems (ADAS), humans now experience reduced risk when driving on the roads. Modern-day systems fuse and interpret data from multiple sources – including GPS, LiDAR, ultrasonic systems, cameras and radar – to immediately generate useful information to the driver, such as lane-departure warnings, or automated behaviours, like collision-avoidance. These sensor-fusion-based technologies have been trialled since the early 1980s and have proven their worth by helping drivers make better and safer decisions on the road.

The marine and maritime industries are catching up to the automotive industry’s innovations quickly. Companies like Sea Machines now offer vessel technologies that fuse common critical marine data sources in a single view so mariners have immediate and accurate operational information.

“Each sensor, whether it be radar, AIS or LiDAR, has strengths and weaknesses. For example, radar is frequently updated and is very good at accurately determining distance and speed of other boats on the water, but can fail to pick up small objects in the water or it can become jammed. Cameras are better suited for identifying and classifying objects, but can be compromised by fog or sun. LiDAR can detect even very small objects, but it doesn’t have the range of cameras or radar. Sensor fusion combines them into one view to offer redundancy, mitigated risk and a more certain view of the environment,” said Sea Machines’ Nikhil Krishnan, perception software engineer.

Currently available on the company’s SM300 autonomous-command and remote-helm control system, the company’s sensor fusion technology brings together ECDIS chart data, GPS, radar, AIS data and live camera feeds into one shared view, like below:

Fusing these data into one intuitive and consumable interface ensures less uncertainty than would be possible if the data streams were relied upon independently. What the radar fails to pick up, the camera does. When AIS is slow to update, radar ensures fresh data. Software algorithms provide the most comprehensive and accurate situational model possible based on all of these data sources combined. In short, this redundancy mitigates not only human error, but also individual sensor error.

 

What’s next

The next generation of Sea Machines technology will additionally incorporate camera vision targets, artificial intelligence (AI) and other proprietary advancements. This product will depend on a robust sensor fusion approach that allows the “brain” of the vessel to process huge data sets and swiftly transmit information to and from the environment. Use of AI enables machine learning, meaning the technology will become “smarter” and more useful over time.

The following view of Sea Machines’ fused data is what mariners aboard a containership in Europe utilize to support operations:

Whether operators choose to use intelligent technologies to support on-board mariners or to execute remote and/or unmanned autonomous missions, it’s clear that better data leads to better decision making and greater predictability. For situations when speed matters, having accurate, redundant data immediately accessible could be the difference between a near-miss and an incident.

“Sensor fusion offers a more precise and safer way of operating vessels,” said Krishnan. “We are consolidating vast amounts of data and making it more reliable and actionable, with less room for error. It’s truly ground-breaking.”

[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_column_text]

Contact Us

[/vc_column_text][vc_raw_html]JTNDaWZyYW1lJTIwc3JjJTNEJTIyaHR0cHMlM0ElMkYlMkZnby5wYXJkb3QuY29tJTJGbCUyRjkzOTUzMyUyRjIwMjEtMDctMjklMkY1MWNwbSUyMiUyMHdpZHRoJTNEJTIyMTAwJTI1JTIyJTIwaGVpZ2h0JTNEJTIyNTAwJTIyJTIwdHlwZSUzRCUyMnRleHQlMkZodG1sJTIyJTIwZnJhbWVib3JkZXIlM0QlMjIwJTIyJTIwYWxsb3dUcmFuc3BhcmVuY3klM0QlMjJ0cnVlJTIyJTIwc3R5bGUlM0QlMjJib3JkZXIlM0ElMjAwJTIyJTNFJTNDJTJGaWZyYW1lJTNF[/vc_raw_html][/vc_column][/vc_row]