Techniques are disclosed for systems and methods to provide graphical user interfaces for assisted and/or autonomous navigation for mobile structures. A navigation assist system includes a user interface for a mobile structure comprising a display and a logic device configured to communicate with the user interface and render a docking user interface on the display. The logic device is configured to monitor control signals for a navigation control system for the mobile structure and render the docking user interface based, at least in part, on the monitored control signals. The docking user interface includes a maneuvering guide with a mobile structure perimeter indicator, an obstruction map, and a translational thrust indicator configured to indicate a translational maneuvering thrust magnitude and direction relative to an orientation of the mobile structure perimeter indicator.
B63B 49/00 - Arrangements of nautical instruments or navigational aids
B63B 79/10 - Monitoring properties or operating parameters of vessels in operation using sensors, e.g. pressure sensors, strain gauges or accelerometers
B63B 79/40 - Monitoring properties or operating parameters of vessels in operation for controlling the operation of vessels, e.g. monitoring their speed, routing or maintenance schedules
B63H 21/21 - Control means for engine or transmission, specially adapted for use on marine vessels
Bird's eye view (BEV) semantic mapping systems and methods are provided. A method includes receiving an image captured by a monocular camera having a first point of view (POV) of an environment including a plurality of features. The method further includes processing, by an artificial neural network (ANN), the captured image to generate a semantic map for the captured image, the semantic map associated with a second POV different from the first POV. The features exhibit a uniform scale in the semantic map. Additional methods and associated systems are also provided.
Techniques are disclosed for systems and methods to provide assisted navigation based on surrounding threats. In one example, an assisted navigation system receives data from a plurality of sensors associated with a mobile structure. The assisted navigation system determines a plurality of navigational hazards disposed within a monitored area associated with the mobile structure. The assisted navigation system processes the data and/or the navigational hazards to determine an operational context of the mobile structure. The assisted navigation system generates a context-dependent navigational chart for the mobile structure, wherein the navigational chart comprises greater or fewer of the navigational hazards in response to the determined operational context. The assisted navigation system updates the navigational chart in response to changes in the data. Additional systems and methods are provided.
Bird's eye view (BEV) semantic mapping systems and methods are provided. A method includes receiving an image captured by a monocular camera having a first point of view (POV) of an environment including a plurality of features. The method further includes processing, by an artificial neural network (ANN), the captured image to generate a semantic map for the captured image, the semantic map associated with a second POV different from the first POV. The features exhibit a uniform scale in the semantic map. Additional methods and associated systems are also provided.
Bird's eye view (BEV) semantic mapping systems and methods are provided. A method includes receiving a plurality of images captured by a plurality of monocular cameras having different points of view (POVs) of an environment. The method further includes processing, by an artificial neural network (ANN), the images to generate a plurality of semantic maps of the environment associated with the images, the semantic maps having a shared POV. The method further includes processing the semantic maps to generate a combined semantic map of the environment having the shared POV. Additional methods and associated systems are also provided.
Techniques are disclosed for systems and methods to provide remote sensing imagery for mobile structures. A remote sensing imagery system includes a radar assembly mounted to a mobile structure and a coupled logic device. The radar assembly includes an imaging system coupled to or within the radar assembly and configured to provide image data associated with the radar assembly. The logic device is configured to receive radar returns corresponding to a detected target from the radar assembly and image data corresponding to the radar returns from the imaging system, and then generate radar image data based on the radar returns and the image data. Subsequent user input and/or the sensor data may be used to adjust a steering actuator, a propulsion system thrust, and/or other operational systems of the mobile structure.
G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
G06V 10/44 - Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersectionsConnectivity analysis, e.g. of connected components
Techniques are disclosed for systems and methods to provide remote sensing imagery for mobile structures. A remote sensing imagery system includes a radar assembly mounted to a mobile structure and a coupled logic device. The radar assembly includes an orientation and position sensor (OPS) coupled to or within the radar assembly and configured to provide orientation and position data associated with the radar assembly. The logic device is configured to receive radar returns corresponding to a detected target from the radar assembly and orientation and/or position data corresponding to the radar returns from the OPS, determine a target radial speed corresponding to the detected target, and then generate remote sensor image data based on the remote sensor returns and the target radial speed. Subsequent user input and/or the sensor data may be used to adjust a steering actuator, a propulsion system thrust, and/or other operational systems of the mobile structure.
Techniques are disclosed for systems and methods to provide assisted navigation based on surrounding threats. In one example, an assisted navigation system receives data from a plurality of sensors associated with a mobile structure. The assisted navigation system determines a plurality of navigational hazards disposed within a monitored area associated with the mobile structure. The assisted navigation system processes the data and/or the navigational hazards to determine an operational context of the mobile structure. The assisted navigation system generates a context-dependent navigational chart for the mobile structure, wherein the navigational chart comprises greater or fewer of the navigational hazards in response to the determined operational context. The assisted navigation system updates the navigational chart in response to changes in the data. Additional systems and methods are provided.
Techniques are disclosed for systems and methods to provide graphical user interfaces for assisted and/or autonomous navigation for mobile structures. A navigation assist system includes a user interface with a display for a mobile structure and a logic device configured to render a docking user interface on the display. The logic device determines a direction and magnitude of a navigational bias associated with navigation of the mobile structure and determines a spatially biased safety perimeter and hazard monitoring area within a monitoring perimeter of a perimeter ranging system mounted to the mobile structure, based on the direction and magnitude of the navigational bias. The docking user interface includes a maneuvering guide with a virtual bumper perimeter intrusion indicator configured to indicate a relative position and/or proximity of a navigation hazard within the spatially biased hazard monitoring area.
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
B63B 79/10 - Monitoring properties or operating parameters of vessels in operation using sensors, e.g. pressure sensors, strain gauges or accelerometers
B63B 79/40 - Monitoring properties or operating parameters of vessels in operation for controlling the operation of vessels, e.g. monitoring their speed, routing or maintenance schedules
Techniques are disclosed for systems and methods to disengage an autopilot drive of a mobile structure based on a steering wheel torque applied manually by a user. A system includes a logic device in communication with a torque sensor unit, such as a strain gauge, load pin, or load cell. Sensor data and/or signals provided by the TSU are used to determine a force applied to a steering mechanism of the mobile structure while the mobile structure is on a heading provided by an autopilot drive of the mobile structure. The force may be a torque applied to the steering mechanism corresponding to manual control of the mobile structure. The system disengages the autopilot device of the mobile structure based, at least in part, on the determined force.
Techniques are disclosed for systems and methods to provide reliable and relatively quick bottom reacquisition in sonar systems for mobile structures, including three dimensional (3D) capable and/or multichannel sonar systems. A sonar system includes a sonar transducer and associated processing and control electronics and optionally orientation and/or position sensors disposed substantially within the housing of a sonar transducer assembly. A logic device of the sonar system is configured to detect bottom lock loss based, at least in part, on sonar data provided by the sonar transducer, determine an expected bottom depth associated with the detected bottom lock loss, and generate updated sonar data based, at least in part, on the expected bottom depth. Resulting sonar data and/or imagery may be displayed to a user and/or used to adjust a steering actuator, a propulsion system thrust, and/or other operational systems of the mobile structure.
Techniques are disclosed for systems and methods to provide dynamic display systems for mobile structures. A dynamic marine display system includes a user interface comprising a primary display and secondary display, where the secondary display is disposed along and physically separate from an edge of the primary display, and where the secondary display comprises a touch screen display configured to render pixelated display views and receive user input as one or more user touches and/or gestures applied to a display surface of the secondary display. A logic device is configured to receive user selection of an operational mode associated with the user interface and/or the mobile structure and render a primary display view via the primary display and/or a secondary display view via the secondary display corresponding to the received user selection and/or operational mode.
G06F 3/0488 - Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
G06F 3/04847 - Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
G06F 3/04817 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
Techniques are disclosed for systems and methods to provide sailing information to users of a mobile structure. A sailing user interface system includes a logic device configured to communicate with a compass or orientation sensor, a wind sensor, and/or a speed sensor. Sensor signals provided by the various sensors are used to determine a heading and a wind direction for the mobile structure. The wind direction and heading may be used to generate a steering guide display view. The steering guide graphically indicates the heading of the mobile structure relative to various optimum velocity made good (VMG) headings associated with the mobile structure, its heading, the wind direction, and/or a performance contour for the mobile structure. The steering guide may be displayed to a user to refine manual operation of the mobile structure, and the information rendered in the steering guide may be used to autopilot the mobile structure.
B63B 49/00 - Arrangements of nautical instruments or navigational aids
B63B 79/15 - Monitoring properties or operating parameters of vessels in operation using sensors, e.g. pressure sensors, strain gauges or accelerometers for monitoring environmental variables, e.g. wave height or weather data
B63B 79/40 - Monitoring properties or operating parameters of vessels in operation for controlling the operation of vessels, e.g. monitoring their speed, routing or maintenance schedules
G01C 21/20 - Instruments for performing navigational calculations
B63B 51/00 - Marking of navigational routes otherwise than with buoys
17.
Navigational danger identification and feedback systems and methods
Techniques are disclosed for systems and methods for navigational danger identification and feedback. A navigation system may include one or more navigation sensors coupled to and/or associated with a mobile structure and a logic device. The one or more navigation sensors are configured to provide navigational data associated with the mobile structure. The logic device is configured to receive navigational data from the one or more navigation sensors; determine a virtual model comprising at least one navigational hazard based, at least in part, on the received navigational data; and generate a navigation display view comprising a virtual model view based, at least in part, on the determined virtual model, wherein the virtual model view comprises at least one navigation threat indicator corresponding to the at least one navigational hazard.
Techniques are disclosed for systems and methods for water non-water segmentation of navigational imagery to assist in the autonomous navigation of mobile structures. An imagery based navigation system includes a logic device configured to communicate with an imaging module coupled to a mobile structure and/or configured to capture images of an environment about the mobile structure. The logic device may be configured to receive at least one image from the imaging module; determine a water/non-water segmented image based, at least in part, on the received at least one image, and generate a range chart corresponding to the environment about the mobile structure based, at least in part, on the determined water/non-water segmented image and/or the received at least one image.
G06V 10/82 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
B60R 1/22 - Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
B63B 49/00 - Arrangements of nautical instruments or navigational aids
G01C 21/00 - NavigationNavigational instruments not provided for in groups
G06V 10/26 - Segmentation of patterns in the image fieldCutting or merging of image elements to establish the pattern region, e.g. clustering-based techniquesDetection of occlusion
Techniques are disclosed for systems and methods to provide remote sensing imagery for mobile structures. A remote sensing imagery system includes a radar assembly (160,300,302,304) mounted to a mobile structure (101) and a coupled logic device (130). The radar assembly includes an imaging system (282) coupled to or within the radar assembly and configured to provide image data associated with the radar assembly. The logic device is configured to receive radar returns corresponding to a detected target (464) from the radar assembly and image data corresponding to the radar returns from the imaging system, and then generate radar image data based on the radar returns and the image data. Subsequent user input and/or the sensor data may be used to adjust a steering actuator, a propulsion system thrust, and/or other operational systems of the mobile structure.
Techniques are disclosed for systems and methods to provide remote sensing imagery for mobile structures. A remote sensing imagery system includes a radar assembly mounted to a mobile structure and a coupled logic device. The radar assembly includes an orientation and position sensor (OPS) coupled to or within the radar assembly and configured to provide orientation and position data associated with the radar assembly. The logic device is configured to receive radar returns corresponding to a detected target from the radar assembly and orientation and/or position data corresponding to the radar returns from the OPS, determine a target radial speed corresponding to the detected target, and then generate remote sensor image data based on the remote sensor returns and the target radial speed. Subsequent user input and/or the sensor data may be used to adjust a steering actuator, a propulsion system thrust, and/or other operational systems of the mobile structure.
G01S 7/295 - Means for transforming co-ordinates or for evaluating data, e.g. using computers
G01S 13/524 - Discriminating between fixed and moving objects or between objects moving at different speeds using transmissions of interrupted pulse modulated waves based upon the phase or frequency shift resulting from movement of objects, with reference to the transmitted signals, e.g. coherent MTi
G01S 13/58 - Velocity or trajectory determination systemsSense-of-movement determination systems
G01S 13/60 - Velocity or trajectory determination systemsSense-of-movement determination systems wherein the transmitter and receiver are mounted on the moving object, e.g. for determining ground speed, drift angle, ground track
G01S 13/86 - Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
G01S 13/89 - Radar or analogous systems, specially adapted for specific applications for mapping or imaging
G01S 13/937 - Radar or analogous systems, specially adapted for specific applications for anti-collision purposes of marine craft
Techniques are disclosed for systems and methods to provide wildlife feeding flock detection using a remote sensing imagery system. A remote sensing imagery system includes a radar assembly mounted to a mobile structure and a coupled logic device. The logic device is configured to receive radar returns corresponding to a detected target, determine a radial velocity spectrum associated with the detected target based, at least in part, on the received radar returns, and determine a probability the detected target includes a feeding flock based, at least in part, on the determined radial velocity spectrum. The logic device may generate radar image data based on the received radar returns, the determined radial velocity spectrum, and/or the probability the detected target includes the feeding flock. Subsequent user input and/or the sensor data may be used to adjust a steering actuator, a propulsion system thrust, and/or other operational systems of the mobile structure.
G01S 7/41 - Details of systems according to groups , , of systems according to group using analysis of echo signal for target characterisationTarget signatureTarget cross-section
G01S 7/10 - Providing two-dimensional co-ordinated display of distance and direction
22.
Modular electrical power distribution system with module detection systems and methods
Techniques are disclosed for systems and methods associated with a modular electrical power distribution system with module detection. A modular electrical power distribution system may include a plurality of controllers, a shared serial communication bus between the plurality of controllers, and a module detection signal line coupled through the plurality of controllers. The plurality of controllers may include a master controller, a power input controller, and one or more load controllers disposed between the master controller and the power input controller.
Techniques are disclosed for systems and methods to provide graphical user interfaces for assisted and/or autonomous navigation for mobile structures. A navigation assist system includes a user interface with a display for a mobile structure and a logic device configured to render a docking user interface on the display. The logic device determines a direction and magnitude of a navigational bias associated with navigation of the mobile structure and determines a spatially biased safety perimeter and hazard monitoring area within a monitoring perimeter of a perimeter ranging system mounted to the mobile structure, based on the direction and magnitude of the navigational bias. The docking user interface includes a maneuvering guide with a virtual bumper perimeter intrusion indicator configured to indicate a relative position and/or proximity of a navigation hazard within the spatially biased hazard monitoring area.
G05D 1/02 - Control of position or course in two dimensions
B63H 25/00 - SteeringSlowing-down otherwise than by use of propulsive elementsDynamic anchoring, i.e. positioning vessels by means of main or auxiliary propulsive elements
Techniques are disclosed for systems and methods for water non-water segmentation of navigational imagery to assist in the autonomous navigation of mobile structures. An imagery based navigation system includes a logic device configured to communicate with an imaging module coupled to a mobile structure and/or configured to capture images of an environment about the mobile structure. The logic device may be configured to receive at least one image from the imaging module; determine a water/non-water segmented image based, at least in part, on the received at least one image, and generate a range chart corresponding to the environment about the mobile structure based, at least in part, on the determined water/non-water segmented image and/or the received at least one image.
Techniques are disclosed for systems and methods to provide high resolution interpolation of arrival direction of echo return signals using an active mills cross arrangement, such as in sonar or other ranging sensor systems. A system may include an active mills cross arrangement with high resolution interpolation of echo returns in two planes. The active mills cross arrangement may include a transmitter configured to emit one or more signals, a first line array including a first plurality of elements defining a first plane, and a second line array including a second plurality of elements defining a second plane orthogonal to the first plane. At least one of the first line array and the second line array may be configured to receive echo returns of the emitted signals from one or more objects or targets.
Marine object detection, localization and classification systems and related techniques include an imaging system configured capture a stream of panoramic images of the water surrounding a mobile structure, including a view of the horizon. The images may include a 360-degree view from the mobile structure. The system is configured to analyze the stream of images using a marine video analytics system and/or a convolutional neural network to detect a region of interest comprising an object on the surface of the water, classify the detected object and relay the results to the user and/or a processing system. The analysis may include determining a horizon in a captured image, defining tiles across the horizon, and detecting objects in each tile.
Techniques are disclosed for systems and methods for navigational danger identification and feedback. A navigation system may include one or more navigation sensors coupled to and/or associated with a mobile structure and a logic device. The one or more navigation sensors are configured to provide navigational data associated with the mobile structure. The logic device is configured to receive navigational data from the one or more navigation sensors; determine a virtual model comprising at least one navigational hazard based, at least in part, on the received navigational data; and generate a navigation display view comprising a virtual model view based, at least in part, on the determined virtual model, wherein the virtual model view comprises at least one navigation threat indicator corresponding to the at least one navigational hazard.
Flight based marine object search, detection and identification systems and related techniques include an unmanned aerial system (UAS) having a flight platform configured to execute a search path to search for an underwater object, an imaging system comprising image capture components configured to generate a stream of images corresponding to a field of view of the UAS, and a logic device associated with the UAS and configured to analyze the stream of images using a marine video analysis (MVA) system to detect a region of interest comprising an underwater object, identify an underwater object in the detected region of interest, and notify a mobile structure of the identified object.
Techniques are disclosed for systems and methods to provide perimeter ranging for navigation of mobile structures. A navigation control system includes a logic device, a perimeter ranging system, one or more actuators/controllers, and modules to interface with users, sensors, actuators, and/or other elements of a mobile structure. The logic device is configured to receive perimeter sensor data from ultrasonic perimeter ranging sensor assemblies of the perimeter ranging system and generate an obstruction map based on the received perimeter sensor data. The logic device determines a range to and/or a relative velocity of a navigation hazard based on the received perimeter sensor data. The logic device determines navigation control signals based on the range and/or relative velocity of the navigation hazard. Control signals may be displayed to a user and/or used to adjust a steering actuator, a propulsion system thrust, and/or other operational systems of the mobile structure.
G05D 1/02 - Control of position or course in two dimensions
G01C 21/20 - Instruments for performing navigational calculations
G01S 13/86 - Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
B63H 25/42 - Steering or dynamic anchoring by propulsive elementsSteering or dynamic anchoring by propellers used therefor onlySteering or dynamic anchoring by rudders carrying propellers
G01C 21/00 - NavigationNavigational instruments not provided for in groups
B63B 43/18 - Improving safety of vessels, e.g. damage control, not otherwise provided for preventing collisionImproving safety of vessels, e.g. damage control, not otherwise provided for reducing collision damage
Techniques are disclosed for systems and methods to provide passage planning for a mobile structure. A passage planning system includes a logic device configured to communicate with a user interface associated with the mobile structure and at least one operational state sensor mounted to or within the mobile structure. The logic device determines an operational range map based, at least in part, on an operational state of the mobile structure, potential navigational hazards, and/or environmental conditions associated with the mobile structure. Such operational range map and other control signals may be displayed to a user and/or used to generate a planned route and/or adjust a steering actuator, a propulsion system thrust, and/or other operational systems of the mobile structure.
Techniques are disclosed for systems and methods to provide accurate and compact three dimensional (3D) capable multichannel sonar systems for mobile structures. A 3D capable multichannel sonar system includes a multichannel transducer and associated processing and control electronics and optionally orientation and/or position sensors disposed substantially within the housing of a sonar transducer assembly. The multichannel transducer includes multiple transmission and/or receive channels/transducer elements. The transducer assembly is configured to support and protect the multichannel transducer and associated electronics and sensors, to physically and/or adjustably couple to a mobile structure, and/or to provide a simplified interface to other systems coupled to the mobile structure. Resulting sonar data and/or imagery may be displayed to a user and/or used to adjust a steering actuator, a propulsion system thrust, and/or other operational systems of the mobile structure.
Systems and methods for controlling traffic signaling includes a wireless sensor operable to detect and receive wireless signals emitted from a vehicle, an image sensor operable to capture a stream of images of a field of view. A traffic control system is operable to extract geographic positioning information for the vehicle from the wireless signals, track the vehicle's movement using the extracted geographic positioning information, detect and track an object in the stream of images corresponding to the vehicle. The vehicle's geographic movement is further tracked using a pixel location of the object in steam of images and a traffic control action is executed based on the geographic movement to facilitate passage of the at least one vehicle through a monitored traffic control location.
Techniques are disclosed for systems and methods to provide graphical user interfaces for assisted and/or autonomous navigation for mobile structures. A navigation assist system includes a user interface for a mobile structure comprising a display and a logic device configured to communicate with the user interface and render a docking user interface on the display. The logic device is configured to monitor control signals for a navigation control system for the mobile structure and render the docking user interface based, at least in part, on the monitored control signals. The docking user interface includes a maneuvering guide with a mobile structure perimeter indicator, an obstruction map, and a translational thrust indicator configured to indicate a translational maneuvering thrust magnitude and direction relative to an orientation of the mobile structure perimeter indicator.
Techniques are disclosed for systems and methods for video based sensor fusion with respect to mobile structures. A mobile structure may include at least one imaging module and multiple navigational sensors and/or receive navigational data from various sources. A navigational database may be generated that includes data from the imaging module, navigational sensors, and/or other sources. Aspects of the navigational database may then be used to generate an integrated model, forecast weather conditions, warn of dangers, identify hard to spot items, and generally aid in the navigation of the mobile structure.
G01C 21/00 - NavigationNavigational instruments not provided for in groups
G01C 21/20 - Instruments for performing navigational calculations
G01C 21/36 - Input/output arrangements for on-board computers
G01C 23/00 - Combined instruments indicating more than one navigational value, e.g. for aircraftCombined measuring devices for measuring two or more variables of movement, e.g. distance, speed or acceleration
Techniques are disclosed for systems and methods to provide remote sensing imagery for mobile structures. A remote sensing imagery system includes a remote sensing assembly with a housing mounted to a mobile structure and a coupled logic device. The logic device is configured to receive radar returns corresponding to a detected target from the radar assembly, determine a target radial speed corresponding to the detected target, determine an adaptive target speed threshold, and then generate remote sensor image data based on the remote sensor returns, the target radial speed, and the adaptive target speed threshold. Subsequent user input and/or the sensor data may be used to adjust a steering actuator, a propulsion system thrust, and/or other operational systems of the mobile structure.
G01S 13/58 - Velocity or trajectory determination systemsSense-of-movement determination systems
G01S 13/524 - Discriminating between fixed and moving objects or between objects moving at different speeds using transmissions of interrupted pulse modulated waves based upon the phase or frequency shift resulting from movement of objects, with reference to the transmitted signals, e.g. coherent MTi
G01S 13/60 - Velocity or trajectory determination systemsSense-of-movement determination systems wherein the transmitter and receiver are mounted on the moving object, e.g. for determining ground speed, drift angle, ground track
G01S 13/86 - Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
G01S 13/89 - Radar or analogous systems, specially adapted for specific applications for mapping or imaging
G01S 13/93 - Radar or analogous systems, specially adapted for specific applications for anti-collision purposes
G01S 15/02 - Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
G01S 15/52 - Discriminating between fixed and moving objects or between objects moving at different speeds
G01S 15/58 - Velocity or trajectory determination systemsSense-of-movement determination systems
Techniques are disclosed for systems and methods to provide assisted and/or autonomous navigation for mobile structures. A navigation assist system includes a control signal coupling configured to couple to a control signal line of a manual user interface for a mobile structure and a logic device. The logic device is configured to monitor control signals communicated between the manual user interface and a navigation control system for the mobile structure, identify maneuvering signals generated by the manual user interface, determine a maneuvering protocol corresponding to the manual user interface, and selectively relay, block, or modify the monitored control signals based on the determined navigation mode for the mobile structure, the monitored control signals, and the determined maneuvering protocol. Control signals may be displayed to a user and/or used to adjust a steering actuator, a propulsion system thrust, and/or other operational systems of the mobile structure.
G05D 1/02 - Control of position or course in two dimensions
B63B 43/18 - Improving safety of vessels, e.g. damage control, not otherwise provided for preventing collisionImproving safety of vessels, e.g. damage control, not otherwise provided for reducing collision damage
G01C 21/00 - NavigationNavigational instruments not provided for in groups
H04L 29/06 - Communication control; Communication processing characterised by a protocol
H04L 29/08 - Transmission control procedure, e.g. data link level control procedure
37.
AUTONOMOUS AND ASSISTED DOCKING SYSTEMS AND METHODS
Techniques are disclosed for systems and methods to provide docking assist for mobile structures. A docking assist system includes a logic device, one or more sensors, one or more actuators/controllers, and modules to interface with users, sensors, actuators, and/or other modules of a mobile structure. The logic device is adapted to receive docking assist parameters from a user interface for the mobile structure and perimeter sensor data from a perimeter ranging system mounted to the mobile structure. The logic device determines docking assist control signals based, at least in part, on the docking assist parameters and perimeter sensor data, and it then provides the docking assist control signals to a navigation control system for the mobile structure. Control signals may be displayed to a user and/or used to adjust a steering actuator, a propulsion system thrust, and/or other operational systems of the mobile structure.
G05D 1/02 - Control of position or course in two dimensions
B63B 43/18 - Improving safety of vessels, e.g. damage control, not otherwise provided for preventing collisionImproving safety of vessels, e.g. damage control, not otherwise provided for reducing collision damage
G01C 21/00 - NavigationNavigational instruments not provided for in groups
Techniques are disclosed for systems and methods to provide perimeter ranging for navigation of mobile structures. A navigation control system includes a logic device, a perimeter ranging sensor, one or more actuators/controllers, and modules to interface with users, sensors, actuators, and/or other elements of a mobile structure. The logic device is configured to receive perimeter sensor data from the perimeter ranging system. The logic device determines a range to and/or a relative velocity of a navigation hazard disposed within a monitoring perimeter of the perimeter ranging system based on the received perimeter sensor data. The logic device then generates a display view of the perimeter sensor data or determines navigation control signals based on the range and/or relative velocity of the navigation hazard. Control signals may be displayed to a user and/or used to adjust a steering actuator, a propulsion system thrust, and/or other operational systems of the mobile structure.
G05D 1/02 - Control of position or course in two dimensions
B63B 43/18 - Improving safety of vessels, e.g. damage control, not otherwise provided for preventing collisionImproving safety of vessels, e.g. damage control, not otherwise provided for reducing collision damage
G01C 21/00 - NavigationNavigational instruments not provided for in groups
Techniques are disclosed for systems and methods for providing a fused image that combines image data of a scene received from an imaging system with a chart to aid in the navigation of a mobile structure. Specifically, the fused image may be generated by superimposing various image elements extracted from an image onto various locations on the chart that correlate to the actual positions of the objects within the scene. In some embodiments, one or more non-water objects may be detected within the image, image representations of the non-water objects may be extracted from the image feed, and the extracted image representations may be superimposed onto locations on the chart that are determined to correlate to the actual positions of the non-water objects within the scene.
Techniques are disclosed for systems and methods to provide visually correlated radar imagery for mobile structures. A visually correlated radar imagery system includes a radar system, an imaging device, and a logic device configured to communicate with the radar system and imaging device. The radar system is adapted to be mounted to a mobile structure, and the imaging device may include an imager position and/or orientation sensor (IPOS). The logic device is configured to determine a horizontal field of view (FOV) of image data captured by the imaging device and to render radar data that is visually or spatially correlated to the image data based, at least in part, on the determined horizontal FOV. Subsequent user input and/or the sonar data may be used to adjust a steering actuator, a propulsion system thrust, and/or other operational systems of the mobile structure.
G01S 13/86 - Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
G01S 7/12 - Plan-position indicators, i.e. P. P. I.
B63B 49/00 - Arrangements of nautical instruments or navigational aids
G01S 7/16 - Signals displayed as intensity modulation with rectangular co-ordinates representing distance and bearing, e.g. type B
G01S 7/22 - Producing cursor lines and indicia by electronic means
G01S 7/24 - Cathode-ray tube displays the display being orientated or displaced in accordance with movement of object carrying the transmitting and receiving apparatus, e.g. true-motion radar
G01S 13/93 - Radar or analogous systems, specially adapted for specific applications for anti-collision purposes
41.
THREE DIMENSIONAL TARGET SELECTION SYSTEMS AND METHODS
Techniques are disclosed for systems and methods to provide three dimensional target selection for use when operating mobile structures. A three dimensional target selection system includes a logic device configured to communicate with a user interface and receive volume data from a volume data source. The logic device is configured to render a first perspective of a three dimensional representation of the volume data on a display of the user interface, determine a first viewpoint vector within the 3D representation based, at least in part, on a first user input received by the user interface; and identify an object or position within the volume data based, at least in part, on the first viewpoint vector and the first user input.
G06F 3/0481 - Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
G06F 3/0488 - Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
G06T 19/20 - Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
G01S 15/96 - Sonar systems specially adapted for specific applications for locating fish
Techniques are disclosed for systems and methods to provide three dimensional volume rendering of volume data for use when operating mobile structures. A three dimensional volume rendering system includes a logic device configured to communicate with a user interface and receive volume data from a volume data source. The logic device is configured to render a three dimensional) representation of the volume data on a display of the user interface according to a view perspective, render an indicator key on a first two dimensional plane overlaid in the 3D representation, detect a change condition, and rotate the indicator key about a rotational axis in response to the detected change condition.
Techniques are disclosed for systems and methods for labeling objects displayed by an augmented reality display system used to assist in the operation of mobile structures. Such an augmented reality display system includes a logic device configured to communicate with navigational sensors and imaging modules coupled to a mobile structure, where the navigational sensors are configured to provide navigational data associated with the mobile structure and the imaging module is configured to image a scene from a position on the mobile structure. The logic device is configured to detect an object in the scene, determine a heading reliability associated with the detected object based, at least in part, on the navigational data, and render an integrated model of the scene on a display, where the integrated model is configured to indicate the determined heading reliability associated with the detected object.
G01C 21/16 - NavigationNavigational instruments not provided for in groups by using measurement of speed or acceleration executed aboard the object being navigatedDead reckoning by integrating acceleration or speed, i.e. inertial navigation
G01C 21/20 - Instruments for performing navigational calculations
G06K 9/00 - Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
G06T 7/70 - Determining position or orientation of objects or cameras
44.
UNMANNED AERIAL SYSTEM ASSISTED NAVIGATIONAL SYSTEMS AND METHODS
Flight based infrared imaging systems and related techniques, and in particular unmanned aerial system (UAS) based systems, are provided for aiding in operation and/or piloting of a mobile structure. Such systems and techniques may include determining environmental conditions around the mobile structure with, at least, the UAS, detecting the presence of objects and/or persons around the mobile structure, and/or determining the presence of other structures around the mobile structure. Instructions for the operation of such mobile structures may then be accordingly determined responsive to such data.
Administration of an incentive reward program comprised of tiers of rebates triggered by purchases of certain categories of marine and nautical electronics products