Tomahawk Robotics, Inc.

United States of America

Back to Profile

1-24 of 24 for Tomahawk Robotics, Inc. Sort by
Query
Patent
Aggregations Reset Report
Jurisdiction
        United States 20
        World 4
Date
New (last 4 weeks) 1
2025 May (MTD) 1
2025 March 1
2025 February 4
2025 January 5
See more
IPC Class
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots 10
G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors 8
B62D 57/02 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members 5
G06V 10/70 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning 5
G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects 5
See more
Status
Pending 10
Registered / In Force 14
Found results for  patents

1.

INERTIALLY ISOLATED SPATIAL CONTROL

      
Application Number 19014558
Status Pending
Filing Date 2025-01-09
First Publication Date 2025-05-08
Owner Tomahawk Robotics, Inc. (USA)
Inventor
  • Bowman, Michael E.
  • Bowman, William S.
  • Hedman, Daniel R.
  • Summer, Matthew D.
  • Falendysz, Andrew D.
  • Makovy, Kevin
  • Holt, Michael W.

Abstract

Methods and systems are described herein for detecting motion-induced errors received from inertial-type input devices and for generating accurate vehicle control commands that account for operator movement. These methods and systems may determine, using motion data from inertial sensors, whether the hand/arm of the operator is moving in the same motion as the body of the operator, and if both are moving in the same way, these systems and methods may determine that the motion is not intended to be a motion-induced command. However, if the hand/arm of the operator is moving in a different motion from the body of the operator, these methods and systems may determine that the operator intended the motion to be a motion-induced command to a vehicle.

IPC Classes  ?

  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

2.

POINT OF INTEREST TRACKING AND ESTIMATION

      
Application Number 18959555
Status Pending
Filing Date 2024-11-25
First Publication Date 2025-03-13
Owner Tomahawk Robotics, Inc. (USA)
Inventor
  • Hedman, Daniel R.
  • Summer, Matthew D.
  • Bowman, William S.
  • Bowman, Michael E.
  • Truesdell, Brad
  • Falendysz, Andrew D.

Abstract

Methods and systems are described herein for determining three-dimensional locations of objects within identified portions of images. An image processing system may receive an image and an identification of location within an image. The image may be input into a machine learning model to detect one or more objects within the identified location. Multiple images may then be used to generate location estimations of those objects. Based on the location estimations, an accurate three-dimensional location may be calculated.

IPC Classes  ?

  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G06T 7/60 - Analysis of geometric attributes
  • G06V 10/774 - Generating sets of training patternsBootstrap methods, e.g. bagging or boosting
  • G06V 20/17 - Terrestrial scenes taken from planes or by drones
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition
  • H04N 7/18 - Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
  • H04N 23/661 - Transmitting camera control signals through networks, e.g. control via the Internet

3.

UNIVERSAL MULTIMODAL PAYLOAD CONTROL

      
Application Number 18453778
Status Pending
Filing Date 2023-08-22
First Publication Date 2025-02-27
Owner Tomahawk Robotics, Inc (USA)
Inventor
  • Holt, Michael W.
  • Bowman, William S.
  • Summer, Matthew D.
  • Moffett, Mark B.

Abstract

Methods and systems are described herein for a payload management system that may detect that a payload has been attached to an uncrewed vehicle and determine whether the payload is a restricted payload or an unrestricted payload. Based on determining that the payload is an unrestricted payload, the payload management system may establish a connection between the payload and the operator using a first communication channel that has already been established between the uncrewed vehicle and the operator. Based on determining that the payload is a restricted payload, the payload management system may establish a connection between the payload and operator using a second communication channel. The payload management system may listen for restricted payload commands over the second communication channel, and when a payload command is received via the second communication channel, the payload command may be executed using the restricted payload.

IPC Classes  ?

  • G05B 15/02 - Systems controlled by a computer electric
  • G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots

4.

UNIVERSAL MULTIMODAL PAYLOAD CONTROL

      
Application Number US2024042962
Publication Number 2025/042854
Status In Force
Filing Date 2024-08-19
Publication Date 2025-02-27
Owner TOMAHAWK ROBOTICS, INC. (USA)
Inventor
  • Holt, Michael W.
  • Bowman, William S.
  • Summer, Matthew D.
  • Moffett, Mark B.

Abstract

Methods and systems are described herein for a payload management system that may detect that a payload has been attached to an uncrewed vehicle and determine whether the payload is a restricted payload or an unrestricted payload. Based on determining that the payload is an unrestricted payload, the payload management system may establish a connection between the payload and the operator using a first communication channel that has already been established between the uncrewed vehicle and the operator. Based on determining that the payload is a restricted payload, the payload management system may establish a connection between the payload and operator using a second communication channel. The payload management system may listen for restricted payload commands over the second communication channel, and when a payload command is received via the second communication channel, the payload command may be executed using the restricted payload.

IPC Classes  ?

  • B25J 9/16 - Programme controls
  • B25J 13/08 - Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
  • B25J 19/02 - Sensing devices

5.

Computer vision classifier defined path planning for unmanned aerial vehicles

      
Application Number 18446450
Grant Number 12293538
Status In Force
Filing Date 2023-08-08
First Publication Date 2025-02-13
Grant Date 2025-05-06
Owner Tomahawk Robotics, Inc. (USA)
Inventor
  • Bowman, William S.
  • Moffett, Mark B.
  • Falendysz, Andrew D.
  • Bowman, Michael E.
  • Holt, Michael W.
  • Williams, Timothy M.
  • Danko, Matthew R.
  • Summer, Matthew D.

Abstract

Methods and systems are described herein for enabling aerial vehicle navigation in GPS-denied areas. The system may use a camera to record images of terrain as the aerial vehicle is flying to a target location. The system may then detect (e.g., using a machine learning model) objects within those images and compare those objects with objects within an electronic map that was loaded onto the aerial vehicle. When the system finds one or more objects within the electronic map that match the objects detected within the recorded images, the system may retrieve locations (e.g., GPS coordinates) of the objects within the electronic map and calculate, based on the coordinates, the location of the aerial vehicle. Once the location of the aerial vehicle is determined, the system may navigate to a target location or otherwise adjust a flight path of the aerial vehicle.

IPC Classes  ?

  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06T 7/50 - Depth or shape recovery
  • G06T 7/90 - Determination of colour characteristics

6.

COMPUTER VISION CLASSIFIER DEFINED PATH PLANNING FOR UNMANNED AERIAL VEHICLES

      
Application Number US2024041348
Publication Number 2025/034910
Status In Force
Filing Date 2024-08-07
Publication Date 2025-02-13
Owner TOMAHAWK ROBOTICS, INC. (USA)
Inventor
  • Bowman, William S.
  • Moffett, Mark B.
  • Falendysz, Andrew D.
  • Bowman, Michael E.
  • Holt, Michael W.
  • Williams, Timothy M.
  • Danko, Matthew R.
  • Summer, Matthew D.

Abstract

Methods and systems are described herein for enabling aerial vehicle navigation in GPS-denied areas. The system may use a camera to record images of terrain as the aerial vehicle is flying to a target location. The system may then detect (e.g., using a machine learning model) objects within those images and compare those objects with objects within an electronic map that was loaded onto the aerial vehicle. When the system finds one or more objects within the electronic map that match the objects detected within the recorded images, the system may retrieve locations (e.g., GPS coordinates) of the objects within the electronic map and calculate, based on the coordinates, the location of the aerial vehicle. Once the location of the aerial vehicle is determined, the system may navigate to a target location or otherwise adjust a flight path of the aerial vehicle.

IPC Classes  ?

  • G05D 1/80 - Arrangements for reacting to or preventing system or operator failure
  • G05D 1/46 - Control of position or course in three dimensions
  • G05D 1/243 - Means capturing signals occurring naturally from the environment, e.g. ambient optical, acoustic, gravitational or magnetic signals
  • G05D 1/246 - Arrangements for determining position or orientation using environment maps, e.g. simultaneous localisation and mapping [SLAM]
  • G06V 20/17 - Terrestrial scenes taken from planes or by drones
  • G06T 7/90 - Determination of colour characteristics
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06V 10/762 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
  • B64U 20/87 - Mounting of imaging devices, e.g. mounting of gimbals
  • G06N 20/00 - Machine learning

7.

LAYERED FAIL-SAFE REDUNDANCY ARCHITECTURE AND PROCESS FOR USE BY SINGLE DATA BUS MOBILE DEVICE

      
Application Number 18353866
Status Pending
Filing Date 2023-07-17
First Publication Date 2025-01-23
Owner Tomahawk Robotics, Inc. (USA)
Inventor
  • Bowman, William S.
  • Falendysz, Andrew D.
  • Makovy, Kevin M.
  • Summer, Matthew D.
  • Bowman, Michael E.
  • Holt, Michael W.
  • Moffett, Mark B.

Abstract

Methods and systems are described herein for a layered fail-safe redundancy system and architecture for privileged operation execution. The system may receive vehicle maneuvering commands from a controller over a first channel. When a user input is received to initiate a privileged mode for executing privileged commands, the system may receive a privileged command over a second channel. The system may identify, based on the privileged mode of operation and the privileged command, a privileged operation to be performed by a vehicle. The system may then transmit a request to the vehicle to perform the privileged operation.

IPC Classes  ?

  • H04L 12/40 - Bus networks
  • G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots

8.

SYSTEMS AND METHODS OF DETECTING INTENT OF SPATIAL CONTROL

      
Application Number 18908639
Status Pending
Filing Date 2024-10-07
First Publication Date 2025-01-23
Owner Tomahawk Robotics, Inc. (USA)
Inventor
  • Summer, Matthew D.
  • Bowman, William S.
  • Falendysz, Andrew D.
  • Makovy, Kevin M.
  • Hedman, Daniel R.
  • Truesdell, Bradley D.

Abstract

Systems and methods of manipulating/controlling robots. In many scenarios, data collected by a sensor (connected to a robot) may not have very high precision (e.g., a regular commercial/inexpensive sensor) or may be subjected to dynamic environmental changes. Thus, the data collected by the sensor may not indicate the parameter captured by the sensor with high accuracy. The present robotic control system is directed at such scenarios. In some embodiments, the disclosed embodiments can be used for computing a sliding velocity limit boundary for a spatial controller. In some embodiments, the disclosed embodiments can be used for teleoperation of a vehicle located in the field of view of a camera.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
  • B62D 57/02 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members
  • G05D 1/222 - Remote-control arrangements operated by humans
  • G05D 1/223 - Command input arrangements on the remote controller, e.g. joysticks or touch screens
  • G05D 1/224 - Output arrangements on the remote controller, e.g. displays, haptics or speakers
  • G05D 1/24 - Arrangements for determining position or orientation
  • G05D 1/65 - Following a desired speed profile
  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

9.

LAYERED FAIL-SAFE REDUNDANCY ARCHITECTURE AND PROCESS FOR USE BY SINGLE DATA BUS MOBILE DEVICE

      
Application Number US2024038114
Publication Number 2025/019462
Status In Force
Filing Date 2024-07-15
Publication Date 2025-01-23
Owner TOMAHAWK ROBOTICS, INC. (USA)
Inventor
  • Bowman, William S.
  • Falendysz, Andrew D.
  • Makovy, Kevin M.
  • Summer, Matthew D.
  • Bowman, Michael E.
  • Holt, Michael W.
  • Moffett, Mark B.

Abstract

Methods and systems are described herein for a layered fail-safe redundancy system and architecture for privileged operation execution. The system may receive vehicle maneuvering commands from a controller over a first channel. When a user input is received to initiate a privileged mode for executing privileged commands, the system may receive a privileged command over a second channel. The system may identify, based on the privileged mode of operation and the privileged command, a privileged operation to be performed by a vehicle. The system may then transmit a request to the vehicle to perform the privileged operation.

IPC Classes  ?

  • G05D 1/226 - Communication links with the remote-control arrangements
  • G05D 1/80 - Arrangements for reacting to or preventing system or operator failure
  • H04W 12/06 - Authentication
  • H04W 12/03 - Protecting confidentiality, e.g. by encryption

10.

POINT OF INTEREST TRACKING AND ESTIMATION

      
Application Number US2024037481
Publication Number 2025/015113
Status In Force
Filing Date 2024-07-10
Publication Date 2025-01-16
Owner TOMAHAWK ROBOTICS, INC. (USA)
Inventor
  • Hedman, Daniel R.
  • Summer, Matthew D.
  • Bowman, William S.
  • Bowman, Michael E.
  • Truesdell, Brad
  • Falendysz, Andrew D.

Abstract

Methods and systems are described herein for determining three-dimensional locations of objects within identified portions of images. An image processing system may receive an image and an identification of location within an image. The image may be input into a machine learning model to detect one or more objects within the identified location. Multiple images may then be used to generate location estimations of those objects. Based on the location estimations, an accurate three-dimensional location may be calculated.

IPC Classes  ?

  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G05D 1/24 - Arrangements for determining position or orientation

11.

Point of interest tracking and estimation

      
Application Number 18350722
Grant Number 12198377
Status In Force
Filing Date 2023-07-11
First Publication Date 2025-01-14
Grant Date 2025-01-14
Owner Tomahawk Robotics, Inc. (USA)
Inventor
  • Hedman, Daniel R.
  • Summer, Matthew D.
  • Bowman, William S.
  • Bowman, Michael E.
  • Truesdell, Brad
  • Falendysz, Andrew D.

Abstract

Methods and systems are described herein for determining three-dimensional locations of objects within identified portions of images. An image processing system may receive an image and an identification of location within an image. The image may be input into a machine learning model to detect one or more objects within the identified location. Multiple images may then be used to generate location estimations of those objects. Based on the location estimations, an accurate three-dimensional location may be calculated.

IPC Classes  ?

  • H04N 7/18 - Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
  • G06T 7/60 - Analysis of geometric attributes
  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G06V 10/774 - Generating sets of training patternsBootstrap methods, e.g. bagging or boosting
  • G06V 20/17 - Terrestrial scenes taken from planes or by drones
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition
  • H04N 23/661 - Transmitting camera control signals through networks, e.g. control via the Internet

12.

CLASSIFICATION PARALLELIZATION ARCHITECTURE

      
Application Number 18800296
Status Pending
Filing Date 2024-08-12
First Publication Date 2024-12-05
Owner Tomahawk Robotics, Inc. (USA)
Inventor
  • Bowman, William S.
  • Wagoner, Sean
  • Falendysz, Andrew D.
  • Summer, Matthew D.
  • Makovy, Kevin
  • Cooper, Jeffrey S.
  • Truesdell, Brad

Abstract

Methods and systems are described herein for hosting and arbitrating algorithms for the generation of structured frames of data from one or more sources of unstructured input frames. A plurality of frames may be received from a recording device and a plurality of object types to be recognized in the plurality of frames may be determined. A determination may be made of multiple machine learning models for recognizing the object types. The frames may be sequentially input into the machine learning models to obtain a plurality of sets of objects from the plurality of machine learning models and object indicators may be received from those machine learning models. A set of composite frames with the plurality of indicators corresponding to the plurality of objects may be generated, and an output stream may be generated including the set of composite frames to be played back in chronological order.

IPC Classes  ?

  • G06V 10/96 - Management of image or video recognition tasks
  • G05B 13/02 - Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric
  • G06V 10/70 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning
  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
  • G06V 10/94 - Hardware or software architectures specially adapted for image or video understanding
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

13.

Classification parallelization architecture

      
Application Number 18772099
Grant Number 12266164
Status In Force
Filing Date 2024-07-12
First Publication Date 2024-11-14
Grant Date 2025-04-01
Owner Tomahawk Robotics, Inc. (USA)
Inventor
  • Bowman, William S.
  • Wagoner, Sean
  • Falendysz, Andrew D.
  • Summer, Matthew D.
  • Makovy, Kevin
  • Cooper, Jeffrey S.
  • Truesdell, Brad

Abstract

Methods and systems are described herein for hosting and arbitrating algorithms for the generation of structured frames of data from one or more sources of unstructured input frames. A plurality of frames may be received from a recording device and a plurality of object types to be recognized in the plurality of frames may be determined. A determination may be made of multiple machine learning models for recognizing the object types. The frames may be sequentially input into the machine learning models to obtain a plurality of sets of objects from the plurality of machine learning models and object indicators may be received from those machine learning models. A set of composite frames with the plurality of indicators corresponding to the plurality of objects may be generated, and an output stream may be generated including the set of composite frames to be played back in chronological order.

IPC Classes  ?

  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
  • G05B 13/02 - Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric
  • G06V 10/70 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning
  • G06V 10/94 - Hardware or software architectures specially adapted for image or video understanding
  • G06V 10/96 - Management of image or video recognition tasks
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

14.

Systems and methods of detecting intent of spatial control

      
Application Number 18540632
Grant Number 12189386
Status In Force
Filing Date 2023-12-14
First Publication Date 2024-04-04
Grant Date 2025-01-07
Owner Tomahawk Robotics, Inc. (USA)
Inventor
  • Summer, Matthew D.
  • Bowman, William S.
  • Falendysz, Andrew D.
  • Makovy, Kevin M.
  • Hedman, Daniel R.
  • Truesdell, Bradley D.

Abstract

Systems and methods of manipulating/controlling robots. In many scenarios, data collected by a sensor (connected to a robot) may not have very high precision (e.g., a regular commercial/inexpensive sensor) or may be subjected to dynamic environmental changes. Thus, the data collected by the sensor may not indicate the parameter captured by the sensor with high accuracy. The present robotic control system is directed at such scenarios. In some embodiments, the disclosed embodiments can be used for computing a sliding velocity limit boundary for a spatial controller. In some embodiments, the disclosed embodiments can be used for teleoperation of a vehicle located in the field of view of a camera.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
  • B62D 57/02 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members
  • G05D 1/222 - Remote-control arrangements operated by humans
  • G05D 1/223 - Command input arrangements on the remote controller, e.g. joysticks or touch screens
  • G05D 1/224 - Output arrangements on the remote controller, e.g. displays, haptics or speakers
  • G05D 1/24 - Arrangements for determining position or orientation
  • G05D 1/65 - Following a desired speed profile
  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

15.

ARCHITECTURE FOR DISTRIBUTED ARTIFICIAL INTELLIGENCE AUGMENTATION

      
Application Number 17930399
Status Pending
Filing Date 2022-09-07
First Publication Date 2024-03-07
Owner Tomahawk Robotics, Inc. (USA)
Inventor
  • Hedman, Daniel R.
  • Bowman, William S.
  • Summer, Matthew D.
  • Korte, Bryce
  • Falendysz, Andrew D.

Abstract

Methods and systems are described herein for determining three-dimensional locations of objects within a video stream and linking those objects with known objects. An image processing system may receive an image and image metadata and detect an object and a location of the object within the image. The estimated location of each object is then determined within the three-dimensional space. In addition, the image processing system may retrieve, for a plurality of known objects, a plurality of known locations within the three-dimensional space and determine, based on estimated location and the known location data, which of the known objects matches the detected object in the image. An indicator for the object is then generated at the location of the object within the image.

IPC Classes  ?

  • G06T 19/00 - Manipulating 3D models or images for computer graphics
  • G06T 7/70 - Determining position or orientation of objects or cameras
  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects

16.

Classification parallelization architecture

      
Application Number 18454055
Grant Number 12067768
Status In Force
Filing Date 2023-08-22
First Publication Date 2023-12-07
Grant Date 2024-08-20
Owner Tomahawk Robotics, Inc. (USA)
Inventor
  • Bowman, William S.
  • Wagoner, Sean
  • Falendysz, Andrew D.
  • Summer, Matthew D.
  • Makovy, Kevin
  • Cooper, Jeffrey S.
  • Truesdell, Brad

Abstract

Methods and systems are described herein for hosting and arbitrating algorithms for the generation of structured frames of data from one or more sources of unstructured input frames. A plurality of frames may be received from a recording device and a plurality of object types to be recognized in the plurality of frames may be determined. A determination may be made of multiple machine learning models for recognizing the object types. The frames may be sequentially input into the machine learning models to obtain a plurality of sets of objects from the plurality of machine learning models and object indicators may be received from those machine learning models. A set of composite frames with the plurality of indicators corresponding to the plurality of objects may be generated, and an output stream may be generated including the set of composite frames to be played back in chronological order.

IPC Classes  ?

  • G06V 10/70 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning
  • G05B 13/02 - Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric
  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
  • G06V 10/94 - Hardware or software architectures specially adapted for image or video understanding
  • G06V 10/96 - Management of image or video recognition tasks
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

17.

Inertially isolated spatial control

      
Application Number 18311370
Grant Number 12223124
Status In Force
Filing Date 2023-05-03
First Publication Date 2023-10-19
Grant Date 2025-02-11
Owner Tomahawk Robotics, Inc. (USA)
Inventor
  • Bowman, Michael E.
  • Bowman, William S.
  • Hedman, Daniel R.
  • Summer, Matthew D.
  • Falendysz, Andrew D.
  • Makovy, Kevin
  • Holt, Michael W.

Abstract

Methods and systems are described herein for detecting motion-induced errors received from inertial-type input devices and for generating accurate vehicle control commands that account for operator movement. These methods and systems may determine, using motion data from inertial sensors, whether the hand/arm of the operator is moving in the same motion as the body of the operator, and if both are moving in the same way, these systems and methods may determine that the motion is not intended to be a motion-induced command. However, if the hand/arm of the operator is moving in a different motion from the body of the operator, these methods and systems may determine that the operator intended the motion to be a motion-induced command to a vehicle.

IPC Classes  ?

  • G06F 3/0488 - Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
  • G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

18.

ARCHITECTURE FOR DISTRIBUTED ARTIFICIAL INTELLIGENCE AUGMENTATION

      
Application Number 17702669
Status Pending
Filing Date 2022-03-23
First Publication Date 2023-07-27
Owner TOMAHAWK ROBOTICS, INC. (USA)
Inventor
  • Falendysz, Andrew D.
  • Bowman, William S.
  • Summer, Matthew D.
  • Hedman, Daniel R.
  • Wagoner, Sean

Abstract

Methods and systems are described herein for generating composite data streams. A data stream processing system may receive multiple data streams from, for example, multiple unmanned vehicles and determine, based on the type of data within each data stream, a machine learning model for each data stream for processing the type of data. Each machine learning model may receive the frames of a corresponding data stream and output indications and locations of objects within those data streams. The data stream processing system may then generate a composite data stream with indications of the detected objects.

IPC Classes  ?

  • G06V 20/40 - ScenesScene-specific elements in video content
  • G06V 10/70 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning
  • G06N 20/20 - Ensemble learning
  • B64C 39/02 - Aircraft not otherwise provided for characterised by special use

19.

Classification parallelization architecture

      
Application Number 17571081
Grant Number 11776247
Status In Force
Filing Date 2022-01-07
First Publication Date 2023-07-13
Grant Date 2023-10-03
Owner TOMAHAWK ROBOTICS, INC. (USA)
Inventor
  • Bowman, William S.
  • Wagoner, Sean
  • Falendysz, Andrew D.
  • Summer, Matthew D.
  • Makovy, Kevin
  • Cooper, Jeffrey S.
  • Truesdell, Brad

Abstract

Methods and systems are described herein for hosting and arbitrating algorithms for the generation of structured frames of data from one or more sources of unstructured input frames. A plurality of frames may be received from a recording device and a plurality of object types to be recognized in the plurality of frames may be determined. A determination may be made of multiple machine learning models for recognizing the object types. The frames may be sequentially input into the machine learning models to obtain a plurality of sets of objects from the plurality of machine learning models and object indicators may be received from those machine learning models. A set of composite frames with the plurality of indicators corresponding to the plurality of objects may be generated, and an output stream may be generated including the set of composite frames to be played back in chronological order.

IPC Classes  ?

  • G06V 10/70 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning
  • G06V 10/96 - Management of image or video recognition tasks
  • G05B 13/02 - Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric
  • G06V 10/94 - Hardware or software architectures specially adapted for image or video understanding
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
  • G06V 10/764 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects

20.

Inertially isolated spatial control

      
Application Number 17720130
Grant Number 11675445
Status In Force
Filing Date 2022-04-13
First Publication Date 2023-06-13
Grant Date 2023-06-13
Owner TOMAHAWK ROBOTICS, INC. (USA)
Inventor
  • Bowman, Michael E.
  • Bowman, William S.
  • Hedman, Daniel R.
  • Summer, Matthew D.
  • Falendysz, Andrew D.
  • Makovy, Kevin
  • Holt, Michael W.

Abstract

Methods and systems are described herein for detecting motion-induced errors received from inertial-type input devices and for generating accurate vehicle control commands that account for operator movement. These methods and systems may determine, using motion data from inertial sensors, whether the hand/arm of the operator is moving in the same motion as the body of the operator, and if both are moving in the same way, these systems and methods may determine that the motion is not intended to be a motion-induced command. However, if the hand/arm of the operator is moving in a different motion from the body of the operator, these methods and systems may determine that the operator intended the motion to be a motion-induced command to a vehicle.

IPC Classes  ?

  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots

21.

UNIVERSAL CONTROL ARCHITECTURE FOR CONTROL OF UNMANNED SYSTEMS

      
Application Number 17571305
Status Pending
Filing Date 2022-01-07
First Publication Date 2022-12-29
Owner TOMAHAWK ROBOTICS, INC. (USA)
Inventor
  • Summer, Matthew D.
  • Bowman, William S.
  • Falendysz, Andrew D.
  • Hedman, Daniel R.
  • Truesdell, Brad
  • Cooper, Jeffrey S.
  • Bowman, Michael E.
  • Wagoner, Sean
  • Makovy, Kevin

Abstract

A common command and control architecture (alternatively termed herein as a “universal control architecture”) is disclosed that allows different unmanned systems, including different types of unmanned systems (e.g., air, ground, and/or maritime unmanned systems), to be controlled simultaneously through a common control device (e.g., a controller that can be an input and/or output device). The universal control architecture brings significant efficiency gains in engineering, deployment, training, maintenance, and future upgrades of unmanned systems. In addition, the disclosed common command and control architecture breaks the traditional stovepipe development involving deployment models and thus reducing hardware and software maintenance, creating a streamlined training/proficiency initiative, reducing physical space requirements for transport, and creating a scalable, more connected interoperable approach to control of unmanned systems over existing unmanned systems technology.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots

22.

Systems and methods of detecting intent of spatial control

      
Application Number 17417176
Grant Number 11886182
Status In Force
Filing Date 2019-12-31
First Publication Date 2022-03-17
Grant Date 2024-01-30
Owner Tomahawk Robotics, Inc. (USA)
Inventor
  • Summer, Matthew D.
  • Bowman, William S.
  • Falendysz, Andrew D.
  • Makovy, Kevin M.
  • Hedman, Daniel R.
  • Truesdell, Bradley D.

Abstract

Systems and methods of manipulating/controlling robots. In many scenarios, data collected by a sensor (connected to a robot) may not have very high precision (e.g., a regular commercial/inexpensive sensor) or may be subjected to dynamic environmental changes. Thus, the data collected by the sensor may not indicate the parameter captured by the sensor with high accuracy. The present robotic control system is directed at such scenarios. In some embodiments, the disclosed embodiments can be used for computing a sliding velocity limit boundary for a spatial controller. In some embodiments, the disclosed embodiments can be used for teleoperation of a vehicle located in the field of view of a camera.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
  • B62D 57/02 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members
  • G05D 1/02 - Control of position or course in two dimensions
  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

23.

Systems and methods of remote teleoperation of robotic vehicles

      
Application Number 17417194
Grant Number 12124256
Status In Force
Filing Date 2019-12-31
First Publication Date 2022-03-17
Grant Date 2024-10-22
Owner Tomahawk Robotics, Inc. (USA)
Inventor
  • Summer, Matthew D.
  • Bowman, William S.
  • Falendysz, Andrew D.
  • Makovy, Kevin M
  • Hedman, Daniel R
  • Truesdell, Bradley D.

Abstract

Systems and methods of manipulating/controlling robots. In many scenarios, data collected by a sensor (connected to a robot) may not have very high precision (e.g., a regular commercial/inexpensive sensor) or may be subjected to dynamic environmental changes. Thus, the data collected by the sensor may not indicate the parameter captured by the sensor with high accuracy. The present robotic control system is directed at such scenarios. In some embodiments, the disclosed embodiments can be used for computing a sliding velocity limit boundary for a spatial controller. In some embodiments, the disclosed embodiments can be used for teleoperation of a vehicle located in the field of view of a camera.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
  • B62D 57/02 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members
  • G05D 1/222 - Remote-control arrangements operated by humans
  • G05D 1/223 - Command input arrangements on the remote controller, e.g. joysticks or touch screens
  • G05D 1/224 - Output arrangements on the remote controller, e.g. displays, haptics or speakers
  • G05D 1/24 - Arrangements for determining position or orientation
  • G05D 1/65 - Following a desired speed profile
  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

24.

SPATIAL TELEOPERATION OF LEGGED VEHICLES

      
Application Number 17417206
Status Pending
Filing Date 2019-12-31
First Publication Date 2022-03-10
Owner TOMAHAWK ROBOTICS, INC. (USA)
Inventor
  • Summer, Matthew D.
  • Bowrnan, William S.
  • Falendysz, Andrew D.
  • Makovy, Kevin M.
  • Hedman, Daniel R.
  • Truesdell, Bradley D.

Abstract

Systems and methods of manipulating/controlling robots. In many scenarios, data collected by a sensor (connected to a robot) may not have very high precision (e.g., a regular commercial/inexpensive sensor) or may be subjected to dynamic environmental changes. Thus, the data collected by the sensor may not indicate the parameter captured by the sensor with high accuracy. The present robotic control system is directed at such scenarios. In some embodiments, the disclosed embodiments can be used for computing a sliding velocity limit boundary for a spatial controller. In some embodiments, the disclosed embodiments can be used for teleoperation of a vehicle located in the field of view of a camera.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
  • G05D 1/02 - Control of position or course in two dimensions
  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • B62D 57/02 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members