Google Inc.

United States of America

Back to Profile

1-100 of 6,894 for Google Inc. and 12 subsidiaries Sort by
Query
Aggregations
IP Type
        Patent 6,864
        Trademark 30
Jurisdiction
        World 4,410
        United States 2,478
        Canada 6
Owner / Subsidiary
[Owner] Google Inc. 4,120
Google Technology Holdings LLC 2,136
Boston Dynamics, Inc. 552
Nest Labs, Inc. 37
Eyefluence, Inc. 9
See more
Date
New (last 4 weeks) 6
2026 February (MTD) 6
2026 January 3
2025 December 6
2025 November 2
See more
IPC Class
G06F 17/30 - Information retrieval; Database structures therefor 758
G06Q 30/00 - Commerce 288
B25J 9/16 - Programme controls 285
G06F 15/16 - Combinations of two or more digital computers each having at least an arithmetic unit, a program unit and a register, e.g. for a simultaneous processing of several programs 274
H04L 29/06 - Communication control; Communication processing characterised by a protocol 250
See more
NICE Class
09 - Scientific and electric apparatus and instruments 24
07 - Machines and machine tools 12
42 - Scientific, technological and industrial services, research and design 9
35 - Advertising and business services 5
12 - Land, air and water vehicles; parts of land vehicles 3
See more
Status
Pending 115
Registered / In Force 6,779
  1     2     3     ...     69        Next Page

1.

Robotic device

      
Application Number 29921273
Grant Number D1114013
Status In Force
Filing Date 2023-12-15
First Publication Date 2026-02-17
Grant Date 2026-02-17
Owner Boston Dynamics, Inc. (USA)
Inventor Abroff, Aaron

2.

AUTONOMOUS AND TELEOPERATED SENSOR POINTING ON A MOBILE ROBOT

      
Application Number 19358968
Status Pending
Filing Date 2025-10-15
First Publication Date 2026-02-12
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Stathis, Christopher
  • Gonano, Dion
  • Paolini, Robert Eugene
  • Komoroski, Adam

Abstract

A computer-implemented method executed by data processing hardware of a robot causes the data processing hardware to perform operations. The operations include receiving a sensor pointing command that commands the robot to use a sensor to capture sensor data of a location in an environment of the robot. The sensor is disposed on the robot. The operations include determining, based on an orientation of the sensor relative to the location, a direction for pointing the sensor toward the location, and an alignment pose of the robot to cause the sensor to point in the direction toward the location. The operations include commanding the robot to move from a current pose to the alignment pose. After the robot moves to the alignment pose and the sensor is pointing in the direction toward the location, the operations include commanding the sensor to capture the sensor data of the location in the environment.

IPC Classes  ?

3.

Topology Processing for Waypoint-based Navigation Maps

      
Application Number 19356803
Status Pending
Filing Date 2025-10-13
First Publication Date 2026-02-05
Owner Boston Dynamics, Inc. (USA)
Inventor Klingensmith, Matthew Jacob

Abstract

The operations of a computer-implemented method include obtaining a topological map of an environment including a series of waypoints and a series of edges. Each edge topologically connects a corresponding pair of adjacent waypoints. The edges represent traversable routes for a robot. The operations include determining, using the topological map and sensor data captured by the robot, one or more candidate alternate edges. Each candidate alternate edge potentially connects a corresponding pair of waypoints that are not connected by one of the edges. For each respective candidate alternate edge, the operations include determining, using the sensor data, whether the robot can traverse the respective candidate alternate edge without colliding with an obstacle and, when the robot can traverse the respective candidate alternate edge, confirming the respective candidate alternate edge as a respective alternate edge. The operations include updating, using nonlinear optimization and the confirmed alternate edges, the topological map.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
  • G01C 22/00 - Measuring distance traversed on the ground by vehicles, persons, animals or other moving solid bodies, e.g. using odometers or using pedometers
  • G05D 1/246 - Arrangements for determining position or orientation using environment maps, e.g. simultaneous localisation and mapping [SLAM]
  • G05D 1/644 - Optimisation of travel parameters, e.g. of energy consumption, journey time or distance
  • G06F 18/2132 - Feature extraction, e.g. by transforming the feature spaceSummarisationMappings, e.g. subspace methods based on discrimination criteria, e.g. discriminant analysis
  • G06F 18/2134 - Feature extraction, e.g. by transforming the feature spaceSummarisationMappings, e.g. subspace methods based on separation criteria, e.g. independent component analysis
  • G06F 18/2135 - Feature extraction, e.g. by transforming the feature spaceSummarisationMappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis

4.

SEMANTIC MODELS FOR ROBOT AUTONOMY ON DYNAMIC SITES

      
Application Number 19178191
Status Pending
Filing Date 2025-04-14
First Publication Date 2026-02-05
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Da Silva, Marco
  • Jonak, Dom
  • Klingensmith, Matthew
  • Seifert, Samuel

Abstract

A method includes receiving, while a robot traverses a building environment, sensor data captured by one or more sensors of the robot. The method includes receiving a building information model (BIM) for the environment that includes semantic information identifying one or more permanent objects within the environment. The method includes generating a plurality of localization candidates for a localization map of the environment. Each localization candidate corresponds to a feature of the environment identified by the sensor data and represents a potential localization reference point. The localization map is configured to localize the robot within the environment when the robot moves throughout the environment. For each localization candidate, the method includes determining whether the respective feature corresponding to the respective localization candidate is a permanent object in the environment and generating the respective localization candidate as a localization reference point in the localization map for the robot.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
  • B62D 57/032 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legVehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted feet or skid
  • G01C 21/00 - NavigationNavigational instruments not provided for in groups
  • G05D 1/246 - Arrangements for determining position or orientation using environment maps, e.g. simultaneous localisation and mapping [SLAM]
  • G05D 1/617 - Safety or protection, e.g. defining protection zones around obstacles or avoiding hazards
  • G05D 111/50 - Internal signals, i.e. from sensors located in the vehicle, e.g. from compasses or angular sensors
  • G06V 20/50 - Context or environment of the image

5.

Detecting and Responding to A Difference Between Parameters of a Legged Robot

      
Application Number 19356851
Status Pending
Filing Date 2025-10-13
First Publication Date 2026-02-05
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Blankespoor, Kevin
  • Perkins, Alex
  • Da Silva, Marco

Abstract

An example method may include i) determining a first distance between a pair of feet of a robot at a first time, where the pair of feet is in contact with a ground surface; ii) determining a second distance between the pair of feet of the robot at a second time, where the pair of feet remains in contact with the ground surface from the first time to the second time; iii) comparing a difference between the determined first and second distances to a threshold difference; iv) determining that the difference between determined first and second distances exceeds the threshold difference; and v) based on the determination that the difference between the determined first and second distances exceeds the threshold difference, causing the robot to react.

IPC Classes  ?

  • B62D 57/02 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members
  • B25J 9/16 - Programme controls
  • B25J 13/08 - Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
  • B62D 57/032 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legVehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted feet or skid
  • G05D 1/245 - Arrangements for determining position or orientation using dead reckoning
  • G05D 1/43 - Control of position or course in two dimensions

6.

DETECTING AND RESPONDING TO OBSTACLES

      
Application Number 19356902
Status Pending
Filing Date 2025-10-13
First Publication Date 2026-02-05
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Yu, Jeffrey
  • Swilling, Benjamin John
  • Whitman, Eric Cary

Abstract

A computer-implemented method when executed by data processing hardware causes the data processing hardware to perform operations. The operations include detecting a candidate support surface at an elevation less than a current surface supporting a legged robot. A determination is made on whether the candidate support surface includes an area of missing terrain data within a portion of an environment surrounding the legged robot, where the area is large enough to receive a touchdown placement for a leg of the legged robot. If missing terrain data is determined, at least a portion of the area of missing terrain data is classified as a no-step region of the candidate support surface. The no-step region indicates a region where the legged robot should avoid touching down a leg of the legged robot.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
  • B62D 57/032 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legVehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted feet or skid

7.

GROUND CLUTTER AVOIDANCE FOR A MOBILE ROBOT

      
Application Number 19329229
Status Pending
Filing Date 2025-09-15
First Publication Date 2026-01-08
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Komoroski, Adam
  • Yamauchi, Brian
  • Klingensmith, Matthew

Abstract

Methods and apparatus for navigating a robot along a route through an environment, the route being associated with a mission, are provided. The method comprises identifying, based on sensor data received by one or more sensors of the robot, a set of potential obstacles in the environment, determining, based at least in part on stored data indicating a set of footfall locations of the robot during a previous execution of the mission, that at least one of the potential obstacles in the set is an obstacle, and navigating the robot to avoid stepping on the obstacle.

IPC Classes  ?

  • G05D 1/622 - Obstacle avoidance
  • B62D 57/032 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legVehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted feet or skid
  • G05D 1/43 - Control of position or course in two dimensions
  • G05D 1/617 - Safety or protection, e.g. defining protection zones around obstacles or avoiding hazards
  • G05D 1/628 - Obstacle avoidance following the obstacle profile, e.g. a wall or undulated terrain
  • G06T 7/10 - SegmentationEdge detection

8.

INTERMEDIATE WAYPOINT GENERATOR

      
Application Number 19326412
Status Pending
Filing Date 2025-09-11
First Publication Date 2026-01-08
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Fay, Gina Christine
  • Rizzi, Alfred

Abstract

A method for generating intermediate waypoints for a navigation system of a robot includes receiving a navigation route. The navigation route includes a series of high-level waypoints that begin at a starting location and end at a destination location and is based on high-level navigation data. The high-level navigation data is representative of locations of static obstacles in an area the robot is to navigate. The method also includes receiving image data of an environment about the robot from an image sensor and generating at least one intermediate waypoint based on the image data. The method also includes adding the at least one intermediate waypoint to the series of high-level waypoints of the navigation route and navigating the robot from the starting location along the series of high-level waypoints and the at least one intermediate waypoint toward the destination location.

IPC Classes  ?

  • G01C 21/20 - Instruments for performing navigational calculations

9.

METHODS AND APPARATUS FOR DETERMINING POSE AND SIZE OF OBJECTS USING THREE-DIMENSIONAL MACHINE LEARNING

      
Application Number 18758024
Status Pending
Filing Date 2024-06-28
First Publication Date 2026-01-01
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Pauwels, Karl
  • Kelly, Michael
  • Gardner, Matthew

Abstract

Methods and apparatus for controlling a mobile robot to perform an action are provided. The method includes receiving, by at least one computing device associated with a mobile robot, first sensor data and second sensor data, providing as input to at least one machine learning model, the first sensor data, the second sensor data, and camera intrinsics associated with at least one camera configured to sense the first sensor data and/or the second sensor data, wherein the at least one machine learning model is trained to output polyhedron information representing a set of objects in an environment of the mobile robot, and controlling the mobile robot to perform an action based, at least in part, on the polyhedron information output from the at least one machine learning model.

IPC Classes  ?

10.

SYSTEMS AND METHOD FOR SAFE ACTUATION OF A MOBILE ROBOT

      
Application Number 19243861
Status Pending
Filing Date 2025-06-20
First Publication Date 2025-12-25
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Billings, Devin
  • Rogers, Kyle W.
  • Murphy, Michael

Abstract

A robot configured to operate safely in response to detecting an abnormal operating condition is provided. The robot includes an actuator coupled to a robot member and a motor controller configured to control the actuator to move the robot member about a robot joint. The motor controller includes a first set of components and a second set of components, and each of the first set of components and the second set of components is independently operable by the motor controller to control the actuator to move the robot member about the robot joint.

IPC Classes  ?

  • B25J 9/16 - Programme controls
  • B25J 19/00 - Accessories fitted to manipulators, e.g. for monitoring, for viewingSafety devices combined with or specially adapted for use in connection with manipulators

11.

Robotic device

      
Application Number 29921260
Grant Number D1106296
Status In Force
Filing Date 2023-12-15
First Publication Date 2025-12-16
Grant Date 2025-12-16
Owner Boston Dynamics, Inc. (USA)
Inventor Abroff, Aaron

12.

METHODS AND APPARATUS FOR OBJECT QUALITY DETECTION

      
Application Number 18679613
Status Pending
Filing Date 2024-05-31
First Publication Date 2025-12-04
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Broad, Alexander
  • Ushani, Arash
  • Ramachandran, Karthik
  • Blank, Amy
  • Shapton, Krista
  • Gilroy, Scott
  • Rahme, Maurice
  • Shaw, Samuel
  • Aylward, Grant
  • Murphy, Michael
  • Perkins, Alexander

Abstract

Methods and apparatus for assigning a quality metric to an object to be grasped by a mobile robot are provided. The method includes receiving at least one image including a set of objects, processing the at least one image using a trained machine learning model to assign a quality metric to a first object of the set of objects in the at least one image, and controlling the mobile robot to perform an action based, at least in part, on the quality metric assigned to the first object.

IPC Classes  ?

  • G05D 1/656 - Interaction with payloads or external entities
  • G05D 1/646 - Following a predefined trajectory, e.g. a line marked on the floor or a flight path
  • G05D 1/65 - Following a desired speed profile
  • G05D 101/15 - Details of software or hardware architectures used for the control of position using artificial intelligence [AI] techniques using machine learning, e.g. neural networks
  • G05D 107/70 - Industrial sites, e.g. warehouses or factories
  • G05D 111/10 - Optical signals
  • G06T 7/00 - Image analysis
  • G06V 10/70 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning
  • G06V 10/98 - Detection or correction of errors, e.g. by rescanning the pattern or by human interventionEvaluation of the quality of the acquired patterns
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

13.

SLIP HANDLING AND GROUND FRICTION ESTIMATION FOR ROBOTS

      
Application Number 19301224
Status Pending
Filing Date 2025-08-15
First Publication Date 2025-12-04
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Graber-Tilton, Alexander Samuel
  • Swilling, Benjamin John

Abstract

Apparatus and methods for mitigating slip conditions and estimating ground friction for a robot having a plurality of feet are provided. In one aspect, a method includes estimating a coefficient of friction for a ground surface supporting the legged robot based on sensor data, odometry data, and a terrain map of an environment. The sensor data includes a set of joint angles and a set of joint torques for a set of joints of the legged robot, and the odometry data indicates a location of the legged robot in the environment. One of the plurality of feet of the robot applies a force on the ground surface based on the estimated coefficient of friction.

IPC Classes  ?

14.

METHODS AND APPARATUS FOR PLACEMENT OF AN OBJECT ON A CONVEYOR USING A ROBOTIC DEVICE

      
Application Number 18679632
Status Pending
Filing Date 2024-05-31
First Publication Date 2025-12-04
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Turpin, Matthew
  • Murphy, Michael

Abstract

Methods and apparatus for determining a velocity of a conveyor associated with a mobile robot are provided. The method includes receiving first image data, the first image data including a first representation of a first object and a conveyor, the first image data captured at a first time, and determining by at least one hardware processor, a velocity of the conveyor based, at least in part, on the first representation of the first object in the first image data and a difference between the first time and a second time different from the first time.

IPC Classes  ?

  • B65G 47/04 - Devices for feeding articles or materials to conveyors for feeding articles
  • B65G 43/08 - Control devices operated by article or material being fed, conveyed, or discharged
  • B65G 47/90 - Devices for picking-up and depositing articles or materials

15.

METHODS AND APPARATUS FOR OBJECT QUALITY DETECTION

      
Application Number US2025031387
Publication Number 2025/250771
Status In Force
Filing Date 2025-05-29
Publication Date 2025-12-04
Owner BOSTON DYNAMICS, INC. (USA)
Inventor
  • Broad, Alexander
  • Ushani, Arash
  • Ramachandran, Karthik
  • Blank, Amy
  • Shapton, Krista
  • Gilroy, Scott
  • Rahme, Maurice
  • Shaw, Samuel
  • Aylward, Grant
  • Murphy, Michael
  • Perkins, Alexander

Abstract

Methods and apparatus for assigning a quality metric to an object to be grasped by a mobile robot are provided. The method includes receiving at least one image including a set of objects, processing the at least one image using a trained machine learning model to assign a quality metric to a first object of the set of objects in the at least one image, and controlling the mobile robot to perform an action based, at least in part, on the quality metric assigned to the first object.

IPC Classes  ?

  • B25J 9/16 - Programme controls
  • G05B 19/418 - Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]

16.

Robotic device

      
Application Number 29921262
Grant Number D1103236
Status In Force
Filing Date 2023-12-15
First Publication Date 2025-11-25
Grant Date 2025-11-25
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Gant, Evan Isaac Timerding
  • Robert, David

17.

ROBOT MOVEMENT AND INTERACTION WITH MASSIVE BODIES

      
Application Number 19274783
Status Pending
Filing Date 2025-07-21
First Publication Date 2025-11-13
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Koolen, Frans Anton
  • Deits, Robin

Abstract

The invention includes systems and methods for determining movement of a robot. A computing system of the robot receives information comprising a reference behavior specification, a current state of the robot, and a characteristic of a massive body coupled to or expected to be coupled to the robot. The computing system determines, based on the information, a set of movement parameters for the robot, the set of movement parameters reflecting a goal trajectory for the robot. The computing system instructs the robot to move consistent with the set of movement parameters.

IPC Classes  ?

18.

SYSTEMS AND METHODS FOR EQUALIZING AUDIO FOR PLAYBACK ON AN ELECTRONIC DEVICE

      
Application Number 19264725
Status Pending
Filing Date 2025-07-09
First Publication Date 2025-10-30
Owner GOOGLE TECHNOLOGY HOLDINGS LLC (USA)
Inventor
  • Schuster, Adrian M.
  • Annabathula, Prabhu
  • Bastyr, Kevin J.
  • Wells, Andrew K.
  • Zhang, Wen Hao

Abstract

Embodiments are provided for receiving a request to output audio at a first speaker and a second speaker of an electronic device, determining that the electronic device is oriented in a portrait orientation or a landscape orientation, identifying, based on the determined orientation, a first equalization setting for the first speaker and a second equalization setting for the second speaker, providing, for output at the first speaker, a first audio signal with the first equalization setting, and providing, for output at the second speaker, a second audio signal with the second equalization setting.

IPC Classes  ?

  • H04R 29/00 - Monitoring arrangementsTesting arrangements
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/16 - Sound inputSound output
  • H03G 5/16 - Automatic control
  • H04R 3/04 - Circuits for transducers for correcting frequency response

19.

Constrained Mobility Mapping

      
Application Number 19253323
Status Pending
Filing Date 2025-06-27
First Publication Date 2025-10-16
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Whitman, Eric
  • Fay, Gina Christine
  • Khripin, Alex
  • Bajracharya, Max
  • Malchano, Matthew
  • Komoroski, Adam
  • Stathis, Christopher

Abstract

A method of constrained mobility mapping includes receiving from at least one sensor of a robot at least one original set of sensor data and a current set of sensor data. Here, each of the at least one original set of sensor data and the current set of sensor data corresponds to an environment about the robot. The method further includes generating a voxel map including a plurality of voxels based on the at least one original set of sensor data. The method also includes generating a spherical depth map based on the current set of sensor data and determining that a change has occurred to an obstacle represented by the voxel map based on a comparison between the voxel map and the spherical depth map. The method additional includes updating the voxel map to reflect the change to the obstacle.

IPC Classes  ?

  • G05D 1/606 - Compensating for or utilising external environmental conditions, e.g. wind or water currents
  • B25J 9/16 - Programme controls
  • B25J 13/08 - Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
  • B62D 57/032 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legVehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted feet or skid
  • G05D 1/243 - Means capturing signals occurring naturally from the environment, e.g. ambient optical, acoustic, gravitational or magnetic signals
  • G05D 1/622 - Obstacle avoidance
  • G05D 111/67 - Sensor fusion

20.

Alternate Route Finding for Waypoint-based Navigation Maps

      
Application Number 19186269
Status Pending
Filing Date 2025-04-22
First Publication Date 2025-10-09
Owner Boston Dynamics, Inc. (USA)
Inventor Merewether, Gene Brown

Abstract

A computer-implemented method executed by data processing hardware of a robot causes the data processing hardware to perform operations including obtaining a topological map including waypoints and edges. Each edge connects adjacent waypoints. The waypoints and edges represent a navigation route for the robot to follow. Operations include determining, that an edge that connects first and second waypoints is blocked by an obstacle. Operations include generating, using image data and the topological map, one or more alternate waypoints offset from one of the waypoints. For each alternate waypoint, operations include generating an alternate edge connecting the alternate waypoint to a waypoint. Operations include adjusting the navigation route to include at least one alternate waypoint and alternate edge that bypass the obstacle. Operations include navigating the robot from the first waypoint to an alternate waypoint along the alternate edge connecting the alternate waypoint to the first waypoint.

IPC Classes  ?

21.

UPDATING MOVEMENT PLANS FOR A ROBOTIC DEVICE

      
Application Number 19246233
Status Pending
Filing Date 2025-06-23
First Publication Date 2025-10-09
Owner Boston Dynamics, Inc. (USA)
Inventor Stephens, Benjamin

Abstract

An example implementation for determining mechanically-timed footsteps may involve a robot having a first foot in contact with a ground surface and a second foot not in contact with the ground surface. The robot may determine a position of its center of mass and center of mass velocity, and based on these, determine a capture point for the robot. The robot may also determine a threshold position for the capture point, where the threshold position is based on a target trajectory for the capture point after the second foot contacts the ground surface. The robot may determine that the capture point has reached this threshold position and based on this determination, and cause the second foot to contact the ground surface.

IPC Classes  ?

  • B62D 57/032 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legVehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted feet or skid
  • B25J 9/16 - Programme controls

22.

ROBOT WITH EXTRA-HUMAN BEHAVIOR

      
Application Number US2025019901
Publication Number 2025/212251
Status In Force
Filing Date 2025-03-14
Publication Date 2025-10-09
Owner BOSTON DYNAMICS, INC. (USA)
Inventor
  • Welner, Jakob
  • Izatt, Gregory
  • Deits, Robin
  • Kuindersma, Scott
  • Abe, Yeuhi
  • Kanzaki, Shigeru
  • Hepler, Leland
  • Robert, David
  • Stephens, Benjamin J.

Abstract

Methods and apparatus for controlling a robot (e.g., having a set of continuous rotation joints) to perform extra human behaviors are provided. The method includes receiving task information to perform a task, determining, using a control system of the robot, a motion plan for the robot to perform the task, wherein the motion plan includes rotation about one or more joints of the robot (e.g., about at least one of the continuous rotation joints in the set of continuous rotation joints) to efficiently perform the task using extra human behaviors.

IPC Classes  ?

  • B62D 57/032 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legVehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted feet or skid

23.

ROBOT WITH EXTRA-HUMAN BEHAVIOR

      
Application Number 19079646
Status Pending
Filing Date 2025-03-14
First Publication Date 2025-10-02
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Welner, Jakob
  • Izatt, Gregory
  • Deits, Robin
  • Kuindersma, Scott
  • Abe, Yeuhi
  • Kanzaki, Shigeru
  • Hepler, Leland
  • Robert, David
  • Stephens, Benjamin J.

Abstract

Methods and apparatus for controlling a robot (e.g., having a set of continuous rotation joints) to perform extra human behaviors are provided. The method includes receiving task information to perform a task, determining, using a control system of the robot, a motion plan for the robot to perform the task, wherein the motion plan includes rotation about one or more joints of the robot (e.g., about at least one of the continuous rotation joints in the set of continuous rotation joints) to efficiently perform the task using extra human behaviors.

IPC Classes  ?

24.

ROBOTIC END EFFECTOR WITH TACTILE SENSING

      
Application Number 19086243
Status Pending
Filing Date 2025-03-21
First Publication Date 2025-09-25
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Taylor, Ian
  • Rodriguez, Alberto

Abstract

A sensor module for an end effector of a robot is described. The sensor module includes a substrate having formed thereon, a set of proximity sensors and a set of pressure sensors, the set of proximity sensors and the set of pressure sensors configured to have overlapping sensing regions, and a cover coupled to the substrate, the cover comprising a material that permits transmission of signals from the set of proximity sensors through the material.

IPC Classes  ?

  • B25J 9/16 - Programme controls
  • B25J 15/10 - Gripping heads having finger members with three or more finger members
  • B25J 19/02 - Sensing devices

25.

ONLINE AUTHORING OF ROBOT AUTONOMY APPLICATIONS

      
Application Number 19220670
Status Pending
Filing Date 2025-05-28
First Publication Date 2025-09-11
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Seifert, Samuel
  • Hepler, Leland

Abstract

A method for online authoring of robot autonomy applications includes receiving sensor data of an environment about a robot while the robot traverses through the environment. The method also includes generating an environmental map representative of the environment about the robot based on the received sensor data. While generating the environmental map, the method includes localizing a current position of the robot within the environmental map and, at each corresponding target location of one or more target locations within the environment, recording a respective action for the robot to perform. The method also includes generating a behavior tree for navigating the robot to each corresponding target location and controlling the robot to perform the respective action at each corresponding target location within the environment during a future mission when the current position of the robot within the environmental map reaches the corresponding target location.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots

26.

NETWORK COMMUNICATION DEVICES AND METHODS FOR ROBOTIC OPERATIONS

      
Application Number US2024060552
Publication Number 2025/183774
Status In Force
Filing Date 2024-12-17
Publication Date 2025-09-04
Owner BOSTON DYNAMICS, INC. (USA)
Inventor
  • Saunders, John Aaron
  • Murphy, Michael
  • Malchano, Matthew
  • Vicentini, Federico
  • Diaz-Lankenau, Guillermo
  • Lee, Christopher

Abstract

Methods and apparatus for implementing signaling in access point devices is described. An access point device may include a network interface configured to be coupled to a wired network, a radio configured to emit radio waves that enable a plurality of wireless devices in an environment of the access point device to wirelessly access the wired network, and at least one signaling component, wherein the at least one signaling component is configured to transmit and/or receive signals, wherein the signals are different from the radio waves emitted from the radio.

IPC Classes  ?

  • H04W 4/024 - Guidance services
  • H04W 4/38 - Services specially adapted for particular environments, situations or purposes for collecting sensor information
  • H04W 4/80 - Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
  • H04W 88/08 - Access point devices

27.

NETWORK COMMUNICATION DEVICES AND METHODS FOR ROBOTIC OPERATIONS

      
Application Number 18984434
Status Pending
Filing Date 2024-12-17
First Publication Date 2025-08-28
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Saunders, John Aaron
  • Murphy, Michael
  • Malchano, Matthew
  • Vicentini, Federico
  • Diaz-Lankenau, Guillermo
  • Lee, Christopher

Abstract

Methods and apparatus for implementing signaling in access point devices is described. An access point device may include a network interface configured to be coupled to a wired network, a radio configured to emit radio waves that enable a plurality of wireless devices in an environment of the access point device to wirelessly access the wired network, and at least one signaling component, wherein the at least one signaling component is configured to transmit and/or receive signals, wherein the signals are different from the radio waves emitted from the radio.

IPC Classes  ?

  • G05D 1/226 - Communication links with the remote-control arrangements
  • B25J 9/16 - Programme controls
  • G05D 1/622 - Obstacle avoidance
  • G05D 1/69 - Coordinated control of the position or course of two or more vehicles
  • G05D 105/28 - Specific applications of the controlled vehicles for transportation of freight
  • G05D 107/70 - Industrial sites, e.g. warehouses or factories
  • H04L 67/12 - Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
  • H04W 4/33 - Services specially adapted for particular environments, situations or purposes for indoor environments, e.g. buildings
  • H04W 76/15 - Setup of multiple wireless link connections
  • H04W 84/12 - WLAN [Wireless Local Area Networks]

28.

CONSTRAINED MANIPULATION OF OBJECTS

      
Application Number 19206968
Status Pending
Filing Date 2025-05-13
First Publication Date 2025-08-28
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Aghasadeghi, Navid
  • Rizzi, Alfred Anthony
  • Fay, Gina
  • Paolini, Robert Eugene

Abstract

A computer-implemented method executed by data processing hardware of a robot causes the data processing hardware to perform operations. The robot includes an articulated arm having an end effector configured to engage with an object. The operations include receiving a measured task parameter set for the end effector. The measured task parameter set representing positions of the end effector while manipulating the object. The operations also include generating a task space model for the object based on the measured task parameter set. The task space model modelling the at least one constrained axis of the object. The operations further include limiting movement of the end effector along the at least one constrained axis of the object based on the task space model.

IPC Classes  ?

  • B25J 9/16 - Programme controls
  • B25J 13/08 - Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
  • B62D 57/02 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members
  • G05B 19/4155 - Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by programme execution, i.e. part programme or machine function execution, e.g. selection of a programme

29.

DIRECTED EXPLORATION FOR NAVIGATION IN DYNAMIC ENVIRONMENTS

      
Application Number 19017008
Status Pending
Filing Date 2025-01-10
First Publication Date 2025-08-07
Owner Boston Dynamics, Inc. (USA)
Inventor Yamauchi, Brian Masao

Abstract

A computer-implemented method when executed by data processing hardware causes the data processing hardware to perform operations. The operations include receiving a navigation route for a mobile robot. The navigation route includes a sequence of waypoints connected by edges. Each edge corresponds to movement instructions that navigate the mobile robot between waypoints of the sequence of waypoints. While the mobile robot is traveling along the navigation route, the operations include determining that the mobile robot is unable to execute a respective movement instruction for a respective edge of the navigation route due to an obstacle obstructing the respective edge, generating an alternative path to navigate the mobile robot to an untraveled waypoint in the sequence of waypoints, and resuming travel by the mobile robot along the navigation route. The alternative path avoids the obstacle.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
  • G01C 21/00 - NavigationNavigational instruments not provided for in groups
  • G05D 1/246 - Arrangements for determining position or orientation using environment maps, e.g. simultaneous localisation and mapping [SLAM]
  • G05D 1/617 - Safety or protection, e.g. defining protection zones around obstacles or avoiding hazards

30.

Robotic device

      
Application Number 29921268
Grant Number D1085192
Status In Force
Filing Date 2023-12-15
First Publication Date 2025-07-22
Grant Date 2025-07-22
Owner Boston Dynamics, Inc. (USA)
Inventor Abroff, Aaron

31.

FOOTSTEP CONTACT DETECTION

      
Application Number 19012395
Status Pending
Filing Date 2025-01-07
First Publication Date 2025-07-10
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Whitman, Eric
  • Khripin, Alex

Abstract

A method of footstep contact detection includes receiving joint dynamics data for a swing phase of a swing leg of the robot, receiving odometry data indicative of a pose of the robot, determining whether an impact on the swing leg is indicative of a touchdown of the swing leg based on the joint dynamics data and an amount of completion of the swing phase, and determining when the impact on the swing leg is not indicative of the touchdown of the swing leg, a cause of the impact based on the joint dynamics data and the odometry data.

IPC Classes  ?

  • B25J 9/16 - Programme controls
  • B25J 13/08 - Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
  • B62D 57/02 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members
  • B62D 57/024 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members specially adapted for moving on inclined or vertical surfaces
  • B62D 57/028 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members having wheels and mechanical legs
  • B62D 57/032 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legVehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted feet or skid

32.

ROBOTIC MANIPULATION OF OBJECTS

      
Application Number 18541239
Status Pending
Filing Date 2023-12-15
First Publication Date 2025-06-19
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Doherty, Kevin
  • Rodriguez, Alberto

Abstract

A computing system of a robot receives robot data reflecting at least a portion of the robot, and object data reflecting at least a portion of an object, the object data determined based on information from at least two sources. The computing system determines, based on the robot data and the object data, a set of states of the object, each state in the set of states associated with a distinct time at which the object is at least partially supported by the robot. The set of states includes at least three states associated with three distinct times. The computing system instructs the robot to perform a manipulation of the object based, at least in part, on at least one state in the set of states.

IPC Classes  ?

  • B25J 9/16 - Programme controls
  • B25J 13/08 - Controls for manipulators by means of sensing devices, e.g. viewing or touching devices

33.

ROBOT LOCALIZATION USING DATA WITH VARIABLE DATA TYPES

      
Application Number 18979094
Status Pending
Filing Date 2024-12-12
First Publication Date 2025-06-19
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Chestnutt, Joel Elliott
  • Klingensmith, Matthew Jacob
  • Otero, Nicholas
  • Seifert, Samuel Frank
  • Stathis, Christopher

Abstract

Systems and methods are described for instructing performance of a localization and an action by a mobile robot based on composite data. A system may obtain satellite-based position data and one or more of odometry data or point cloud data. The system may generate composite data by merging the satellite-based position data and the one or more of the odometry data or the point cloud data. The system may instruct performance of a localization by the mobile robot based on the composite data. Based on the localization by the mobile robot, the system may identify an action and instruct performance of the action by a mobile robot.

IPC Classes  ?

  • G05D 1/246 - Arrangements for determining position or orientation using environment maps, e.g. simultaneous localisation and mapping [SLAM]
  • B62D 57/032 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legVehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted feet or skid
  • G01S 19/45 - Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
  • G05D 1/222 - Remote-control arrangements operated by humans
  • G05D 1/648 - Performing a task within a working area or space, e.g. cleaning
  • G05D 111/50 - Internal signals, i.e. from sensors located in the vehicle, e.g. from compasses or angular sensors

34.

AUTOMATED CONSTRAINED MANIPULATION

      
Application Number US2024059801
Publication Number 2025/128842
Status In Force
Filing Date 2024-12-12
Publication Date 2025-06-19
Owner BOSTON DYNAMICS, INC. (USA)
Inventor
  • Aghasadeghi, Navid
  • Paolini, Robert Eugene

Abstract

Techniques for automated constrained manipulation are provided. In one aspect, a method includes receiving a request for manipulating a target constrained object and receiving perception data from at least one sensor of a robot. The perception data indicative of the target constrained object. The method also includes receiving a semantic model of the target constrained object generated based on the perception data and determining a location for a robotic arm of the robot to interact with the target constrained object based on the semantic model and the request. The method further includes controlling the robotic arm to manipulate the target constrained object based on the location for the robotic arm to interact with the target constrained object.

IPC Classes  ?

35.

POWER MANAGEMENT FOR A ROBOTIC DEVICE

      
Application Number 18541021
Status Pending
Filing Date 2023-12-15
First Publication Date 2025-06-19
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Kelly, Matthew
  • Deits, Robin
  • Dicarlo, Jared

Abstract

Methods and apparatus for determining a control strategy for controlling a robot based on power limits of the robot are provided. The method includes determining, by a computing device, motor control information for a plurality of motors associated with a plurality of joints of the robot, the motor control information being determined based, at least in part, on a robot trajectory to achieve a desired behavior and power limit information associated with a power system of the robot, and controlling the plurality of motors associated with the plurality of joints of the robot based, at least in part, on the motor control information.

IPC Classes  ?

  • G05D 1/646 - Following a predefined trajectory, e.g. a line marked on the floor or a flight path
  • B60L 15/20 - Methods, circuits or devices for controlling the propulsion of electrically-propelled vehicles, e.g. their traction-motor speed, to achieve a desired performanceAdaptation of control equipment on electrically-propelled vehicles for remote actuation from a stationary place, from alternative parts of the vehicle or from alternative vehicles of the same vehicle train for control of the vehicle or its driving motor to achieve a desired performance, e.g. speed, torque, programmed variation of speed
  • B62D 57/032 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legVehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted feet or skid

36.

BODY STRUCTURE FOR A ROBOT

      
Application Number 18541213
Status Pending
Filing Date 2023-12-15
First Publication Date 2025-06-19
Owner Boston Dynamics, Inc (USA)
Inventor Geating, Joshua Timothy

Abstract

The invention includes systems and methods for relating to improved body structures for robots. A robot assembly includes a base member (e.g., a pelvis base). The robot assembly includes a first hip member rotatably connected to the pelvis base. The first hip member is connected to a first electric actuator configured to rotate the first hip member about a first hip axis relative to the pelvis base. A first intermediate member is rotatably connected to the first hip member. The first intermediate member is connected to a second electric actuator configured to rotate the first intermediate member about a second hip axis relative to the first hip member. A first leg member is rotatably connected to the first intermediate member. The first leg member is connected to a third electric actuator configured to rotate the first leg member about a third hip axis relative to the first intermediate member.

IPC Classes  ?

  • B25J 9/12 - Programme-controlled manipulators characterised by positioning means for manipulator elements electric
  • B25J 17/00 - Joints

37.

ROBOTIC MANIPULATION OF OBJECTS

      
Application Number 18541256
Status Pending
Filing Date 2023-12-15
First Publication Date 2025-06-19
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Petersen, Mark
  • Izatt, Greg
  • Thomasson, Rachel
  • Rodriguez, Alberto
  • Kuindersma, Scott

Abstract

Computer-implemented methods and apparatus for manipulating an object using a robotic device are provided. The method includes associating a first grasp region of an object with an end effector of a robotic device, wherein the first grasp region includes a set of potential grasps achievable by the end effector of the robotic device. The method further includes determining, within the first grasp region, a grasp from among the set of potential grasps, wherein the grasp is determined based, at least in part, on information associated with a capability of the robotic device to perform the grasp, and instructing the robotic device to manipulate the object based on the grasp.

IPC Classes  ?

38.

SCREW ACTUATOR

      
Application Number 18541267
Status Pending
Filing Date 2023-12-15
First Publication Date 2025-06-19
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Katz, Benjamin G.
  • De Souza Ramos, Joao Luiz Almeida

Abstract

The invention includes a screw actuator. The screw actuator includes a screw having a screw shaft and a screw nut. The screw shaft defines a first longitudinal axis along its length. The screw nut at least partially surrounds the screw shaft. The screw actuator includes a motor having a stator and a rotor. The rotor is mechanically coupled to the screw shaft. The stator at least partially surrounds the rotor. The screw actuator includes a first rigid member having a length dimension oriented along the first longitudinal axis. The screw actuator includes a second rigid member mechanically constrained relative to the first rigid member. The second rigid member is configured to move along a direction of the first longitudinal axis.

IPC Classes  ?

  • B25J 9/12 - Programme-controlled manipulators characterised by positioning means for manipulator elements electric
  • B25J 9/10 - Programme-controlled manipulators characterised by positioning means for manipulator elements

39.

SYSTEMS AND METHODS FOR ALTERING OPERATION OF MACHINERY

      
Application Number 18541355
Status Pending
Filing Date 2023-12-15
First Publication Date 2025-06-19
Owner Boston Dynamics, Inc. (USA)
Inventor Vicentini, Federico

Abstract

Systems and methods are disclosed for altering operation of a machine (e.g., bringing about an emergency stop). A device comprises an emitter configured to produce a first acoustic signal having a first tone and a second tone, wherein the first tone is different from the second tone. The device also includes an activation mechanism in communication with the emitter. The activation mechanism is configured to activate the emitter.

IPC Classes  ?

  • G05B 15/02 - Systems controlled by a computer electric

40.

AUTOMATED CONSTRAINED MANIPULATION

      
Application Number 18978536
Status Pending
Filing Date 2024-12-12
First Publication Date 2025-06-19
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Aghasadeghi, Navid
  • Paolini, Robert Eugene

Abstract

Techniques for automated constrained manipulation are provided. In one aspect, a method includes receiving a request for manipulating a target constrained object and receiving perception data from at least one sensor of a robot. The perception data indicative of the target constrained object. The method also includes receiving a semantic model of the target constrained object generated based on the perception data and determining a location for a robotic arm of the robot to interact with the target constrained object based on the semantic model and the request. The method further includes controlling the robotic arm to manipulate the target constrained object based on the location for the robotic arm to interact with the target constrained object.

IPC Classes  ?

41.

DYNAMIC AND VARIABLE DEFINITION OF ROBOT MISSIONS

      
Application Number 18979155
Status Pending
Filing Date 2024-12-12
First Publication Date 2025-06-19
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Rice, Alexander C.
  • Ringley, Brian
  • Basmajian, Christopher Michael
  • Hurt, Colton
  • Finnie, Iii, Gordon
  • Hepler, Leland John

Abstract

Systems and methods are described for dynamic and variable definition of robot missions and performance of the robot missions. A system can obtain first robot mission data associated with a first robot mission and second robot mission data. The second robot mission data may be associated with a second robot mission or a user-defined route edge. The system can generate composite mission data based on the first robot mission data and the second robot mission data. The system can instruct navigation of a robot through an environment according to the composite mission data.

IPC Classes  ?

42.

SYSTEMS AND METHODS FOR ALTERING OPERATION OF MACHINERY

      
Application Number US2024054375
Publication Number 2025/128231
Status In Force
Filing Date 2024-11-04
Publication Date 2025-06-19
Owner BOSTON DYNAMICS, INC. (USA)
Inventor Vicentini, Federico

Abstract

Systems and methods are disclosed for altering operation of a machine (e.g., bringing about an emergency stop). A device comprises an emitter configured to produce a first acoustic signal having a first tone and a second tone, wherein the first tone is different from the second tone. The device also includes an activation mechanism in communication with the emitter. The activation mechanism is configured to activate the emitter.

IPC Classes  ?

  • B25J 9/16 - Programme controls
  • B25J 19/06 - Safety devices
  • G05B 9/02 - Safety arrangements electric
  • G05B 19/406 - Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by monitoring or safety
  • F16P 3/14 - Safety devices acting in conjunction with the control or operation of a machineControl arrangements requiring the simultaneous use of two or more parts of the body with means, e.g. feelers, which in case of the presence of a body part of a person in or near the danger zone influence the control or operation of the machine the means being photocells or other devices sensitive without mechanical contact

43.

ROBOT LOCALIZATION USING DATA WITH VARIABLE DATA TYPES

      
Application Number US2024059844
Publication Number 2025/128869
Status In Force
Filing Date 2024-12-12
Publication Date 2025-06-19
Owner BOSTON DYNAMICS, INC. (USA)
Inventor
  • Chestnutt, Joel Elliott
  • Klingensmith, Matthew Jacob
  • Otero, Nicholas
  • Seifert, Samuel Frank
  • Stathis, Christopher

Abstract

Systems and methods are described for instructing performance of a localization and an action by a mobile robot based on composite data. A system may obtain satellite-based position data and one or more of odometry data or point cloud data. The system may generate composite data by merging the satellite-based position data and the one or more of the odometry data or the point cloud data. The system may instruct performance of a localization by the mobile robot based on the composite data. Based on the localization by the mobile robot, the system may identify an action and instruct performance of the action by a mobile robot.

IPC Classes  ?

  • G05D 1/229 - Command input data, e.g. waypoints
  • G01C 21/00 - NavigationNavigational instruments not provided for in groups
  • G05D 1/245 - Arrangements for determining position or orientation using dead reckoning
  • G05D 1/246 - Arrangements for determining position or orientation using environment maps, e.g. simultaneous localisation and mapping [SLAM]
  • G05D 1/248 - Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons generated by satellites, e.g. GPS
  • G05D 107/17 - Spaces with priority for humans, e.g. populated areas, pedestrian ways, parks or beaches
  • G05D 107/60 - Open buildings, e.g. offices, hospitals, shopping areas or universities
  • G05D 109/12 - Land vehicles with legs
  • G05D 111/50 - Internal signals, i.e. from sensors located in the vehicle, e.g. from compasses or angular sensors

44.

SCREW ACTUATOR, A ROBOR COMPRISING THE SCREW ACTUATOR AND A METHOD

      
Application Number US2024060032
Publication Number 2025/128996
Status In Force
Filing Date 2024-12-13
Publication Date 2025-06-19
Owner BOSTON DYNAMICS, INC. (USA)
Inventor
  • Katz, Benjamin G.
  • De Souza Ramos, Joao Luiz Almeida

Abstract

The invention includes a screw actuator. The screw actuator includes a screw having a screw shaft and a screw nut. The screw shaft defines a first longitudinal axis along its length. The screw nut at least partially surrounds the screw shaft. The screw actuator includes a motor having a stator and a rotor. The rotor is mechanically coupled to the screw shaft. The stator at least partially surrounds the rotor. The screw actuator includes a first rigid member having a length dimension oriented along the first longitudinal axis. The screw actuator includes a second rigid member mechanically constrained relative to the first rigid member. The second rigid member is configured to move along a direction of the first longitudinal axis.

IPC Classes  ?

  • F16H 25/20 - Screw mechanisms
  • B25J 9/12 - Programme-controlled manipulators characterised by positioning means for manipulator elements electric
  • B25J 17/02 - Wrist joints
  • B62D 57/032 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legVehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted feet or skid
  • H02K 7/06 - Means for converting reciprocating motion into rotary motion or vice versa
  • F16H 25/22 - Screw mechanisms with balls, rollers, or similar members between the co-operating partsElements essential to the use of such members

45.

INTELLIGENT GRIPPER WITH INDIVIDUAL CUP CONTROL

      
Application Number 19055622
Status Pending
Filing Date 2025-02-18
First Publication Date 2025-06-12
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Saunders, John Aaron
  • Thorne, Christopher Everett
  • Meduna, Matthew Paul
  • Geating, Joshua Timothy

Abstract

Systems and methods related to intelligent grippers with individual cup control are disclosed. One aspect of the disclosure provides a method of determining grip quality between a robotic gripper and an object. The method comprises applying a vacuum to two or more cup assemblies of the robotic gripper in contact with the object, moving the object with the robotic gripper after applying the vacuum to the two or more cup assemblies, and determining, using at least one pressure sensor associated with each of the two or more cup assemblies, a grip quality between the robotic gripper and the object.

IPC Classes  ?

46.

INTEGRATED MOBILE MANIPULATOR ROBOT

      
Application Number 19042586
Status Pending
Filing Date 2025-01-31
First Publication Date 2025-06-05
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Murphy, Michael
  • Zelnick, Benjamin
  • Hansen, Malik
  • Chernyak, Vadim
  • Thorne, Christopher Everett
  • Perkins, Alex

Abstract

A robot includes a mobile base, a turntable rotatably coupled to the mobile base, a robotic arm operatively coupled to the turntable, and at least one directional sensor. An orientation of the at least one directional sensor is independently controllable. A method of controlling a robotic arm includes controlling a state of a mobile base and controlling a state of a robotic arm coupled to the mobile base, based, at least in part, on the state of the mobile base.

IPC Classes  ?

  • B25J 9/06 - Programme-controlled manipulators characterised by multi-articulated arms
  • B25J 5/00 - Manipulators mounted on wheels or on carriages
  • B25J 9/16 - Programme controls
  • B25J 15/06 - Gripping heads with vacuum or magnetic holding means
  • B25J 17/02 - Wrist joints

47.

ROBOT MOVEMENT AND ONLINE TRAJECTORY OPTIMIZATION

      
Application Number 18975170
Status Pending
Filing Date 2024-12-10
First Publication Date 2025-06-05
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Deits, Robin
  • Kuindersma, Scott
  • Kelly, Matthew P.
  • Koolen, Twan
  • Abe, Yeuhi
  • Stephens, Benjamin

Abstract

Systems and methods for determining movement of a robot about an environment are provided. A computing system of the robot (i) receives information including a navigation target for the robot and a kinematic state of the robot; (ii) determines, based on the information and a trajectory target for the robot, a retargeted trajectory for the robot; (iii) determines, based on the retargeted trajectory, a centroidal trajectory for the robot and a kinematic trajectory for the robot consistent with the centroidal trajectory; and (iv) determines, based on the centroidal trajectory and the kinematic trajectory, a set of vectors having a vector for each of one or more joints of the robot.

IPC Classes  ?

  • B25J 9/16 - Programme controls
  • B25J 13/08 - Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
  • B62D 57/02 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members
  • G05D 1/249 - Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons from positioning sensors located off-board the vehicle, e.g. from cameras
  • G05D 1/646 - Following a predefined trajectory, e.g. a line marked on the floor or a flight path

48.

DYNAMIC MASS ESTIMATION METHODS FOR AN INTEGRATED MOBILE MANIPULATOR ROBOT

      
Application Number 19040953
Status Pending
Filing Date 2025-01-30
First Publication Date 2025-05-29
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Talebi, Shervin
  • Neville, Neil
  • Blankespoor, Kevin

Abstract

A method of estimating one or more mass characteristics of a payload manipulated by a robot includes moving the payload using the robot, determining one or more accelerations of the payload while the payload is in motion, sensing, using one or more sensors of the robot, a wrench applied to the payload while the payload is in motion, and estimating the one or more mass characteristics of the payload based, at least in part, on the determined accelerations and the sensed wrench.

IPC Classes  ?

  • B25J 19/02 - Sensing devices
  • B25J 9/16 - Programme controls
  • B65G 61/00 - Use of pick-up or transfer devices or of manipulators for stacking or de-stacking articles not otherwise provided for
  • G01G 9/00 - Methods of, or apparatus for, the determination of weight, not provided for in groups
  • G01G 19/14 - Weighing apparatus or methods adapted for special purposes not provided for in groups for weighing suspended loads

49.

Systems and methods for equalizing audio for playback on an electronic device

      
Application Number 19029119
Grant Number 12375863
Status In Force
Filing Date 2025-01-17
First Publication Date 2025-05-22
Grant Date 2025-07-29
Owner GOOGLE TECHNOLOGY HOLDINGS LLC (USA)
Inventor
  • Schuster, Adrian M.
  • Annabathula, Prabhu
  • Bastyr, Kevin J.
  • Wells, Andrew K.
  • Zhang, Wen Hao

Abstract

Embodiments are provided for receiving a request to output audio at a first speaker and a second speaker of an electronic device, determining that the electronic device is oriented in a portrait orientation or a landscape orientation, identifying, based on the determined orientation, a first equalization setting for the first speaker and a second equalization setting for the second speaker, providing, for output at the first speaker, a first audio signal with the first equalization setting, and providing, for output at the second speaker, a second audio signal with the second equalization setting.

IPC Classes  ?

  • H04R 29/00 - Monitoring arrangementsTesting arrangements
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06F 3/03 - Arrangements for converting the position or the displacement of a member into a coded form
  • G06F 3/0346 - Pointing devices displaced or positioned by the userAccessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
  • G06F 3/16 - Sound inputSound output
  • H03G 5/16 - Automatic control
  • H04R 3/04 - Circuits for transducers for correcting frequency response

50.

AUTO SWING-HEIGHT ADJUSTMENT

      
Application Number 18924425
Status Pending
Filing Date 2024-10-23
First Publication Date 2025-05-22
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Perkins, Alexander Douglas
  • Blankespoor, Kevin

Abstract

An example implementation includes (i) receiving sensor data that indicates topographical features of an environment in which a robotic device is operating, (ii) processing the sensor data into a topographical map that includes a two-dimensional matrix of discrete cells, the discrete cells indicating sample heights of respective portions of the environment, (iii) determining, for a first foot of the robotic device, a first step path extending from a first lift-off location to a first touch-down location, (iv) identifying, within the topographical map, a first scan patch of cells that encompass the first step path, (v) determining a first high point among the first scan patch of cells; and (vi) during the first step, directing the robotic device to lift the first foot to a first swing height that is higher than the determined first high point.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
  • B25J 9/16 - Programme controls
  • B62D 57/032 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legVehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted feet or skid
  • G05D 1/43 - Control of position or course in two dimensions
  • G05D 1/646 - Following a predefined trajectory, e.g. a line marked on the floor or a flight path

51.

HANDLING GAIT DISTURBANCES WITH ASYNCHRONOUS TIMING

      
Application Number 18912229
Status Pending
Filing Date 2024-10-10
First Publication Date 2025-05-15
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Blankespoor, Kevin
  • Stephens, Benjamin
  • Da Silva, Marco

Abstract

An example method may include i) detecting a disturbance to a gait of a robot, where the gait includes a swing state and a step down state, the swing state including a target swing trajectory for a foot of the robot, and where the target swing trajectory includes a beginning and an end; and ii) based on the detected disturbance, causing the foot of the robot to enter the step down state before the foot reaches the end of the target swing trajectory.

IPC Classes  ?

  • B25J 9/16 - Programme controls
  • B62D 57/032 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legVehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted feet or skid

52.

TERRAIN AWARE STEP PLANNING SYSTEM

      
Application Number 19020194
Status Pending
Filing Date 2025-01-14
First Publication Date 2025-05-15
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Whitman, Eric
  • Fay, Gina Christine

Abstract

A method for terrain and constraint planning a step plan includes receiving, at data processing hardware of a robot, image data of an environment about the robot from at least one image sensor. The robot includes a body and legs. The method also includes generating, by the data processing hardware, a body-obstacle map, a ground height map, and a step-obstacle map based on the image data and generating, by the data processing hardware, a body path for movement of the body of the robot while maneuvering in the environment based on the body-obstacle map. The method also includes generating, by the data processing hardware, a step path for the legs of the robot while maneuvering in the environment based on the body path, the body-obstacle map, the ground height map, and the step-obstacle map.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
  • G06T 7/593 - Depth or shape recovery from multiple images from stereo images

53.

ROBOTIC STEP TIMING AND SEQUENCING USING REINFORCEMENT LEARNING

      
Application Number 18931680
Status Pending
Filing Date 2024-10-30
First Publication Date 2025-05-08
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Swilling, Benjamin John
  • Domanico, Paul Simon

Abstract

Techniques for determining robotic step timing and sequencing using reinforcement learning are provided. In one aspect, a method includes receiving a target trajectory for a robot and receiving a state of the robot. The method further includes generating, using a neural network, a set of gait timing parameters for the robot based, at least in part, on the state of the robot and the target trajectory and controlling movement of the robot based on the set of gait timing parameters.

IPC Classes  ?

  • G05D 1/622 - Obstacle avoidance
  • B62D 57/032 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legVehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted feet or skid
  • G05D 101/15 - Details of software or hardware architectures used for the control of position using artificial intelligence [AI] techniques using machine learning, e.g. neural networks
  • G05D 109/12 - Land vehicles with legs
  • G06N 3/092 - Reinforcement learning

54.

ROBOTIC STEP TIMING AND SEQUENCING USING REINFORCEMENT LEARNING

      
Application Number US2024053629
Publication Number 2025/096591
Status In Force
Filing Date 2024-10-30
Publication Date 2025-05-08
Owner BOSTON DYNAMICS, INC. (USA)
Inventor
  • Swilling, Benjamin John
  • Domanico, Paul Simon

Abstract

Techniques for determining robotic step timing and sequencing using reinforcement learning are provided. In one aspect, a method includes receiving a target trajectory for a robot and receiving a state of the robot. The method further includes generating, using a neural network, a set of gait timing parameters for the robot based, at least in part, on the state of the robot and the target trajectory and controlling movement of the robot based on the set of gait timing parameters.

IPC Classes  ?

  • B62D 57/032 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legVehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted feet or skid
  • B25J 9/16 - Programme controls
  • G06N 3/08 - Learning methods

55.

Multi-Processor Support for Array Imagers

      
Application Number 18948117
Status Pending
Filing Date 2024-11-14
First Publication Date 2025-05-01
Owner Google Technology Holdings LLC (USA)
Inventor
  • Bretscher, John T.
  • Vaas, Randall S.

Abstract

Using the techniques discussed herein, a set of images is captured by one or more array imagers (106). Each array imager includes multiple imagers configured in various manners. Each array imager captures multiple images of substantially a same scene at substantially a same time. The images captured by each array image are encoded by multiple processors (112, 114). Each processor can encode sets of images captured by a different array imager, or each processor can encode different sets of images captured by the same array imager. The encoding of the images is performed using various image-compression techniques so that the information that results from the encoding is smaller, in terms of storage size, than the uncompressed images.

IPC Classes  ?

  • H04N 13/282 - Image signal generators for generating image signals corresponding to three or more geometrical viewpoints, e.g. multi-view systems
  • G06T 1/20 - Processor architecturesProcessor configuration, e.g. pipelining
  • H04N 13/161 - Encoding, multiplexing or demultiplexing different image signal components
  • H04N 19/107 - Selection of coding mode or of prediction mode between spatial and temporal predictive coding, e.g. picture refresh
  • H04N 19/42 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
  • H04N 19/436 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation using parallelised computational arrangements
  • H04N 19/503 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
  • H04N 19/593 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving spatial prediction techniques
  • H04N 19/597 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding specially adapted for multi-view video sequence encoding
  • H04N 19/62 - Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding by frequency transforming in three dimensions
  • H04N 23/45 - Cameras or camera modules comprising electronic image sensorsControl thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
  • H04N 23/75 - Circuitry for compensating brightness variation in the scene by influencing optical camera components
  • H04N 23/80 - Camera processing pipelinesComponents thereof
  • H04N 23/957 - Light-field or plenoptic cameras or camera modules

56.

SYSTEMS AND METHODS FOR GRASPING OBJECTS WITH UNKNOWN OR UNCERTAIN EXTENTS USING A ROBOTIC MANIPULATOR

      
Application Number 18926580
Status Pending
Filing Date 2024-10-25
First Publication Date 2025-05-01
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Shaw, Samuel
  • Gilroy, Scott
  • Rahme, Maurice
  • Neville, Neil M.

Abstract

Methods and apparatus for grasping an object by a suction-based gripper of a mobile robot are provided. The method comprises receiving, by a computing device, from a perception system of the mobile robot, perception information reflecting an object to be grasped by the suction-based gripper, determining, by the computing device, uncertainty information reflecting an unknown or uncertain extent and/or pose of the object, determining, by the computing device, a grasp strategy to grasp the object based, at least in part, on the uncertainty information, and controlling, by the computing device, the mobile robot to grasp the object using the grasp strategy.

IPC Classes  ?

  • B25J 9/16 - Programme controls
  • B25J 13/08 - Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
  • B25J 15/06 - Gripping heads with vacuum or magnetic holding means

57.

NONLINEAR TRAJECTORY OPTIMIZATION FOR ROBOTIC DEVICES

      
Application Number 18938800
Status Pending
Filing Date 2024-11-06
First Publication Date 2025-05-01
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Bellicoso, C. Dario
  • Tutt, Logan W.
  • Neville, Neil M.
  • Perkins, Alexander D.

Abstract

Systems and methods for determining movement of a robot are provided. A computing system of the robot receives information including an initial state of the robot and a goal state of the robot. The computing system determines, using nonlinear optimization, a candidate trajectory for the robot to move from the initial state to the goal state. The computing system determines whether the candidate trajectory is feasible. If the candidate trajectory is feasible, the computing system provides the candidate trajectory to a motion control module of the robot. If the candidate trajectory is not feasible, the computing system determines, using nonlinear optimization, a different candidate trajectory for the robot to move from the initial state to the goal state, the nonlinear optimization using one or more changed parameters.

IPC Classes  ?

  • B25J 9/16 - Programme controls
  • B25J 13/08 - Controls for manipulators by means of sensing devices, e.g. viewing or touching devices

58.

ROBOTICALLY NEGOTIATING STAIRS

      
Application Number 18938925
Status Pending
Filing Date 2024-11-06
First Publication Date 2025-05-01
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Whitman, Eric
  • Fay, Gina Christine
  • Swilling, Benjamin

Abstract

A method for negotiating stairs includes receiving image data about a robot maneuvering in an environment with stairs. Here, the robot includes two or more legs. Prior to the robot traversing the stairs, for each stair, the method further includes determining a corresponding step region based on the received image data. The step region identifies a safe placement area on a corresponding stair for a distal end of a corresponding swing leg of the robot. Also prior to the robot traversing the stairs, the method includes shifting a weight distribution of the robot towards a front portion of the robot. When the robot traverses the stairs, the method further includes, for each stair, moving the distal end of the corresponding swing leg of the robot to a target step location where the target step location is within the corresponding step region of the stair.

IPC Classes  ?

  • B25J 9/16 - Programme controls
  • B62D 57/024 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members specially adapted for moving on inclined or vertical surfaces
  • B62D 57/032 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legVehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted feet or skid

59.

WHOLE BODY MANIPULATION ON A LEGGED ROBOT USING DYNAMIC BALANCE

      
Application Number 18983928
Status Pending
Filing Date 2024-12-17
First Publication Date 2025-04-10
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Blankespoor, Kevin
  • Stephens, Benjamin
  • Hudson, Nicolas
  • Abe, Yeuhi
  • Barry, Jennifer

Abstract

A robot system includes: an upper body section including one or more end-effectors; a lower body section including one or more legs; and an intermediate body section coupling the upper and lower body sections. An upper body control system operates at least one of the end-effectors. The intermediate body section experiences a first intermediate body linear force and/or moment based on an end-effector force acting on the at least one end-effector. A lower body control system operates the one or more legs. The one or more legs experience respective surface reaction forces. The intermediate body section experiences a second intermediate body linear force and/or moment based on the surface reaction forces. The lower body control system operates the one or more legs so that the second intermediate body linear force balances the first intermediate linear force and the second intermediate body moment balances the first intermediate body moment.

IPC Classes  ?

  • B62D 57/032 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legVehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted feet or skid
  • B25J 9/16 - Programme controls
  • B62D 57/02 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members
  • B62D 57/024 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members specially adapted for moving on inclined or vertical surfaces

60.

DYNAMIC PERFORMANCE OF ACTIONS BY A MOBILE ROBOT BASED ON SENSOR DATA AND A SITE MODEL

      
Application Number US2024021354
Publication Number 2025/071664
Status In Force
Filing Date 2024-03-25
Publication Date 2025-04-03
Owner BOSTON DYNAMICS, INC. (USA)
Inventor
  • Klingensmith, Matthew, Jacob
  • Mcdonald, Michael, James
  • Agrawal, Radhika
  • Allum, Christopher, Peter
  • Shinkle, Rosalind, Fish, Blais

Abstract

Systems and methods are described for instructing performance of an action by a mobile robot based on transformed data. A system may obtain a site model in a first data format and sensor data in a second data format. The site model and/or the sensor data may be annotated. The system may transform the site model and the sensor data to generate transformed data in a third data format. The system may provide the transformed data to a computing system. For example, the system may provide the transformed data to a machine learning model. Based on the output of the computing system, the system may identify an action and instruct performance of the action by a mobile robot.

IPC Classes  ?

61.

DYNAMIC PERFORMANCE OF ACTIONS BY A MOBILE ROBOT BASED ON SENSOR DATA AND A SITE MODEL

      
Application Number 18613943
Status Pending
Filing Date 2024-03-22
First Publication Date 2025-03-27
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Klingensmith, Matthew Jacob
  • Mcdonald, Michael James
  • Agrawal, Radhika
  • Allum, Christopher Peter
  • Shinkle, Rosalind Fish Blais

Abstract

Systems and methods are described for instructing performance of an action by a mobile robot based on transformed data. A system may obtain a site model in a first data format and sensor data in a second data format. The site model and/or the sensor data may be annotated. The system may transform the site model and the sensor data to generate transformed data in a third data format. The system may provide the transformed data to a computing system. For example, the system may provide the transformed data to a machine learning model. Based on the output of the computing system, the system may identify an action and instruct performance of the action by a mobile robot.

IPC Classes  ?

  • G05D 1/246 - Arrangements for determining position or orientation using environment maps, e.g. simultaneous localisation and mapping [SLAM]
  • G05D 109/12 - Land vehicles with legs
  • G06F 40/30 - Semantic analysis
  • G10L 15/22 - Procedures used during a speech recognition process, e.g. man-machine dialog

62.

DATA TRANSFER ASSEMBLIES FOR ROBOTIC DEVICES

      
Application Number 18975515
Status Pending
Filing Date 2024-12-10
First Publication Date 2025-03-27
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Billings, Devin
  • Meduna, Matthew

Abstract

The invention includes systems and methods for routing data packets in a robot. The method comprises routing, using a first switching device, data packets between a first host processor and a first electronic device of the robot, and routing, using the first switching device, data packets between a second host processor and a second electronic device of the robot.

IPC Classes  ?

  • G06F 13/42 - Bus transfer protocol, e.g. handshakeSynchronisation
  • B25J 19/02 - Sensing devices
  • H04L 45/00 - Routing or path finding of packets in data switching networks
  • H04L 45/42 - Centralised routing

63.

GRIPPER MECHANISM

      
Application Number 18822752
Status Pending
Filing Date 2024-09-03
First Publication Date 2025-03-27
Owner Boston Dynamics, Inc. (USA)
Inventor Dellon, Brian Todd

Abstract

A gripper mechanism includes a pair of gripper jaws, a linear actuator, and a rocker bogey. The linear actuator drives a first gripper jaw to move relative to a second gripper jaw. Here, the linear actuator includes a screw shaft and a drive nut where the drive nut includes a protrusion having protrusion axis expending along a length of the protrusion. The protrusion axis is perpendicular to an actuation axis of the linear actuator along a length of the screw shaft. The rocker bogey is coupled to the drive nut at the protrusion to form a pivot point for the rocker bogey and to enable the rocker bogey to pivot about the protrusion axis when the linear actuator drives the first gripper jaw to move relative to the second gripper jaw.

IPC Classes  ?

64.

PERCEPTION MAST FOR AN INTEGRATED MOBILE MANIPULATOR ROBOT

      
Application Number 18969444
Status Pending
Filing Date 2024-12-05
First Publication Date 2025-03-20
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Turpin, Matthew
  • Zelnick, Benjamin
  • Murphy, Michael
  • Perkins, Alex

Abstract

A perception mast for mobile robot is provided. The mobile robot comprises a mobile base, a turntable operatively coupled to the mobile base, the turntable configured to rotate about a first axis, an arm operatively coupled to a first location on the turntable, and the perception mast operatively coupled to a second location on the turntable, the perception mast configured to rotate about a second axis parallel to the first axis, wherein the perception mast includes disposed thereon, a first perception module and a second perception module arranged between the first imaging module and the turntable.

IPC Classes  ?

  • B25J 9/16 - Programme controls
  • B25J 5/00 - Manipulators mounted on wheels or on carriages
  • B25J 13/08 - Controls for manipulators by means of sensing devices, e.g. viewing or touching devices

65.

ROBUST DOCKING OF ROBOTS WITH IMPERFECT SENSING

      
Application Number 18941996
Status Pending
Filing Date 2024-11-08
First Publication Date 2025-02-27
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Gonano, Dion
  • Whitman, Eric Cary
  • Stathis, Christopher
  • Klingensmith, Matthew Jacob

Abstract

A computer-implemented method when executed by data processing hardware of a legged robot causes the data processing hardware to perform operations including receiving sensor data corresponding to an area including at least a portion of a docking station. The operations include determining an estimated pose for the docking station based on an initial pose of the legged robot relative to the docking station. The operations include identifying one or more docking station features from the received sensor data. The operations include matching the one or more identified docking station features to one or more known docking station features. The operations include adjusting the estimated pose for the docking station to a corrected pose for the docking station based on an orientation of the one or more identified docking station features that match the one or more known docking station features.

IPC Classes  ?

  • B60L 53/36 - Means for automatic or assisted adjustment of the relative position of charging devices and vehicles by positioning the vehicle
  • B60L 53/16 - Connectors, e.g. plugs or sockets, specially adapted for charging electric vehicles
  • B62D 57/02 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members

66.

Continuous Slip Recovery

      
Application Number 18805204
Status Pending
Filing Date 2024-08-14
First Publication Date 2025-02-13
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Berard, Stephen
  • Khripin, Alex Yu
  • Swilling, Benjamin

Abstract

The disclosure provides systems and methods for mitigating slip of a robot appendage. In one aspect, a method for mitigating slip of a robot appendage includes (i) receiving an input from one or more sensors, (ii) determining, based on the received input, an appendage position of the robot appendage, (iii) determining a filter position for the robot appendage, (iv) determining a distance between the appendage position and the filter position, (v) determining, based on the distance, a force to apply to the robot appendage, (vi) causing one or more actuators to apply the force to the robot appendage, (vii) determining whether the distance is greater than a threshold distance, and (viii) responsive to determining that the distance is greater than the threshold distance, the control system adjusting the filter position to a position, which is the threshold distance from the appendage position, for use in a next iteration.

IPC Classes  ?

  • B25J 13/08 - Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
  • B25J 19/00 - Accessories fitted to manipulators, e.g. for monitoring, for viewingSafety devices combined with or specially adapted for use in connection with manipulators
  • B62D 57/032 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legVehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted feet or skid

67.

SYSTEMS AND METHODS FOR PROVIDING MODULAR ARCHITECTURES FOR ROBOTIC END EFFECTORS

      
Application Number 18359349
Status Pending
Filing Date 2023-07-26
First Publication Date 2025-01-30
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Geating, Joshua
  • Thorne, Christopher
  • Jenko, Alexander
  • Bursal, Faruk
  • Demas, Nickolas
  • Baniszewski, Beth

Abstract

A robotic gripper includes a first modular component comprising a set of deformable members, such as a set of vacuum cups or foam members. The robotic gripper also includes a second modular component comprising a set of vacuum valves. Each vacuum valve in the set of vacuum valves is fluidly connected to at least one deformable member in the set of deformable members.

IPC Classes  ?

  • B25J 15/06 - Gripping heads with vacuum or magnetic holding means
  • B65G 47/91 - Devices for picking-up and depositing articles or materials incorporating pneumatic, e.g. suction, grippers

68.

SYSTEMS AND METHODS FOR PROVIDING MODULAR ARCHITECTURES FOR ROBOTIC END EFFECTORS

      
Application Number US2024039523
Publication Number 2025/024647
Status In Force
Filing Date 2024-07-25
Publication Date 2025-01-30
Owner BOSTON DYNAMICS, INC. (USA)
Inventor
  • Geating, Joshua
  • Thorne, Christopher
  • Jenko, Alexander
  • Bursal, Faruk
  • Baniszewski, Beth
  • Demas, Nickolas

Abstract

A robotic gripper includes a first modular component comprising a set of deformable members, such as a set of vacuum cups or foam members. The robotic gripper also includes a second modular component comprising a set of vacuum valves. Each vacuum valve in the set of vacuum valves is fluidly connected to at least one deformable member in the set of deformable members.

IPC Classes  ?

  • B25J 5/00 - Manipulators mounted on wheels or on carriages
  • B25J 15/00 - Gripping heads
  • B25J 15/06 - Gripping heads with vacuum or magnetic holding means

69.

Adaptive method for biometrically certified communication

      
Application Number 17154597
Grant Number 12210600
Status In Force
Filing Date 2021-01-21
First Publication Date 2025-01-28
Grant Date 2025-01-28
Owner Google Technology Holdings LLC (USA)
Inventor
  • Slaby, Jiri
  • Ady, Roger W.

Abstract

A recipient communication device and method wherein a user authenticates a message that is being received. The method includes receiving, by a messaging utility of the recipient communication device, a message transmitted from a sender communication device. The messaging utility determines that one of (a) sender authentication of the message and (b) recipient authentication to open the message is required. In response to sender authentication being required, the recipient communication device transmits a request to the sender communication device for sender authentication of the message, and receives a certification of the message based on an authentication of a user input via the sender communication device. When recipient authentication is required, the recipient is prompted to enter biometric input at the recipient device. In one embodiment, a clearinghouse service authenticates a user of a communication device in order for the recipient communication device to receive certification of the user and/or the message.

IPC Classes  ?

  • H04L 9/40 - Network security protocols
  • G06F 21/32 - User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
  • G06F 21/64 - Protecting data integrity, e.g. using checksums, certificates or signatures
  • H04L 9/32 - Arrangements for secret or secure communicationsNetwork security protocols including means for verifying the identity or authority of a user of the system
  • H04W 12/069 - Authentication using certificates or pre-shared keys

70.

MANIPULATING BOXES USING A ZONED GRIPPER

      
Application Number 18811123
Status Pending
Filing Date 2024-08-21
First Publication Date 2024-12-12
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Chitta, Sachin
  • Hershberger, David
  • Pauwels, Karl

Abstract

A method of manipulating boxes includes receiving a minimum box size for a plurality of boxes varying in size located in a walled container. The method also includes dividing a grip area of a gripper into a plurality of zones. The method further includes locating a set of candidate boxes based on an image from a visual sensor. For each zone, the method additionally includes, determining an overlap of a respective zone with one or more neighboring boxes to the set of candidate boxes. The method also includes determining a grasp pose for a target candidate box that avoids one or more walls of the walled container. The method further includes executing the grasp pose to lift the target candidate box by the gripper where the gripper activates each zone of the plurality of zones that does not overlap a respective neighboring box to the target candidate box.

IPC Classes  ?

  • B25J 9/10 - Programme-controlled manipulators characterised by positioning means for manipulator elements
  • B25J 9/16 - Programme controls
  • B25J 13/08 - Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
  • B25J 15/06 - Gripping heads with vacuum or magnetic holding means
  • B66F 9/02 - Stationary loaders or unloaders, e.g. for sacks

71.

LOCATION BASED CHANGE DETECTION WITHIN IMAGE DATA BY A MOBILE ROBOT

      
Application Number 18541874
Status Pending
Filing Date 2023-12-15
First Publication Date 2024-11-14
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Ryde, Julian
  • Dong, Yi
  • Da Silva, Marco
  • Stathis, Christopher
  • Ramachandran, Karthik

Abstract

Systems and methods are described for detecting changes at a location based on image data by a mobile robot. A system can instruct navigation of the mobile robot to a location. For example, the system can instruct navigation to the location as part of an inspection mission. The system can obtain input identifying a change detection. Based on the change detection and obtained image data associated with the location, the system can perform the change detection and detect a change associated with the location. For example, the system can perform the change detection based on one or more regions of interest of the obtained image data. Based on the detected change and a reference model, the system can determine presence of an anomaly condition in the obtained image data.

IPC Classes  ?

  • G05D 1/689 - Pointing payloads towards fixed or moving targets
  • B62D 57/032 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legVehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted feet or skid
  • G06T 7/00 - Image analysis
  • G06T 7/70 - Determining position or orientation of objects or cameras

72.

LOCATION BASED CHANGE DETECTION WITHIN IMAGE DATA BY A MOBILE ROBOT

      
Application Number US2023084269
Publication Number 2024/232947
Status In Force
Filing Date 2023-12-15
Publication Date 2024-11-14
Owner BOSTON DYNAMICS, INC. (USA)
Inventor
  • Ryde, Julian
  • Dong, Yi
  • Da Silva, Marco
  • Stathis, Christopher
  • Ramachandran, Karthik

Abstract

Systems and methods are described for detecting changes at a location based on image data by a mobile robot. A system can instruct navigation of the mobile robot to a location. For example, the system can instruct navigation to the location as part of an inspection mission. The system can obtain input identifying a change detection. Based on the change detection and obtained image data associated with the location, the system can perform the change detection and detect a change associated with the location. For example, the system can perform the change detection based on one or more regions of interest of the obtained image data. Based on the detected change and a reference model, the system can determine presence of an anomaly condition in the obtained image data.

IPC Classes  ?

  • G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
  • G05D 1/225 - Remote-control arrangements operated by off-board computers
  • G05D 1/229 - Command input data, e.g. waypoints
  • G05D 1/249 - Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons from positioning sensors located off-board the vehicle, e.g. from cameras
  • G05D 1/246 - Arrangements for determining position or orientation using environment maps, e.g. simultaneous localisation and mapping [SLAM]
  • G05D 1/689 - Pointing payloads towards fixed or moving targets

73.

Dynamic Planning Controller

      
Application Number 18774604
Status Pending
Filing Date 2024-07-16
First Publication Date 2024-11-07
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Whitman, Eric
  • Khripin, Alex

Abstract

A dynamic planning controller receives a maneuver for a robot and a current state of the robot and transforms the maneuver and the current state of the robot into a nonlinear optimization problem. The nonlinear optimization problem is configured to optimize an unknown force and an unknown position vector. At a first time instance, the controller linearizes the nonlinear optimization problem into a first linear optimization problem and determines a first solution to the first linear optimization problem using quadratic programming. At a second time instance, the controller linearizes the nonlinear optimization problem into a second linear optimization problem based on the first solution at the first time instance and determines a second solution to the second linear optimization problem based on the first solution using the quadratic programming. The controller also generates a joint command to control motion of the robot during the maneuver based on the second solution.

IPC Classes  ?

  • B25J 9/16 - Programme controls
  • G05B 13/04 - Adaptive control systems, i.e. systems automatically adjusting themselves to have a performance which is optimum according to some preassigned criterion electric involving the use of models or simulators
  • G06N 5/01 - Dynamic search techniquesHeuristicsDynamic treesBranch-and-bound

74.

LIGHT OUTPUT USING LIGHT SOURCES OF A ROBOT

      
Application Number 18640544
Status Pending
Filing Date 2024-04-19
First Publication Date 2024-10-31
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Dellon, Brian Todd
  • Vicentini, Federico
  • Needleman, John Frederick
  • Hepler, Leland
  • Bollini, Mario
  • Robert, David Yann

Abstract

Systems and methods are described for outputting light and/or audio using one or more light and/or audio sources of a robot. The light sources may be located on one or more legs of the robot, a bottom portion of the robot, and/or a top portion of the robot. The audio sources may include a speaker and/or an audio resonator. A system can obtain sensor data associated with an environment of the robot. Based on the sensor data, the system can identify an alert. For example, the system can identify an entity based on the sensor data and identify an alert for the entity. The system can instruct an output of light and/or audio indicative of the alert using the one or more light and/or audio sources. The system can adjust parameters of the output based on the sensor data.

IPC Classes  ?

75.

Object-Based Robot Control

      
Application Number 18761998
Status Pending
Filing Date 2024-07-02
First Publication Date 2024-10-24
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Bollini, Mario
  • Hepler, Leland

Abstract

A method includes receiving sensor data for an environment about the robot. The sensor data is captured by one or more sensors of the robot. The method includes detecting one or more objects in the environment using the received sensor data. For each detected object, the method includes authoring an interaction behavior indicating a behavior that the robot is capable of performing with respect to the corresponding detected object. The method also includes augmenting a localization map of the environment to reflect the respective interaction behavior of each detected object.

IPC Classes  ?

76.

LIGHT AND/OR AUDIO OUTPUT USING LIGHT AND/OR AUDIO SOURCES OF A ROBOT

      
Application Number US2024025417
Publication Number 2024/220811
Status In Force
Filing Date 2024-04-19
Publication Date 2024-10-24
Owner BOSTON DYNAMICS, INC. (USA)
Inventor
  • Dellon, Brian Todd
  • Vicentini, Federico
  • Needleman, John Frederick
  • Hepler, Leland
  • Bollini, Mario
  • Robert, David Yann

Abstract

Systems and methods are described for outputting light and/or audio using one or more light and/or audio sources of a robot. The light sources may be located on one or more legs of the robot, a bottom portion of the robot, and/or a top portion of the robot. The audio sources may include a speaker and/or an audio resonator. A system can obtain sensor data associated with an environment of the robot. Based on the sensor data, the system can identify an alert. For example, the system can identify an entity based on the sensor data and identify an alert for the entity. The system can instruct an output of light and/or audio indicative of the alert using the one or more light and/or audio sources. The system can adjust parameters of the output based on the sensor data.

IPC Classes  ?

  • B25J 9/16 - Programme controls
  • B62D 57/02 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members
  • B62D 57/032 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legVehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted feet or skid
  • B62D 57/024 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members specially adapted for moving on inclined or vertical surfaces
  • G05D 109/12 - Land vehicles with legs

77.

Systems and methods for actuation of a robotic manipulator

      
Application Number 18750091
Grant Number 12533795
Status In Force
Filing Date 2024-06-21
First Publication Date 2024-10-17
Grant Date 2026-01-27
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Geating, Joshua Timothy
  • Peyton, Geoffrey
  • Thorne, Christopher Everett
  • Webb, Jacob

Abstract

Disclosed herein are systems and methods directed to an industrial robot that can perform mobile manipulation (e.g., dexterous mobile manipulation). A robotic arm may be capable of precise control when reaching into tight spaces, may be robust to impacts and collisions, and/or may limit the mass of the robotic arm to reduce the load on the battery and increase runtime. A robotic arm may include differently configured proximal joints and/or distal joints. Proximal joints may be designed to promote modularity and may include separate functional units, such as modular actuators, encoder, bearings, and/or clutches. Distal joints may be designed to promote integration and may include offset actuators to enable a through-bore for the internal routing of vacuum, power, and signal connections.

IPC Classes  ?

  • B25J 17/02 - Wrist joints
  • B25J 9/10 - Programme-controlled manipulators characterised by positioning means for manipulator elements

78.

Recognizing accented speech

      
Application Number 18748464
Grant Number 12334050
Status In Force
Filing Date 2024-06-20
First Publication Date 2024-10-10
Grant Date 2025-06-17
Owner Google Technology Holdings LLC (USA)
Inventor Gray, Kristin A.

Abstract

Techniques and apparatuses for recognizing accented speech are described. In some embodiments, an accent module recognizes accented speech using an accent library based on device data, uses different speech recognition correction levels based on an application field into which recognized words are set to be provided, or updates an accent library based on corrections made to incorrectly recognized speech.

IPC Classes  ?

  • G10L 15/00 - Speech recognition
  • G06F 40/174 - Form fillingMerging
  • G10L 15/01 - Assessment or evaluation of speech recognition systems
  • G10L 15/06 - Creation of reference templatesTraining of speech recognition systems, e.g. adaptation to the characteristics of the speaker's voice
  • G10L 15/187 - Phonemic context, e.g. pronunciation rules, phonotactical constraints or phoneme n-grams
  • G10L 15/22 - Procedures used during a speech recognition process, e.g. man-machine dialog
  • G10L 15/30 - Distributed recognition, e.g. in client-server systems, for mobile phones or network applications

79.

ENVIRONMENTAL FEATURE-SPECIFIC ACTIONS FOR ROBOT NAVIGATION

      
Application Number US2023084418
Publication Number 2024/205673
Status In Force
Filing Date 2023-12-15
Publication Date 2024-10-03
Owner BOSTON DYNAMICS, INC. (USA)
Inventor
  • Klingensmith, Matthew Jacob
  • Komoroski, Adam
  • Yamauchi, Brian Masao
  • Mcdonald, Michael James
  • Stathis, Christopher

Abstract

Systems and methods are described for reacting to a feature in an environment of a robot based on a classification of the feature. A system can detect the feature in the environment using a first sensor on the robot. For example, the system can detect the feature using a feature detection system based on sensor data from a camera. The system can detect a mover in the environment using a second sensor on the robot. For example, the system can detect the mover using a mover detection system based on sensor data from a lidar sensor. The system can fuse the data from detecting the feature and detecting the mover to produce fused data. The system can classify the feature based on the fused data and react to the feature based on classifying the feature.

IPC Classes  ?

  • G05D 1/43 - Control of position or course in two dimensions

80.

ENVIRONMENTAL FEATURE-SPECIFIC ACTIONS FOR ROBOT NAVIGATION

      
Application Number 18542082
Status Pending
Filing Date 2023-12-15
First Publication Date 2024-09-26
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Klingensmith, Matthew Jacob
  • Komoroski, Adam
  • Yamauchi, Brian Masao
  • Mcdonald, Michael James
  • Stathis, Christopher

Abstract

Systems and methods are described for reacting to a feature in an environment of a robot based on a classification of the feature. A system can detect the feature in the environment using a first sensor on the robot. For example, the system can detect the feature using a feature detection system based on sensor data from a camera. The system can detect a mover in the environment using a second sensor on the robot. For example, the system can detect the mover using a mover detection system based on sensor data from a lidar sensor. The system can fuse the data from detecting the feature and detecting the mover to produce fused data. The system can classify the feature based on the fused data and react to the feature based on classifying the feature.

IPC Classes  ?

81.

PERCEPTION SYSTEM FOR A LOWER BODY POWERED EXOSKELETON

      
Application Number US2023085095
Publication Number 2024/196444
Status In Force
Filing Date 2023-12-20
Publication Date 2024-09-26
Owner BOSTON DYNAMICS, INC. (USA)
Inventor
  • Talebi, Shervin
  • Tao, Ran
  • Saunders, John Aaron
  • Hyun, Dong Jin
  • Park, Sang In

Abstract

Systems and methods for a perception system for a lower body powered exoskeleton device are provided. The perception system includes a camera configured to capture one or more images of terrain in proximity to the exoskeleton device, an at least one processor. The at least one processor is programmed to perform footstep planning for the exoskeleton device based, at least in part, on the captured one or more images of terrain, and issue an instruction to perform a first action based, at least in part, on the footstep planning.

IPC Classes  ?

  • A61H 3/00 - Appliances for aiding patients or disabled persons to walk about
  • A61H 1/02 - Stretching or bending apparatus for exercising
  • A61B 5/11 - Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
  • B25J 9/00 - Programme-controlled manipulators
  • A61B 5/00 - Measuring for diagnostic purposes Identification of persons
  • A61F 5/01 - Orthopaedic devices, e.g. long-term immobilising or pressure directing devices for treating broken or deformed bones such as splints, casts or braces
  • B62D 57/032 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legVehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted feet or skid
  • G06V 20/10 - Terrestrial scenes
  • A61H 3/02 - Crutches

82.

Methods and apparatus for modeling loading dock environments

      
Application Number 18545334
Grant Number 12544932
Status In Force
Filing Date 2023-12-19
First Publication Date 2024-09-12
Grant Date 2026-02-10
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Broad, Alex
  • Ramachandran, Karthik
  • Dong, Yi

Abstract

Methods and apparatus for operating a mobile robot in a loading dock environment are provided. The method comprises capturing, by a camera system of the mobile robot, at least one image of the loading dock environment, and processing, by at least one hardware processor of the mobile robot, the at least one image using a machine learning model trained to identify one or more features of the loading dock environment.

IPC Classes  ?

  • B25J 5/00 - Manipulators mounted on wheels or on carriages
  • B25J 9/16 - Programme controls
  • G06T 17/10 - Volume description, e.g. cylinders, cubes or using CSG [Constructive Solid Geometry]
  • G06V 10/40 - Extraction of image or video features
  • G06V 10/70 - Arrangements for image or video recognition or understanding using pattern recognition or machine learning
  • G06V 20/56 - Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle

83.

METHODS AND APPARATUS FOR REDUCING MULTIPATH ARTIFACTS FOR A CAMERA SYSTEM OF A MOBILE ROBOT

      
Application Number 18545559
Status Pending
Filing Date 2023-12-19
First Publication Date 2024-09-12
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Turpin, Matthew
  • Hoelscher, Andrew
  • Merkle, Lukas

Abstract

Methods and apparatus for determining a pose of an object sensed by a camera system of a mobile robot are described. The method includes acquiring, using the camera system, a first image of the object from a first perspective and a second image of the object from a second perspective, and determining, by a processor of the camera system, a pose of the object based, at least in part, on a first set of sparse features associated with the object detected in the first image and a second set of sparse features associated with the object detected in the second image.

IPC Classes  ?

  • G06T 7/73 - Determining position or orientation of objects or cameras using feature-based methods
  • G06T 7/13 - Edge detection
  • G06T 7/521 - Depth or shape recovery from laser ranging, e.g. using interferometryDepth or shape recovery from the projection of structured light
  • G06T 7/55 - Depth or shape recovery from multiple images

84.

Transmission with integrated overload protection for a legged robot

      
Application Number 18421354
Grant Number 12320400
Status In Force
Filing Date 2024-01-24
First Publication Date 2024-09-12
Grant Date 2025-06-03
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Jackowski, Zachary John
  • Young, Adam

Abstract

An example robot includes: a motor disposed at a joint configured to control motion of a member of the robot; a transmission including an input member coupled to and configured to rotate with the motor, an intermediate member, and an output member, where the intermediate member is fixed such that as the input member rotates, the output member rotates therewith at a different speed; a pad frictionally coupled to a side surface of the output member of the transmission and coupled to the member of the robot; and a spring configured to apply an axial preload on the pad, wherein the axial preload defines a torque limit that, when exceeded by a torque load on the member of the robot, the output member of the transmission slips relative to the pad.

IPC Classes  ?

  • F16D 7/00 - Slip couplings, e.g. slipping on overload, for absorbing shock
  • B25J 9/10 - Programme-controlled manipulators characterised by positioning means for manipulator elements
  • B25J 19/06 - Safety devices
  • F16D 7/02 - Slip couplings, e.g. slipping on overload, for absorbing shock of the friction type
  • F16H 35/10 - Arrangements or devices for absorbing overload or preventing damage by overload
  • F16H 49/00 - Other gearing
  • F16H 25/20 - Screw mechanisms

85.

Systems and methods for grasping and placing multiple objects with a robotic gripper

      
Application Number 18545239
Grant Number 12552030
Status In Force
Filing Date 2023-12-19
First Publication Date 2024-09-12
Grant Date 2026-02-17
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Shaw, Samuel
  • Snow, Colin
  • Gilroy, Scott
  • Tutt, Logan
  • Edelberg, Kyle
  • Neville, Neil

Abstract

A method of grasping and/or placing multiple objects by a gripper of a mobile robot. The multi-grasp method includes determining one or more candidate groups of objects to grasp by the suction-based gripper of the mobile robot, each of the one or more candidate groups of objects including a plurality of objects, determining a grasp quality score for each of the one or more candidate groups of objects, and grasping, by the suction-based gripper of the mobile robot, all objects in a candidate group of objects based, at least in part, on the grasp quality score. The multi-place method includes determining an allowed width associated with the conveyor, selecting a multi-place technique based, at least in part, on the allowed width and a dimension of the multiple grasped objects, and controlling the mobile robot to place the multiple grasped objects on the conveyor based on the selected multi-place technique.

IPC Classes  ?

  • B25J 9/16 - Programme controls
  • B25J 15/06 - Gripping heads with vacuum or magnetic holding means

86.

SYSTEMS AND METHODS FOR GRASPING AND PLACING MULTIPLE OBJECTS WITH A ROBOTIC GRIPPER

      
Application Number US2023085080
Publication Number 2024/186375
Status In Force
Filing Date 2023-12-20
Publication Date 2024-09-12
Owner BOSTON DYNAMICS, INC. (USA)
Inventor
  • Shaw, Samuel
  • Snow, Colin
  • Gilroy, Scott
  • Tutt, Logan
  • Edelberg, Kyle
  • Neville, Neil

Abstract

A method of grasping and/or placing multiple objects by a gripper of a mobile robot. The multi-grasp method includes determining one or more candidate groups of objects to grasp by the suction-based gripper of the mobile robot, each of the one or more candidate groups of objects including a plurality of objects, determining a grasp quality score for each of the one or more candidate groups of objects, and grasping, by the suction-based gripper of the mobile robot, all objects in a candidate group of objects based, at least in part, on the grasp quality score. The multi-place method includes determining an allowed width associated with the conveyor, selecting a multi-place technique based, at least in part, on the allowed width and a dimension of the multiple grasped objects, and controlling the mobile robot to place the multiple grasped objects on the conveyor based on the selected multi-place technique.

IPC Classes  ?

  • B25J 9/16 - Programme controls
  • B25J 15/06 - Gripping heads with vacuum or magnetic holding means
  • B65G 59/04 - De-stacking from the top of the stack by suction or magnetic devices

87.

Limiting arm forces and torques

      
Application Number 18646099
Grant Number 12515323
Status In Force
Filing Date 2024-04-25
First Publication Date 2024-08-15
Grant Date 2026-01-06
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Paolini, Robert Eugene
  • Rizzi, Alfred Anthony
  • Aghasadeghi, Navid
  • Khripin, Alex

Abstract

A computer-implemented method includes generating a joint-torque-limit model for the articulated arm based on allowable joint torque sets corresponding to a base pose of the base. The method also include receiving a first requested joint torque set for a first arm pose of the articulated arm and determining, using the joint-torque-limit model, an optimized joint torque set corresponding to the first requested joint torque set. The method also includes receiving a second requested joint torque set for a second arm pose of the articulated arm and generating an adjusted joint torque set by adjusting the second requested joint torque set based on the optimized joint torque set. The method also includes sending the adjusted joint torque set to the articulated arm.

IPC Classes  ?

88.

Detecting and responding to disturbances to a gait of a legged robot

      
Application Number 18587680
Grant Number 12466501
Status In Force
Filing Date 2024-02-26
First Publication Date 2024-07-25
Grant Date 2025-11-11
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Blankespoor, Kevin
  • Perkins, Alex
  • Da Silva, Marco

Abstract

An example method may include i) determining a first distance between a pair of feet of a robot at a first time, where the pair of feet is in contact with a ground surface; ii) determining a second distance between the pair of feet of the robot at a second time, where the pair of feet remains in contact with the ground surface from the first time to the second time; iii) comparing a difference between the determined first and second distances to a threshold difference; iv) determining that the difference between determined first and second distances exceeds the threshold difference; and v) based on the determination that the difference between the determined first and second distances exceeds the threshold difference, causing the robot to react.

IPC Classes  ?

  • B62D 57/00 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track
  • B25J 9/16 - Programme controls
  • B25J 13/08 - Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
  • B62D 57/02 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members
  • B62D 57/032 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legVehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted feet or skid
  • G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
  • G05D 1/245 - Arrangements for determining position or orientation using dead reckoning
  • G05D 1/43 - Control of position or course in two dimensions

89.

Enhanced Image Capture

      
Application Number 18615355
Status Pending
Filing Date 2024-03-25
First Publication Date 2024-07-11
Owner Google Technology Holdings LLC (USA)
Inventor
  • Petrescu, Doina I.
  • Lay, Thomas T.
  • Petrie, Steven R.
  • Ryan, Bill
  • Sinha, Snigdha
  • Vanhoof, Jeffrey S.

Abstract

Disclosed are techniques that provide a “best” picture taken within a few seconds of the moment when a capture command is received (e.g., when the “shutter” button is pressed). In some situations, several still images are automatically (that is, without the user's input) captured. These images are compared to find a “best” image that is presented to the photographer for consideration. Video is also captured automatically and analyzed to see if there is an action scene or other motion content around the time of the capture command. If the analysis reveals anything interesting, then the video clip is presented to the photographer. The video clip may be cropped to match the still-capture scene and to remove transitory parts. Higher-precision horizon detection may be provided based on motion analysis and on pixel-data analysis.

IPC Classes  ?

  • H04N 23/60 - Control of cameras or camera modules
  • H04N 1/00 - Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmissionDetails thereof
  • H04N 1/21 - Intermediate information storage
  • H04N 23/61 - Control of cameras or camera modules based on recognised objects
  • H04N 23/611 - Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
  • H04N 23/62 - Control of parameters via user interfaces
  • H04N 23/63 - Control of cameras or camera modules by using electronic viewfinders
  • H04N 23/667 - Camera operation mode switching, e.g. between still and video, sport and normal or high and low resolution modes
  • H04N 23/68 - Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
  • H04N 23/80 - Camera processing pipelinesComponents thereof

90.

Robotic device

      
Application Number 29849352
Grant Number D1034728
Status In Force
Filing Date 2022-08-10
First Publication Date 2024-07-09
Grant Date 2024-07-09
Owner Boston Dynamics, Inc. (USA)
Inventor Abroff, Aaron

91.

Robotic device

      
Application Number 29849353
Grant Number D1034729
Status In Force
Filing Date 2022-08-10
First Publication Date 2024-07-09
Grant Date 2024-07-09
Owner Boston Dynamics, Inc. (USA)
Inventor Abroff, Aaron

92.

Systems and methods for communicating notifications and textual data associated with applications

      
Application Number 18609364
Grant Number 12443391
Status In Force
Filing Date 2024-03-19
First Publication Date 2024-07-04
Grant Date 2025-10-14
Owner Google Technology Holdings LLC (USA)
Inventor
  • Peng, Long
  • Dai, Hui
  • Guan, Xin

Abstract

Embodiments are provided for communicating notifications and other textual data associated with applications installed on an electronic device. According to certain aspects, a user can interface with an input device to send a wake up trigger to the electronic device. The electronic device retrieves application notifications and converts (288) the application notifications to audio data. The electronic device also sends the audio data to an audio output device for annunciation. The user may also use the input device to send a request to the electronic device to activate the display screen. The electronic device identifies an application corresponding to an annunciated notification, and activates the display screen and initiates the application.

IPC Classes  ?

  • G06F 3/0484 - Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
  • G06F 3/0488 - Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
  • G06F 3/16 - Sound inputSound output
  • G10L 13/02 - Methods for producing synthetic speechSpeech synthesisers
  • G10L 13/04 - Details of speech synthesis systems, e.g. synthesiser structure or memory management
  • G10L 13/047 - Architecture of speech synthesisers

93.

Method and apparatus for using image data to aid voice recognition

      
Application Number 18606066
Grant Number 12334074
Status In Force
Filing Date 2024-03-15
First Publication Date 2024-07-04
Grant Date 2025-06-17
Owner Google Technology Holdings LLC (USA)
Inventor
  • Zurek, Robert A.
  • Schuster, Adrian M.
  • Shau, Fu-Lin
  • Wu, Jincheng

Abstract

A device performs a method for using image data to aid voice recognition. The method includes the device capturing image data of a vicinity of the device and adjusting, based on the image data, a set of parameters for voice recognition performed by the device. The set of parameters for the device performing voice recognition include, but are not limited to: a trigger threshold of a trigger for voice recognition; a set of beamforming parameters; a database for voice recognition; and/or an algorithm for voice recognition. The algorithm may include using noise suppression or using acoustic beamforming.

IPC Classes  ?

  • G10L 15/22 - Procedures used during a speech recognition process, e.g. man-machine dialog
  • G06F 3/01 - Input arrangements or combined input and output arrangements for interaction between user and computer
  • G06V 20/59 - Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
  • G06V 40/16 - Human faces, e.g. facial parts, sketches or expressions
  • G06V 40/18 - Eye characteristics, e.g. of the iris
  • G06V 40/19 - Sensors therefor
  • G06V 40/20 - Movements or behaviour, e.g. gesture recognition
  • G10L 15/20 - Speech recognition techniques specially adapted for robustness in adverse environments, e.g. in noise or of stress induced speech
  • G10L 15/24 - Speech recognition using non-acoustical features
  • G10L 15/25 - Speech recognition using non-acoustical features using position of the lips, movement of the lips or face analysis
  • G10L 15/26 - Speech to text systems
  • G10L 21/0208 - Noise filtering
  • G10L 21/0216 - Noise filtering characterised by the method used for estimating noise
  • G10L 25/78 - Detection of presence or absence of voice signals

94.

Methods and apparatus for controlling a gripper of a robotic device

      
Application Number 18545148
Grant Number 12447620
Status In Force
Filing Date 2023-12-19
First Publication Date 2024-07-04
Grant Date 2025-10-21
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Neville, Neil
  • Edelberg, Kyle
  • Gilroy, Scott

Abstract

Methods and apparatus for controlling a robotic gripper of a robotic device are provided. The method includes activating a plurality of vacuum assemblies of the robotic gripper to grasp one or more objects, disabling one or more of the plurality of vacuum assemblies having a seal quality with the one or more objects that is less than a first threshold, assigning a score to each of the one or more disabled vacuum assemblies, reactivating the one or more disabled vacuum assemblies in an order based, at least in part, on the assigned scores, and grasping the one or more objects with the robotic gripper when a grasp quality of the robotic gripper is higher than a second threshold.

IPC Classes  ?

  • B25J 9/16 - Programme controls
  • B25J 15/06 - Gripping heads with vacuum or magnetic holding means

95.

Robotic device

      
Application Number 29849351
Grant Number D1033501
Status In Force
Filing Date 2022-08-10
First Publication Date 2024-07-02
Grant Date 2024-07-02
Owner Boston Dynamics, Inc. (USA)
Inventor Abroff, Aaron

96.

METHODS AND APPARATUS FOR AUTOMATED CEILING DETECTION

      
Application Number 18545050
Status Pending
Filing Date 2023-12-19
First Publication Date 2024-06-27
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Merkle, Lukas
  • Turpin, Matthew
  • Tutt, Logan

Abstract

Methods and apparatus for estimating a ceiling location of a container within which a mobile robot is configured to operate are provided. The method comprises sensing distance measurement data associated with the ceiling of the container using one or more distance sensors arranged on an end effector of a mobile robot, and determining a ceiling estimate of the container based on the distance measurement data.

IPC Classes  ?

  • B25J 9/16 - Programme controls
  • G01S 13/88 - Radar or analogous systems, specially adapted for specific applications
  • G05D 1/648 - Performing a task within a working area or space, e.g. cleaning

97.

METHODS AND APPARATUS FOR CONTROLLING A GRIPPER OF A ROBOTIC DEVICE

      
Application Number US2023085076
Publication Number 2024/137781
Status In Force
Filing Date 2023-12-20
Publication Date 2024-06-27
Owner BOSTON DYNAMICS, INC. (USA)
Inventor
  • Neville, Neil
  • Edelberg, Kyle
  • Gilroy, Scott

Abstract

Methods and apparatus for controlling a robotic gripper of a robotic device are provided. The method includes activating a plurality of vacuum assemblies of the robotic gripper to grasp one or more objects, disabling one or more of the plurality of vacuum assemblies having a seal quality with the one or more objects that is less than a first threshold, assigning a score to each of the one or more disabled vacuum assemblies, reactivating the one or more disabled vacuum assemblies in an order based, at least in part, on the assigned scores, and grasping the one or more objects with the robotic gripper when a grasp quality of the robotic gripper is higher than a second threshold.

IPC Classes  ?

  • B25J 9/16 - Programme controls
  • B25J 15/06 - Gripping heads with vacuum or magnetic holding means

98.

METHODS AND APPARATUS FOR LIDAR ALIGNMENT AND CALIBRATION

      
Application Number 18545124
Status Pending
Filing Date 2023-12-19
First Publication Date 2024-06-27
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Turpin, Matthew
  • Hoelscher, Andrew
  • Shimelis, Eyassu
  • Murphy, Michael
  • Nehrkorn, Mark
  • Vicentini, Federico
  • Neville, Neil

Abstract

Methods and apparatus for automated calibration for a LIDAR system of a mobile robot are provided. The method comprises capturing a plurality of LIDAR measurements. The plurality of LIDAR measurements include a first set of LIDAR measurements as the mobile robot spins in a first direction at a first location, the first location being a first distance to a calibration target, and a second set of LIDAR measurements as the mobile robot spins in a second direction at a second location, the second location being a second distance to the calibration target, wherein the first direction and the second direction are different and the second distance is different than the first distance. The method further comprises processing the plurality of LIDAR measurements to determine calibration data, and generating alignment instructions for the LIDAR system based, at least in part, on the calibration data.

IPC Classes  ?

  • G01S 7/497 - Means for monitoring or calibrating
  • G01S 17/931 - Lidar systems, specially adapted for specific applications for anti-collision purposes of land vehicles

99.

ANCHORING BASED TRANSFORMATION FOR ALIGNING SENSOR DATA OF A ROBOT WITH A SITE MODEL

      
Application Number 18531152
Status Pending
Filing Date 2023-12-06
First Publication Date 2024-06-13
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Klingensmith, Matthew Jacob
  • Jonak, Dom
  • Hepler, Leland
  • Basmajian, Christopher
  • Ringley, Brian

Abstract

Systems and methods are described for the display of a transformed virtual representation of sensor data overlaid on a site model. A system can obtain a site model identifying a site. For example, the site model can include a map, a blueprint, or a graph. The system can obtain sensor data from a sensor of a robot. The sensor data can include route data identifying route waypoints and/or route edges associated with the robot. The system can receive input identifying an association between a virtual representation of the sensor data and the site model. Based on the association, the system can transform the virtual representation of the sensor data and instruct display of the transformed data overlaid on the site model.

IPC Classes  ?

  • G05D 1/222 - Remote-control arrangements operated by humans
  • B62D 57/032 - Vehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legVehicles characterised by having other propulsion or other ground-engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted feet or skid
  • G05D 1/229 - Command input data, e.g. waypoints
  • G05D 1/246 - Arrangements for determining position or orientation using environment maps, e.g. simultaneous localisation and mapping [SLAM]
  • G06T 3/40 - Scaling of whole images or parts thereof, e.g. expanding or contracting
  • G06T 11/20 - Drawing from basic elements, e.g. lines or circles

100.

Arm and body coordination

      
Application Number 18443180
Grant Number 12440970
Status In Force
Filing Date 2024-02-15
First Publication Date 2024-06-13
Grant Date 2025-10-14
Owner Boston Dynamics, Inc. (USA)
Inventor
  • Berard, Stephen George
  • Barry, Andrew James
  • Swilling, Benjamin John
  • Rizzi, Alfred Anthony

Abstract

A computer-implemented method, when executed by data processing hardware of a robot having an articulated arm and a base, causes data processing hardware to perform operations. The operations include determining a first location of a workspace of the articulated arm associated with a current base configuration of the base of the robot. The operations also include receiving a task request defining a task for the robot to perform outside of the workspace of the articulated arm at the first location. The operations also include generating base parameters associated with the task request. The operations further include instructing, using the generated base parameters, the base of the robot to move from the current base configuration to an anticipatory base configuration.

IPC Classes  ?

  • B25J 9/16 - Programme controls
  • B25J 5/00 - Manipulators mounted on wheels or on carriages
  1     2     3     ...     69        Next Page