Loading a pallet into a trailer using an autonomous mobile robot. The robot determines a pose of the trailer based on sensor data and navigates to a first goal position inside the trailer, determined based on the pose of the trailer. The robot then side-shifts the fork toward the trailer's side wall until sensors detect contact between the pallet and the side wall, with the pallet positioned above a lip on the trailer wall. The robot retracts the fork by a distance corresponding to the lip's width to prevent the pallet and the side wall of the trailer from scraping each other. The robot then navigates in a straight line forward to a second goal position, which is within a predetermined threshold distance from the drop position. The robot releases the pallet at the second goal position.
Loading last few rows of pallets onto a trailer using an autonomous mobile robot. The robot determines that the next pallet to be loaded is a first in a row where the trailer does not have sufficient space to accommodate the robot. The robot identifies a pose of a previous pallet in an immediately prior row and determines a front plane of the immediately prior row. The robot navigates to a first goal position at least partially inside the trailer, determined by the pose of the previous pallet and the trailer. The robot then side-shifts the fork toward the trailer's side wall until contact is detected and subsequently adjust the fork back to prevent scraping. The robot proceeds to a second goal position, which is within a predetermined threshold distance of the drop position, before lowering and releasing the pallet.
B66F 9/06 - Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
Unloading first few rows of pallets from a trailer using an autonomous mobile robot. The robot determines that a pallet is a first in a row where the trailer does not have sufficient space to accommodate the robot. The robot determines a pose of each observable pallet in the trailer and determines a front plane for the pallets in the same row as the target pallet. The robot navigates to a first goal position inside the trailer, based on the pallet and trailer poses, then picks up the pallet. The robot side-shifts toward an adjacent pallet until detecting contact, then adjusts back by a predetermined distance to maximize clearance between the pallet and a side wall of the trailer. The robot navigates backward in a straight line to a second goal position on a ramp, and then proceeds to drop off the pallet in the staging area.
B66F 9/06 - Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
Loading and unloading pallets in a trailer using an autonomous mobile robot. The method involves the robot determining the pose of the trailer and using sensor data to navigate to goal positions inside the trailer for precise pallet placement or retrieval. During loading, the robot side-shifts the fork to align the pallet with the trailer's side wall and retracts the fork to prevent scraping before advancing to a second goal position to release the pallet. For unloading, the robot identifies the pose of the target pallet, navigates to align the fork, lifts the pallet, and proceeds to a drop-off location. In scenarios where space within the trailer is limited, the robot adjusts its movements to avoid scraping and maximize clearance, ensuring efficient loading or unloading operations. The method enables autonomous management of the first or last rows of pallets even in confined spaces.
A method implemented at an autonomous mobile robot equipped with a fork to carry a pallet. The robot transitions between a first and second piecewise flat floor segments with differing geometries. The robot uses sensor data from sensors such as LIDAR, stereo cameras, GPS, and ultrasound sensors to determine the transition between the first and second piecewise flat floor segments. The fork operates based on parameters that meet reference constraints. When the robot detects that a second floor geometry would cause these parameters to no longer meet the reference constraints, the robot determines new parameters that will satisfy the reference constraints. Control signals are then sent to adjust the fork's operation as the robot transitions to the second floor segment.
B66F 9/06 - Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
Unloading pallets from a trailer using an autonomous mobile robot. The robot determines a pose of the trailer and a pose of each observable pallet inside. The robot identifies a target pallet for retrieval based on the observed poses and determines a front plane for the pallets in the same row as the target pallet. The robot navigates to a first goal position, side-shifts its fork to align the fork with the target pallet's pockets, inserts the fork, and lifts the pallet. The robot then navigates in reverse to a second goal position, determined based on the front plane and trailer pose. From the second position, the robot proceeds to a drop-off point in a staging area, side-shifting the fork towards the center during transit.
B66F 9/06 - Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
A method implemented at an autonomous mobile robot equipped with a fork to carry a pallet. The robot transitions between a first and second piecewise flat floor segments with differing geometries. The robot uses sensor data from sensors such as LIDAR, stereo cameras, GPS, and ultrasound sensors to determine the transition between the first and second piecewise flat floor segments. The fork operates based on parameters that meet reference constraints. When the robot detects that a second floor geometry would cause these parameters to no longer meet the reference constraints, the robot determines new parameters that will satisfy the reference constraints. Control signals are then sent to adjust the fork's operation as the robot transitions to the second floor segment.
12 - Land, air and water vehicles; parts of land vehicles
Goods & Services
Automatic guided vehicles; Electric vehicles; Self-loading vehicles; Self-propelled electric vehicle; Vehicles; Self-driving robots for delivery; autonomous forklift trucks for unloading and loading of truck trailers.
9.
AI-POWERED LOAD STABILITY ESTIMATION FOR PALLET HANDLING
An autonomous mobile robot receives sensor data from one or more sensors. The sensor data includes image data depicting a load coupled to a pallet and depth data indicating distance of surfaces of the load or the pallet from the one or more sensors. A first machine-learning model is applied to the image data to generate a first mask and second mask. The first mask represents the load, and the second mask represents the pallet. The first mask, the second mask, and/or the depth data are then used to determine a load orientation and load size. Based on the load orientation and load size, the robot evaluates the load's stability. If the stability is deemed safe, the robot is caused to lift the pallet.
B65G 43/02 - Control devices, e.g. for safety, warning or fault-correcting detecting dangerous physical condition of load- carriers, e.g. for interrupting the drive in the event of overheating
A system and method are described that provide for mapping features of a warehouse environment having improved workflow. In one example of the system/method of the present invention, a mapping robot is navigated through a warehouse environment, and sensors of the mapping robot collect geospatial data as part of a mapping mode. A Frontend N block of a map framework may be responsible for reading and processing the geospatial data from the sensors of the mapping robot, as well as various other functions. The data may be stored in a keyframe object at a keyframe database. A Backend block of the map framework may be useful for detecting loop constraints, building submaps, optimizing a pose graph using keyframe data from one or more trajectory blocks, and/or various other functions.
A system and method are described that provide for queueing robot operations in a warehouse environment based on workflow optimization instructions. In one example of the system/method of the present invention, a control system causes certain robots to queue proximate to one another to permit resources to be obtained, transported, deposited, etc. without the robots crashing into one another (or into other objects), or forming traffic jams. A robot may remain at an assigned queue position at least until another position assigned to the robot becomes available.
In a method for subpixel disparity calculation, image data for various images each representing a field of view of an input device is received by a processor, and the image data is applied to a machine learning model. The machine learning module uses the image data to compute an output representing calculated subpixel disparity between the various images. In an example of the method, the machine learning model is a neural network that produces accurate and reliable subpixel disparity estimation in real-time using synthetically generated data.
To load and unload a trailer, an autonomous mobile robot determines its location and the location of objects within the trailer relative to the trailer itself, rather than relative to a warehouse. The autonomous mobile robot determines its location the location of objects within the trailer relative to the trailer. The autonomous mobile robot navigates within the trailer and manipulates objects within the trailer from the trailer's reference frame. Additionally, the autonomous mobile robot uses a centerline heuristic to compute a path for itself within the trailer. A centerline heuristic evaluates nodes within the trailer based on how far away those nodes are from the centerline. If the nodes are further away from the centerline, they are assigned a higher cost. Thus, when the autonomous mobile robot computes a path, the path is more likely to stay near the centerline of the trailer rather than get closer to the sides.
B66F 9/06 - Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
14.
PATHFINDING USING CENTERLINE HEURISTICS FOR AN AUTONOMOUS MOBILE ROBOT
To load and unload a trailer, an autonomous mobile robot determines its location and the location of objects within the trailer relative to the trailer itself, rather than relative to a warehouse. The autonomous mobile robot determines its location the location of objects within the trailer relative to the trailer. The autonomous mobile robot navigates within the trailer and manipulates objects within the trailer from the trailer's reference frame. Additionally, the autonomous mobile robot uses a centerline heuristic to compute a path for itself within the trailer. A centerline heuristic evaluates nodes within the trailer based on how far away those nodes are from the centerline. If the nodes are further away from the centerline, they are assigned a higher cost. Thus, when the autonomous mobile robot computes a path, the path is more likely to stay near the centerline of the trailer rather than get closer to the sides.
B66F 9/06 - Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
An autonomous mobile robot uses a capability-aware pathfinding algorithm to traverse from a start pose to an end pose efficiently and effectively. The robot receives a start pose and an end pose, and determines a primary path from the start pose to the end pose based on a primary pathfinding algorithm. The robot may smooth the primary path using Bezier curves. The robot may identify a conflict point on the primary path where the robot cannot traverse, and may determine a secondary path from a first point before the conflict point to a second point after the conflict point. The secondary path may use a secondary pathfinding algorithm that uses motion primitives of the robot to generate the secondary path based on the motion capabilities of the robot. The robot may then traverse from the start pose to the end pose based on the primary path and the secondary path.
An autonomous mobile robot may use an improved steering system. The improved steering system may include a steering motor that is operably coupled to a motor shaft. The motor shaft may be aligned at an offset position relative to a center axis of the autonomous mobile robot. The motor shaft may be operably coupled to a front and rear steering linkage. Each steering linkage may include a pitman arm that is coupled to the motor shaft and a drag link. The drag link may be coupled to a first steering arm and a tie rod. The tie rod may also be coupled to a second steering arm. The first steering arm may be coupled to a first wheel and the second steering arm may be coupled to a second wheel.
B62D 7/14 - Steering linkageStub axles or their mountings for individually-pivoted wheels, e.g. on king-pins the pivotal axes being situated in more than one plane transverse to the longitudinal centre line of the vehicle, e.g. all-wheel steering
B62D 5/04 - Power-assisted or power-driven steering electrical, e.g. using an electric servo-motor connected to, or forming part of, the steering gear
17.
AREA-BASED OPERATION BY AUTONOMOUS ROBOTS IN A FACILITY CONTEXT
A system and a method are disclosed that identifies a source area within a facility comprising a plurality of objects, and determines a destination area within the facility to which the plurality of objects are to be transported and unloaded. The system selects robots within the facility based a capability of the robots and/or a location of the robots within the facility. The system provides an instruction to the robots to transport the plurality of objects from the source area to the destination area. The robots are configured to autonomously select an object based on a position and location of the object within the source area, transport the selected object to a destination area along a route selected by the robot, and unload the selected object at a location within the destination area selected based on a number of objects yet to be unloaded within the destination area.
A system and a method are disclosed that identifies a source area within a facility comprising a plurality of objects, and determines a destination area within the facility to which the plurality of objects are to be transported and unloaded. The system selects robots within the facility based a capability of the robots and/or a location of the robots within the facility. The system provides an instruction to the robots to transport the plurality of objects from the source area to the destination area. The robots are configured to autonomously select an object based on a position and location of the object within the source area, transport the selected object to a destination area along a route selected by the robot, and unload the selected object at a location within the destination area selected based on a number of objects yet to be unloaded within the destination area.
B65G 1/06 - Storage devices mechanical with means for presenting articles for removal at predetermined position or level
B65G 1/137 - Storage devices mechanical with arrangements or automatic control means for selecting which articles are to be removed
B66F 9/06 - Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
G05B 19/418 - Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
G05D 1/02 - Control of position or course in two dimensions
A system and a method are disclosed where an autonomous robot captures an image of an object to be transported from a source to a destination. The robot generates a bounding box within the image surrounding the object. The robot applies a machine-learned model to the image with the bounding box, the machine-learned model configured to identify an object type of the object, and to identify features of the object based on the identified object type and the image. The robot determines which of the identified features of the object are visible to the autonomous robot, and determines a three-dimensional pose of the object based on the features determined to be visible to the autonomous robot.
B65G 1/06 - Storage devices mechanical with means for presenting articles for removal at predetermined position or level
B66F 9/06 - Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
G05D 1/02 - Control of position or course in two dimensions
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
B65G 1/137 - Storage devices mechanical with arrangements or automatic control means for selecting which articles are to be removed
A system and a method are disclosed that generate for display to a remote operator a user interface comprising a map, the map comprising visual representations of a source area, a plurality of candidate robots, and a plurality of candidate destination areas. The system receives, via the user interface, a selection of a visual representation of a candidate robot of the plurality of candidate robots, and detects a drag-and-drop gesture within the user interface of the visual representation of the candidate robot being dragged-and-dropped to a visual representation of a candidate destination area of the plurality of candidate destination areas. Responsive to detecting the drag-and-drop gesture, the system generates a mission, where the mission causes the candidate robot to autonomously transport an object from the source area to the candidate destination area.
B65G 1/06 - Storage devices mechanical with means for presenting articles for removal at predetermined position or level
B66F 9/06 - Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
G05D 1/02 - Control of position or course in two dimensions
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
B65G 1/137 - Storage devices mechanical with arrangements or automatic control means for selecting which articles are to be removed
G06F 18/21 - Design or setup of recognition systems or techniquesExtraction of features in feature spaceBlind source separation
G05B 19/418 - Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
G06F 3/04847 - Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
21.
Safety mode toggling by autonomous robots in a facility context
A system and a method are disclosed that cause a robot to traverse along a route based on a minimum distance to be maintained between the autonomous mobile robot and an obstacle corresponding to a first mode. The robot determines that the route cannot be continued without a distance between the robot and a detected obstacle becoming less than the minimum distance, and responsively determines whether the route can be continued without the distance between the robot and the detected obstacle becoming less than a second minimum distance less than the initial minimum distance, the second minimum distance corresponding to a second mode. Responsive to determining that the route can be continued without the distance between the autonomous mobile robot and the detected obstacle becoming less than the second minimum distance, the robot is configured to operate in second mode and continues traversal of the route.
B65G 1/137 - Storage devices mechanical with arrangements or automatic control means for selecting which articles are to be removed
B65G 1/06 - Storage devices mechanical with means for presenting articles for removal at predetermined position or level
B66F 9/06 - Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
G05D 1/02 - Control of position or course in two dimensions
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
G06F 18/21 - Design or setup of recognition systems or techniquesExtraction of features in feature spaceBlind source separation
G05B 19/418 - Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
G06F 3/04847 - Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
22.
Dynamic traversal protocol selection by autonomous robots in a facility context
A system and a method are disclosed where a robot operating using a first traversal protocol traverses autonomously along a first route that is defined by markers that are detectable by the robot, wherein the robot is configured to move only based on a presence and type of each marker when the robot is configured to operate based on the first traversal protocol. The robot detects, while traversing along the route, a triggering condition corresponding to a change in operation by the robot from the first traversal protocol to a second traversal protocol. Responsive to detecting the triggering condition, the robot is configured to operate in the second traversal protocol, wherein the robot, when configured to operate based on the second traversal protocol, determines a second route autonomously without regard to a presence of any of the markers.
B65G 1/06 - Storage devices mechanical with means for presenting articles for removal at predetermined position or level
B65G 1/137 - Storage devices mechanical with arrangements or automatic control means for selecting which articles are to be removed
B66F 9/06 - Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
G05B 19/418 - Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
G05D 1/00 - Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
G05D 1/222 - Remote-control arrangements operated by humans
G05D 1/223 - Command input arrangements on the remote controller, e.g. joysticks or touch screens
G05D 1/224 - Output arrangements on the remote controller, e.g. displays, haptics or speakers
G05D 1/225 - Remote-control arrangements operated by off-board computers
G05D 1/226 - Communication links with the remote-control arrangements
G05D 1/227 - Handing over between remote control and on-board controlHanding over between remote control arrangements
G05D 1/247 - Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons
G05D 1/617 - Safety or protection, e.g. defining protection zones around obstacles or avoiding hazards
G05D 1/646 - Following a predefined trajectory, e.g. a line marked on the floor or a flight path
G05D 1/69 - Coordinated control of the position or course of two or more vehicles