A base station may include a parking plate on which the drone is to be landed. The base station may include a plurality of pushers slidable on the parking plate, the plurality of pushers configured to push the legs of the drone to move the drone towards a reference location on the parking plate. The base station may include a gripper carried by the parking plate, the gripper configured to removably secure the body of the drone in place relative to the reference location. The base station may include an alignment sensor carried by the parking plate and positioned at the reference location, the alignment sensor configured to detect whether a marker on the drone is in alignment with the reference location.
An aerial robot includes an image sensor for capturing images of an environment. The robot receives a first image captured at a first location. The robot identifies one or more first pixels in the first image. The first pixels correspond to one or more targeted features of an object identified in the first image. The robot receives a second image captured at the second location. The robot receives its distance data that estimates a movement of the robot from the first location to the second location. The robot identifies second pixels in the second image. The second pixels corresponding to the targeted features of the object as appeared in the second image. The robot determines an estimated distance between the robot and the object based on the changes of locations of the second pixels from the first pixels relative to the movement of the robot provided by the distance data.
B64U 101/30 - Véhicules aériens sans pilote spécialement adaptés à des utilisations ou à des applications spécifiques à l’imagerie, à la photographie ou à la vidéographie
G01C 21/16 - NavigationInstruments de navigation non prévus dans les groupes en utilisant des mesures de la vitesse ou de l'accélération exécutées à bord de l'objet navigantNavigation à l'estime en intégrant l'accélération ou la vitesse, c.-à-d. navigation par inertie
G06T 7/73 - Détermination de la position ou de l'orientation des objets ou des caméras utilisant des procédés basés sur les caractéristiques
G06V 10/44 - Extraction de caractéristiques locales par analyse des parties du motif, p. ex. par détection d’arêtes, de contours, de boucles, d’angles, de barres ou d’intersectionsAnalyse de connectivité, p. ex. de composantes connectées
G06V 10/762 - Dispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique utilisant le regroupement, p. ex. de visages similaires sur les réseaux sociaux
G06V 10/82 - Dispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique utilisant les réseaux neuronaux
G06V 20/17 - Scènes terrestres transmises par des avions ou des drones
G06V 20/58 - Reconnaissance d’objets en mouvement ou d’obstacles, p. ex. véhicules ou piétonsReconnaissance des objets de la circulation, p. ex. signalisation routière, feux de signalisation ou routes
3.
AUTONOMOUS INVENTORY MANAGEMENT USING BATTERY SWAPPING DRONES
A computing device may generate a command for performing inventory management of a storage site based on inventory management data. The computing device may cause the inventory aerial robot to perform an inventory management trip based on the command. The inventory management trip may include receiving an input that includes coordinates of a plurality of target locations in the storage site, departing from a base station, navigating through the storage site to the plurality of target locations, capturing an image associated with an inventory item location at the one of the target locations, and returning to the base station. The computing device may perform an analysis of the image captured by the inventory aerial robot. The base station may perform swapping of a battery pack of the inventory aerial robot at the base station to prepare the inventory aerial robot for another inventory management trip.
G05D 1/00 - Commande de la position, du cap, de l'altitude ou de l'attitude des véhicules terrestres, aquatiques, aériens ou spatiaux, p. ex. utilisant des pilotes automatiques
B64U 10/14 - Plates-formes volantes comportant quatre axes distincts de rotors, p. ex. quadcoptères
B64U 80/25 - Transport ou stockage spécialement adaptés aux véhicules aériens sans pilote avec des dispositions pour assurer le service du véhicule aérien sans pilote pour la recharge de batteriesTransport ou stockage spécialement adaptés aux véhicules aériens sans pilote avec des dispositions pour assurer le service du véhicule aérien sans pilote pour le ravitaillement en combustible
B64U 101/30 - Véhicules aériens sans pilote spécialement adaptés à des utilisations ou à des applications spécifiques à l’imagerie, à la photographie ou à la vidéographie
B64U 101/70 - Véhicules aériens sans pilote spécialement adaptés à des utilisations ou à des applications spécifiques à l’utilisation à l’intérieur d’espaces clos, p. ex. dans les bâtiments ou dans les véhicules
Embodiments relate to a drone and a base station. The drone is adapted to navigate a storage site and carry a swappable battery. The base station is adapted to receive the drone to perform a battery swap. As such, the base station may include charging ports to recharge the swappable battery and may be adapted to predict a return timing of the drone, charge a battery to a level in anticipation of the return timing, and subsequent to the drone returning to the base station, swapping the battery of the drone with the charged battery.
Embodiments relate to a drone and a base station. The drone is adapted to navigate a storage site and carry a swappable battery. The base station is adapted to receive the drone to perform a battery swap. As such, the base station may include charging ports to recharge the swappable battery and may be adapted to predict a return timing of the drone, charge a battery to a level in anticipation of the return timing, and subsequent to the drone returning to the base station, swapping the battery of the drone with the charged battery.
B60L 58/10 - Procédés ou agencements de circuits pour surveiller ou commander des batteries ou des piles à combustible, spécialement adaptés pour des véhicules électriques pour la surveillance et la commande des batteries
B64U 70/90 - Lancement à partir de ou atterrissage sur des plates-formes
H02J 7/00 - Circuits pour la charge ou la dépolarisation des batteries ou pour alimenter des charges par des batteries
A base station may include a parking plate on which the drone is to be landed. The base station may include pushers slidable on the parking plate. The pushers may be configured to push the legs of the drone to move the drone towards a reference location on the parking plate. The base station may include a gripper carried by the parking plate. The gripper may removably secure the body of the drone in place relative to the reference location. The base station may include an alignment sensor that detects whether a marker on the drone is in alignment with the reference location. A base station may also include a battery swapping system to replace the battery of the drone after the drone is secured.
B64U 70/97 - Moyens de guidage du véhicule aérien sans pilote vers un emplacement spécifique sur la plate-forme, p. ex. structures de plate-forme empêchant un atterrissage hors-piste
B64F 1/22 - Installations au sol ou installations pour ponts d'envol des porte-avions pour la manœuvre des aéronefs
B64F 5/10 - Fabrication ou assemblage d’aéronefs, p. ex. gabarits à cet effet
An aerial drone may include a drone body having a longitudinal housing carrying a processing circuit and a battery, the longitudinal housing extending in a first direction. The drone may include a sensor rod carried by the drone body and extending from the drone body in a second direction different from the first direction, the sensor rod carrying a sensor at a distal end of the sensor rod. The drone may include a propeller guard connected to the distal end of the sensor rod and supported at least partially by the sensor rod, the propeller guard forming part of a periphery of the aerial drone.
A drone may include a drone body for carrying a battery pack. The drone may include a slide guide carried by the drone body, wherein the slide guide suspends from a surface of the drone body and creates a channel between the slide guide and the surface, and wherein the battery pack is slidable along the channel. The drone may include a slide-guide contact sensor carried on the surface of the drone body, the slide-guide contact sensor configured to detect whether the battery pack is in contact with the surface. The drone may include a connection port carried by the drone body. The drone may include a port contact sensor carried by the drone body, the port contact sensor is configured to detect whether the battery pack is slid in to the connection port.
A base station may include a frame configured to provide mechanical support to the base station. The base station may include a parking plate carried by the frame, the parking plate comprising a landing surface configured to be in contact with the drone when the drone is landed and a component-carrying surface opposing the landing surface. The base station may include a plurality of chargers carried on the component-carrying surface of the parking plate. The base station may include a battery pack carrier carried on the component-carrying surface of the parking plate, the battery pack carrier movable to carry a battery pack connected to one of the chargers to the drone. The base station may include a shutter on the parking plate, the shutter openable to provide access of the battery pack carrier from the component-carrying surface to the landing surface.
An aerial drone may include a cabinet comprising a parking plate and one or more walls forming an enclosure for one or more internal components of the base station, the parking plate configured to receive the drone. The drone may include a plurality of chargers carried within the enclosure, each charger configured to provide power to a battery pack being charged at the charger. The drone may include a temperature sensor carried within the enclosure, the temperature sensor configured to measure a temperature within the enclosure. The drone may include a temperature regulator configured to regulate the temperature within the enclosure to maintain the temperature of a plurality of battery packs charged at the plurality of chargers within a temperature range.
H01M 10/46 - Accumulateurs combinés par structure avec un appareil de charge
H01M 10/48 - Accumulateurs combinés à des dispositions pour mesurer, tester ou indiquer l'état des éléments, p. ex. le niveau ou la densité de l'électrolyte
H01M 10/617 - Types de commande de la température pour réaliser l'uniformité ou une répartition désirée de la température
H01M 50/249 - MonturesBoîtiers secondaires ou cadresBâtis, modules ou blocsDispositifs de suspensionAmortisseursDispositifs de transport ou de manutentionSupports spécialement adaptés aux aéronefs ou aux véhicules, p. ex. aux automobiles ou aux trains
H02J 7/00 - Circuits pour la charge ou la dépolarisation des batteries ou pour alimenter des charges par des batteries
A base station may include a parking plate on which the drone is to be landed. The base station may include a plurality of pushers slidable on the parking plate, the plurality of pushers configured to push the legs of the drone to move the drone towards a reference location on the parking plate. The base station may include a gripper carried by the parking plate, the gripper configured to removably secure the body of the drone in place relative to the reference location. The base station may include an alignment sensor carried by the parking plate and positioned at the reference location, the alignment sensor configured to detect whether a marker on the drone is in alignment with the reference location.
A base station may include a parking plate for receiving a drone. The base station may include a gripper carried by the parking plate, the gripper configured to secure the drone in place at a parking position at the parking plate. The base station may include a plurality of chargers, each charger comprising a battery latch and a power port, the battery latch configured to mechanically hold a battery pack in place with the charger and the power port configured to provide power to the battery pack being charged. The base station may include a battery pack carrier movable among the plurality of chargers and the parking position, the battery pack carrier configured to: remove a first battery pack from the drone and move the first battery pack to a first charger, and carry and install a second battery pack to the drone.
A robotic control system may be configured to translate a first set of location coordinates based on a first location numbering system to a second set of location coordinates based on a unified location numbering system. A robotic control system may receive layout data of a warehouse, the layout data containing a plurality of location coordinates of racks and storage locations. The location coordinates may be of a first format based on the first location numbering system that is specific to the warehouse. The robotic control system may analyze the format of the location coordinates to select, from a plurality of candidate conversion algorithms, a suitable conversion algorithm to translate the plurality of location coordinates of the first format to a second format based on the unified numbering system. The robotic control system may store the translated location coordinates for use in generating a topometric map of the warehouse.
A robotic control system may be configured to translate a first set of location coordinates based on a first location numbering system to a second set of location coordinates based on a unified location numbering system. A robotic control system may receive layout data of a warehouse, the layout data containing a plurality of location coordinates of racks and storage locations. The location coordinates may be of a first format based on the first location numbering system that is specific to the warehouse. The robotic control system may analyze the format of the location coordinates to select, from a plurality of candidate conversion algorithms, a suitable conversion algorithm to translate the plurality of location coordinates of the first format to a second format based on the unified numbering system. The robotic control system may store the translated location coordinates for use in generating a topometric map of the warehouse.
G05D 1/246 - Dispositions pour déterminer la position ou l’orientation utilisant des cartes d’environnement, p. ex. localisation et cartographie simultanées [SLAM]
G05D 1/244 - Dispositions pour déterminer la position ou l’orientation utilisant des aides à la navigation passive extérieures au véhicule, p. ex. marqueurs, réflecteurs ou moyens magnétiques
G05D 1/46 - Commande de la position ou du cap dans les trois dimensions
G06T 7/73 - Détermination de la position ou de l'orientation des objets ou des caméras utilisant des procédés basés sur les caractéristiques
G05D 107/70 - Sites industriels, p. ex. entrepôts ou usines
B64U 101/25 - Véhicules aériens sans pilote spécialement adaptés à des utilisations ou à des applications spécifiques à la fabrication ou à l’entretien
15.
TOPOMETRIC MAP BASED AUTONOMOUS NAVIGATION FOR INVENTORY DRONE
A topometric map that enables autonomous navigation of an inventory robot. The topometric map is generated using the layout of a storage site and is made up of vertices and edges. The vertices are generated at pallet locations and other structural locations, and edges are generated between neighboring vertices. Vertices and edges have associated metrics that aid in the routing of the robot. The metrics of the vertices and edges may be updated as the robot navigates through the storage site by using a perception engine and state estimator. The metrics of the edges can be used to calculate an energy cost. The robot determines a shortest path between a source and destination vertex based on the energy cost associated with each edge.
A topometric map that enables autonomous navigation of an inventory robot. The topometric map is generated using the layout of a storage site and is made up of vertices and edges. The vertices are generated at pallet locations and other structural locations, and edges are generated between neighboring vertices. Vertices and edges have associated metrics that aid in the routing of the robot. The metrics of the vertices and edges may be updated as the robot navigates through the storage site by using a perception engine and state estimator. The metrics of the edges can be used to calculate an energy cost. The robot determines a shortest path between a source and destination vertex based on the energy cost associated with each edge.
A robot includes an image sensor that captures the environment of a storage site. The robot visually recognizes regularly shaped structures to navigate through the storage site using various object detection and image segmentation techniques. In response to receiving a target location in the storage site, the robot moves to the target location along a path. The robot receives the images as the robot moves along the path. The robot analyzes the images captured by the image sensor to determine the current location of the robot in the path by tracking a number of regularly shaped structures in the storage site passed by the robot. The regularly shaped structures may be racks, horizontal bars of the racks, and vertical bars of the racks. The robot can identify the target location by counting the number of rows and columns that the robot has passed.
G05D 1/00 - Commande de la position, du cap, de l'altitude ou de l'attitude des véhicules terrestres, aquatiques, aériens ou spatiaux, p. ex. utilisant des pilotes automatiques
B64C 39/02 - Aéronefs non prévus ailleurs caractérisés par un emploi spécial
B64U 101/30 - Véhicules aériens sans pilote spécialement adaptés à des utilisations ou à des applications spécifiques à l’imagerie, à la photographie ou à la vidéographie
A robot includes an image sensor that captures the environment of a storage site. The robot visually recognizes regularly shaped structures to navigate through the storage site using various object detection and image segmentation techniques. In response to receiving a target location in the storage site, the robot moves to the target location along a path. The robot receives the images as the robot moves along the path. The robot analyzes the images captured by the image sensor to determine the current location of the robot in the path by tracking a number of regularly shaped structures in the storage site passed by the robot. The regularly shaped structures may be racks, horizontal bars of the racks, and vertical bars of the racks. The robot can identify the target location by counting the number of rows and columns that the robot has passed.
G05D 1/00 - Commande de la position, du cap, de l'altitude ou de l'attitude des véhicules terrestres, aquatiques, aériens ou spatiaux, p. ex. utilisant des pilotes automatiques
B64C 39/02 - Aéronefs non prévus ailleurs caractérisés par un emploi spécial
B64U 101/30 - Véhicules aériens sans pilote spécialement adaptés à des utilisations ou à des applications spécifiques à l’imagerie, à la photographie ou à la vidéographie
An aerial robot may include a distance sensor and visual inertial sensor. The aerial robot may determine a first height estimate of the aerial robot relative to a first region with a first surface level using data from the distance sensor. The aerial robot may fly over at least a part of the first region based on the first estimated height. The aerial robot may determine that it is in a transition region between the first region and a second region with a second surface level different from the first surface level. The aerial robot may determine a second height estimate of the aerial robot using data from a visual inertial sensor. The aerial robot may control its flight using the second height estimate in the transition region. In the second region, the aerial robot may revert to using the distance sensor in estimating the height.
An aerial robot includes an image sensor for capturing images of an environment. The robot receives a first image captured at a first location. The robot identifies one or more first pixels in the first image. The first pixels correspond to one or more targeted features of an object identified in the first image. The robot receives a second image captured at the second location. The robot receives its distance data that estimates a movement of the robot from the first location to the second location. The robot identifies second pixels in the second image. The second pixels corresponding to the targeted features of the object as appeared in the second image. The robot determines an estimated distance between the robot and the object based on the changes of locations of the second pixels from the first pixels relative to the movement of the robot provided by the distance data.
G05D 1/10 - Commande de la position ou du cap dans les trois dimensions simultanément
B64C 39/02 - Aéronefs non prévus ailleurs caractérisés par un emploi spécial
G01C 21/16 - NavigationInstruments de navigation non prévus dans les groupes en utilisant des mesures de la vitesse ou de l'accélération exécutées à bord de l'objet navigantNavigation à l'estime en intégrant l'accélération ou la vitesse, c.-à-d. navigation par inertie
G05D 1/00 - Commande de la position, du cap, de l'altitude ou de l'attitude des véhicules terrestres, aquatiques, aériens ou spatiaux, p. ex. utilisant des pilotes automatiques
G06T 7/73 - Détermination de la position ou de l'orientation des objets ou des caméras utilisant des procédés basés sur les caractéristiques
G06V 10/44 - Extraction de caractéristiques locales par analyse des parties du motif, p. ex. par détection d’arêtes, de contours, de boucles, d’angles, de barres ou d’intersectionsAnalyse de connectivité, p. ex. de composantes connectées
G06V 10/762 - Dispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique utilisant le regroupement, p. ex. de visages similaires sur les réseaux sociaux
G06V 10/82 - Dispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique utilisant les réseaux neuronaux
G06V 20/17 - Scènes terrestres transmises par des avions ou des drones
G06V 20/58 - Reconnaissance d’objets en mouvement ou d’obstacles, p. ex. véhicules ou piétonsReconnaissance des objets de la circulation, p. ex. signalisation routière, feux de signalisation ou routes
B64U 101/30 - Véhicules aériens sans pilote spécialement adaptés à des utilisations ou à des applications spécifiques à l’imagerie, à la photographie ou à la vidéographie
21.
THIN OBJECT DETECTION AND AVOIDANCE IN AERIAL ROBOTS
An aerial robot includes an image sensor for capturing images of an environment. The robot receives a first image captured at a first location. The robot identifies one or more first pixels in the first image. The first pixels correspond to one or more targeted features of an object identified in the first image. The robot receives a second image captured at the second location. The robot receives its distance data that estimates a movement of the robot from the first location to the second location. The robot identifies second pixels in the second image. The second pixels corresponding to the targeted features of the object as appeared in the second image. The robot determines an estimated distance between the robot and the object based on the changes of locations of the second pixels from the first pixels relative to the movement of the robot provided by the distance data.
G06T 7/521 - Récupération de la profondeur ou de la forme à partir de la télémétrie laser, p. ex. par interférométrieRécupération de la profondeur ou de la forme à partir de la projection de lumière structurée
G06V 10/46 - Descripteurs pour la forme, descripteurs liés au contour ou aux points, p. ex. transformation de caractéristiques visuelles invariante à l’échelle [SIFT] ou sacs de mots [BoW]Caractéristiques régionales saillantes
G06V 10/764 - Dispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique utilisant la classification, p. ex. des objets vidéo
An aerial robot may include a distance sensor and visual inertial sensor. The aerial robot may determine a first height estimate of the aerial robot relative to a first region with a first surface level using data from the distance sensor. The aerial robot may fly over at least a part of the first region based on the first estimated height. The aerial robot may determine that it is in a transition region between the first region and a second region with a second surface level different from the first surface level. The aerial robot may determine a second height estimate of the aerial robot using data from a visual inertial sensor. The aerial robot may control its flight using the second height estimate in the transition region. In the second region, the aerial robot may revert to using the distance sensor in estimating the height.
A robot includes an image sensor that captures the environment of a storage site. The robot visually recognizes regularly shaped structures to navigate through the storage site using various object detection and image segmentation techniques. In response to receiving a target location in the storage site, the robot moves to the target location along a path. The robot receives the images as the robot moves along the path. The robot analyzes the images captured by the image sensor to determine the current location of the robot in the path by tracking a number of regularly shaped structures in the storage site passed by the robot. The regularly shaped structures may be racks, horizontal bars of the racks, and vertical bars of the racks. The robot can identify the target location by counting the number of rows and columns that the robot has passed.
B64U 101/30 - Véhicules aériens sans pilote spécialement adaptés à des utilisations ou à des applications spécifiques à l’imagerie, à la photographie ou à la vidéographie
A robot includes an image sensor that captures the environment of a storage site. The robot visually recognizes regularly shaped structures to navigate through the storage site using various object detection and image segmentation techniques. In response to receiving a target location in the storage site, the robot moves to the target location along a path. The robot receives the images as the robot moves along the path. The robot analyzes the images captured by the image sensor to determine the current location of the robot in the path by tracking a number of regularly shaped structures in the storage site passed by the robot. The regularly shaped structures may be racks, horizontal bars of the racks, and vertical bars of the racks. The robot can identify the target location by counting the number of rows and columns that the robot has passed.
G05D 1/02 - Commande de la position ou du cap par référence à un système à deux dimensions
G05D 1/00 - Commande de la position, du cap, de l'altitude ou de l'attitude des véhicules terrestres, aquatiques, aériens ou spatiaux, p. ex. utilisant des pilotes automatiques
An aerial robot may include a distance sensor and visual inertial sensor. The aerial robot may determine a first height estimate of the aerial robot relative to a first region with a first surface level using data from the distance sensor. The aerial robot may fly over at least a part of the first region based on the first estimated height. The aerial robot may determine that it is in a transition region between the first region and a second region with a second surface level different from the first surface level. The aerial robot may determine a second height estimate of the aerial robot using data from a visual inertial sensor. The aerial robot may control its flight using the second height estimate in the transition region. In the second region, the aerial robot may revert to using the distance sensor in estimating the height.
A robot includes an image sensor that captures the environment of a storage site. The robot visually recognizes regularly shaped structures to navigate through the storage site using various object detection and image segmentation techniques. In response to receiving a target location in the storage site, the robot moves to the target location along a path. The robot receives the images as the robot moves along the path. The robot analyzes the images captured by the image sensor to determine the current location of the robot in the path by tracking a number of regularly shaped structures in the storage site passed by the robot. The regularly shaped structures may be racks, horizontal bars of the racks, and vertical bars of the racks. The robot can identify the target location by counting the number of rows and columns that the robot has passed.
An aerial robot includes an image sensor for capturing images of an environment. The robot receives a first image captured at a first location. The robot identifies one or more first pixels in the first image. The first pixels correspond to one or more targeted features of an object identified in the first image. The robot receives a second image captured at the second location. The robot receives its distance data that estimates a movement of the robot from the first location to the second location. The robot identifies second pixels in the second image. The second pixels corresponding to the targeted features of the object as appeared in the second image. The robot determines an estimated distance between the robot and the object based on the changes of locations of the second pixels from the first pixels relative to the movement of the robot provided by the distance data.
G06T 7/521 - Récupération de la profondeur ou de la forme à partir de la télémétrie laser, p. ex. par interférométrieRécupération de la profondeur ou de la forme à partir de la projection de lumière structurée
G06T 7/70 - Détermination de la position ou de l'orientation des objets ou des caméras
G06V 10/46 - Descripteurs pour la forme, descripteurs liés au contour ou aux points, p. ex. transformation de caractéristiques visuelles invariante à l’échelle [SIFT] ou sacs de mots [BoW]Caractéristiques régionales saillantes
G06V 10/764 - Dispositions pour la reconnaissance ou la compréhension d’images ou de vidéos utilisant la reconnaissance de formes ou l’apprentissage automatique utilisant la classification, p. ex. des objets vidéo