In the past few years the need for more flexibility in industrial production has implied, in the field of industrial robotics, a growing attention towards the possibility of making humans work directly in touch with robots. As a matter of fact, it is today a common opinion that Human-Robot Interaction (HRI) represents the key factor that will facilitate industrial robots to spread in Small and Medium sized Enterprises (SMEs). Nevertheless, HRI introduces a series of safety issues which are uncommon in industrial settings where physical separation between robot and human workspaces is typically enforced. In order to achieve safe and efficient HRI, this thesis was developed around two main goals: i) enhance the perception capabilities of a typical control system of an industrial robot by integrating information coming from different exteroceptive sensors, like for instance RGB and depth cameras; ii) develop reactive control strategies and trajectory generation algorithms that not only rely on the information acquired by these sensors, but that also guarantee human workers’ safety by satisfying safety standards and regulations. From the perception perspective, two main problems have been approached. At first we have developed sensor fusion strategies able to merge information coming from several RGB cameras, or from multiple depth cameras, or from both kind of sensors. Then we have chosen a simple yet effective human kinematic model and we have developed algorithms able to detect, track and predict human motion on the basis of the information acquired via sensor fusion. Switching from the perception domain to the control perspective, we initially approached the problem of formalizing safety requirements and regulations in a mathematical way. To this purpose ”safety constraints” have been formalized in order to express collision avoidance requirements with respect to both a-priori known obstacles and obstacles perceived at runtime. On the basis of these safety constraints, several safety-oriented control strategies and trajectory generation algorithms have been developed in order to exploit the information acquired from the perception system. Finally, also the problem of safety in physical HRI has been investigated, with a peculiar focus on Lead-Through Programming (LTP).
Negli ultimi anni, la necessità di rendere la produzione industriale sempre più rapida nell'adattarsi a specifiche in continua evoluzione ha determinato, nel campo della robotica industriale, un crescente interesse vero la possibilità di far lavorare operatori umani e robot a stretto contatto. Non a caso, gli esperti del settore concordano nell'affermare che l’interazione uomo-robot rappresenta, al giorno d’oggi, il fattore chiave che faciliterà la diffusione dei robot industriali all'interno delle medie e piccole imprese. Tuttavia, l’interazione diretta uomo-robot presuppone l’eliminazione delle barriere che separano lo spazio di lavoro dedicato ai robot da quello di competenza degli operatori umani, dando così origine a rilevanti problemi di sicurezza. Nell'ottica di realizzare un’interazione uomo-robot sicura ed efficiente, questa tesi è stata sviluppata attorno a due obiettivi fondamentali: i) potenziare le capacità percettive dei tipici sistemi di controllo per robot industriali attraverso l’integrazione di informazione proveniente da svariati sensori esterocettivi, come per esempio telecamere di sorveglianza e sensori di profondità; ii) sviluppare strategie di controllo reattivo e di generazione della traiettoria che non solo si basano sulle informazioni acquisite dai sensori di cui sopra, ma, soprattutto, che siano in grado di garantire la sicurezza degli operatori umani attraverso il soddisfacimento dei requisiti di sicurezza imposti delle normative vigenti. Per quanto concerne l’ambito della percezione, due problemi fondamentali sono stati affrontati. I primo luogo abbiamo sviluppato diverse strategie di fusione sensoriale allo scopo di integrare in maniera coerente le informazioni acquisite da telecamere RGB e sensori di profondità. Successivamente abbiamo adottato un modello cinematico dell’essere umano e abbiamo sviluppato algoritmi in grado di rilevare, effettuare il tracking e predire il movimento degli operatori umani all’interno della cella robotica. Da ultimo, anche il problema della sicurezza durante l’interazione fisica uomo-robot è stato affrontato, ponendo l’attenzione sul tema del Lead-Through Programming (LTP).
Towards a safe interaction between humans and industrial robots through perception algorithms and control strategies
RAGAGLIA, MATTEO
Abstract
In the past few years the need for more flexibility in industrial production has implied, in the field of industrial robotics, a growing attention towards the possibility of making humans work directly in touch with robots. As a matter of fact, it is today a common opinion that Human-Robot Interaction (HRI) represents the key factor that will facilitate industrial robots to spread in Small and Medium sized Enterprises (SMEs). Nevertheless, HRI introduces a series of safety issues which are uncommon in industrial settings where physical separation between robot and human workspaces is typically enforced. In order to achieve safe and efficient HRI, this thesis was developed around two main goals: i) enhance the perception capabilities of a typical control system of an industrial robot by integrating information coming from different exteroceptive sensors, like for instance RGB and depth cameras; ii) develop reactive control strategies and trajectory generation algorithms that not only rely on the information acquired by these sensors, but that also guarantee human workers’ safety by satisfying safety standards and regulations. From the perception perspective, two main problems have been approached. At first we have developed sensor fusion strategies able to merge information coming from several RGB cameras, or from multiple depth cameras, or from both kind of sensors. Then we have chosen a simple yet effective human kinematic model and we have developed algorithms able to detect, track and predict human motion on the basis of the information acquired via sensor fusion. Switching from the perception domain to the control perspective, we initially approached the problem of formalizing safety requirements and regulations in a mathematical way. To this purpose ”safety constraints” have been formalized in order to express collision avoidance requirements with respect to both a-priori known obstacles and obstacles perceived at runtime. On the basis of these safety constraints, several safety-oriented control strategies and trajectory generation algorithms have been developed in order to exploit the information acquired from the perception system. Finally, also the problem of safety in physical HRI has been investigated, with a peculiar focus on Lead-Through Programming (LTP).File | Dimensione | Formato | |
---|---|---|---|
2016_02_PhD_Ragaglia.pdf
Open Access dal 26/01/2017
Descrizione: Thesis text
Dimensione
42.2 MB
Formato
Adobe PDF
|
42.2 MB | Adobe PDF | Visualizza/Apri |
I documenti in POLITesi sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/10589/117804