The thesis focuses on the application of OpenPose, an open source computer vision library based on deep learning algorithms, to manual operations monitoring. In a firrst stage, data processing algorithms were developed in order to make it possible to fully reconstruct the body motion using OpenPose keypoints, turning ordinary cameras into sensors. Once done this, these algorithms were coupled with Stereo Vision libraries and a 3D reconstruction of the scene was made possible. As a result from this stage, a new tool for the analysis of human motion was developed to be used in the second part of the project. In the second part, a methodology for the assessment of assembly sequences based on Deep Neural Networks is presented. This approach makes it possible to connect human tasks to digital tools providing live feedback to the worker while increasing the reliability and quality of the manual processes. The proposed network architecture proved to be a good solution to the problem under study reaching a sequence-to-sequence classi cation accuracy of 94.4% and a sequence-to-label classi cation accuracy of 98.3%. Finally, the methodology was validated by analysing a real operation, obtaining a sequence-to-sequence classi cation accuracy of 89.7%.
La tesi si concentra sull'applicazione di OpenPose, una libreria di visione artificiale open source basata su algoritmi di deep learning, al monitoraggio delle operazioni manuali. In una prima fase, sono stati sviluppati algoritmi di elaborazione dati per rendere possibile ricostruire completamente il movimento del corpo utilizzando i punti chiave OpenPose, trasformando le normali telecamere in sensori. Una volta fatto questo, questi algoritmi sono stati accoppiati con le librerie di Stereo Vision e una ricostruzione 3D della scena è stata resa possibile. Come risultato di questa fase, è stato sviluppato un nuovo strumento per l'analisi del movimento umano da utilizzare nella seconda parte del progetto. Nella seconda parte viene presentata una metodologia per la valutazione delle sequenze di assemblaggio basate su Deep Neural Networks. Questo approccio consente di collegare attività umane a strumenti digitali fornendo feedback in tempo reale al lavoratore aumentando l'affidabilità e la qualità dei processi manuali. L'architettura di rete proposta si è dimostrata una buona soluzione al problema in esame raggiungendo un'accuratezza di classificazione da sequenza a sequenza del 94,4% e un'accuratezza di classi fi cazione sequenza-etichetta del 98,3%. Infine, la metodologia è stata convalidata analizzando un'operazione reale, ottenendo una precisione di classificazione da sequenza a sequenza dell'89,7%.
A deep learning-based methodology for manual operations monitoring and control
GRANDE GIL, JUAN FERNANDO
2018/2019
Abstract
The thesis focuses on the application of OpenPose, an open source computer vision library based on deep learning algorithms, to manual operations monitoring. In a firrst stage, data processing algorithms were developed in order to make it possible to fully reconstruct the body motion using OpenPose keypoints, turning ordinary cameras into sensors. Once done this, these algorithms were coupled with Stereo Vision libraries and a 3D reconstruction of the scene was made possible. As a result from this stage, a new tool for the analysis of human motion was developed to be used in the second part of the project. In the second part, a methodology for the assessment of assembly sequences based on Deep Neural Networks is presented. This approach makes it possible to connect human tasks to digital tools providing live feedback to the worker while increasing the reliability and quality of the manual processes. The proposed network architecture proved to be a good solution to the problem under study reaching a sequence-to-sequence classi cation accuracy of 94.4% and a sequence-to-label classi cation accuracy of 98.3%. Finally, the methodology was validated by analysing a real operation, obtaining a sequence-to-sequence classi cation accuracy of 89.7%.File | Dimensione | Formato | |
---|---|---|---|
2019_04_Grande.pdf
accessibile in internet solo dagli utenti autorizzati
Descrizione: Thesis text
Dimensione
13.24 MB
Formato
Adobe PDF
|
13.24 MB | Adobe PDF | Visualizza/Apri |
I documenti in POLITesi sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.
https://hdl.handle.net/10589/145584