Please use this identifier to cite or link to this item: http://theses.ncl.ac.uk/jspui/handle/10443/4793
Title: Deep Vision for Prosthetic Grasp
Authors: Ghazaei, Ghazal
Issue Date: 2019
Publisher: Newcastle University
Abstract: The loss of the hand can limit the natural ability of individuals in grasping and manipulating objects and affect their quality of life. Prosthetic hands can aid the users in overcoming these limitations and regaining their ability. Despite considerable technical advances, the control of commercial hand prostheses is still limited to few degrees of freedom. Furthermore, switching a prosthetic hand into a desired grip mode can be tiring. Therefore, the performance of hand prostheses should improve greatly. The main aim of this thesis is to improve the functionality, performance and flexibility of current hand prostheses by augmentation of current commercial hand prosthetics with a vision module. By offering the prosthesis the capacity to see objects, appropriate grip modes can be determined autonomously and quickly. Several deep learning-based approaches were designed in this thesis to realise such a vision-reinforced prosthetic system. Importantly, the user, interacting with this learning structure, may act as a supervisor to accept or correct the suggested grasp. Amputee participants evaluated the designed system and provided feedback. The following objectives for prosthetic hands were met: 1. Chapter 3: Design, implementation and real-time testing of a semi-autonomous vision-reinforced prosthetic control structure, empowered with a baseline convolutional neural network deep learning structure. 2. Chapter 4: Development of advanced deep learning structure to simultaneously detect and estimate grasp maps for unknown objects, in presence of ambiguity. 3. Chapter 5: Design and development of several deep learning set-ups for concurrent depth and grasp map as well as human grasp type prediction. Publicly available datasets, consisting of common graspable objects, namely Amsterdam library of object images (ALOI) and Cornell grasp library were used within this thesis. Moreover, to have access to real data, a small dataset of household objects was gathered for the experiments, that is Newcastle Grasp Library.
Description: Ph. D. Thesis
URI: http://theses.ncl.ac.uk/jspui/handle/10443/4793
Appears in Collections:School of Engineering

Files in This Item:
File Description SizeFormat 
Ghazaei G 2019.pdfThesis10.82 MBAdobe PDFView/Open
dspacelicence.pdfLicence43.82 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.