Please use this identifier to cite or link to this item: http://theses.ncl.ac.uk/jspui/handle/10443/2632
Full metadata record
DC FieldValueLanguage
dc.contributor.authorHamouda, Wafya-
dc.date.accessioned2015-04-28T13:48:20Z-
dc.date.available2015-04-28T13:48:20Z-
dc.date.issued2014-
dc.identifier.urihttp://hdl.handle.net/10443/2632-
dc.descriptionPhD Thesisen_US
dc.description.abstractIn the age of the internet, email, and social media there is an increasing need for processing online information, for example, to support education and business. This has led to the rapid development of natural language processing technologies such as computational linguistics, information retrieval, and data mining. As a branch of computational linguistics, anaphora resolution has attracted much interest. This is reflected in the large number of papers on the topic published in journals such as Computational Linguistics. Mitkov (2002) and Ji et al. (2005) have argued that the overall quality of anaphora resolution systems remains low, despite practical advances in the area, and that major challenges include dealing with real-world knowledge and accurate parsing. This thesis investigates the following research question: can an algorithm be found for the resolution of the anaphor nafs in Arabic text which is accurate to at least 90%, scales linearly with text size, and requires a minimum of knowledge resources? A resolution algorithm intended to satisfy these criteria is proposed. Testing on a corpus of contemporary Arabic shows that it does indeed satisfy the criteria.en_US
dc.description.sponsorshipEgyptian Government:en_US
dc.language.isoenen_US
dc.publisherNewcastle Univeristyen_US
dc.titleAnaphora resolution for Arabic machine translation :a case study of nafsen_US
dc.typeThesisen_US
Appears in Collections:School of English Literature, Language and Linguistics

Files in This Item:
File Description SizeFormat 
Hamouda 14.pdfThesis2.08 MBAdobe PDFView/Open
dspacelicence.pdfLicence43.82 kBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.