Please use this identifier to cite or link to this item: http://ir.juit.ac.in:8080/jspui/jspui/handle/123456789/9841
Full metadata record
DC FieldValueLanguage
dc.contributor.authorAgnihotri, Aryan-
dc.contributor.authorAgarwal, Tanmay-
dc.contributor.authorGandotra, Ekta-
dc.date.accessioned2023-09-02T11:42:17Z-
dc.date.available2023-09-02T11:42:17Z-
dc.date.issued2023-
dc.identifier.urihttp://ir.juit.ac.in:8080/jspui/jspui/handle/123456789/9841-
dc.descriptionEnrolment No. 191401, 191416en_US
dc.description.abstractCataracts are the leading cause of reversible blindness and visual impairment. Cataract surgery is one of the most commonly performed surgeries in the world. The only treatment for cataract is surgery. It is also one of the oldest. By 2022, approximately 1 billion people worldwide are blind due to cataracts (95 million), glaucoma (7.7 million) and refractive errors (84.4 million), etc. In addition, from 1 million to 2 million people go blind every year. In our world, someone goes blind every 5 seconds, and a child goes blind every minute. In 75% of these cases, blindness is treatable or preventable. However, there are now deep learning convolutional neural networks (CNNs) used for pattern recognition that help automate image classification. This study was proposed to minimize data loss and increase the accuracy of the cataract identification process performing alternating epochs. Research results show that adding more epochs affects the accuracy and lost data of convolutional neural networks. According to this study, epoch value of 51 had the highest value of 98%.en_US
dc.language.isoen_USen_US
dc.publisherJaypee University of Information Technology, Solan, H.P.en_US
dc.subjectCataracten_US
dc.subjectMachine learningen_US
dc.titleCataract Detection using Machine Learningen_US
dc.typeProject Reporten_US
Appears in Collections:B.Tech. Project Reports

Files in This Item:
File Description SizeFormat 
Cataract Detection using Machine Learning.pdf2.43 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.