Crowdsourcing as a screening tool to detect clinical features of glaucomatous optic neuropathy from digital photography.
<h4>Aim</h4>Crowdsourcing is the process of simplifying and outsourcing numerous tasks to many untrained individuals. Our aim was to assess the performance and repeatability of crowdsourcing in the classification of normal and glaucomatous discs from optic disc images.<h4>Methods&l...
Saved in:
| Main Authors: | , , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Public Library of Science (PLoS)
2015-01-01
|
| Series: | PLoS ONE |
| Online Access: | https://doi.org/10.1371/journal.pone.0117401 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1849332308387037184 |
|---|---|
| author | Danny Mitry Tunde Peto Shabina Hayat Peter Blows James Morgan Kay-Tee Khaw Paul J Foster |
| author_facet | Danny Mitry Tunde Peto Shabina Hayat Peter Blows James Morgan Kay-Tee Khaw Paul J Foster |
| author_sort | Danny Mitry |
| collection | DOAJ |
| description | <h4>Aim</h4>Crowdsourcing is the process of simplifying and outsourcing numerous tasks to many untrained individuals. Our aim was to assess the performance and repeatability of crowdsourcing in the classification of normal and glaucomatous discs from optic disc images.<h4>Methods</h4>Optic disc images (N = 127) with pre-determined disease status were selected by consensus agreement from grading experts from a large cohort study. After reading brief illustrative instructions, we requested that knowledge workers (KWs) from a crowdsourcing platform (Amazon MTurk) classified each image as normal or abnormal. Each image was classified 20 times by different KWs. Two study designs were examined to assess the effect of varying KW experience and both study designs were conducted twice for consistency. Performance was assessed by comparing the sensitivity, specificity and area under the receiver operating characteristic curve (AUC).<h4>Results</h4>Overall, 2,540 classifications were received in under 24 hours at minimal cost. The sensitivity ranged between 83-88% across both trials and study designs, however the specificity was poor, ranging between 35-43%. In trial 1, the highest AUC (95%CI) was 0.64(0.62-0.66) and in trial 2 it was 0.63(0.61-0.65). There were no significant differences between study design or trials conducted.<h4>Conclusions</h4>Crowdsourcing represents a cost-effective method of image analysis which demonstrates good repeatability and a high sensitivity. Optimisation of variables such as reward schemes, mode of image presentation, expanded response options and incorporation of training modules should be examined to determine their effect on the accuracy and reliability of this technique in retinal image analysis. |
| format | Article |
| id | doaj-art-6ed13e12aa874961be2e37d4dfebf36c |
| institution | Kabale University |
| issn | 1932-6203 |
| language | English |
| publishDate | 2015-01-01 |
| publisher | Public Library of Science (PLoS) |
| record_format | Article |
| series | PLoS ONE |
| spelling | doaj-art-6ed13e12aa874961be2e37d4dfebf36c2025-08-20T03:46:13ZengPublic Library of Science (PLoS)PLoS ONE1932-62032015-01-01102e011740110.1371/journal.pone.0117401Crowdsourcing as a screening tool to detect clinical features of glaucomatous optic neuropathy from digital photography.Danny MitryTunde PetoShabina HayatPeter BlowsJames MorganKay-Tee KhawPaul J Foster<h4>Aim</h4>Crowdsourcing is the process of simplifying and outsourcing numerous tasks to many untrained individuals. Our aim was to assess the performance and repeatability of crowdsourcing in the classification of normal and glaucomatous discs from optic disc images.<h4>Methods</h4>Optic disc images (N = 127) with pre-determined disease status were selected by consensus agreement from grading experts from a large cohort study. After reading brief illustrative instructions, we requested that knowledge workers (KWs) from a crowdsourcing platform (Amazon MTurk) classified each image as normal or abnormal. Each image was classified 20 times by different KWs. Two study designs were examined to assess the effect of varying KW experience and both study designs were conducted twice for consistency. Performance was assessed by comparing the sensitivity, specificity and area under the receiver operating characteristic curve (AUC).<h4>Results</h4>Overall, 2,540 classifications were received in under 24 hours at minimal cost. The sensitivity ranged between 83-88% across both trials and study designs, however the specificity was poor, ranging between 35-43%. In trial 1, the highest AUC (95%CI) was 0.64(0.62-0.66) and in trial 2 it was 0.63(0.61-0.65). There were no significant differences between study design or trials conducted.<h4>Conclusions</h4>Crowdsourcing represents a cost-effective method of image analysis which demonstrates good repeatability and a high sensitivity. Optimisation of variables such as reward schemes, mode of image presentation, expanded response options and incorporation of training modules should be examined to determine their effect on the accuracy and reliability of this technique in retinal image analysis.https://doi.org/10.1371/journal.pone.0117401 |
| spellingShingle | Danny Mitry Tunde Peto Shabina Hayat Peter Blows James Morgan Kay-Tee Khaw Paul J Foster Crowdsourcing as a screening tool to detect clinical features of glaucomatous optic neuropathy from digital photography. PLoS ONE |
| title | Crowdsourcing as a screening tool to detect clinical features of glaucomatous optic neuropathy from digital photography. |
| title_full | Crowdsourcing as a screening tool to detect clinical features of glaucomatous optic neuropathy from digital photography. |
| title_fullStr | Crowdsourcing as a screening tool to detect clinical features of glaucomatous optic neuropathy from digital photography. |
| title_full_unstemmed | Crowdsourcing as a screening tool to detect clinical features of glaucomatous optic neuropathy from digital photography. |
| title_short | Crowdsourcing as a screening tool to detect clinical features of glaucomatous optic neuropathy from digital photography. |
| title_sort | crowdsourcing as a screening tool to detect clinical features of glaucomatous optic neuropathy from digital photography |
| url | https://doi.org/10.1371/journal.pone.0117401 |
| work_keys_str_mv | AT dannymitry crowdsourcingasascreeningtooltodetectclinicalfeaturesofglaucomatousopticneuropathyfromdigitalphotography AT tundepeto crowdsourcingasascreeningtooltodetectclinicalfeaturesofglaucomatousopticneuropathyfromdigitalphotography AT shabinahayat crowdsourcingasascreeningtooltodetectclinicalfeaturesofglaucomatousopticneuropathyfromdigitalphotography AT peterblows crowdsourcingasascreeningtooltodetectclinicalfeaturesofglaucomatousopticneuropathyfromdigitalphotography AT jamesmorgan crowdsourcingasascreeningtooltodetectclinicalfeaturesofglaucomatousopticneuropathyfromdigitalphotography AT kayteekhaw crowdsourcingasascreeningtooltodetectclinicalfeaturesofglaucomatousopticneuropathyfromdigitalphotography AT pauljfoster crowdsourcingasascreeningtooltodetectclinicalfeaturesofglaucomatousopticneuropathyfromdigitalphotography |