A COMPARATIVE ANALYSIS OF DEEP TRANSFER LEARNING TECHNIQUES FOR MAMMOGRAPHIC IMAGE CLASSIFICATION

Among all new cancer cases diagnosed, breast cancer has been leading in count, followed by prostate and lung cancer. Breast cancer also has the highest chances of getting cured, if it gets early diagnosis, thus increasing the lives of not only women but also the minority of males. For the same, the...

Full description

Saved in:
Bibliographic Details
Main Authors: Bhavesh Gupta, Akshay Singh, Anjana Gosain
Format: Article
Language:English
Published: University of Kragujevac 2024-12-01
Series:Proceedings on Engineering Sciences
Subjects:
Online Access:https://pesjournal.net/journal/v6-n4/51.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Among all new cancer cases diagnosed, breast cancer has been leading in count, followed by prostate and lung cancer. Breast cancer also has the highest chances of getting cured, if it gets early diagnosis, thus increasing the lives of not only women but also the minority of males. For the same, the Deep Learning algorithms with transfer learning models are utilized, already trained with ImageNet database, and partially training them on the small mammography images database and thus help to diagnose it without the need for large datasets or tissue analysis (biopsy). The pre-trained convolution neural network models of VGG-16, VGG-19, ResNet50 and Inception V3 are worked as Deep Transfer Learning on two databases: the Mammography Image Analysis Society (MIAS) database containing 321 images, and Chinese Mammography Database (CMMD) containing 3744 mammography, of which 2000 images are used for learning. The evaluation of the model is based upon the parameters of accuracy, precision, recall, and F1-score. For MIAS Database, VGG 19 model showed better results, with accuracy being 98.44%, and precision, recall and F1 score being 0.99 each. For CMMD, VGG16 showed better results, with accuracy being 99.50%, precision being 1.0, recall being 0.99, and F1 score of 0.99.
ISSN:2620-2832
2683-4111