메뉴 건너뛰기
.. 내서재 .. 알림
소속 기관/학교 인증
인증하면 논문, 학술자료 등을  무료로 열람할 수 있어요.
한국대학교, 누리자동차, 시립도서관 등 나의 기관을 확인해보세요
(국내 대학 90% 이상 구독 중)
로그인 회원가입 고객센터 ENG
주제분류

추천
검색

논문 기본 정보

자료유형
학위논문
저자정보

Gelan Ayana (금오공과대학교, 금오공과대학교 대학원)

지도교수
Se-woon Choe
발행연도
2023
저작권
금오공과대학교 논문은 저작권에 의해 보호받습니다.

이용수0

표지
AI에게 요청하기
추천
검색

이 논문의 연구 히스토리 (3)

초록· 키워드

오류제보하기
Despite advances in screening technologies and awareness campaigns, early diagnosis of breast cancer remains challenging. Mammography, the main screening tool, has limitations in detecting small tumors, particularly in women with dense breast tissue, and can yield false-positive results. Ultrasound and biopsy are alternative diagnostic tools that offer valuable additional information for early breast cancer diagnosis. However, they also face challenges such as operator dependency, false-positive and false-negative results, invasiveness, cost, and limited detection in certain cases. To address these challenges, this thesis developed a robust transfer learning method based on convolutional neural networks and vision transformers for breast cancer detection from multiple imaging modalities. This thesis presents important contributions, including patchless multistage transfer learning for mammogram images, a novel multistage transfer learning approach for breast ultrasound images, and vision transformer-based transfer learning for human epidermal growth factor receptor 2 (HER2) expression staging. The results demonstrated that the developed methods improved breast cancer diagnosis. Moreover, the proposed approaches offer cost-effective solutions and the potential to increase global access to accurate early breast cancer diagnosis. These findings have significant implications for the early diagnosis of breast cancer, particularly in women with dense breasts and in resource-limited settings.

목차

[List of Tables] i
[List of Figures] iii
[List of Abbreviations] v
[Bibliographic Notes] vii
Chapter I Introduction 1
1.1 Patchless multistage transfer learning for mammogram images 8
1.2 A novel multistage transfer learning for breast ultrasound images 9
1.3 HER2 X-transformer 11
Chapter II Patchless multistage transfer learning for mammogram images 14
2.1 Introduction 14
2.2 Methods 21
2.2.1 Datasets 21
2.2.2 Preprocessing 26
2.2.3 The multistage transfer learning method 28
2.2.4 Evaluation metrics 40
2.3 Results 41
2.3.1 Convolutional neural network-based multistage transfer learning model for breast mammograms 41
2.3.2 Vision transformer-based multistage transfer learning model for breast mammograms 47
2.4 Discussion 57
Chapter III A novel multistage transfer learning for breast ultrasound images 68
3.1 Introduction 68
3.2 Methods 75
3.2.1 Datasets 75
3.2.2 Preprocessing 77
3.2.3 The multistage transfer learning method 79
3.2.4 Evaluation metrics 87
3.3 Results 88
3.3.1 The convolutional neural network-based multistage transfer learning for breast ultrasound 88
3.3.2 The vision transformer-based multistage transfer learning for breast ultrasound 93
3.4 Discussion 100
Chapter IV HER2 X-transformer: Vision transformers for breast cancer human epidermal growth factor receptor 2 (HER2) expression staging without immunohistochemical (IHC) staining 105
4.1 Introduction 105
4.1.1 Related works 107
4.2 Methods 109
4.2 1 The proposed model 109
4.2.2 Localization module 112
4.2.3 Attention module 113
4.2.4 Loss module 114
4.2.5 Dataset 116
4.2.6 Implementation details 117
4.2.7 Evaluation metrics 118
4.3 Results 118
4.3.1 Experimental settings 118
4.3.2 Experimental results 119
4.4 Discussion 124
Chapter V Conclusion 128
[References] 130
[Acknowledgement] 165

최근 본 자료

전체보기

댓글(0)

0