Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SYSTEM, METHOD AND COMPUTER-ACCESSIBLE MEDIUM FOR A PATIENT SELECTION FOR A DUCTAL CARCINOMA IN SITU OBSERVATION AND DETERMINATIONS OF ACTIONS BASED ON THE SAME
Document Type and Number:
WIPO Patent Application WO/2019/222675
Kind Code:
A1
Abstract:
An exemplary system, method and computer-accessible medium for determining ductal carcinoma in situ(DCIS) information regarding a patient (s) can include for example, receiving image (s) of internal portion (s) of a breast of the patient (s), and automatically determining the DCIS information by applying a neural network(s) to the image(s). The DCIS information can include predicting (i) pure DCIS or (ii) DCIS with invasion. Input information of the patient(s) can be selected for a DCIS observation for determining the DCIS informiation. The image (s) can be a mammographic image (s). The image (s) can be one of a magnetic resonance image or a computer tomography image.

Inventors:
HA RICHARD (US)
Application Number:
PCT/US2019/032946
Publication Date:
November 21, 2019
Filing Date:
May 17, 2019
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
UNIV COLUMBIA (US)
HA RICHARD (US)
International Classes:
A61B5/055; A61B6/03
Other References:
SHI, BIBO ET AL.: "Prediction of Occult Invasive Disease in Ductal Carcinoma in Situ Using Deep Learning Features", JOURNAL OF THE AMERICAN COLLEGE OF RADIOLOGY, vol. 15, no. 3PB, March 2018 (2018-03-01), pages 527 - 534, XP085355869
BHOOSHAN, NEHA ET AL.: "Cancerous Breast Lesions on Dynamic Contrast-enhanced MR Images: Computerized Characterization for Image-based Prognostic Markers", RADIOLOGY, vol. 254, no. 3, March 2010 (2010-03-01), pages 680 - 690, XP055311371
ZHU, ZHE ET AL.: "Deep Learning Analysis of Breast MRIs for Prediction of Occult Invasive Disease in Ductal Carcinoma in situ", ARXIV, 28 November 2017 (2017-11-28), XP081297966, Retrieved from the Internet
Attorney, Agent or Firm:
ABELEV, Gary (US)
Download PDF:
Claims:
WHAT IS CLAIMED IS:

1. A non-transitory computer-accessible medium having stored thereon computer-executable instructions for determining ductal carcinoma in situ (DCIS) information regarding at least one patient, wherein, when a computer arrangement executes the instructions, the computer arrangement is configured to perform procedures comprising:

receiving at least one image of at least one internal portion of a breast of the at least one patient; and

automatically determining the DCIS information by applying at least one neural network to the at least one image.

2. The computer-accessible medium of claim 1 , wherein the DCIS information includes predicting (i) pure DCIS or (ii) DCIS with invasion.

3. The computer-accessible medium of claim 1, wherein the computer arrangement is further configured to select input information of the at least one patient for a DCIS observation for determining the DCIS information.

4. The computer-accessible medium of claim 1, wherein the at least one image is at least mammographic image

5. The computer-accessible medium of claim 1, wherein the at least one image is one of a magnetic resonance image or a computer tomography image.

6. The computer-accessible medium of claim 1 , wherein the at least one image contains at least one calcification.

7 The computer-accessible medium of claim 1, wherein the computer arrangement is further configured to segment and resize the at least one image.

8. The computer-accessible medium of claim 7, wherein the computer arrangement is further configured to center the at least one image using a histogram-based z score normalization of non-air pixel intensity values.

9. The computer-accessible medium of claim 1, wherein the computer arrangement is further configured to at least one of (i) randomly flip the at least one image, (it) randomly rotate the at least one image, or (iii) randomly crop the at least one image.

10. The computer-accessible medium of claim 1 , wherein the computer arrangement is further configured to apply a random affine shear to the at least one image.

1 1. The computer-accessible medium of claim 1 , wherein the at least one neural network is a convolutional neural network (CNN).

12. The computer-accessible medium of claim 11, wherein the CNN includes a plurality of layers.

13. The computer-accessible medium of claim 12, wherein the CNN includes 15 hidden layers.

14. The computer-accessible medium of claim 12, wherein the CNN includes five residual layers.

15. The computer-accessible medium of claim 12, wherein the CNN includes at least one inception style layer after a ninth hidden layer.

16. The computer-accessible medium of claim 12, wherein the CNN includes at least one fully connected layer after a .13th layer thereof.

17. The computer-accessible medium of claim 16, wherein the at least one fully connected layer includes 16 neurons.

18. The computer-accessible medium of claim 12, wherein the CNN includes at least one linear layer after a 13th layer.

19. The computer-accessible medium of claim 18, wherein the at least one linear layer includes 8 neurons.

20. The computer-accessible medium of claim 1 , wherein the computer arrangement is further configured to determine what action to perform or whether to perform any action based on the determined DCIS information.

21 , A method for determining ductal carcinoma in situ (DCIS) information regarding at least one patient, comprising:

receiving at least one image of at least one internal portion of a breast of the at least one patient; and

using a computer hardware arrangement, automatically determining the DCIS information by applying at least one neural network to the at least one image.

22, The method of claim 21, wherein the DO S information includes predicting (i) pure DOS or (ii) DCIS with invasion.

23. The method of claim 21 , further comprising selecting input information of the at least one patient for a DOTS observation for determining the DCIS information.

24. The method of claim 21, wherein the at least one image is at least mammographic image.

25. The method of claim 21, wherein the at least one image is one of a magnetic resonance image or a computer tomography image.

26. The method of claim 21 , wherein the at least one image contains at least one calcification.

27. The method of claim 21, further comprising segmenting and resizing the at least one tmage.

28. The method of claim 27, further comprising centering the at least one image using a hi stogram -based z score normalization of non-air pixel intensity values.

29. The method of claim 21 , further comprising at least one of (i) randomly Hipping the at least one image, (ii) randomly rotating the at least one image, or (tit) randomly cropping the at least one image.

30. The method of claim 21 , further comprising applying a random affine shear to the at least one image.

31. The method of claim 21, wherein the at least one neural network is a convolutional neural network (CNN).

32. The method of claim 31, wherein the CNN includes a plurali ty of layers.

33. The method of claim 32, wherein the CNN includes 15 hidden layers.

34. The method of claim 32, wherein the CNN includes five residual layers.

35. The method of claim 32, wherein the CNN includes at least one inception style layer after a ninth hidden layer.

36. The method of claim 32, wherein the CNN includes at least one fully connected layer after a 13th layer thereof

37. T he method of claim 36, wherein the at least one fully connected layer includes 16 neurons

38. The method of claim 32, wherein the CNN includes at least one linear layer after a 13th layer.

39. The method of claim 38, wherein the at least one linear layer includes 8 neurons.

40. The method of claim 21 , further comprising determining what action to perform or whether to perform any action based on the determined DOS information.

41. A system for determining ductal carcinoma in situ (DCIS) information regarding at least one patient, comprising:

a computer hardware arrangement configured to:

receive at least one image of at least one internal portion of a breast of the at least one patient; and

automatically determine the DCIS information by applying at least one neural network to the at least one image.

42. The system of claim 41, wherein the DCiS information includes predicting (i) pure DOS or (ii) DCiS with invasion.

43. The system of claim 41. wherein the compu ter arrangement is further configured to select input information of the at least one patient for a DCiS observation for determining the DCIS information.

44. The system of claim 41, wherein the at least one image is at least mammographic image.

45. The system of clai m 41, wherein the at least one image is on e of a magnetic resonance image or a computer tomography image.

46. The system of claim 41 , wherein the at least one image con tains at least one calcification.

47. The system of claim 41 , wherein the computer arrangement is further configured to segment and resize the at least one image.

48. The system of claim 47, wherein the computer arrangement is further configured to center the at least one image using a histogram-based z score normalization of non-air pixel intensity values.

49. The system of claim 41 , wherein the computer arrangement is further configured to at least one of (i) randomly flip the at least one image, (ii) randomly rotate the at least one image, or (Hi) randomly crop the at least one image.

50. The system of claim 41 , wherein the computer arrangement is further configured to apply a random affine shear to the at least one image.

51. The system of claim 41 , wherein the at least one neural network is a convolutional neural network (CNN)

52. The system of claim 51, wherein the CNN includes a plurality of layers.

53. The system of claim 52, wherein the CNN includes 15 hidden layers.

54. The system of claim 52, wherein the CNN includes five residual layers.

55. The system of claim 52, wherein the CNN includes at least one inception style layer after a ninth hidden layer

56. The system of claim 52, wherein the CN N includes at least one fully connected layer after a 1 h layer thereof

57. The system of claim 56, wherein the at least one fully connected layer includes 16 neurons.

58 The system of claim 52, wherein the CNN includes at least one linear layer after a 13th layer.

59. The system of claim 58, wherein the at least one linear layer includes 8 neurons.

60 The system of claim 41 wherein the computer arrangement is further configured to determine what action to perform or whether to perform any action based on the determined DOS information.

Description:
SYSTEM, METHOD AND COMPUTER-ACCESSIBLE MEDIUM FOR A PATIENT SELECTION FOR A DUCTAL CARCINOM A IN SITU OBSERVATION AND DETERMINATIONS OF ACTIONS BASED ON THE SAME

CROSS-REFERENCE TO RELATED APPLICATION(S)

[0001] This application relates to and claims priority from U.S. Patent Application No

62/672,945, tiled on May 17, 2018, the entire disclosure of which is incorporated herein by reference.

FIELD OF THE DISCLOSURE

[0002] The present disclosure relates generally to Ductal Carcinoma observation and/or determination, and more specifically, to exemplary embodiments of exemplary system, method and computer-accessible medium for patient selection for Ductal Carcinoma in Situ observation and/or determination of possible actions based on the same.

BACKGROUND INFORMATION

[0003] Attempts to minimize over-diagnoses and treatment of Ductal Carcinoma in Situ (“DC IS”) have led to clinical trials of observing patients with DO S instead of surgery. Despite careful selection for“low risk” DOS patients, occult invasive cancels can occur in significant number of these patients.

[0004] Thus, it may be beneficial to provide an exemplary system, method and computer- accessible medium for patient selection for ductal carcinoma in situ observation and/or determination of possible actions based on the same which can overcome at least some of the deficiencies described herein above.

SUMMARY OF EXEMPLARY EMBODIMENTS

[0005] An exemplary system, method and computer-accessible medium for determining ductal carcinoma in situ (DOS) information regarding a patient(s) can include for example, receiving image(s) of internal portion(s) of a breast of the patient(s), and automatically determining the DOS information by applying a neural network(s) to the image(s). The DCIS information can include predicting (i) pure DOS or (if) DOS with invasion. Input information of the patient(s) can be selected for a DOS observation for determining the DOS information. The image(s) can be a mammographic image(s). The image(s) can be one of a magnetic resonance image or a computer tomography image. [0006] In some exemplary embodiments of the present disclosure, the image(s) can contain a caicification(s). The image can be segmented and/or resized. The image can be centered using a histogram-based z score normalization of non-air pixel intensi ty val ues. The image(s) can be (i) randomly flipped, (si) randomly rotated, or (iii) randomly cropped A random affine shear can be applied to the image(s) The neural network] s) can be a convolutional neural network (CNN). The CNN can include a plurality of layers. The CNN can include 15 hidden layers. The CNN can include five residual layers. The CNN can include an inception style layer(s) after a ninth hidden layer. The CNN can include a fully connected layer(s) after a 13 th layer thereof. The fully connected layeifs) can include 16 neurons. The CNN can include a linear iayer(s) after a 13 th layer. The linear layeifs) can include 8 neurons. A determination can be made as to what action to perform or whether to perform any action based on the determined DOS information.

[0007] These and other objects, features and advantages of the exemplary embodiments of the present disclosure will become apparent upon reading the following detailed description of the exemplary embodiments of the present disclosure , when taken i n conjunction with the appended claims.

BRIEF DESCRIPTION OF THE DRAWINGS

[0008] Further objects, features and advantages of the present disclosure will become apparent from the following detailed description taken in conjunction with the accompanying Figures showing illustrative embodiments of the present disclosure, in which:

[0009] Figures 1 A-1C are exemplary input images for the exemplary convolutional neural network of patients with DOS according to an exemplary' embodiment of the present disclosure;

[0010] Figure 2 is an exemplary diagram of the exemplary convolutional neural network according to an exemplary' embodiment of the present disclosure;

[0011] Figure 3 is an exemplary flow diagram of an exemplary' method for determining DCIS information regarding a patient according to an exemplary embodiment of the present disclosure; and

[0012] Figure 4 is an illustration of an exemplary block diagram of an exemplary system in accordance with certain exemplary' embodiments of the present disclosure.

[0013] Throughout the drawings, the same reference numerals and characters, unless otherwise stated, are used to denote like features, elements, components or portions of the illustrated embodiments. Moreover, while the present disclosure will now be described in detail with reference to the figures, it is done so in connection with the illustrative embodiments and is not limited by the particular embodiments Illustrated in the figures and the appended claims.

DETAILED DESCRIPTION

Exemplary Definitions

[0014] Conventional neural networks: Conventional neural networks can be, but not limited to, networks composed of neurons with !eamabie weights and biases. Raw data (e g., an image) is input into the machine, which encodes defining characteristics into the network architecture. Each neuron receives multiple inputs, calculates a weighted sum that goes through an activation function, and creates an output.

[0015] Convolutional layer: The convolutional layer can apply a filter that slides over the entire image to calculate the dot product of each particular region. In this procedure, one image can become a stack of filtered images.

[0016] Pooling layer: The pooling layer can reduces the spatial size of each feature map. Maximum pooling can apply a filter that slides over the entire image and keeps only the maximum value for each particular region.

[0017] Rectified linear units: Rectified linear units can be, but not limited to, computation units that perform normalization of the stack of images in a rectified linear unit, for example, all negative values can be changed to zero.

[0018] Inception layer: The inception layer can reduce the computation burden by making use of dual computational layers.

[0019] Fully connected layer: In the fully connected layer, as an example, every feature value from the created stack of filtered images can have a weighted output, which can be averaged to create a prediction.

[0020] Back propagaii on : in back propagation, the error of the final predicti on can be calculated, and can be used to adjust each feature value to improve future predictions.

[0021] Dropout: Dropout can be, but not limited to, a regularization procedure used to reduce overfitting of the network by preventing coadaptation of training data. Dropout randomly selects neurons to be ignored during training. [0022] L2 regularization: L2 regularization can be, but not limited to, a regularization procedure used to reduce overfitting by decreasing the weighted value of features to simplify the model.

[0023] The exemplary system, method, and computer-accessible medium, according to an exemplary' embodiment of the present disclosure, can utilize a convolutional neural network (“CNN”) for predicting patients with pure DOS versus DOS with invasion using, for example, mammographic images; however, it should be understood that other imaging modalities can be used.

Exemplary Procedures and Methods

[0024] A retrospective study utilizing the exemplary CNN was performed, which included 246 unique images from 123 patients. Additionally, 164 images in 82 patients diagnosed with DOS by stereotactic-guided biopsy of calcifications without any upgrade at the time of surgical excision (e.g., pure DO S group) were used 82 images in 41 patients with mammographk calcifications yielding occult invasive carcinoma as the final upgraded diagnosis on surgery {e.g , occult invasive group) were used. Two mammographic magnification views (e.g , bilateral craniocaudal and mediolateral, lateralmedlal) of the calcifications were used for analysis. Calcifications were segmented using an exemplary 3D Sheer, which were then resized to fit a 128x128 pixel bounding box A 15 hidden layer topology was used to implement the exemplary CNN The exemplary network architecture included 5 residual layers and a dropout of 0.25 after each convolution. Cases were randomly separated into a training set (e.g., 80%) and a validation set (e.g., 20%).

Exemplary Data Preparation

[0025] An original pathology report was determined to be ground truth information and was used as the basis for dividing patients. Eighty percent of the available patients were randomly selected to develop the exemplary network, and the remaining 20% of patients were used to test the exemplary CNN

Exemplary Data Augmentation and Segregation

[0026] The magnification views of each patient’s mammogram were loaded into a 3D segmentation program. Segments w ere extracted using an exemplary automatic segmentation procedure to include the regions of the magnification view that contained calcifications. Each image was scal ed in size on the basis of the radius of the segmentations and was resized to fit a bounding box of 128 x 128 pixels. Figures 1A-1C illustrate exemplary input images for the exemplary CNN of patients with DC IS according to an exemplary' embodiment of the present disclosure. The entire image batch was centered using histogram-based z score normalization of foe non-air pixel intensity values. Exemplary data augmentation was performed to limit overfit ting. Some of the magnification view's (e.g , orthogonal magnification views) were randomly flipped vertically, horizontally, or in both directions. Additionally, some of the magnification views were rotated by a random angle between 0 52 and -0.52 radians, and were randomly cropped to a box 80% of the initial size. Random affine shear was applied to each input image.

Exemplary Network Architecture

[0027] A topology with multiple layers, for example, 15 hidden layers, can be used to implement the exemplary CNN. The exemplary CNN can include fully convolutional (“FC”) layers. The exemplary CNN can include the application of a series of convolutional matrices to a vectorized input image that can iteratively separate the input to a target vector space.

The exemplary CNN can include five residual layers. The residual neural networks can be used to stabilize gradients during back propagation, facilitating improved optimization and greater network depth. For example, starting with the 10th hidden layer, inception V2 style layers can be used. The inception layer architecture can facilitate a computationally efficient procedure for facilitating a network to selecti vely determine the appropriate filter architectures for an input feature map, providing improved learning rates

[0028] A fully connected layer with, for example, 16 neurons can he implemented after, as an example, the 13th hidden layer, which can be followed by implantation of a linear layer with eight neurons. A final softmax function output layer with two classes can be inserted as the last layer. Training was performed using an exemplary optimization procedure (e.g., the AdamOptimizer optimization procedure) (see, e.g , Reference 20), combined with an exemplary accelerated gradient procedure (e.g., the Nesterov accelerated gradient procedure). (See, e.g.. References 21 and 22) Parameters were initialized using an exemplary heuristic. (See, e.g., Reference 23). L2 regularization was performed to prevent over- fitting of data by limiting the squared magnitude of the kernel weights. Dropout (e.g., 25% randomly) was also used to prevent overfitting by limiting unit coadaptation. (See, e.g., Reference 24). Batch normalization was used to improve network training speed and regularize performance by reducing internal covariate shift. (See, e.g.. Reference 25).

(00293 Figure 2 shows an exemplary diagram of the exemplary CNN according to an exempla ry' embodiment of the present disclosure. For example, as shown in Figure 2, a DCIS image 205 can be input into the exemplary CNN. image 205 can be input into a set of residual layers 210 (e.g., four layers, which can include Rl : 3x3x16; R2: 3x3x32; R3:

3x3x64; and R4: 3x3x128). A plurality of inception layers 215 can be used (e.g., four inception layers, which can include Il: x 256; 12: x 256; 13: x 256: and 14; x 256). Multiple fully connected layers 220 can be implemented (e.g., 15 fully connected layers, which can include one or more fully connected layers, for example, FC14: 1x16 dropout). Additionally, multiple linear layers 225 can be used (e.g., 15 linear layers, which can include one or more fully connected layers, for example, FC: 1x8). The Exemplary CNN can produce an output 230, which can be used, for example, to (i) predict pure DCIS or DCIS with invasion and/or (ii) select a patient for DISC.

[0030] Softmax with cress-entropy hinge loss was used as the primary' objective function of the netw ork to provide a more intuitive output of normalized class probabilities. A class- sensitive cost function penal izing incorrect classification of the underrepresented class was used. A final softmax score threshold of 0.5 from the mean of raw logits from the ML and CC views was used for two-class classification. The area under the curve (“AUC”) value was used as the primary' performance metric. Sensitivity, specificity, and accuracy were also calculated as secondary performance metrics.

[0031] Visualization of network predictions was performed using an exemplary gradient- weighted class activation mapping (“Grad-CAM”) procedure. (See, e.g., Reference 26).

Each Grad-CAM map was generated by an exemplary' prediction mode! along with every input image. The salient region of the averaged Grad-CAM map illustrates where important features come from when the exemplary' prediction model makes classification decisions.

Exemplary Results

[0032] The exemplary CNN procedure for predicting patients with pure DCIS achieved an overall accuracy of about 74.6% (e.g., about 95%CI, ± 5) with area under the ROC curve of about 0,71 (e.g:, about 95% Cl, ±0,04), a specificity of about 49,4% (e.g., about 95% Cl, ± 6%) and a sensitivity' of about 91.6% (e.g., about 95% Cl, ± 5%). [0033] Thus, as described above, the exemplary system, method, and computer-accessible medium, according to an exemplary embodiment of the present disclosure, can utilize the exemplary CNN to distinguish pure DCIS from DC IS with invasion using, for example, using mammographic images

[0034] Figure 3 shows an exemplary' flow diagram of an exemplary method 300 for determining DCIS information regarding a patient according to an exemplary' embodiment of the present disclosure. For example, at procedure 305, an image of an internal portion of a breast of a patient can be recei ved. At procedure 310, the image can be segmented and resized. At procedure 315, the image can be centered using a histogram-based z score normalization of non-air pixel intensity values. At procedure 320, the image can be randomly flipped, randomly rotated, and/or randomly cropped. At procedure 325, a random affine shear can be applied to the image. At procedure 330, input information of patient for DCIS observation can be selected for determining DCIS information. At procedure 335, DCIS information can be automatically determined by applying a neural network to the image. At procedure 340, a determination can be made as to what action to perform or whether to perform any action based on the determined DCIS information.

[0035] Figure 4 shows a block diagram of an exemplary embodiment of a system according to the present disclosure. For example, exemplary procedures m accordance with the present disclosure described herein can be performed by a processing arrangement and or a computing arrangement (e.g., computer hardware arrangement) 405. Such

processing/computing arrangement 405 can be, for example entirely or a part of. or include, but not limited to, a computer/processor 410 that can include, for example one or more microprocessors, and use instructions stored on a computer-accessible medium (e.g. , RAM, ROM, hard drive, or other storage device).

[0036] As shown in Figure 4, for example a computer-accessible medium 415 (e.g., as described herein above, a storage device such as a hard disk, floppy disk, memory stick, CD- ROM, RAM, ROM, etc , or a collection thereof) can be provided (e.g., in communication with the processing arrangement 405) The computer-accessible medium 415 can contain executable instructions 420 thereon. In addition or alternatively, a storage arrangement 425 can be provided separately from the computer-accessible medium 415, which can provide the instructions to the processing arrangement 405 so as to configure the processing arrangement to execute certain exemplary procedures, processes, and methods, as described herein above, for example.

[0037] Further, the exemplary processing arrangement 405 can be provided with or include an mput/output ports 435, which can include, for example a wired network, a wireless network, the internet, an intranet, a data collection probe, a sensor, etc. As shown in Figure 4, the exemplary processing arrangement 405 can be in communication with an exemplary display arrangement 430, which, according to certain exemplary' embodiments of the present disclosure, can be a touch-screen configured for inputting information to the processing arrangement in addition to outputting information from the processing arrangement, for example. Further, the exemplary' display arrangement 430 and/or a storage arrangement 425 can be used to display and/or store data in a user-accessible format and/or user-readable format.

[0038] The foregoing merely illustrates the principles of the disclosure. Various modifications and alterations to the described embodiments will be apparent to those skilled in the art in view of the teachings herein. It will thus be appreciated that those skilled in the art will be able to devise numerous systems, arrangements, and procedures which, although not explici tly shown or described herein, embody the principles of the disclosure and can be thus within the spirit and scope of the disclosure. Various different exemplary embodiments can be used together with one another, as well as interchangeably therewith, as should be understood by those having ordinary skill in the art in addition, certain terms used in the present disclosure, including the specification, drawings and claims thereof, can be used synonymously in certain instances, including, but not limi ted to, for example, data and information. It should be understood that, while these words, and/or other words that can be synonymous to one another, can be used synonymously herein, that there can be instances when such words can be intended to not be used synonymously. Further, to the extent that the prior art knowledge has not been explicitly incorporated by reference herein above, it is explicitly incorporated herein in its entirety All publications referenced are incorporated herein by reference in their entireties. EXEMPLARY REFERENCES

[0039] The following references are hereby incorporated by reference in their entireties, as follows:

[1] Nguyen CV, Albarracin CT, Whitman GJ, Lopez A, Sneige N Atypical ductal hyperplasia in directional vacuum-assisted biopsy of breast microcalcificafions:

considerations for surgical excision. Ann Surg Oncol 201 1 ; 18:752-761.

[2] Sinn HP, Rreipe H. A brief overview of the WHO classification of breast tumors,

4th edition, focusing on issues and updates from foe 3rd edition. Breast Care (Basel) 2013

[3 ] Racz JM, Carter JM, Degnim AC Lobular neoplasia and atypical ductal hyperplasia on core biopsy: current surgical management recommendations. Ann Surg Oncol 2017; 24:2848-2854.

[4] Ko E, Han W, Lee JW, et al. Scoring system for predicting malignancy in patients diagnosed with atypical ductal hyperplasia at ultrasound-guided core needle biopsy. Breast Cancer Res Treat 2008: 1 12:189-195

[5] Merten RS, Ganesan N, Bevers T, et al. Long-term safety of observation in selected women following core biopsy diagnosis of atypical ductal hyperplasia. Ann Surg Oncol 2017; 24:70-76

[6] Pankratz VS , Hartmann LC, Degnim AC, et al . Assessment of the accuracy of the Gail model in women with atypical hyperplasia J Clin Oncol 2008; 26:5374-5379

[7] Deshaies 1, Provencher L, Jacob S, et al. Factors associated with upgrading to malignancy at surgery of atypical ductal hyperplasia diagnosed on core biopsy. Breast 2011; 20:50-55

[8] Bendifailah S, Defert S, Chabbert-Buffet N, et al. Scoring to predict the possibility of upgrades to malignancy in atypical ductal hyperplasia diagnosed by an 1 1 -gauge vacuum- assisted biopsy device: an external validation study Eur J Cancer 2012; 48:30-36.

[9] Yu YH, Liang C, Yuan XZ Diagnostic value of vacuum-assisted breast biopsy for breast carcinoma: a meta-analysis and systematic review. Breast Cancer Res Treat 2010; 120:469-479.

[10] Song JL, Chen C, Yuan JP, Sun SR Progress in the clinical detection of

heterogeneity in breast cancer. Cancer Med 2016; 5:3475-3488.

[11] Gomes DS, Porto SS, Balabram D, Gobbi H. Inter-observer variability between general pathologists and a specialist in breast pathology in the diagnosis of lobular neoplasia, columnar cell lesions, atypical ductal hyperplasia and ductal carcinoma in situ of the breast,

Diagn Pathol 2014; 9:121.

[12] Ha R, Chang P, Mutasa S, et al. Convolutional neural network using a breast MRI tumor dataset can predict Oncotype Dx recurrence score. J Magn Rcson Imaging 2018 Aug 21,

[13] Ha R, Chang P, Mema E, el al. Fully automated convolutional neural network method for quantification of breast MR fibroglandular tissue and background parenchymal enhancement, j Digit Imaging 2018 Aug 3,

[14] Ha R, Chang P, Karcich J, et al. Convolutional neural network based cancer risk stratification using a mammographic dataset. Acad Radiol 2018 Jul 31,

[15] Ribli D, Horvath A, Unger Z, Pollner P, Csabai I. Detecting and classifying lesions in mammograms with Deep Learning. Sci Rep 2018; 8:4165,

[16] Mohamed AA, Berg WA, Peng Pi, Luo Y. Jankowitz RC, Wu S, A deep learning method for classifying mammographic breast density categories. Med Phys 2018; 45:314 321.

[17] LeCun Y, Bottou L. Bengio Y, Haffner P. Gradient-based learning applied to document recognition. Proc IEEE 1998; 11 :2278-2324.

[18] He K, Zhang X, Ren S. Sun J. Deep residual learning for image recognition. Xplore Digital Library' website. ieeexplore,ieee.org/document/7780459. Published 2016,

[19] Szegedy C, Liu W. jia Y, et al. Going deeper with convolutions, Xplore Digital Library website. ieeexplore.ieee.org/document/7298594IEEE. Published 2015.

[20] Kingma DP. Ba J Adam: a method for stochastic optimization. arXiv website. arxiv.org abs 1412,6980, Published 2014,

[21] Nesterov Y. Gradient methods for minimizing composite objective function.

Optimization Online website, www.qptimization-on line. orgDB_FIL 2007/09/ 1784. pdf. Published 2007.

[22] Dozat T. Incorporating Nesterov momentum into Adam. Stanford University website, cs229.stanford.edu/proj2015, 054_report.pdf. Published 2016.

[23] Glorot X, Bengio Y. Understanding the difficulty of training deep feedforward neural networks. Proceedin gs of Machine Learning Research website.

proceedings.mlr,press/v9/glorotl0a/glorotlQa,pdf. Published 2010,

[24] Srivasta va N, Hinton GE, Krizhevsky A, et al. Dropout: a simple way to prevent neural networks from overfitting. J Mach Learn Res 2014; 15: 1929-1958. [25] Ioffe S, Szegedy C. Batch normalization: accelerating deep network training fay reducing internal covariate shift. Proceedings of Machine Learning Research website. proceedings mlr.press/v37 /ioffel 5.html. Published 2015.

[26] Selvaraju RR, Das A, Vedantam R, et al Grad-CAM: why did you say that? Visual explanations from deep networks via gradient-based localization arXiv website

arxi v.org/abs Ί 610.02391 vl . Published 2016.

[27] Araujo T, Aresta G, Castro E, et al. Classification of breast cancer histology' images using convolutional neural networks PLoS One 2017; l2:e0l 77544

[28] Bejnordi BE, Zuidhof G, Ba!kenho! M, et ah Context-aware stacked convolutional neural networks for classification of breast carcinomas in whole-slide histopathology images J Med Imaging (Bellingham) 2017; 4:044504.

[29] Tsuehiya K, Mori N, Schacht DV, et al. Value of breast MRI for patients with a biopsy showing atypical ductal hyperplasia (ADH). J Magn Reson Imaging 2017; 46: 1738- 1747.

[30] Menes T, Kerlikowske K, Jaffer 8, et al. Rates of atypical ductal hyperplasia have declined with less use of postmenopausal hormone treatment: findings from the Breast Cancer Surveillance Consortium Cancer Epidemiol Biomarkers Prev 2009; 18:2822-2828.