A newly developed neural network is highly accurate in identifying key landmarks important in breast surgery—opening the potential for objective assessment of breast symmetry, suggests a study in the February issue of Plastic and Reconstructive Surgery®, the official medical journal of the American Society of Plastic Surgeons(ASPS).

“Neural networks and machine learning have the potential to improve evaluation of breast symmetry in reconstructive and cosmetic breast surgery, enabling rapid, automated detection of features used by plastic surgeons,” comments lead author Nitzan Kenig, MD, of Albacete University Hospital in Spain.

Creating Neural Networks for Breast Evaluation

Breast symmetry is a key concern in breast surgery and is generally assessed by simple subjective evaluations by both patients and surgeons. Computer programs can provide more objective assessments, but with limitations including the need to manually enter data and lengthy calculation times.

Neural networks—an artificial intelligence technique that seeks to emulate the way the human brain processes data—are being explored for their potential to improve care in several areas of medical practice. Kenig and his colleagues developed an “ad hoc convolutional neural network” to detect key breast features used in assessing breast symmetry.

Using an open-source algorithm called YOLOV3 (“You Only Look Once,” version 3), the researchers trained their neural network to identify three anatomic features used in assessing the female beast: the breast boundaries, the nipple-areola complex (nipple and surrounding tissue) and the suprasternal notch (the depression at the base of the neck, at the top of the breastbone).

The neural network was trained using 200 frontal photographs of patients who underwent breast surgery. Its performance in identifying key breast features was then tested using an additional set of 47 photographs of patients who underwent breast reconstruction after breast cancer surgery.

Swift, Automated, Objective Breast Symmetry Evaluation

After training, the neural network was highly accurate in localizing the three features, with a total detection rate of 97.7%. For the right and left breast boundaries and nipple-areola complex, accuracy was 100%. For the suprasternal notch, detection rate dipped to 87%. Processing was quick, with an average detection time of 0.52 second.

The neural network was able to detect and localize the key features even in visibly asymmetrical breast reconstructions. The high success rate confirmed that the training data set was sufficient, with no need for data augmentation techniques.

“Neural networks and machine learning have a potential of improving the evaluation of breast symmetry in the field of plastic surgery, by automated and quick detection of features used by surgeons in practice,” Kenig and his coauthors conclude. They believe that, with further advances in image detection capabilities and their applications to breast surgery, neural networks could play a role in evaluation of breast symmetry and planning of both aesthetic and reconstructive plastic surgery.