Home / Health / Meet BCECNN: The AI that detects breast cancer and explains its logic
Meet BCECNN: The AI that detects breast cancer and explains its logic
Researchers have developed an AI model that not only detects breast cancer with near-perfect accuracy but also visually explains its diagnosis to doctors, improving trust and transparency in screening
Breast Cancer Awareness Month: AI model detects breast cancer with high accuracy and explains its findings. (Photo: AdobeStock)
3 min read Last Updated : Oct 14 2025 | 1:46 PM IST
A new artificial intelligence model has achieved 98.75 per cent accuracy in detecting breast cancer and, for the first time, can also visually explain how it reached its diagnosis.
Published in the journal BMC Medical Informatics and Decision Making, the study titled BCECNN: an explainable deep ensemble architecture for accurate diagnosis of breast cancer introduces a model called Breast Cancer Ensemble Convolutional Neural Network (BCECNN). It not only spots tumours in mammograms but also justifies its decisions. Developed by researchers from Afyon Kocatepe University, Turkey, the model could make AI-driven cancer detection far more transparent and trustworthy for doctors and patients alike.
What makes this AI system different from previous breast cancer tools?
Most AI tools that scan medical images act like black boxes. They predict but cannot explain. BCECNN stands apart because it doesn’t rely on a single algorithm. Instead, it combines the “brains” of five powerful neural networks — AlexNet, VGG16, ResNet-18, EfficientNetB0, and XceptionNet — and uses a “majority vote” to decide.
According to the published scientific paper, these models were trained using transfer learning (TL) and evaluated on five distinct sub-datasets generated from the Artificial Intelligence Smart Solution Laboratory (AISSLab) dataset, which consists of 266 mammography images labelled and validated by radiologists.
This ensemble strategy, known as Triple Ensemble CNN (TECNN), mimics how expert radiologists might consult each other before making a final call.
To make the AI’s reasoning visible, the researchers embedded Explainable Artificial Intelligence (XAI) methods, specifically Grad-CAM (Gradient-weighted Class Activation Mapping) and LIME (Local Interpretable Model-Agnostic Explanations).
These techniques create heatmaps over mammogram images, showing the areas that most influenced the AI’s verdict — for example, highlighting a specific lesion that looked suspicious. When these heatmaps were reviewed by an expert radiologist, they closely matched human interpretations, suggesting that BCECNN’s focus mirrored a clinician’s eye.
Why is this so important for doctors and patients?
According to the researchers, the biggest hurdle for AI in healthcare is not performance, it is trust. Doctors are reluctant to rely on a system that cannot justify its conclusions. BCECNN bridges this gap. By making the decision-making process transparent, it gives radiologists a tool they can verify and build confidence in.
It also has the potential to reduce diagnostic errors and unnecessary biopsies, particularly in regions with fewer trained specialists. The model’s strong results, even with a small dataset, make it particularly useful for low-resource medical setups.
How BCECNN could shape future medical AI
The authors believe BCECNN could form the foundation for next-generation clinical decision-support systems, digital assistants that work alongside radiologists. Future versions could be trained on larger, multi-institutional datasets, integrated with ultrasound or MRI scans, and even extended to other cancers.
The researchers conclude that transparency, not just accuracy, will define the next era of medical AI, where doctors and machines collaborate, not compete, to improve patient outcomes.
You’ve reached your limit of {{free_limit}} free articles this month. Subscribe now for unlimited access.