sdhvjdsvcdshfcsd

sdhvjdsvcdshfcsd

Published on Jan. 15, 2026


Here is an **academic-style abstract of about 250 words**, suitable for a journal or thesis submission:

---

**Abstract**

Image classification remains a fundamental task in computer vision, with applications spanning medical imaging, autonomous systems, and intelligent surveillance. Although convolutional neural networks (CNNs) have achieved remarkable success in this domain, their performance is often constrained by limited training data and poor generalization to unseen samples. Data augmentation has therefore become an essential strategy for improving model robustness. This study proposes an enhanced data augmentation approach aimed at increasing classification accuracy while reducing overfitting in CNN-based models. The proposed method builds upon existing masking-based augmentation techniques by selectively obscuring informative and non-informative regions of input images in a controlled manner during training. Unlike conventional random augmentation, the method dynamically adapts to image content, encouraging the network to learn more discriminative and context-aware features. Experiments were conducted using benchmark image datasets, including MNIST, Fashion-MNIST, and CIFAR-10, to evaluate the effectiveness of the approach. Performance was assessed using standard metrics such as accuracy, precision, recall, and loss convergence. The results demonstrate that models trained with the proposed augmentation technique consistently outperform baseline models and those trained with traditional augmentation methods. Improvements were observed in both training stability and generalization performance across all datasets. These findings suggest that the proposed technique is a practical and efficient enhancement for CNN training pipelines. The study contributes to ongoing research on data-centric approaches in deep learning and provides insights into designing augmentation strategies that improve model reliability in real-world image classification tasks.

---

If you want:

* **Exactly 250 words**
* A **simpler abstract**
* A **journal-specific version**
* Or one aligned strictly with your **MPhil thesis**

Just let me know.


Contact
  • Directorate of Academic Planning and Quality Assurance (DAPQA),
    UBIDS, P. O. Box WA 64, Wa, Upper West Region, Ghana
  • +233 208071685
  • kpeprah@ubids.edu.gh
  • +233 208071685
  • Facebook
  • LinkedIn
  • Academia
© 2026 Ecothink GH — Journal of AI Production and Consumption