Deep sky object detection in astronomical imagery using YOLO models: a comparative assessment

Leo Thomas Ramos*, Francklin Rivas-Echeverría

*Autor correspondiente de este trabajo

Producción científica: Contribución a una revistaArtículorevisión exhaustiva

Resumen

This study presents a comparative analysis of three YOLO architectures: YOLOv8, YOLOv9, and YOLOv10, focused on their effectiveness in detecting deep sky objects (DSOs). An extended version of the DeepSpaceYoloDataset is used, originally consisting of 4696 images. To enhance data diversity, data augmentation techniques are applied. These include cropping, rotations, blurring, and noise addition. As a result, the dataset expands to 8421 images, better reflecting real astronomical conditions. Each YOLO variant is trained using default parameters for 100 epochs in a standardized environment. The models’ performance is evaluated based on precision, recall, mean average precision (mAP), and computational efficiency. Results show that YOLOv8 performs better than YOLOv9 and YOLOv10. Particularly, YOLOv8 consistently achieves high precision, recall, and mAP scores across various scenarios. Additionally, YOLOv8 models require less training time, even in their most complex versions, while maintaining low latency. However, in detection tests, YOLOv10 models frequently match or even outperform YOLOv8, often achieving confidence scores above 90%. In contrast, YOLOv9 models exhibit lower performance in both quantitative and qualitative evaluations, along with higher training times and latency, making them the least effective option for DSO detection. These findings serve as a reference for selecting suitable models in astronomical research. They also promote the adoption of these technologies to improve the accuracy and efficiency of DSO detection. The augmented dataset used in this study can be accessed at: https://github.com/Leo-Thomas/Augmented-DeepSpaceYolo.

Idioma originalInglés
Número de artículo100481
PublicaciónNeural Computing and Applications
DOI
EstadoAceptada/en prensa - 2025
Publicado de forma externa

Nota bibliográfica

Publisher Copyright:
© The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature 2025.

Citar esto