The detection of illegal and unregulated fishing activities remains a significant challenge for environmentally sustainable fishing and regulatory enforcement. Manual surveying and electronic monitoring systems require extensive human efforts and intervention adding to inefficiencies and scalability issues. Advances in deep learning based image analysis provide an alternative for implementing automated monitoring strategies to realize the sustainability goals for fisheries. This study introduces a novel two-step detection and classification framework designed to identify recreational fishers adaptable to remote image sources such as fixed cameras and drones. The framework is evaluated on YOLOv8, YOLOv10, YOLOv11, and YOLOv12 architectures and is compared against a single step detection. Our proposed two-step approach improves classification reliability by refining object localization, minimizing false positives by making use of exhaustive features from the underlying dataset ensuring more accurate identification of fishing-related activities. Across all the model configurations tested on our two step approach YOLOv11 stands out with superior generalization, with the detection variant large achieving a precision of 0.704, recall of 0.444, and mAP50 of 0.503, maintaining competitive inference efficiency, strikes an ideal choice for our particular down streaming application. Model explainability through EigenCAM confirms that network attention aligns with relevant fishing objects. Our tests show that the proposed approach offers a scalable, and real time adaptable solution for monitoring, demonstrating an advantage over conventional single-step models reducing misclassification and enabling fine-grained activity recognition.
A NOVEL TWO-STEP APPROACH FOR RECREATIONAL FISHERS DETECTION AND CLASSIFICATION / Jose, N.; Narang, G.; Galdelli, A.; Cannella, F.; Mancini, A.; Zingaretti, P.; Bolognini, L.. - 5:(2025). ( ASME 2025 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, IDETC-CIE 2025 Hilton Anaheim, usa 2025) [10.1115/DETC2025-168964].
A NOVEL TWO-STEP APPROACH FOR RECREATIONAL FISHERS DETECTION AND CLASSIFICATION
Jose N.;Narang G.;Galdelli A.;Mancini A.;Zingaretti P.;
2025-01-01
Abstract
The detection of illegal and unregulated fishing activities remains a significant challenge for environmentally sustainable fishing and regulatory enforcement. Manual surveying and electronic monitoring systems require extensive human efforts and intervention adding to inefficiencies and scalability issues. Advances in deep learning based image analysis provide an alternative for implementing automated monitoring strategies to realize the sustainability goals for fisheries. This study introduces a novel two-step detection and classification framework designed to identify recreational fishers adaptable to remote image sources such as fixed cameras and drones. The framework is evaluated on YOLOv8, YOLOv10, YOLOv11, and YOLOv12 architectures and is compared against a single step detection. Our proposed two-step approach improves classification reliability by refining object localization, minimizing false positives by making use of exhaustive features from the underlying dataset ensuring more accurate identification of fishing-related activities. Across all the model configurations tested on our two step approach YOLOv11 stands out with superior generalization, with the detection variant large achieving a precision of 0.704, recall of 0.444, and mAP50 of 0.503, maintaining competitive inference efficiency, strikes an ideal choice for our particular down streaming application. Model explainability through EigenCAM confirms that network attention aligns with relevant fishing objects. Our tests show that the proposed approach offers a scalable, and real time adaptable solution for monitoring, demonstrating an advantage over conventional single-step models reducing misclassification and enabling fine-grained activity recognition.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


