Thermal images for wildfire core detection

Citation Author(s):
Linfeng
Wang
Jeonbuk National University
Oualid
Doukhi
Jeonbuk National University
Deok Jin
Lee
Jeonbuk National University
Submitted by:
Linfeng Wang
Last updated:
Tue, 11/26/2024 - 21:14
DOI:
10.21227/k0kc-rv13
Data Format:
License:
0
0 ratings - Please login to submit your rating.

Abstract 

The increasing number of wildfires damages nature and human life, making the early detection of wildfires in complex outdoor environments critical. With the advancement of drones and remote sensing technology, infrared cameras have become essential for wildfire detection. However, as the demand for higher accuracy in detection algorithms grows, the detection model's size and computational costs increase, making it challenging to deploy high-precision detection algorithms on edge computing devices onboard drones for real-time fire detection. This paper introduces a novel infrared wildfire detection network named FCDNet to tackle this issue. It includes an Efficient Processing (EP) module based on the novel Partial Depthwise Convolution (PDWConv) and the lightweight feature-sharing decoupled detection head (Fast Head), achieving low-size and low-computation wildfire detection. An Adaptive Sample Attention (ASA) Loss is introduced to enhance the detection accuracy of wildfire cores in combination with Normalized Wasserstein Distance (NWD) Loss.   The experiment shows that the model size of FCDNet is only 4.0MB, representing 63.5% of the baseline YOLOv8n network, with 63.3% of its parameters. It operates at just 5 Giga Floating Point Operations Per Second (GFLOPs), 38.3% lower, and achieves a 77.5% mAP (@50-95 IOU), a 1% increase, with a 460×460 input image size. Compared to the state-of-the-art YOLOv11n, FCDNet reduces parameters, computation, and model size by 26.9%, 20.6%, and 27.3%, respectively. The thermal dataset and training codes used in this study are made publicly available at: https://github.com/WangLF1996/FCDNet-Dataset-and-Algorithm

Funding Agency: 
This research is performed based on the cooperation with Jeonbuk National University-LIG Nex1 Cooperation and was supported by Unmanned Vehicles Core Technology Research and Development Program through the National Research Foundation of Korea (NRF), Unm
Grant Number: 
2020M3C1C1A01082375

Documentation

AttachmentSize
File README.txt1016 bytes