DBFormer: A Dual-Branch Adaptive Remote Sensing Image Resolution Fine-Grained Weed Segmentation Network
Xiangfei She, Zhankui Tang, Xin Pan, Jian Zhao, Wenyu LiuRemote sensing image segmentation holds significant application value in precision agriculture, environmental monitoring, and other fields. However, in the task of fine-grained segmentation of weeds and crops, traditional deep learning methods often fail to balance global semantic information with local detail features, resulting in over-segmentation or under-segmentation issues. To address this challenge, this paper proposes a segmentation model based on a dual-branch Transformer architecture—DBFormer—to enhance the accuracy of weed detection in remote sensing images. This approach integrates the following techniques: (1) a dynamic context aggregation branch (DCA-Branch) with adaptive downsampling attention to model long-range dependencies and suppress background noise, and (2) a local detail enhancement branch (LDE-Branch) leveraging depthwise-separable convolutions with residual refinement to preserve and sharpen small weed edges. An Edge-Aware Loss module further reinforces boundary clarity. On the Tobacco Dataset, DBFormer achieves an mIoU of 86.48%, outperforming the best baseline by 3.83%; on the Sunflower Dataset, it reaches 85.49% mIoU, a 4.43% absolute gain. These results demonstrate that our dual-branch synergy effectively resolves the global–local conflict, delivering superior accuracy and stability in the context of practical agricultural applications.