
Research Article
FrequencyFormer: Oriented Object Detection with Frequency Transformer
@ARTICLE{10.4108/airo.10701, author={Shuai Liu and Haiming Wang and Zhibin Li and Peiyang Wei}, title={FrequencyFormer: Oriented Object Detection with Frequency Transformer}, journal={EAI Endorsed Transactions on AI and Robotics}, volume={4}, number={1}, publisher={EAI}, journal_a={AIRO}, year={2025}, month={12}, keywords={Transformer, Oriented object, Frequency}, doi={10.4108/airo.10701} }- Shuai Liu
Haiming Wang
Zhibin Li
Peiyang Wei
Year: 2025
FrequencyFormer: Oriented Object Detection with Frequency Transformer
AIRO
EAI
DOI: 10.4108/airo.10701
Abstract
Detecting objects with oriented bounding boxes have shown impressive generalizations in the challenging scenes with densely packed objects with arbitrary rotations. Existing oriented object detectors rely on customized operations like anchor pre-definition and NMS post-processing for accuracy improvement. However, those components usually bring extensive computational costs and complicate the pipeline, and thus limit the scalability of existing methods. In this paper, we propose a new paradigm, FrequencyFormer, for end-to-end oriented object detection. Upon the Transformer based encoder-decoder framework, two key ingredients are proposed to adapt it to detect oriented objects robustly. First, a frequency boosted query updatestrategy is designed to enhance the shape encoding of object queries by incorporating the frequency vectors of oriented objects. Second, a dynamic matching strategy is introduced to facilitate the training process, in which the matching weights are adjusted adaptively as the training progress. Experimental results on DOTA and HRSC2016 datasets demonstrate that our FrequencyFormer achieves competitive performance with stateof-the-art methods.
Copyright © 2025 Shuai Liu et al., licensed to EAI. This is an open access article distributed under the terms of the CC BY-NC-SA 4.0, which permits copying, redistributing, remixing, transformation, and building upon the material in any medium so long as the original work is properly cited.


