Mb-taylorformer: Multi-branch efficient transformer expanded by taylor formula for image dehazing

Y Qiu, K Zhang, C Wang, W Luo… - Proceedings of the …, 2023 - openaccess.thecvf.com
Proceedings of the IEEE/CVF International Conference on …, 2023openaccess.thecvf.com
In recent years, Transformer networks are beginning to replace pure convolutional neural
networks (CNNs) in the field of computer vision due to their global receptive field and
adaptability to input. However, the quadratic computational complexity of softmax-attention
limits the wide application in image dehazing task, especially for high-resolution images. To
address this issue, we propose a new Transformer variant, which applies the Taylor
expansion to approximate the softmax-attention and achieves linear computational …
Abstract
In recent years, Transformer networks are beginning to replace pure convolutional neural networks (CNNs) in the field of computer vision due to their global receptive field and adaptability to input. However, the quadratic computational complexity of softmax-attention limits the wide application in image dehazing task, especially for high-resolution images. To address this issue, we propose a new Transformer variant, which applies the Taylor expansion to approximate the softmax-attention and achieves linear computational complexity. A multi-scale attention refinement module is proposed as a complement to correct the error of the Taylor expansion. Furthermore, we introduce a multi-branch architecture with multi-scale patch embedding to the proposed Transformer, which embeds features by overlapping deformable convolution of different scales. The design of multi-scale patch embedding is based on three key ideas: 1) various sizes of the receptive field; 2) flexible shapes of the receptive field; 3) multi-level semantic information. Our model, named Multi-branch Transformer expanded by Taylor formula (MB-TaylorFormer), can embed coarse to fine features more flexibly at the patch embedding stage and capture long-distance pixel interactions with limited computational cost. Experimental results on several dehazing benchmarks show that MB-TaylorFormer achieves state-of-the-art performance with a light computational burden.
openaccess.thecvf.com