In this paper, we propose a novel Attention mechanism, the Multipole Attention Neural Operator (MANO), which utilizes the multipole method to overcome the limitations of the existing Transformer, which has difficulty in processing high-resolution inputs. MANO computes attention in a distance-based multi-scale manner by reframing attention as an interaction problem between grid points. This achieves linear time and memory complexity with respect to the number of grid points while maintaining a global receptive field in each attention head. Experimental results on image classification and Darcy flow show that MANO performs comparable to state-of-the-art models such as ViT and Swin Transformer, while reducing the running time and maximum memory usage by several orders of magnitude. The code is publicly available.