This paper presents a novel method to improve marine visibility by fusing real-time image data with nautical chart information. The system overlays nautical chart data on the real-time image feed by detecting navigational aids such as buoys and accurately matching their representations with the corresponding nautical chart data. To ensure strong correlation, we introduce a transformer-based end-to-end neural network that predicts bounding boxes and confidence scores for buoy queries, thereby directly matching the detections of image regions with nautical chart markers in the world coordinate system. The proposed method is compared with baseline methods including a ray tracing model that estimates buoy positions via camera projections and an extended YOLOv7-based network with a distance estimation module. Experimental results on a real-world maritime scene dataset demonstrate that the proposed method significantly improves object localization and association accuracy in dynamic and challenging environments.