In this paper, we propose ChatSR, a novel symbolic regression method that induces formula generation by providing prior knowledge through natural language based on the knowledge and language understanding capabilities of multimodal large-scale language models. Unlike existing symbolic regression methods that generate formulas directly from observed data, ChatSR understands and utilizes prior knowledge in natural language to improve the quality of formula generation. Experimental results on 13 datasets show that ChatSR demonstrates state-of-the-art performance on existing symbolic regression tasks, and in particular, it demonstrates zero-shot capability even for prior knowledge that is not present in the training data.