This paper presents NetGPT, a pre-trained model for effectively modeling Internet network traffic. Unlike previous successful pre-training efforts in natural language processing, similar efforts have been lacking in the network traffic domain. NetGPT transforms diverse network traffic patterns into unified text inputs, supporting both traffic understanding and traffic generation tasks. It optimizes the adaptability of pre-trained models to various tasks through techniques such as header field shuffling, packet segmentation, and incorporating various task labels into prompts. Experimental results using diverse traffic datasets, including cryptographic software, DNS, proprietary industry protocols, and cryptocurrency mining, demonstrate that NetGPT significantly outperforms existing best-performing models in both traffic understanding and traffic generation tasks.