This paper presents WeatherEdit, a novel weather editing pipeline that generates realistic weather effects in 3D scenes with controllable types and intensities. WeatherEdit consists of two main components: weather background editing and weather particle generation. For weather background editing, we introduce an all-in-one adapter that integrates multiple weather styles into a single pre-trained diffusion model to generate diverse weather effects on 2D image backgrounds. During inference, we design a temporal-view (TV) attention mechanism that follows a specific order to aggregate temporal and spatial information, ensuring consistent editing across multi-frame and multi-view images. To generate weather particles, we first reconstruct the 3D scene using the edited image, then introduce a dynamic 4D Gaussian field to generate snow, rain, and fog. The properties and dynamics of these particles are precisely controlled through physics-based modeling and simulation, ensuring realistic weather representation and flexible intensity adjustment. Finally, we integrate the 4D Gaussian field with the 3D scene to render consistent and highly realistic weather effects. Experiments on multiple driving datasets demonstrate that WeatherEdit can generate a variety of weather effects with controllable condition intensities, highlighting its potential for simulating autonomous driving in severe weather.