In this paper, we propose a novel quantization framework, SegQuant, based on Post-Training Quantization (PTQ), which can be applied to pre-trained models without retraining to solve the deployment problem of computationally expensive diffusion models. SegQuant combines the SegLinear strategy, which captures the semantics and spatial heterogeneity of the model structure, and the DualScale technique, which preserves the polar asymmetric activation, which is important for maintaining the visual fidelity of the generated results. It is applicable to various models, ensures compatibility with major deployment tools, and improves performance. It is characterized by overcoming the limitations of existing PTQ methods that depend on a specific architecture and increasing generality.