This paper comprehensively analyzes and classifies existing methods to solve the oversmoothing problem in graph neural networks (GNNs). Oversmoothing refers to the phenomenon in which the embedding features become similar as the layers of the GNN deepen, making it difficult to distinguish network proximity. In this paper, we present various existing oversmoothing mitigation techniques from an integrated perspective called ATNPA, which consists of five core steps: Augmentation, Transformation, Normalization, Propagation, and Aggregation. In addition, we propose a classification scheme that includes three topics for solving oversmoothing, and classify existing methods into six categories to analyze in detail the relationship with ATNPA, advantages and disadvantages of each method, etc., thereby providing an in-depth understanding of existing research and suggesting future research directions.