This paper highlights that despite improved accuracy in skin lesion classification models, distrust of AI models remains a problem in the medical field. Beyond high accuracy, reliable and explainable diagnoses are essential. To overcome the limitations of existing explainability methods (LIME, CAM), we propose the Global Class Activation Probabilistic Map Evaluation (GCAPE) method. GCAPE analyzes the activation probability maps of all classes probabilistically, pixel-by-pixel, to provide an integrated visualization of the diagnostic process, thereby reducing the risk of misdiagnosis. SafeML is additionally applied to detect incorrect diagnoses and alert physicians and patients as needed, thereby enhancing diagnostic reliability and patient safety. The method was evaluated using the ISIC dataset, MobileNetV2, and the Vision Transformer.