ZetA is a novel deep learning optimization algorithm that integrates dynamic scaling based on the Riemann zeta function into Adam. It improves generalization performance and robustness through a hybrid update mechanism that incorporates adaptive decay, cosine-similarity-based momentum boosting, entropy-regularized loss, and Sharpness-Aware Minimization (SAM)-style perturbations. It demonstrates improved test accuracy compared to Adam on the SVHN, CIFAR10, CIFAR100, STL10, and noisy CIFAR10 datasets, trained for 5 epochs using a lightweight fully-connected network with mixed-precision settings. It demonstrates a computationally efficient and robust alternative to Adam, especially for noisy or high-dimensional classification tasks.