To address the unstable learning dynamics and mode-loss issues of existing implicit generative models, this paper proposes Pareto-ISL, an extension of the invariant statistical loss (ISL) method that accurately models the tails of a distribution along with the central features. To overcome the limitation of existing ISL, which is limited to one-dimensional data, we propose a generator using the Generalized Pareto Distribution (GPD) and a novel loss function suitable for multidimensional data using random projections. Experiments demonstrate its performance in multidimensional generative modeling and demonstrate its potential as a pre-training technique for GANs to prevent mode collapse. In particular, we focus on effectively handling heavy-tailed distributions encountered in real-world phenomena.