In this paper, we present a method for learning group isomorphic functions by learning a quadratic form $x^TA x$ corresponding to a group (known or unknown) from data. We exploit this by assuming that the underlying symmetry group is orthogonal, assuming that the underlying symmetry group is orthogonal. Using the corresponding eigensymmetric matrix and eigendiagonal form, we develop a simple and efficient model by incorporating appropriate inductive biases into the neural network architecture. This approach yields norm-preserving invariant models, which are expressed as the product of a norm-invariant model and a scale-invariant model, where "product" denotes a group action. Furthermore, we extend this approach to the more general setting of applying diagonal (or multiplicative) group actions to tuples of input vectors. In this extension, the isomorphic function is decomposed into angular components extracted from the normalized first vector and scale-invariant components that depend on the entire Gram matrix of the tuple. This decomposition captures interdependencies between multiple inputs while preserving the underlying group symmetry. We evaluate the effectiveness of our framework on several tasks, including polynomial regression, top-quark tagging, and moment of inertia matrix prediction. Comparative analysis with baseline methods demonstrates that our model consistently outperforms baseline methods in discovering underlying symmetries and efficiently learning corresponding isoval functions.