Daily Arxiv

This is a page that curates AI-related papers published worldwide.
All content here is summarized using Google Gemini and operated on a non-profit basis.
Copyright for each paper belongs to the authors and their institutions; please make sure to credit the source when sharing.

APTx Neuron: A Unified Trainable Neuron Architecture Integrating Activation and Computation

Created by
  • Haebom

Author

Ravin Kumar

Outline

In this paper, we propose a novel unified neural computational unit, the APTx neuron, which integrates nonlinear activations and linear transformations into a single learnable representation. The APTx neuron is derived from the APTx activation function and does not require a separate activation layer, resulting in an architecture that is both computationally efficient and elegant. The proposed neuron has a functional form $y = \sum_{i=1}^{n} ((\alpha_i + \tanh(\beta_i x_i)) \cdot \gamma_i x_i) + \delta$, where all parameters $\alpha_i$, $\beta_i$, $\gamma_i$, and $\delta$ are learnable. We validate the APTx neuron-based architecture on the MNIST dataset, achieving a test accuracy up to 96.69% in only 20 epochs with about 332K learnable parameters. These results highlight the superior expressive power and computational efficiency of APTx neurons compared to conventional neurons, and suggest a new paradigm for integrated neuron design and architectures built on top of them.

Takeaways, Limitations

Takeaways:
A novel neuron architecture that integrates nonlinear activation and linear transformation to improve computational efficiency is presented.
Verified through experiments on the MNIST dataset that it has superior expressive power and accuracy compared to existing neurons.
Presenting a new paradigm for integrated neuron design
Limitations:
Validated using only MNIST dataset, performance validation on other complex datasets is needed.
Further studies are needed on the generalization performance of the proposed APTx neurons.
Comparative analysis with other neuronal structures should be conducted in more depth.
👍