Optimizing Spectral Activation Functions for Spectral Convolutional Neural Networks: Balancing Efficiency and Fidelity
DOI:
https://doi.org/10.11113/elektrika.v24n2.651Keywords:
activation function, memory usage, spectral convolutional neural networks, spectral activation functionAbstract
Spectral Convolutional Neural Networks (SpCNNs) represent a critical domain in signal processing and machine learning, where the integration of activation functions (AFs) presents significant computational and theoretical challenges. This study systematically evaluates six spectral-domain AFs, employing a comprehensive methodology that assesses their performance across multiple critical metrics including signal quality, computational efficiency, and spectral domain characteristics. Using the MNIST dataset and a high-performance computational infrastructure, we conducted a rigorous analysis of AFs including ACReLU, SReLU, Split-tanh, Split-LReLU, PhaseReLU, and ModReLU. Our findings reveal substantial variations in signal processing capabilities, with novel AFs demonstrating marked improvements in signal quality metrics. Notably, Split-tanh and Split-LReLU exhibited significant noise reduction, achieving Signal-to-Noise Ratio (SNR) and Peak Signal-to-Noise Ratio (PSNR) values approaching 0 dB and 9.47 dB, respectively, compared to traditional AFs. The research provides critical insights into the trade-offs between computational efficiency and signal transformation capabilities, highlighting the potential of advanced spectral-domain AFs in improving neural network performance across complex signal processing tasks.
Downloads
Published
How to Cite
Issue
Section
License
Copyright of articles that appear in Elektrika belongs exclusively to Penerbit Universiti Teknologi Malaysia (Penerbit UTM Press). This copyright covers the rights to reproduce the article, including reprints, electronic reproductions, or any other reproductions of similar nature.













