Himanshu Singh
This repository accompanies the research paper
Machine Learning Application of Generalized Gaussian Radial Basis Function and Its Reproducing Kernel Theory published in Mathematics (MDPI), 2024.
The goal of this project is to investigate a Generalized Gaussian Radial Basis Function (GGRBF) kernel and demonstrate how it can be used across several machine learning architectures.
The work bridges mathematical function theory with modern AI models by studying the reproducing kernel Hilbert space (RKHS) generated by the kernel and its use in practical learning algorithms.
Gaussian radial basis functions are among the most widely used kernels in machine learning due to their strong empirical performance and theoretical properties.
However, little work has explored generalized exponential radial kernels in learning systems.
This research investigates:
• a generalized Gaussian radial basis function kernel
• the Hilbert space generated by the kernel
• applications in machine learning architectures
The resulting kernel demonstrates improved classification accuracy and lower misclassification error in several experiments compared to standard Gaussian RBF, sigmoid, and ReLU functions.
Let
This formulation introduces an additional exponential structure that extends the classical Gaussian RBF kernel. When the additional parameter vanishes, the kernel reduces to the standard Gaussian RBF kernel.
The paper establishes several theoretical results for the kernel:
• characterization of the Hilbert function space generated by the kernel
• construction of the reproducing kernel Hilbert space (RKHS)
• analysis of orthonormal bases
• discussion of universality for machine learning applications
These results provide a mathematical foundation for using the kernel in learning algorithms.
The repository demonstrates how the kernel can be applied in several machine learning settings.
The GGRBF kernel can be used as a kernel function in SVM classification.
The kernel provides a flexible basis for nonlinear regression.
The generalized radial basis function can be used as an activation function inside neural architectures.
Experiments in the paper show that the generalized kernel can outperform several traditional activation functions in classification tasks.
Several interesting theoretical directions remain open:
• spectral decomposition of the generalized kernel
• Mercer eigenfunction analysis
• connections with Hermite polynomial expansions
• scalability in deep learning architectures
These questions form part of ongoing research.
Himanshu Singh
Research interests:
• scientific machine learning
• kernel methods
• sparse learning
• operator learning • mechanistic interpretability