Skip to content

himanshuvnm/Generalized-Gaussian-Radial-Basis-Function-in-Artificial-Intelligence-MATLAB

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

28 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Generalized Gaussian Radial Basis Function for Artificial Intelligence

Mathematical Foundations and Machine Learning Applications

Himanshu Singh

Paper Kernel Methods Scientific ML License


Overview

This repository accompanies the research paper

Machine Learning Application of Generalized Gaussian Radial Basis Function and Its Reproducing Kernel Theory published in Mathematics (MDPI), 2024.

The goal of this project is to investigate a Generalized Gaussian Radial Basis Function (GGRBF) kernel and demonstrate how it can be used across several machine learning architectures.

The work bridges mathematical function theory with modern AI models by studying the reproducing kernel Hilbert space (RKHS) generated by the kernel and its use in practical learning algorithms.


Research Motivation

Gaussian radial basis functions are among the most widely used kernels in machine learning due to their strong empirical performance and theoretical properties.

However, little work has explored generalized exponential radial kernels in learning systems.

This research investigates:

• a generalized Gaussian radial basis function kernel
• the Hilbert space generated by the kernel
• applications in machine learning architectures

The resulting kernel demonstrates improved classification accuracy and lower misclassification error in several experiments compared to standard Gaussian RBF, sigmoid, and ReLU functions.


The Generalized Gaussian RBF Kernel

Let $x,y \in \mathbb{R}^d$. The generalized Gaussian radial basis function kernel is defined as

$$ K(x,y) = \exp(-\alpha|x-y|^2)\exp(-\beta\exp(-|x-y|^2)); \beta\geq0,\alpha>0 $$

This formulation introduces an additional exponential structure that extends the classical Gaussian RBF kernel. When the additional parameter vanishes, the kernel reduces to the standard Gaussian RBF kernel.

Kernel Preview


Mathematical Contributions

The paper establishes several theoretical results for the kernel:

• characterization of the Hilbert function space generated by the kernel
• construction of the reproducing kernel Hilbert space (RKHS)
• analysis of orthonormal bases
• discussion of universality for machine learning applications

These results provide a mathematical foundation for using the kernel in learning algorithms.


Machine Learning Applications

The repository demonstrates how the kernel can be applied in several machine learning settings.

Support Vector Machines

The GGRBF kernel can be used as a kernel function in SVM classification.

Kernel Regression

The kernel provides a flexible basis for nonlinear regression.

Neural Networks

The generalized radial basis function can be used as an activation function inside neural architectures.

Experiments in the paper show that the generalized kernel can outperform several traditional activation functions in classification tasks.


Future Research Directions

Several interesting theoretical directions remain open:

• spectral decomposition of the generalized kernel

• Mercer eigenfunction analysis

• connections with Hermite polynomial expansions

• scalability in deep learning architectures

These questions form part of ongoing research.

Author

Himanshu Singh

Research interests:

• scientific machine learning

• kernel methods

• sparse learning

• operator learning • mechanistic interpretability

About

This is the recent work of my on the importance and application of mathematical function around its Hilbert function theory on artificial intelligence algorithms. The main motivation was the desire of improving the convergence rate and learning rate of various learning algorithms via Generalized Gaussian Radial Basis Function.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages