Neural Network Trigonometric Approximation

Rating  0
Views   951
حوراء عباس فاضل
28/11/2018 06:33:05

In this paper, we define a weighted norm to construct a weighted (Lp,alpha)-space for 2pi-periodic functions. Then, we prove that any periodic lebesgue-integrable function is approximated by a feedforward neural network with sigmoidal hidden neuron in terms of kth modulus of smoothness , That’s what we called Neural Networks Trigonometric Approximation.
Keywords: Neural Network, Trigonometric Approximation, Modulus of Smoothness.
1. Introduction
In recent years, trigonometric polynomials have a main rule in approximation functions in -space as well as other more general spaces. They are widely used to find neural networks as approximators for those functions, in for examples [Cao F. L., Zhang Y. Q. and Zhang W.G., 2007, Wang J and Xu Z., 2011.], for the huge importance for neural networks in different fields, and the essential need for approximated neural networks in different applications.
Three-layer feedforward neural network is an important class of neural networks that can approximate the desired function well. Most papers deal with the rate of approximation as a tool to understand the approximation capability.
In this work, we care to find a new space with a new norm to spot light on the relationship between the approximation error and the properties of the used neural network. For our space, a feedforward neural network (FNN) of three layer can be existed to approximate the present function well.

وصف الــ Tags لهذا الموضوع   Approximation, Neural Network, Trigonometric