Let L^2([0, 1], x) be the space of the real valued, measurable, square summable functions on [0, 1] with weight x, and let n be the subspace of L2([0, 1], x) defined by a linear combination of Jo(μkX), where J...Let L^2([0, 1], x) be the space of the real valued, measurable, square summable functions on [0, 1] with weight x, and let n be the subspace of L2([0, 1], x) defined by a linear combination of Jo(μkX), where Jo is the Bessel function of order 0 and {μk} is the strictly increasing sequence of all positive zeros of Jo. For f ∈ L^2([0, 1], x), let E(f, n) be the error of the best L2([0, 1], x), i.e., approximation of f by elements of n. The shift operator off at point x ∈[0, 1] with step t ∈[0, 1] is defined by T(t)f(x)=1/π∫0^π f(√x^2 +t^2-2xtcosO)dθ The differences (I- T(t))^r/2f = ∑j=0^∞(-1)^j(j^r/2)T^j(t)f of order r ∈ (0, ∞) and the L^2([0, 1],x)- modulus of continuity ωr(f,τ) = sup{||(I- T(t))^r/2f||:0≤ t ≤τ] of order r are defined in the standard way, where T^0(t) = I is the identity operator. In this paper, we establish the sharp Jackson inequality between E(f, n) and ωr(f, τ) for some cases of r and τ. More precisely, we will find the smallest constant n(τ, r) which depends only on n, r, and % such that the inequality E(f, n)≤ n(τ, r)ωr(f, τ) is valid.展开更多
We consider Jackson inequality in L^2 (B^d×T, Wκ,μ^B (x)), where the weight function Wκ,μ^B (X) is defined on the ball B^d and related to reflection group, and obtain the sharp Jackson inequalityEn-1,m-...We consider Jackson inequality in L^2 (B^d×T, Wκ,μ^B (x)), where the weight function Wκ,μ^B (X) is defined on the ball B^d and related to reflection group, and obtain the sharp Jackson inequalityEn-1,m-1(f)2≤κn,m(τ,r)ωr(f,t)2,τ≥2τn,λ,where Tn,λ is the first positive zero of the Gegenbauer cosine polynomial Cn^λ (cos θ)(n ∈ N).展开更多
The relationship between the order of approximation by neural network based on scattered threshold value nodes and the neurons involved in a single hidden layer is investigated. The results obtained show that the degr...The relationship between the order of approximation by neural network based on scattered threshold value nodes and the neurons involved in a single hidden layer is investigated. The results obtained show that the degree of approximation by the periodic neural network with one hidden layer and scattered threshold value nodes is increased with the increase of the number of neurons hid in hidden layer and the smoothness of excitation function.展开更多
We study Jackson's inequality between the best approximation of a function f∈ L2(R^3) by entire functions of exponential spherical type and its generalized modulus of continuity. We prove Jackson's inequality wit...We study Jackson's inequality between the best approximation of a function f∈ L2(R^3) by entire functions of exponential spherical type and its generalized modulus of continuity. We prove Jackson's inequality with the exact constant and the optimal argument in the modulus of continuity. In particular, Jackson's inequality with the optimal parameters is obtained for classical modulus of continuity of order r and Thue-Morse modulus of continuity of order r∈ N. These results are based on the solution of the generalized Logan problem for entire functions of exponential type. For it we construct a new quadrature formulas for entire functions of exponential type.展开更多
In this paper, we study the sharp Jackson inequality for the best approximation of f ∈L2,k(Rd) by a subspace Ek2(σ) (SEk2(σ)), which is a subspace of entire functions of exponential type (spherical exponen...In this paper, we study the sharp Jackson inequality for the best approximation of f ∈L2,k(Rd) by a subspace Ek2(σ) (SEk2(σ)), which is a subspace of entire functions of exponential type (spherical exponential type) at most σ. Here L2,k(Rd) denotes the space of all d-variate functions f endowed with the L2-norm with the weight vk(x)=Пζ∈R+}(ζ,x)}2k(ζ),which is defined by a positive subsystem R+ of a finite root system R Rd and a function k(ζ):R→R+ invariant under the reflection group G(R) generated by R. In the case G(R) = Z2d, we get some exact results. Moreover, the deviation of best approximation by the subspace Ek2(σ) (SE2(σ)) of some class of the smooth functions in the space L2,k(Rd) is obtained.展开更多
基金supported partly by National Natural Science Foundation of China (No.10471010)partly by the project"Representation Theory and Related Topics"of the"985 Program"of Beijing Normal University and Beijing Natural Science Foundation (1062004).
文摘Let L^2([0, 1], x) be the space of the real valued, measurable, square summable functions on [0, 1] with weight x, and let n be the subspace of L2([0, 1], x) defined by a linear combination of Jo(μkX), where Jo is the Bessel function of order 0 and {μk} is the strictly increasing sequence of all positive zeros of Jo. For f ∈ L^2([0, 1], x), let E(f, n) be the error of the best L2([0, 1], x), i.e., approximation of f by elements of n. The shift operator off at point x ∈[0, 1] with step t ∈[0, 1] is defined by T(t)f(x)=1/π∫0^π f(√x^2 +t^2-2xtcosO)dθ The differences (I- T(t))^r/2f = ∑j=0^∞(-1)^j(j^r/2)T^j(t)f of order r ∈ (0, ∞) and the L^2([0, 1],x)- modulus of continuity ωr(f,τ) = sup{||(I- T(t))^r/2f||:0≤ t ≤τ] of order r are defined in the standard way, where T^0(t) = I is the identity operator. In this paper, we establish the sharp Jackson inequality between E(f, n) and ωr(f, τ) for some cases of r and τ. More precisely, we will find the smallest constant n(τ, r) which depends only on n, r, and % such that the inequality E(f, n)≤ n(τ, r)ωr(f, τ) is valid.
基金supported by National Natural Science Foundation of China(11071019)Beijing Natural Science Foundation(1132001)
文摘We consider Jackson inequality in L^2 (B^d×T, Wκ,μ^B (x)), where the weight function Wκ,μ^B (X) is defined on the ball B^d and related to reflection group, and obtain the sharp Jackson inequalityEn-1,m-1(f)2≤κn,m(τ,r)ωr(f,t)2,τ≥2τn,λ,where Tn,λ is the first positive zero of the Gegenbauer cosine polynomial Cn^λ (cos θ)(n ∈ N).
文摘The relationship between the order of approximation by neural network based on scattered threshold value nodes and the neurons involved in a single hidden layer is investigated. The results obtained show that the degree of approximation by the periodic neural network with one hidden layer and scattered threshold value nodes is increased with the increase of the number of neurons hid in hidden layer and the smoothness of excitation function.
基金Supported by the Russian Foundation for Basic Research(Grant No.16-01-00308)
文摘We study Jackson's inequality between the best approximation of a function f∈ L2(R^3) by entire functions of exponential spherical type and its generalized modulus of continuity. We prove Jackson's inequality with the exact constant and the optimal argument in the modulus of continuity. In particular, Jackson's inequality with the optimal parameters is obtained for classical modulus of continuity of order r and Thue-Morse modulus of continuity of order r∈ N. These results are based on the solution of the generalized Logan problem for entire functions of exponential type. For it we construct a new quadrature formulas for entire functions of exponential type.
基金Supported by National Natural Science Foundation of China(Grant No.11071019)the research Fund for the Doctoral Program of Higher Education and Beijing Natural Science Foundation(Grant No.1102011)
文摘In this paper, we study the sharp Jackson inequality for the best approximation of f ∈L2,k(Rd) by a subspace Ek2(σ) (SEk2(σ)), which is a subspace of entire functions of exponential type (spherical exponential type) at most σ. Here L2,k(Rd) denotes the space of all d-variate functions f endowed with the L2-norm with the weight vk(x)=Пζ∈R+}(ζ,x)}2k(ζ),which is defined by a positive subsystem R+ of a finite root system R Rd and a function k(ζ):R→R+ invariant under the reflection group G(R) generated by R. In the case G(R) = Z2d, we get some exact results. Moreover, the deviation of best approximation by the subspace Ek2(σ) (SE2(σ)) of some class of the smooth functions in the space L2,k(Rd) is obtained.