期刊文献+
共找到21篇文章
< 1 2 >
每页显示 20 50 100
PRECISE ASYMPTOTICS IN SELF-NORMALIZED SUMS OF ITERATED LOGARITHM FOR MULTIDIMENSIONALLY INDEXED RANDOM VARIABLES 被引量:3
1
作者 Jiang Chaowei Yang Xiaorong 《Applied Mathematics(A Journal of Chinese Universities)》 SCIE CSCD 2007年第1期87-94,共8页
In the case of Z+^d(d ≥ 2)-the positive d-dimensional lattice points with partial ordering ≤, {Xk,k∈ Z+^d} i.i.d, random variables with mean 0, Sn =∑k≤nXk and Vn^2 = ∑j≤nXj^2, the precise asymptotics for ∑... In the case of Z+^d(d ≥ 2)-the positive d-dimensional lattice points with partial ordering ≤, {Xk,k∈ Z+^d} i.i.d, random variables with mean 0, Sn =∑k≤nXk and Vn^2 = ∑j≤nXj^2, the precise asymptotics for ∑n1/|n|(log|n|dP(|Sn/Vn|≥ε√log log|n|) and ∑n(logn|)b/|n|(log|n|)^d-1P(|Sn/Vn|≥ε√log n),as ε↓0,is established. 展开更多
关键词 multidimensionally indexed random variable precise asymptotics self-normalized sum Davislaw of large numbers law of iterated logarithm.
下载PDF
A note on self-normalized Dickey-Fuller test for unit root in autoregressive time series with GARCH errors 被引量:1
2
作者 YANG Xiao-rong ZHANG Li-xin 《Applied Mathematics(A Journal of Chinese Universities)》 SCIE CSCD 2008年第2期197-201,共5页
In this article, the unit root test for AR(p) model with GARCH errors is considered. The Dickey-Fuller test statistics are rewritten in the form of self-normalized sums, and the asymptotic distribution of the test s... In this article, the unit root test for AR(p) model with GARCH errors is considered. The Dickey-Fuller test statistics are rewritten in the form of self-normalized sums, and the asymptotic distribution of the test statistics is derived under the weak conditions. 展开更多
关键词 unit root AR (p)-GARCH (1 1) self-normalized Dickey-Fuller test statistic.
下载PDF
A weak invariance principle for self-normalized products of sums of mixing sequences
3
作者 FU Ke-ang HUANG Wei 《Applied Mathematics(A Journal of Chinese Universities)》 SCIE CSCD 2008年第2期183-189,共7页
Let variables in the {X, Xn, n ≥ 1} be a sequence of strictly stationary φ-mixing positive random domain of attraction of the normal law. Under some suitable conditions the principle for self-normalized products of ... Let variables in the {X, Xn, n ≥ 1} be a sequence of strictly stationary φ-mixing positive random domain of attraction of the normal law. Under some suitable conditions the principle for self-normalized products of partial sums is obtained. 展开更多
关键词 self-normalized product domain of attraction of the normal law Φ-MIXING Wiener process.
下载PDF
A Note on “Limit Distributions of Self-Normalized Sums” Using Cauchy-Generated Samples
4
作者 Jan Vrbik 《Applied Mathematics》 2019年第11期863-875,共13页
In this case study, we would like to illustrate the utility of characteristic functions, using an example of a sample statistic defined for samples from Cauchy distribution. The derivation of the corresponding asympto... In this case study, we would like to illustrate the utility of characteristic functions, using an example of a sample statistic defined for samples from Cauchy distribution. The derivation of the corresponding asymptotic probability density function is based on [1], elaborating and expanding the individual steps of their presentation, and including a small extension;our reason for such a plagiarism is to make the technique, its mathematical tools and ingenious arguments available to the widest possible audience. 展开更多
关键词 self-normalized SUM CAUCHY Distribution Characteristic Functions FOURIER Transform Padé Approximation
下载PDF
Ext-ICAS:A Novel Self-Normalized Extractive Intra Cosine Attention Similarity Summarization
5
作者 P.Sharmila C.Deisy S.Parthasarathy 《Computer Systems Science & Engineering》 SCIE EI 2023年第4期377-393,共17页
With the continuous growth of online news articles,there arises the necessity for an efficient abstractive summarization technique for the problem of information overloading.Abstractive summarization is highly complex... With the continuous growth of online news articles,there arises the necessity for an efficient abstractive summarization technique for the problem of information overloading.Abstractive summarization is highly complex and requires a deeper understanding and proper reasoning to come up with its own summary outline.Abstractive summarization task is framed as seq2seq modeling.Existing seq2seq methods perform better on short sequences;however,for long sequences,the performance degrades due to high computation and hence a two-phase self-normalized deep neural document summarization model consisting of improvised extractive cosine normalization and seq2seq abstractive phases has been proposed in this paper.The novelty is to parallelize the sequence computation training by incorporating feed-forward,the self-normalized neural network in the Extractive phase using Intra Cosine Attention Similarity(Ext-ICAS)with sentence dependency position.Also,it does not require any normalization technique explicitly.Our proposed abstractive Bidirectional Long Short Term Memory(Bi-LSTM)encoder sequence model performs better than the Bidirectional Gated Recurrent Unit(Bi-GRU)encoder with minimum training loss and with fast convergence.The proposed model was evaluated on the Cable News Network(CNN)/Daily Mail dataset and an average rouge score of 0.435 was achieved also computational training in the extractive phase was reduced by 59%with an average number of similarity computations. 展开更多
关键词 Abstractive summarization natural language processing sequence-tosequence learning(seq2seq) self-normALIZATION intra(self)attention
下载PDF
Berry-Esseen bounds for self-normalized sums of locally dependent random variables
6
作者 Zhuo-Song Zhang 《Science China Mathematics》 SCIE CSCD 2024年第11期2629-2652,共24页
The Berry-Esseen bound provides an upper bound on the Kolmogorov distance between a random variable and the normal distribution.In this paper,we establish Berry-Esseen bounds with optimal rates for self-normalized sum... The Berry-Esseen bound provides an upper bound on the Kolmogorov distance between a random variable and the normal distribution.In this paper,we establish Berry-Esseen bounds with optimal rates for self-normalized sums of locally dependent random variables,assuming only a second-moment condition.Our proof leverages Stein's method and introduces a novel randomized concentration inequality,which may also be of independent interest for other applications.Our main results have applied to self-normalized sums of m-dependent random variables and graph dependency models. 展开更多
关键词 Berry-Esseen bounds self-normalized sums local dependence m-dependence graph dependency
原文传递
Normalized and self-normalized Cramér-type moderate deviations for the Euler-Maruyama scheme for the SDE
7
作者 Xiequan Fan Haijuan Hu Lihu Xu 《Science China Mathematics》 SCIE CSCD 2024年第8期1865-1880,共16页
In this paper,we establish normalized and self-normalized Cramér-type moderate deviations for the Euler-Maruyama scheme for SDE.Due to our results,Berry-Esseen's bounds and moderate deviation principles are a... In this paper,we establish normalized and self-normalized Cramér-type moderate deviations for the Euler-Maruyama scheme for SDE.Due to our results,Berry-Esseen's bounds and moderate deviation principles are also obtained.Our normalized Cramér-type moderate deviations refine the recent work of Lu et al.(2022). 展开更多
关键词 Euler-Maruyama scheme Cramer-type moderate deviations self-normalized sequences Berry-Esseen’s bounds
原文传递
Self-normalized moderate deviations for independent random variables 被引量:2
8
作者 JING BingYi LIANG HanYing ZHOU Wang 《Science China Mathematics》 SCIE 2012年第11期2297-2315,共19页
Let X1,X2,... be a sequence of independent random variables (r.v.s) belonging to the domain of attraction of a normal or stable law. In this paper, we study moderate deviations for the self-normalized sum n X ∑^n_i... Let X1,X2,... be a sequence of independent random variables (r.v.s) belonging to the domain of attraction of a normal or stable law. In this paper, we study moderate deviations for the self-normalized sum n X ∑^n_i=1Xi/Vm,p ,where Vn,p (∑^n_i=1|Xi|p)^1/p (P 〉 1).Applications to the self-normalized law of the iteratedlogarithm, Studentized increments of partial sums, t-statistic, and weighted sum of independent and identically distributed (i.i.d.) r.v.s are considered. 展开更多
关键词 self-normalized sum moderate deviation t-statistic LIL INCREMENT
原文传递
Berry-Esseen Bounds for Self-Normalized Martingales 被引量:2
9
作者 Xiequan Fan Qi-Man Shao 《Communications in Mathematics and Statistics》 SCIE 2018年第1期13-27,共15页
A Berry–Esseen bound is obtained for self-normalized martingales under the assumption of finite moments.The bound coincides with the classical Berry–Esseenboundforstandardizedmartingales.Anexampleisgiventoshowtheopt... A Berry–Esseen bound is obtained for self-normalized martingales under the assumption of finite moments.The bound coincides with the classical Berry–Esseenboundforstandardizedmartingales.Anexampleisgiventoshowtheoptimality of the bound.Applications to Student’s statistic and autoregressive process are also discussed. 展开更多
关键词 self-normalized process Berry-Esseen bounds MARTINGALES Student’s statistic Autoregressive process
原文传递
Self-normalized Cramer-type Moderate Deviations for Functionals of Markov Chain 被引量:1
10
作者 Xin-wei FENG Qi-Man SHAO 《Acta Mathematicae Applicatae Sinica》 SCIE CSCD 2020年第2期294-313,共20页
Let{xn,n≥0}be a Markov chain with a countable state space S and let f(·)be a measurable function from S to R and consider the functionals of the Markov chain yn:=f(xn).We construct a new type of self-normalized ... Let{xn,n≥0}be a Markov chain with a countable state space S and let f(·)be a measurable function from S to R and consider the functionals of the Markov chain yn:=f(xn).We construct a new type of self-normalized sums based on the random-block scheme and establish a Crame′r-type moderate deviations for self-normalized sums of functionals of the Markov chain. 展开更多
关键词 self-normalized partial SUMS Cramer-type moderate deviations Markov Chain ERGODIC
原文传递
A strong approximation of self-normalized sums 被引量:1
11
作者 CSRG Miklós HU ZhiShui 《Science China Mathematics》 SCIE 2013年第1期149-160,共12页
Let {X,Xn,n1} be a sequence of independent identically distributed random variables with EX=0 and assume that EX2I(|X|≤x) is slowly varying as x→∞,i.e.,X is in the domain of attraction of the normal law.In this pap... Let {X,Xn,n1} be a sequence of independent identically distributed random variables with EX=0 and assume that EX2I(|X|≤x) is slowly varying as x→∞,i.e.,X is in the domain of attraction of the normal law.In this paper a Strassen-type strong approximation is established for self-normalized sums of such random variables. 展开更多
关键词 strong approximation self-normalized sums domain of attraction of the normal law
原文传递
An Almost Sure Central Limit Theorem for Self-normalized Weighted Sums 被引量:1
12
作者 Yong ZHANG Xiao-yun YANG 《Acta Mathematicae Applicatae Sinica》 SCIE CSCD 2013年第1期79-92,共14页
Let X, X1, X2, be a sequence of nondegenerate i.i.d, random variables with zero means, which is in the domain of attraction of the normal law. Let (ani, 1 ≤ i ≤n,n ≥1} be an array of real numbers with some suitab... Let X, X1, X2, be a sequence of nondegenerate i.i.d, random variables with zero means, which is in the domain of attraction of the normal law. Let (ani, 1 ≤ i ≤n,n ≥1} be an array of real numbers with some suitable conditions. In this paper, we show that a central limit theorem for self-normalized weighted sums holds. We also deduce a version of ASCLT for self-normalized weighted sums. 展开更多
关键词 almost sure central limit theorem self-normalized weighted sums domain of attractio of thenormal law
原文传递
On necessary and sufficient conditions for the self-normalized central limit theorem In Honor of Professor Chuanrong Lu on His 85th Birthday 被引量:1
13
作者 Qiman Shao 《Science China Mathematics》 SCIE CSCD 2018年第10期1741-1748,共8页
Let X_1, X_2,... be a sequence of independent random variables and S_n=sum X_1 from i=1 to n and V_n^2=sum X_1~2 from i=1 to n . When the elements of the sequence are i.i.d., it is known that the self-normalized sum S... Let X_1, X_2,... be a sequence of independent random variables and S_n=sum X_1 from i=1 to n and V_n^2=sum X_1~2 from i=1 to n . When the elements of the sequence are i.i.d., it is known that the self-normalized sum S_n/V_n converges to a standard normal distribution if and only if max1≤i≤n|X_i|/V_n → 0 in probability and the mean of X_1 is zero. In this paper, sufficient conditions for the self-normalized central limit theorem are obtained for general independent random variables. It is also shown that if max1≤i≤n|X_i|/V_n → 0 in probability, then these sufficient conditions are necessary. 展开更多
关键词 central limit theorem self-normalized independent random variables
原文传递
Self-Normalized Moderate Deviation and Laws of the Iterated Logarithm Under G-Expectation 被引量:3
14
作者 Li-Xin Zhang 《Communications in Mathematics and Statistics》 SCIE 2016年第2期229-263,共35页
The sub-linear expectation or called G-expectation is a non-linear expectation having advantage of modeling non-additive probability problems and the volatilityuncertainty in finance.Let{Xn;n≥1}be a sequence of indep... The sub-linear expectation or called G-expectation is a non-linear expectation having advantage of modeling non-additive probability problems and the volatilityuncertainty in finance.Let{Xn;n≥1}be a sequence of independent random vari-ables in a sub-linear expectation space(Ω,H,E^(^)).Denote S_(n)=∑_(k=1)^(n)Xk and=V_(n)^(2)=∑_(k=1)^(n)X_(k)^(2).In this paper,a moderate deviation for self-normalized sums,thatis,the asymptotic capacity of the event{Sn/Vn≥x_(n)}for x_(n)=o(√n),is found both for identically distributed random variables and independent but not necessarilyidentically distributed random variables.As an application,the self-normalized lawsof the iterated logarithm are obtained.A Bernstein's type inequality is also establishedfor proving the law of the iterated logarithm. 展开更多
关键词 Non-linear expectation Capacity self-normALIZATION Law of theiterated logarithm Moderate deviation
原文传递
Convergence to a self-normalized G-Brownian motion 被引量:1
15
作者 Zhengyan Lin Li-Xin Zhang 《Probability, Uncertainty and Quantitative Risk》 2017年第1期87-111,共25页
G-Brownian motion has a very rich and interesting new structure that nontrivially generalizes the classical Brownian motion.Its quadratic variation process is also a continuous process with independent and stationary ... G-Brownian motion has a very rich and interesting new structure that nontrivially generalizes the classical Brownian motion.Its quadratic variation process is also a continuous process with independent and stationary increments.We prove a self-normalized functional central limit theorem for independent and identically distributed random variables under the sub-linear expectation with the limit process being a G-Brownian motion self-normalized by its quadratic variation.To prove the self-normalized central limit theorem,we also establish a new Donsker’s invariance principle with the limit process being a generalized G-Brownian motion. 展开更多
关键词 Sub-linear expectation G-Brownian motion Central limit theorem Invariance principle self-normALIZATION
原文传递
A Self-normalized Law of the Iterated Logarithm for the Geometrically Weighted Random Series
16
作者 Ke Ang FU Wei HUANG 《Acta Mathematica Sinica,English Series》 SCIE CSCD 2016年第3期384-392,共9页
Let {X, Xn; n ≥ 0} be a sequence of independent and identically distributed random variables with EX=0, and assume that EX^2I(|X| ≤ x) is slowly varying as x →∞, i.e., X is in the domain of attraction of the n... Let {X, Xn; n ≥ 0} be a sequence of independent and identically distributed random variables with EX=0, and assume that EX^2I(|X| ≤ x) is slowly varying as x →∞, i.e., X is in the domain of attraction of the normal law. In this paper, a self-normalized law of the iterated logarithm for the geometrically weighted random series Σ~∞(n=0)β~nXn(0 〈 β 〈 1) is obtained, under some minimal conditions. 展开更多
关键词 Domain of attraction of the normal law geometrically weighted series law of the iteratedlogarithm self-normALIZATION slowly varying
原文传递
Large-scale self-normalizing neural networks
17
作者 Zhaodong Chen Weiqin Zhao +4 位作者 Lei Deng Yufei Ding Qinghao Wen Guoqi Li Yuan Xie 《Journal of Automation and Intelligence》 2024年第2期101-110,共10页
Self-normalizing neural networks(SNN)regulate the activation and gradient flows through activation functions with the self-normalization property.As SNNs do not rely on norms computed from minibatches,they are more fr... Self-normalizing neural networks(SNN)regulate the activation and gradient flows through activation functions with the self-normalization property.As SNNs do not rely on norms computed from minibatches,they are more friendly to data parallelism,kernel fusion,and emerging architectures such as ReRAM-based accelerators.However,existing SNNs have mainly demonstrated their effectiveness on toy datasets and fall short in accuracy when dealing with large-scale tasks like ImageNet.They lack the strong normalization,regularization,and expression power required for wider,deeper models and larger-scale tasks.To enhance the normalization strength,this paper introduces a comprehensive and practical definition of the self-normalization property in terms of the stability and attractiveness of the statistical fixed points.It is comprehensive as it jointly considers all the fixed points used by existing studies:the first and second moment of forward activation and the expected Frobenius norm of backward gradient.The practicality comes from the analytical equations provided by our paper to assess the stability and attractiveness of each fixed point,which are derived from theoretical analysis of the forward and backward signals.The proposed definition is applied to a meta activation function inspired by prior research,leading to a stronger self-normalizing activation function named‘‘bi-scaled exponential linear unit with backward standardized’’(bSELU-BSTD).We provide both theoretical and empirical evidence to show that it is superior to existing studies.To enhance the regularization and expression power,we further propose scaled-Mixup and channel-wise scale&shift.With these three techniques,our approach achieves 75.23%top-1 accuracy on the ImageNet with Conv MobileNet V1,surpassing the performance of existing self-normalizing activation functions.To the best of our knowledge,this is the first SNN that achieves comparable accuracy to batch normalization on ImageNet. 展开更多
关键词 self-normalizing neural network Mean-field theory Block dynamical isometry Activation function
下载PDF
Self-normalization:Taming a wild population in a heavy-tailed world 被引量:2
18
作者 SHAO Qi-man ZHOU Wen-xin 《Applied Mathematics(A Journal of Chinese Universities)》 SCIE CSCD 2017年第3期253-269,共17页
The past two decades have witnessed the active development of a rich probability theory of Studentized statistics or self-normalized processes, typified by Student’s t-statistic as introduced by W. S. Gosset more tha... The past two decades have witnessed the active development of a rich probability theory of Studentized statistics or self-normalized processes, typified by Student’s t-statistic as introduced by W. S. Gosset more than a century ago, and their applications to statistical problems in high dimensions, including feature selection and ranking, large-scale multiple testing and sparse, high dimensional signal detection. Many of these applications rely on the robustness property of Studentization/self-normalization against heavy-tailed sampling distributions. This paper gives an overview of the salient progress of self-normalized limit theory, from Student’s t-statistic to more general Studentized nonlinear statistics. Prototypical examples include Studentized one- and two-sample U-statistics. Furthermore, we go beyond independence and glimpse some very recent advances in self-normalized moderate deviations under dependence. 展开更多
关键词 Berry-Esseen inequality Hotelling’s T 2-statistic large deviation moderate deviation self-normALIZATION Student’s t-statistic U-STATISTIC
下载PDF
Saddlepoint approximations for studentized compound Poisson sums with no moment conditions in audit sampling
19
作者 ZHOU GuoLiang 1 & ZHOU Wang 2, 1 Institute of Accounting and Finance, Shanghai University of Finance and Economics, Shanghai 200433, China 2 Department of Statistics and Applied Probability, National University of Singapore, Singapore 117546, Singapore 《Science China Mathematics》 SCIE 2010年第12期3131-3138,共8页
Saddlepoint approximations for the studentized compound Poisson sums with no moment conditions in audit sampling are derived. This result not only provides a very accurate approximation for studentized compound Poisso... Saddlepoint approximations for the studentized compound Poisson sums with no moment conditions in audit sampling are derived. This result not only provides a very accurate approximation for studentized compound Poisson sums, but also can be applied much more widely in statistical inference of the error amount in an audit population of accounts to check the validity of financial statements of a firm. Some numerical illustrations and comparison with the normal approximation method are presented. 展开更多
关键词 saddlepoint APPROXIMATION Laplace APPROXIMATION self-normalized COMPOUND POISSON SUM stu- dentized COMPOUND POISSON SUM
原文传递
Studentized Increments of Partial Sums 被引量:2
20
作者 CsorgoMiklos 林正炎 邵启满 《Science China Mathematics》 SCIE 1994年第3期265-276,共12页
Using suitable self-normalization for partial sums of i.i.d.random variables,Griffin and Kuelbs established the law of the iterated logarithm for all distributions in the domain of attraction of a normal law.We obtain... Using suitable self-normalization for partial sums of i.i.d.random variables,Griffin and Kuelbs established the law of the iterated logarithm for all distributions in the domain of attraction of a normal law.We obtain the corresponding results for Studentized increments of partial sums under thesame condition. 展开更多
关键词 increments of partial sums self-normALIZATION domain of attraction of a normal law
原文传递
上一页 1 2 下一页 到第
使用帮助 返回顶部