The experimental values of 2059 β-decay half-lives are systematically analyzed and investigated. We have found that they are in satisfactory agreement with Benford's law, which states that the frequency of occurrenc...The experimental values of 2059 β-decay half-lives are systematically analyzed and investigated. We have found that they are in satisfactory agreement with Benford's law, which states that the frequency of occurrence of each figure, 1-9, as the first significant digit in a surprisingly large number of different data sets follows a logarithmic distribution favoring the smaller ones. Benford's logarithmic distribution of β-deeay half-lives can be explained in terms of Neweomb's justification of Benford's law and empirical exponential law of β-decay half-lives. Moreover, we test the calculated values of 6721 β-decay half-lives with the aid of Benford's law. This indicates that Benford's law is useful for theoretical physicists to test their methods for calculating β-decay half-lives.展开更多
Hyperspectral image classification stands as a pivotal task within the field of remote sensing,yet achieving highprecision classification remains a significant challenge.In response to this challenge,a Spectral Convol...Hyperspectral image classification stands as a pivotal task within the field of remote sensing,yet achieving highprecision classification remains a significant challenge.In response to this challenge,a Spectral Convolutional Neural Network model based on Adaptive Fick’s Law Algorithm(AFLA-SCNN)is proposed.The Adaptive Fick’s Law Algorithm(AFLA)constitutes a novel metaheuristic algorithm introduced herein,encompassing three new strategies:Adaptive weight factor,Gaussian mutation,and probability update policy.With adaptive weight factor,the algorithmcan adjust theweights according to the change in the number of iterations to improve the performance of the algorithm.Gaussianmutation helps the algorithm avoid falling into local optimal solutions and improves the searchability of the algorithm.The probability update strategy helps to improve the exploitability and adaptability of the algorithm.Within the AFLA-SCNN model,AFLA is employed to optimize two hyperparameters in the SCNN model,namely,“numEpochs”and“miniBatchSize”,to attain their optimal values.AFLA’s performance is initially validated across 28 functions in 10D,30D,and 50D for CEC2013 and 29 functions in 10D,30D,and 50D for CEC2017.Experimental results indicate AFLA’s marked performance superiority over nine other prominent optimization algorithms.Subsequently,the AFLA-SCNN model was compared with the Spectral Convolutional Neural Network model based on Fick’s Law Algorithm(FLA-SCNN),Spectral Convolutional Neural Network model based on Harris Hawks Optimization(HHO-SCNN),Spectral Convolutional Neural Network model based onDifferential Evolution(DE-SCNN),SpectralConvolutionalNeuralNetwork(SCNN)model,and SupportVector Machines(SVM)model using the Indian Pines dataset and PaviaUniversity dataset.The experimental results show that the AFLA-SCNN model outperforms other models in terms of Accuracy,Precision,Recall,and F1-score on Indian Pines and Pavia University.Among them,the Accuracy of the AFLA-SCNN model on Indian Pines reached 99.875%,and the Accuracy on PaviaUniversity reached 98.022%.In conclusion,our proposed AFLA-SCNN model is deemed to significantly enhance the precision of hyperspectral image classification.展开更多
The Newcomb-Benford law, which describes the uneven distribution of the frequencies of digits in data sets, is by its nature probabilistic. Therefore, the main goal of this work was to derive formulas for the permissi...The Newcomb-Benford law, which describes the uneven distribution of the frequencies of digits in data sets, is by its nature probabilistic. Therefore, the main goal of this work was to derive formulas for the permissible deviations of the above frequencies (confidence intervals). For this, a previously developed method was used, which represents an alternative to the traditional approach. The alternative formula expressing the Newcomb-Benford law is re-derived. As shown in general form, it is numerically equivalent to the original Benford formula. The obtained formulas for confidence intervals for Benford’s law are shown to be useful for checking arrays of numerical data. Consequences for numeral systems with different bases are analyzed. The alternative expression for the frequencies of digits at the second decimal place is deduced together with the corresponding deviation intervals. In general, in this approach, all the presented results are a consequence of the positionality property of digital systems such as decimal, binary, etc.展开更多
Benford's law is logarithmic law for distribution of leading digits formulated by P[D=d]= log(1+1/d) where d is leading digit or group of digits. It's named by Frank Albert Benford (1938) who formulated mathema...Benford's law is logarithmic law for distribution of leading digits formulated by P[D=d]= log(1+1/d) where d is leading digit or group of digits. It's named by Frank Albert Benford (1938) who formulated mathematical model of this probability. Befbre him, the same observation was made by Simon Newcomb. This law has changed usual preasumption of equal probability of each digit on each position in number.The main characteristic properties of this law are base, scale, sum, inverse and product invariance. Base invariance means that logarithmic law is valid for any base. Inverse invariance means that logarithmic law for leading digits holds for inverse values in sample. Multiplication invariance means that if random variable X follows Benford's law and Y is arbitrary random variable with continuous density then XY follows Benford's law too. Sum invariance means that sums of significand are the same for any leading digit or group of digits. In this text method of testing sum invariance property is proposed.展开更多
Tampering of biometric data has attracted a great deal of attention recently. Furthermore, there could be an intentional or accidental use of a particular biometric sample instead of another for a particular applicati...Tampering of biometric data has attracted a great deal of attention recently. Furthermore, there could be an intentional or accidental use of a particular biometric sample instead of another for a particular application. Therefore, there exists a need to propose a method to detect data tampering, as well as differentiate biometric samples in cases of intentional or accidental use for a different application. In this paper, fingerprint image tampering is studied. Furthermore, optically acquired fingerprints, synthetically generated fingerprints and contact-less acquired fingerprints are studied for separation purposes using the Benford’s law divergence metric. Benford’s law has shown in literature to be very effective in detecting tampering of natural images. In this paper, the Benford’s law features with support vector machine are proposed for the detection of malicious tampering of JPEG fingerprint images. This method is aimed at protecting against insider attackers and hackers. This proposed method detected tampering effectively, with Equal Error Rate (EER) of 2.08%. Again, the experimental results illustrate that, optically acquired fingerprints, synthetically generated fingerprints and contact-less acquired fingerprints can be separated by the proposed method effectively.展开更多
From a basic probabilistic argumentation, the Zipfian distribution and Benford’s law are derived. It is argued that Zipf’s law fits to calculate the rank probabilities of identical indistinguishable objects and that...From a basic probabilistic argumentation, the Zipfian distribution and Benford’s law are derived. It is argued that Zipf’s law fits to calculate the rank probabilities of identical indistinguishable objects and that Benford’s distribution fits to calculate the rank probabilities of distinguishable objects. i.e. in the distribution of words in long texts all the words in a given rank are identical, therefore, the rank distribution is Zipfian. In logarithmic tables, the objects with identical 1st digits are distinguishable as there are many different digits in the 2nd, 3rd… places, etc., and therefore the distribution is according to Benford’s law. Pareto 20 - 80 rule is shown to be an outcome of Benford’s distribution as when the number of ranks is about 10 the probability of 20% of the high probability ranks is equal to the probability of the rest of 80% low probability ranks. It is argued that all these distributions, including the central limit theorem, are outcomes of Planck’s law and are the result of the quantization of energy. This argumentation may be considered a physical origin of probability.展开更多
In the communication field, during transmission, a source signal undergoes a convolutive distortion between its symbols and the channel impulse response. This distortion is referred to as Intersymbol Interference (ISI...In the communication field, during transmission, a source signal undergoes a convolutive distortion between its symbols and the channel impulse response. This distortion is referred to as Intersymbol Interference (ISI) and can be reduced significantly by applying a blind adaptive deconvolution process (blind adaptive equalizer) on the distorted received symbols. But, since the entire blind deconvolution process is carried out with no training symbols and the channel’s coefficients are obviously unknown to the receiver, no actual indication can be given (via the mean square error (MSE) or ISI expression) during the deconvolution process whether the blind adaptive equalizer succeeded to remove the heavy ISI from the transmitted symbols or not. Up to now, the output of a convolution and deconvolution process was mainly investigated from the ISI point of view. In this paper, the output of a convolution and deconvolution process is inspected from the leading digit point of view. Simulation results indicate that for the 4PAM (Pulse Amplitude Modulation) and 16QAM (Quadrature Amplitude Modulation) input case, the number “1” is the leading digit at the output of a convolution and deconvolution process respectively as long as heavy ISI exists. However, this leading digit does not follow exactly Benford’s Law but follows approximately the leading digit (digit 1) of a Gaussian process for independent identically distributed input symbols and a channel with many coefficients.展开更多
The basic physics of unsteady Hele-Shaw flow at high Reynolds numbers is mainly studied by an experimental measurement. In order to confirm the Darcy′s law in Hele-Shaw cell, since there is an analogy between flow in...The basic physics of unsteady Hele-Shaw flow at high Reynolds numbers is mainly studied by an experimental measurement. In order to confirm the Darcy′s law in Hele-Shaw cell, since there is an analogy between flow in cells and that in porous media, progressive water waves are utilized to build an unsteady flow in a Hele-Shaw cell, and which complex wave number is measured by a wave height gauge. Meanwhile, theoretical analyses are used to compare with experimental data. Result shows Darcy′s Law is not exactly correct for unsteady Hele-Shaw flows, and it is expected to conduct a modified Darcy′s Law.展开更多
As the advent and growing popularity of image rendering software,photorealistic computer graphics are becoming more and more perceptually indistinguishable from photographic images.If the faked images are abused,it ma...As the advent and growing popularity of image rendering software,photorealistic computer graphics are becoming more and more perceptually indistinguishable from photographic images.If the faked images are abused,it may lead to potential social,legal or private consequences.To this end,it is very necessary and also challenging to find effective methods to differentiate between them.In this paper,a novel leading digit law,also called Benford's law,based method to identify computer graphics is proposed.More specifically,statistics of the most significant digits are extracted from image's Discrete Cosine Transform(DCT) coefficients and magnitudes of image's gradient,and then the Support Vector Machine(SVM) based classifiers are built.Results of experiments on the image datasets indicate that the proposed method is comparable to prior works.Besides,it possesses low dimensional features and low computational complexity.展开更多
Let{Xn;n≥1}be a sequence of i.i.d, random variables with finite variance,Q(n)be the related R/S statistics. It is proved that lim ε↓0 ε^2 ∑n=1 ^8 n log n/1 P{Q(n)≥ε√2n log log n}=2/1 EY^2,where Y=sup0≤t...Let{Xn;n≥1}be a sequence of i.i.d, random variables with finite variance,Q(n)be the related R/S statistics. It is proved that lim ε↓0 ε^2 ∑n=1 ^8 n log n/1 P{Q(n)≥ε√2n log log n}=2/1 EY^2,where Y=sup0≤t≤1B(t)-inf0≤t≤sB(t),and B(t) is a Brownian bridge.展开更多
We give some theorems of strong law of large numbers and complete convergence for sequences of φ-mixing random variables. In particular, Wittmann's strong law of large numbers and Teicher's strong law of large nnum...We give some theorems of strong law of large numbers and complete convergence for sequences of φ-mixing random variables. In particular, Wittmann's strong law of large numbers and Teicher's strong law of large nnumbers for independent random variables are generalized to the case of φ -minxing random variables.展开更多
This paper focuses on the Noether symmetries and the conserved quantities for both holonomic and nonholonomic systems based on a new non-conservative dynamical model introduced by E1-Nabulsi. First, the E1-Nabulsi dyn...This paper focuses on the Noether symmetries and the conserved quantities for both holonomic and nonholonomic systems based on a new non-conservative dynamical model introduced by E1-Nabulsi. First, the E1-Nabulsi dynamical model which is based on a fractional integral extended by periodic laws is introduced, and E1-Nabulsi-Hamilton's canoni- cal equations for non-conservative Hamilton system with holonomic or nonholonomic constraints are established. Second, the definitions and criteria of E1-Nabulsi-Noether symmetrical transformations and quasi-symmetrical transformations are presented in terms of the invariance of E1-Nabulsi-Hamilton action under the infinitesimal transformations of the group. Fi- nally, Noether's theorems for the non-conservative Hamilton system under the E1-Nabulsi dynamical system are established, which reveal the relationship between the Noether symmetry and the conserved quantity of the system.展开更多
Beverloo's scaling law can describe the flow rate of grains discharging from hoppers. In this paper, we show that the Beverloo's scaling law is valid for varying material parameters. The flow rates from a hopp...Beverloo's scaling law can describe the flow rate of grains discharging from hoppers. In this paper, we show that the Beverloo's scaling law is valid for varying material parameters. The flow rates from a hopper with different hopper and orifice sizes(D, D_0) are studied by running large-scale simulations. When the hopper size is fixed, the numerical results show that Beverloo's law is valid even if the orifice diameter is very large and then the criteria for this law are discussed.To eliminate the effect of walls, it is found that the criteria can be suggested as D-D_0≥ 40 d or D/D_0≥ 2. Interestingly,it is found that there is still a scaling relation between the flow rate and orifice diameter if D/D_0 is fixed and less than 2.When the orifice diameter is close to the hopper size, the velocity field changes and the vertical velocities of grains above the free fall region are much larger. Then, the free fall arch assumption is invalid and Beverloo's law is inapplicable.展开更多
In this paper, a novel image encryption scheme based on Keplers third law and random Hadamard transform is proposed to ensure the security of a digital image. First, a set of Kepler periodic sequences is generated to ...In this paper, a novel image encryption scheme based on Keplers third law and random Hadamard transform is proposed to ensure the security of a digital image. First, a set of Kepler periodic sequences is generated to permutate image data, which is characteristic of the plain-image and the Keplers third law. Then, a random Hadamard matrix is constructed by combining the standard Hadamard matrix with the hyper-Chen chaotic system, which is used to further scramble the image coefficients when the image is transformed through random Hadamard transform. In the end, the permuted image presents interweaving diffusion based on two special matrices, which are constructed by Kepler periodic sequence and chaos system. The experimental results and performance analysis show that the proposed encrypted scheme is highly sensitive to the plain-image and external keys, and has a high security and speed, which are very suitable for secure real-time communication of image data.展开更多
The Henry's Law constant (k) for phosphine in seawater was determined by multiple phase equilibration combined with headspace gas chromatography. The effects of pH, temperature, and salinity on k were studied. The ...The Henry's Law constant (k) for phosphine in seawater was determined by multiple phase equilibration combined with headspace gas chromatography. The effects of pH, temperature, and salinity on k were studied. The k value for phosphine in natural seawater was 6.415 at room temperature (approximately 23℃). This value increases with increases in temperature and salinity, but no obvious change was observed at different pH levels. At the same temperature, there was no significant difference between the k for phosphine in natural seawater and that in artificial seawater. This implies that temperature and salinity are major determining factors for k in marine environment. Double linear regression with Henry's Law constants for phosphine as a function of temperature and salinity confirmed our observations. These results provide a basis for the measurement of trace phosphine concentrations in seawater, and will be helpful for future research on the status of phosphine in the oceanic biogeochemical cycle of phosphorus.展开更多
基金supported by the National Natural Science Foundation of China under Grant Nos. 10675090, 10535010, and 10775068the National Fund for Forstering Talents of Basic Science under Grant No. J0630316+2 种基金the 973 State Key Basic Research and Development Program of China under Grant No. 2007CB815004the CAS Knowledge Innovation Project under Grant No. KJCX2-SW-N02the Research Fund of Doctoral Points under Grant No. 20070284016
文摘The experimental values of 2059 β-decay half-lives are systematically analyzed and investigated. We have found that they are in satisfactory agreement with Benford's law, which states that the frequency of occurrence of each figure, 1-9, as the first significant digit in a surprisingly large number of different data sets follows a logarithmic distribution favoring the smaller ones. Benford's logarithmic distribution of β-deeay half-lives can be explained in terms of Neweomb's justification of Benford's law and empirical exponential law of β-decay half-lives. Moreover, we test the calculated values of 6721 β-decay half-lives with the aid of Benford's law. This indicates that Benford's law is useful for theoretical physicists to test their methods for calculating β-decay half-lives.
基金Natural Science Foundation of Shandong Province,China(Grant No.ZR202111230202).
文摘Hyperspectral image classification stands as a pivotal task within the field of remote sensing,yet achieving highprecision classification remains a significant challenge.In response to this challenge,a Spectral Convolutional Neural Network model based on Adaptive Fick’s Law Algorithm(AFLA-SCNN)is proposed.The Adaptive Fick’s Law Algorithm(AFLA)constitutes a novel metaheuristic algorithm introduced herein,encompassing three new strategies:Adaptive weight factor,Gaussian mutation,and probability update policy.With adaptive weight factor,the algorithmcan adjust theweights according to the change in the number of iterations to improve the performance of the algorithm.Gaussianmutation helps the algorithm avoid falling into local optimal solutions and improves the searchability of the algorithm.The probability update strategy helps to improve the exploitability and adaptability of the algorithm.Within the AFLA-SCNN model,AFLA is employed to optimize two hyperparameters in the SCNN model,namely,“numEpochs”and“miniBatchSize”,to attain their optimal values.AFLA’s performance is initially validated across 28 functions in 10D,30D,and 50D for CEC2013 and 29 functions in 10D,30D,and 50D for CEC2017.Experimental results indicate AFLA’s marked performance superiority over nine other prominent optimization algorithms.Subsequently,the AFLA-SCNN model was compared with the Spectral Convolutional Neural Network model based on Fick’s Law Algorithm(FLA-SCNN),Spectral Convolutional Neural Network model based on Harris Hawks Optimization(HHO-SCNN),Spectral Convolutional Neural Network model based onDifferential Evolution(DE-SCNN),SpectralConvolutionalNeuralNetwork(SCNN)model,and SupportVector Machines(SVM)model using the Indian Pines dataset and PaviaUniversity dataset.The experimental results show that the AFLA-SCNN model outperforms other models in terms of Accuracy,Precision,Recall,and F1-score on Indian Pines and Pavia University.Among them,the Accuracy of the AFLA-SCNN model on Indian Pines reached 99.875%,and the Accuracy on PaviaUniversity reached 98.022%.In conclusion,our proposed AFLA-SCNN model is deemed to significantly enhance the precision of hyperspectral image classification.
文摘The Newcomb-Benford law, which describes the uneven distribution of the frequencies of digits in data sets, is by its nature probabilistic. Therefore, the main goal of this work was to derive formulas for the permissible deviations of the above frequencies (confidence intervals). For this, a previously developed method was used, which represents an alternative to the traditional approach. The alternative formula expressing the Newcomb-Benford law is re-derived. As shown in general form, it is numerically equivalent to the original Benford formula. The obtained formulas for confidence intervals for Benford’s law are shown to be useful for checking arrays of numerical data. Consequences for numeral systems with different bases are analyzed. The alternative expression for the frequencies of digits at the second decimal place is deduced together with the corresponding deviation intervals. In general, in this approach, all the presented results are a consequence of the positionality property of digital systems such as decimal, binary, etc.
文摘Benford's law is logarithmic law for distribution of leading digits formulated by P[D=d]= log(1+1/d) where d is leading digit or group of digits. It's named by Frank Albert Benford (1938) who formulated mathematical model of this probability. Befbre him, the same observation was made by Simon Newcomb. This law has changed usual preasumption of equal probability of each digit on each position in number.The main characteristic properties of this law are base, scale, sum, inverse and product invariance. Base invariance means that logarithmic law is valid for any base. Inverse invariance means that logarithmic law for leading digits holds for inverse values in sample. Multiplication invariance means that if random variable X follows Benford's law and Y is arbitrary random variable with continuous density then XY follows Benford's law too. Sum invariance means that sums of significand are the same for any leading digit or group of digits. In this text method of testing sum invariance property is proposed.
文摘Tampering of biometric data has attracted a great deal of attention recently. Furthermore, there could be an intentional or accidental use of a particular biometric sample instead of another for a particular application. Therefore, there exists a need to propose a method to detect data tampering, as well as differentiate biometric samples in cases of intentional or accidental use for a different application. In this paper, fingerprint image tampering is studied. Furthermore, optically acquired fingerprints, synthetically generated fingerprints and contact-less acquired fingerprints are studied for separation purposes using the Benford’s law divergence metric. Benford’s law has shown in literature to be very effective in detecting tampering of natural images. In this paper, the Benford’s law features with support vector machine are proposed for the detection of malicious tampering of JPEG fingerprint images. This method is aimed at protecting against insider attackers and hackers. This proposed method detected tampering effectively, with Equal Error Rate (EER) of 2.08%. Again, the experimental results illustrate that, optically acquired fingerprints, synthetically generated fingerprints and contact-less acquired fingerprints can be separated by the proposed method effectively.
文摘From a basic probabilistic argumentation, the Zipfian distribution and Benford’s law are derived. It is argued that Zipf’s law fits to calculate the rank probabilities of identical indistinguishable objects and that Benford’s distribution fits to calculate the rank probabilities of distinguishable objects. i.e. in the distribution of words in long texts all the words in a given rank are identical, therefore, the rank distribution is Zipfian. In logarithmic tables, the objects with identical 1st digits are distinguishable as there are many different digits in the 2nd, 3rd… places, etc., and therefore the distribution is according to Benford’s law. Pareto 20 - 80 rule is shown to be an outcome of Benford’s distribution as when the number of ranks is about 10 the probability of 20% of the high probability ranks is equal to the probability of the rest of 80% low probability ranks. It is argued that all these distributions, including the central limit theorem, are outcomes of Planck’s law and are the result of the quantization of energy. This argumentation may be considered a physical origin of probability.
文摘In the communication field, during transmission, a source signal undergoes a convolutive distortion between its symbols and the channel impulse response. This distortion is referred to as Intersymbol Interference (ISI) and can be reduced significantly by applying a blind adaptive deconvolution process (blind adaptive equalizer) on the distorted received symbols. But, since the entire blind deconvolution process is carried out with no training symbols and the channel’s coefficients are obviously unknown to the receiver, no actual indication can be given (via the mean square error (MSE) or ISI expression) during the deconvolution process whether the blind adaptive equalizer succeeded to remove the heavy ISI from the transmitted symbols or not. Up to now, the output of a convolution and deconvolution process was mainly investigated from the ISI point of view. In this paper, the output of a convolution and deconvolution process is inspected from the leading digit point of view. Simulation results indicate that for the 4PAM (Pulse Amplitude Modulation) and 16QAM (Quadrature Amplitude Modulation) input case, the number “1” is the leading digit at the output of a convolution and deconvolution process respectively as long as heavy ISI exists. However, this leading digit does not follow exactly Benford’s Law but follows approximately the leading digit (digit 1) of a Gaussian process for independent identically distributed input symbols and a channel with many coefficients.
文摘The basic physics of unsteady Hele-Shaw flow at high Reynolds numbers is mainly studied by an experimental measurement. In order to confirm the Darcy′s law in Hele-Shaw cell, since there is an analogy between flow in cells and that in porous media, progressive water waves are utilized to build an unsteady flow in a Hele-Shaw cell, and which complex wave number is measured by a wave height gauge. Meanwhile, theoretical analyses are used to compare with experimental data. Result shows Darcy′s Law is not exactly correct for unsteady Hele-Shaw flows, and it is expected to conduct a modified Darcy′s Law.
文摘As the advent and growing popularity of image rendering software,photorealistic computer graphics are becoming more and more perceptually indistinguishable from photographic images.If the faked images are abused,it may lead to potential social,legal or private consequences.To this end,it is very necessary and also challenging to find effective methods to differentiate between them.In this paper,a novel leading digit law,also called Benford's law,based method to identify computer graphics is proposed.More specifically,statistics of the most significant digits are extracted from image's Discrete Cosine Transform(DCT) coefficients and magnitudes of image's gradient,and then the Support Vector Machine(SVM) based classifiers are built.Results of experiments on the image datasets indicate that the proposed method is comparable to prior works.Besides,it possesses low dimensional features and low computational complexity.
基金Project Supported by NSFC (10131040)SRFDP (2002335090)
文摘A law of iterated logarithm for R/S statistics with the help of the strong approximations of R/S statistics by functions of a Wiener process is shown.
文摘Let{Xn;n≥1}be a sequence of i.i.d, random variables with finite variance,Q(n)be the related R/S statistics. It is proved that lim ε↓0 ε^2 ∑n=1 ^8 n log n/1 P{Q(n)≥ε√2n log log n}=2/1 EY^2,where Y=sup0≤t≤1B(t)-inf0≤t≤sB(t),and B(t) is a Brownian bridge.
文摘数据增广是提升深度学习模型性能的有效方法之一。针对多类别目标检测任务中检测性能不平衡问题,提出一种针对“短板类别”(检测性能远低于模型平均检测性能的类别)的离线数据增广方法。受Cannikin’s Law的启发,采用基于复制粘贴(copy-paste)机制的场景多样性增广方法。随机采集训练集中“短板类别”实例区域,通过相似性度量机制选取训练集中增广目标样本进行随机粘贴。为了降低随机粘贴导致的遮挡问题,采用基于自遮挡(cut-replace)机制的增广方法提升模型遮挡表达能力。通过截取样本自身区域,对特征表达最显著区域进行遮挡。实验表明,FCOS目标检测框架在PASCAL VOC数据上的平均检测精度(mean average precision,mAP)从79.10%提升到83.90%,其中短板类别更为显著,提升了20.8个百分点。在MS-COCO数据上平均检测精度提升了0.9个百分点。
基金Supported by the National Natural Science Foundation of China (10671149)
文摘We give some theorems of strong law of large numbers and complete convergence for sequences of φ-mixing random variables. In particular, Wittmann's strong law of large numbers and Teicher's strong law of large nnumbers for independent random variables are generalized to the case of φ -minxing random variables.
基金supported by the National Natural Science Foundation of China(Grant Nos.10972151 and 11272227)the Innovation Program for Postgraduate in Higher Education Institutions of Jiangsu Province,China(Grant No.CXLX11_0961)
文摘This paper focuses on the Noether symmetries and the conserved quantities for both holonomic and nonholonomic systems based on a new non-conservative dynamical model introduced by E1-Nabulsi. First, the E1-Nabulsi dynamical model which is based on a fractional integral extended by periodic laws is introduced, and E1-Nabulsi-Hamilton's canoni- cal equations for non-conservative Hamilton system with holonomic or nonholonomic constraints are established. Second, the definitions and criteria of E1-Nabulsi-Noether symmetrical transformations and quasi-symmetrical transformations are presented in terms of the invariance of E1-Nabulsi-Hamilton action under the infinitesimal transformations of the group. Fi- nally, Noether's theorems for the non-conservative Hamilton system under the E1-Nabulsi dynamical system are established, which reveal the relationship between the Noether symmetry and the conserved quantity of the system.
基金Project supported by the National Natural Science Foundation of China(Grant Nos.11705256 and 11605264)
文摘Beverloo's scaling law can describe the flow rate of grains discharging from hoppers. In this paper, we show that the Beverloo's scaling law is valid for varying material parameters. The flow rates from a hopper with different hopper and orifice sizes(D, D_0) are studied by running large-scale simulations. When the hopper size is fixed, the numerical results show that Beverloo's law is valid even if the orifice diameter is very large and then the criteria for this law are discussed.To eliminate the effect of walls, it is found that the criteria can be suggested as D-D_0≥ 40 d or D/D_0≥ 2. Interestingly,it is found that there is still a scaling relation between the flow rate and orifice diameter if D/D_0 is fixed and less than 2.When the orifice diameter is close to the hopper size, the velocity field changes and the vertical velocities of grains above the free fall region are much larger. Then, the free fall arch assumption is invalid and Beverloo's law is inapplicable.
基金Project supported by the National Natural Science Foundation of China(Grant Nos.61661008 and 61603104)the Natural Science Foundation of Guangxi Zhuang Autonomous Region,China(Grant Nos.2015GXNSFBA139256 and 2016GXNSFCA380017)+3 种基金the Funding of Overseas 100 Talents Program of Guangxi Provincial Higher Education,China,the Research Project of Guangxi University of China(Grant No.KY2016YB059)the Guangxi Key Laboratory of Multi-source Information Mining&Security,China(Grant No.MIMS15-07)the Doctoral Research Foundation of Guangxi Normal University,the Guangxi Provincial Experiment Center of Information Sciencethe Innovation Project of Guangxi Graduate Education(Grant No.YCSZ2017055)
文摘In this paper, a novel image encryption scheme based on Keplers third law and random Hadamard transform is proposed to ensure the security of a digital image. First, a set of Kepler periodic sequences is generated to permutate image data, which is characteristic of the plain-image and the Keplers third law. Then, a random Hadamard matrix is constructed by combining the standard Hadamard matrix with the hyper-Chen chaotic system, which is used to further scramble the image coefficients when the image is transformed through random Hadamard transform. In the end, the permuted image presents interweaving diffusion based on two special matrices, which are constructed by Kepler periodic sequence and chaos system. The experimental results and performance analysis show that the proposed encrypted scheme is highly sensitive to the plain-image and external keys, and has a high security and speed, which are very suitable for secure real-time communication of image data.
基金Supported by the National Natural Science Foundation of China (Nos.30970522,40576058)the National Natural Science Foundation of China for Creative Research Groups (No.41121064)
文摘The Henry's Law constant (k) for phosphine in seawater was determined by multiple phase equilibration combined with headspace gas chromatography. The effects of pH, temperature, and salinity on k were studied. The k value for phosphine in natural seawater was 6.415 at room temperature (approximately 23℃). This value increases with increases in temperature and salinity, but no obvious change was observed at different pH levels. At the same temperature, there was no significant difference between the k for phosphine in natural seawater and that in artificial seawater. This implies that temperature and salinity are major determining factors for k in marine environment. Double linear regression with Henry's Law constants for phosphine as a function of temperature and salinity confirmed our observations. These results provide a basis for the measurement of trace phosphine concentrations in seawater, and will be helpful for future research on the status of phosphine in the oceanic biogeochemical cycle of phosphorus.