The main objective of this organized paper is to establish the Poisson distribution conditions for the v-spirallike function classes S(γ;ψ)and K(γ;ψ).We also investigate an integral operator associated with the Po...The main objective of this organized paper is to establish the Poisson distribution conditions for the v-spirallike function classes S(γ;ψ)and K(γ;ψ).We also investigate an integral operator associated with the Poisson distribution.展开更多
In this paper, we consider the simultaneous estimation of the pa rameters (means) of the independent Poisson distribution by using the following loss functions:L0(θ,T)=ni=1(Ti-θi)2, L1(θ,T)=ni=1(Ti-θi)2/θiWe ...In this paper, we consider the simultaneous estimation of the pa rameters (means) of the independent Poisson distribution by using the following loss functions:L0(θ,T)=ni=1(Ti-θi)2, L1(θ,T)=ni=1(Ti-θi)2/θiWe develop an estimator which is better than the maximum likelihood estimator X simultaneously under L0(θ,T) and L1(θ,T). Our estimator possesses substantially smaller risk than the usual estimator X to estimate the param eters (means) of the independent Poisson distribution.展开更多
In this study, we propose a two stage randomized response model. Improved unbiased estimators of the mean number of persons possessing a rare sensitive attribute under two different situations are proposed. The propos...In this study, we propose a two stage randomized response model. Improved unbiased estimators of the mean number of persons possessing a rare sensitive attribute under two different situations are proposed. The proposed estimators are evaluated using a relative efficiency comparison. It is shown that our estimators are efficient as compared to existing estimators when the parameter of rare unrelated attribute is known and in unknown case, depending on the probability of selecting a question.展开更多
Many articles dealing with individual cell lag phase determination assume that growth, when observed, comes from one cell. This assumption is not in agreement with the Poisson distribution, which uses the probability ...Many articles dealing with individual cell lag phase determination assume that growth, when observed, comes from one cell. This assumption is not in agreement with the Poisson distribution, which uses the probability of growth in a sample to predict how many samples contain one, two, or some other number of cells. This article analyses and compares different approaches to improve the accuracy of lag phase estimation of individual cells and micropopulations. It argues that if the highest initial load, as predicted by the Poisson distribution, is assigned to the sample with the shortest lag phase, the second highest to the sample with the second shortest lag phase and so on, the resulting lag phase distributions would be more accurate. This study also proposes the use of a robust test, permutation test, to compare lag phase distributions obtained in different situations.展开更多
Acceptance sampling is used to decide either the whole lot will be accepted or rejected,based on inspection of randomly sampled items from the same lot.As an alternative to traditional sampling plans,it is possible to...Acceptance sampling is used to decide either the whole lot will be accepted or rejected,based on inspection of randomly sampled items from the same lot.As an alternative to traditional sampling plans,it is possible to use Baye-sian approaches using previous knowledge on process variation.This study pre-sents a Bayesian two-sided group chain sampling plan(BTSGChSP)by using various combinations of design parameters.In BTSGChSP,inspection is based on preceding as well as succeeding lots.Poisson function is used to derive the probability of lot acceptance based on defective and non-defective products.Gamma distribution is considered as a suitable prior for Poisson distribution.Four quality regions are found,namely:(i)quality decision region(QDR),(ii)probabil-istic quality region(PQR),(iii)limiting quality region(LQR)and(iv)indifference quality region(IQR).Producer’s risk and consumer’s risk are considered to esti-mate the quality regions,where acceptable quality level(AQL)is associated with producer’s risk and limiting quality level(LQL)is associated with consumer’s risk.Moreover,AQL and LQL are used in the selection of design parameters for BTSGChSP.The values based on all possible combinations of design parameters for BTSGChSP are presented and inflection points’values are found.Thefinding exposes that BTSGChSP is a better substitute for the existing plan for industrial practitioners.展开更多
We introduce here the concept of Bayesian networks, in compound Poisson model, which provides a graphical modeling framework that encodes the joint probability distribution for a set of random variables within a direc...We introduce here the concept of Bayesian networks, in compound Poisson model, which provides a graphical modeling framework that encodes the joint probability distribution for a set of random variables within a directed acyclic graph. We suggest an approach proposal which offers a new mixed implicit estimator. We show that the implicit approach applied in compound Poisson model is very attractive for its ability to understand data and does not require any prior information. A comparative study between learned estimates given by implicit and by standard Bayesian approaches is established. Under some conditions and based on minimal squared error calculations, we show that the mixed implicit estimator is better than the standard Bayesian and the maximum likelihood estimators. We illustrate our approach by considering a simulation study in the context of mobile communication networks.展开更多
A countable Markov chain in a Markovian environment is considered.A Poisson limit theorem for the chain recurring to small cylindrical sets is mainly achieved.In order to prove this theorem,the entropy function h is i...A countable Markov chain in a Markovian environment is considered.A Poisson limit theorem for the chain recurring to small cylindrical sets is mainly achieved.In order to prove this theorem,the entropy function h is introduced and the Shannon-McMillan-Breiman theorem for the Markov chain in a Markovian environment is shown. It's well-known that a Markov process in a Markovian environment is generally not a standard Markov chain,so an example of Poisson approximation for a process which is not a Markov process is given.On the other hand,when the environmental process degenerates to a constant sequence,a Poisson limit theorem for countable Markov chains,which is the generalization of Pitskel's result for finite Markov chains is obtained.展开更多
In the present study,we undertake the task of hypothesis testing in the context of Poissondistributed data.The primary objective of our investigation is to ascertain whether two distinct sets of discrete data share th...In the present study,we undertake the task of hypothesis testing in the context of Poissondistributed data.The primary objective of our investigation is to ascertain whether two distinct sets of discrete data share the same Poisson rate.We delve into a comprehensive review and comparative analysis of various frequentist and Bayesian methodologies specifically designed to address this problem.Among these are the conditional test,the likelihood ratio test,and the Bayes factor.Additionally,we employ the posterior predictive p-value in our analysis,coupled with its corresponding calibration procedures.As the culmination of our investigation,we apply these diverse methodologies to test both simulated datasets and real-world data.The latter consists of the offspring distributions linked to COVID-19 cases in two disparate geographies-Hong Kong and Rwanda.This allows us to provide a practical demonstration of the methodologies’applications and their potential implications in the field of epidemiology.展开更多
In view of the fact that traditional air target threat assessment methods are difficult to reflect the combat characteristics of uncertain, dynamic and hybrid formation, an algorithm is proposed to solve the multi-tar...In view of the fact that traditional air target threat assessment methods are difficult to reflect the combat characteristics of uncertain, dynamic and hybrid formation, an algorithm is proposed to solve the multi-target threat assessment problems. The target attribute weight is calculated by the intuitionistic fuzzy entropy(IFE) algorithm and the time series weight is gained by the Poisson distribution method based on multi-times data. Finally,assessment and sequencing of the air multi-target threat model based on IFE and dynamic Vlse Kriterijumska Optimizacija I Kompromisno Resenje(VIKOR) is established with an example which indicates that the method is reasonable and effective.展开更多
COVID-19,being the virus of fear and anxiety,is one of the most recent and emergent of various respiratory disorders.It is similar to the MERS-COV and SARS-COV,the viruses that affected a large population of different...COVID-19,being the virus of fear and anxiety,is one of the most recent and emergent of various respiratory disorders.It is similar to the MERS-COV and SARS-COV,the viruses that affected a large population of different countries in the year 2012 and 2002,respectively.Various standard models have been used for COVID-19 epidemic prediction but they suffered from low accuracy due to lesser data availability and a high level of uncertainty.The proposed approach used a machine learning-based time-series Facebook NeuralProphet model for prediction of the number of death as well as confirmed cases and compared it with Poisson Distribution,and Random Forest Model.The analysis upon dataset has been performed considering the time duration from January 1st 2020 to16th July 2021.The model has been developed to obtain the forecast values till September 2021.This study aimed to determine the pandemic prediction of COVID-19 in the second wave of coronavirus in India using the latest Time-Series model to observe and predict the coronavirus pandemic situation across the country.In India,the cases are rapidly increasing day-by-day since mid of Feb 2021.The prediction of death rate using the proposed model has a good ability to forecast the COVID-19 dataset essentially in the second wave.To empower the prediction for future validation,the proposed model works effectively.展开更多
Surface-enhanced resonance Raman scattering (SERRS) of Rhodamine 6G (R6G) adsorbed on colloidal silver clusters has been studied. Based on the great enhancement of the Raman signal and the quench of the fluorescen...Surface-enhanced resonance Raman scattering (SERRS) of Rhodamine 6G (R6G) adsorbed on colloidal silver clusters has been studied. Based on the great enhancement of the Raman signal and the quench of the fluorescence, the SERRS spectra of R6G were recorded for the samples of dye colloidal solution with different concentrations. Spectral inhomogeneity behaviours from single molecules in the dried sample films were observed with complementary evidences, such as spectral polarization, spectral diffusion, intensity fluctuation of vibrational lines and even "breathing" of the molecules. Sequential spectra observed from a liquid sample with an average of 0.3 dye molecules in the probed volume exhibited the expected Poisson distribution for actually measuring 0, 1 or 2 molecules. Difference between the SERRS spectra of R6G excited by linearly and circularly polarized light were experimentally measured.展开更多
This paper describes a multi-threat real-time separating system for broadband anti-radiation missile seeker. It presents a method, with a dual-port memory as comparer, to perform PF and PW hardware real-time separatio...This paper describes a multi-threat real-time separating system for broadband anti-radiation missile seeker. It presents a method, with a dual-port memory as comparer, to perform PF and PW hardware real-time separation and to determine the time-of-arrival (TOA) by use of sequential difference histogram (SDIF). The method has been applied to practice, which has achieved good results.展开更多
The rharginal recursive equations on excess-of-loss reinsurance treaty are investignted, under the assumption that the number of claims belongs to the family consisting of Poisson, binomial and negative binomial, and ...The rharginal recursive equations on excess-of-loss reinsurance treaty are investignted, under the assumption that the number of claims belongs to the family consisting of Poisson, binomial and negative binomial, and that the severity distribution has bounded continuous density function. On conditional of the numbers of claims associated with the reinsurer and the cedent, some recursive equations are obtained for the marginal distributions of the total payments of the reinsurer and the cedent.展开更多
Background: In this paper, a regression model for predicting the spatial distribution of forest cockchafer larvae in the Hessian Ried region (Germany) is presented. The forest cockchafer, a native biotic pest, is a...Background: In this paper, a regression model for predicting the spatial distribution of forest cockchafer larvae in the Hessian Ried region (Germany) is presented. The forest cockchafer, a native biotic pest, is a major cause of damage in forests in this region particularly during the regeneration phase. The model developed in this study is based on a systematic sample inventory of forest cockchafer larvae by excavation across the Hessian Ried. These forest cockchafer larvae data were characterized by excess zeros and overdispersion. Methods: Using specific generalized additive regression models, different discrete distributions, including the Poisson, negative binomial and zero-inflated Poisson distributions, were compared. The methodology employed allowed the simultaneous estimation of non-linear model effects of causal covariates and, to account for spatial autocorrelation, of a 2-dimensional spatial trend function. In the validation of the models, both the Akaike information criterion (AIC) and more detailed graphical procedures based on randomized quantile residuals were used. Results: The negative binomial distribution was superior to the Poisson and the zero-inflated Poisson distributions, providing a near perfect fit to the data, which was proven in an extensive validation process. The causal predictors found to affect the density of larvae significantly were distance to water table and percentage of pure clay layer in the soil to a depth of I m. Model predictions showed that larva density increased with an increase in distance to the water table up to almost 4 m, after which it remained constant, and with a reduction in the percentage of pure clay layer. However this latter correlation was weak and requires further investigation. The 2-dimensional trend function indicated a strong spatial effect, and thus explained by far the highest proportion of variation in larva density. Conclusions: As such the model can be used to support forest practitioners in their decision making for regeneration and forest protection planning in the Hessian predicting future spatial patterns of the larva density is still comparatively weak. Ried. However, the application of the model for somewhat limited because the causal effects are展开更多
Wireless Body Area Network(WBAN)technologies are emerging with extensive applications in several domains.Health is a fascinating domain of WBAN for smart monitoring of a patient’s condition.An important factor to con...Wireless Body Area Network(WBAN)technologies are emerging with extensive applications in several domains.Health is a fascinating domain of WBAN for smart monitoring of a patient’s condition.An important factor to consider in WBAN is a node’s lifetime.Improving the lifetime of nodes is critical to address many issues,such as utility and reliability.Existing routing protocols have addressed the energy conservation problem but considered only a few parameters,thus affecting their performance.Moreover,most of the existing schemes did not consider traffic prioritization which is critical in WBANs.In this paper,an adaptive multi-cost routing protocol is proposed with a multi-objective cost function considering minimum distance from sink,temperature of sensor nodes,priority of sensed data,and maximum residual energy on sensor nodes.The performance of the proposed protocol is compared with the existing schemes for the parameters:network lifetime,stability period,throughput,energy consumption,and path loss.It is evident from the obtained results that the proposed protocol improves network lifetime and stability period by 30%and 15%,respectively,as well as outperforms the existing protocols in terms of throughput,energy consumption,and path loss.展开更多
From the point of view of the interplay between order and chaos, the most regular single-particle motion of neutrons has been found in the superheavy system with and based on the Skyrme–Hartree–Fock model and in t...From the point of view of the interplay between order and chaos, the most regular single-particle motion of neutrons has been found in the superheavy system with and based on the Skyrme–Hartree–Fock model and in the system with and based on the relativistic mean-field model. It has been shown that the statistical analysis of spectra can give valuable information about the stability of suprheavy systems. In addition it may yield deep insight into the single-particle motion in the mean field formed by the superheavy system.展开更多
Serial Analysis of Gene Expression (SAGE) is a powerful tool to analyze whole-genome expression profiles. SAGE data, characterized by large quantity and high dimensions, need reducing their dimensions and extract feat...Serial Analysis of Gene Expression (SAGE) is a powerful tool to analyze whole-genome expression profiles. SAGE data, characterized by large quantity and high dimensions, need reducing their dimensions and extract feature to improve the accuracy and efficiency when they are used for pattern recognition and clustering analysis. A Poisson Model-based Kernel (PMK) was proposed based on the Poisson distribution of the SAGE data. Kernel Principle Component Analysis (KPCA) with PMK was proposed and used in feature-extract analysis of mouse retinal SAGE data. The computa-tional results show that this algorithm can extract feature effectively and reduce dimensions of SAGE data.展开更多
The probability of occurrence of strong ( M W≥6 0) earthquakes in the area of Aeghion (Central Greece) is determined by Bayes statistics. A catalogue of strong shocks around the city of Aeghion since 1794 is used. Fo...The probability of occurrence of strong ( M W≥6 0) earthquakes in the area of Aeghion (Central Greece) is determined by Bayes statistics. A catalogue of strong shocks around the city of Aeghion since 1794 is used. For the purposes of our study two distributions of earthquakes’ occurrence are considered. In applying the Bayes approach, a Poisson distribution, which is a memoryless one, is assumed. In order to reinforce the result a time dependent model (normal distribution) is also used. An effort is made to find the probabilities of earthquake occurrence for successive decades are determined by both distributions. The estimated probability for a strong earthquake to occur during 1996~2005 in relation to the Bayes approach shows that the year 2004 is the most likely for this future event. A pattern is also revealed which suggests that the earthquakes in the examined area occurred in clusters (in time). The strong earthquakes in these clusters occurred in quadruplets.展开更多
Storm surge is one of the predominant natural threats to coastal communities. Qingdao is located on the southern coast of the Shandong Peninsula in China. The storm surge disaster in Qingdao depends on various influen...Storm surge is one of the predominant natural threats to coastal communities. Qingdao is located on the southern coast of the Shandong Peninsula in China. The storm surge disaster in Qingdao depends on various influencing factors such as the intensity, duration, and route of the passing typhoon, and thus a comprehensive understanding of natural coastal hazards is essential. In order to make up the defects of merely using the warning water level, this paper presents two statistical distribution models(Poisson Bi- variable Gumbel Logistic Distribution and Poisson Bi-variable Log-normal Distribution) to classify the intensity of storm surge. We emphasize the joint return period of typhoon-induced water levels and wave heights measured in the coastal area of Qingdao since 1949. The present study establishes a new criterion to classify the intensity grade of catastrophic storms using the typhoon surge estimated by the two models. A case study demonstrates that the new criterion is well defined in terms of probability concept, is easy to implement, and fits well the calculation of storm surge intensity. The procedures with the proposed statistical models would be useful for the disaster mitigation in other coastal areas influenced by typhoons.展开更多
Historically, Crescent City is one of the most vulnerable communities impacted by tsunamis along the west coast of the United States, largely attributed to its offshore geography. Trans-ocean tsunamis usually produce ...Historically, Crescent City is one of the most vulnerable communities impacted by tsunamis along the west coast of the United States, largely attributed to its offshore geography. Trans-ocean tsunamis usually produce large wave runup at Crescent Harbor resulting in catastrophic damages, property loss and human death. How to determine the return values of tsunami height using relatively short-term observation data is of great significance to assess the tsunami hazards and improve engineering design along the coast of Crescent City. In the present study, the extreme tsunami heights observed along the coast of Crescent City from 1938 to 2015 are fitted using six different probabilistic distributions, namely, the Gumbel distribution, the Weibull distribution, the maximum entropy distribution, the lognormal distribution, the generalized extreme value distribution and the generalized Pareto distribution. The maximum likelihood method is applied to estimate the parameters of all above distributions. Both Kolmogorov-Smirnov test and root mean square error method are utilized for goodness-of-fit test and the better fitting distribution is selected. Assuming that the occurrence frequency of tsunami in each year follows the Poisson distribution, the Poisson compound extreme value distribution can be used to fit the annual maximum tsunami amplitude, and then the point and interval estimations of return tsunami heights are calculated for structural design. The results show that the Poisson compound extreme value distribution fits tsunami heights very well and is suitable to determine the return tsunami heights for coastal disaster prevention.展开更多
文摘The main objective of this organized paper is to establish the Poisson distribution conditions for the v-spirallike function classes S(γ;ψ)and K(γ;ψ).We also investigate an integral operator associated with the Poisson distribution.
文摘In this paper, we consider the simultaneous estimation of the pa rameters (means) of the independent Poisson distribution by using the following loss functions:L0(θ,T)=ni=1(Ti-θi)2, L1(θ,T)=ni=1(Ti-θi)2/θiWe develop an estimator which is better than the maximum likelihood estimator X simultaneously under L0(θ,T) and L1(θ,T). Our estimator possesses substantially smaller risk than the usual estimator X to estimate the param eters (means) of the independent Poisson distribution.
文摘In this study, we propose a two stage randomized response model. Improved unbiased estimators of the mean number of persons possessing a rare sensitive attribute under two different situations are proposed. The proposed estimators are evaluated using a relative efficiency comparison. It is shown that our estimators are efficient as compared to existing estimators when the parameter of rare unrelated attribute is known and in unknown case, depending on the probability of selecting a question.
基金the support of the Ministerio de Educacion y Ciencia(Spain),Program Consolider CARNISENUSA CSD2007-0016 and AGL-2010-16598.
文摘Many articles dealing with individual cell lag phase determination assume that growth, when observed, comes from one cell. This assumption is not in agreement with the Poisson distribution, which uses the probability of growth in a sample to predict how many samples contain one, two, or some other number of cells. This article analyses and compares different approaches to improve the accuracy of lag phase estimation of individual cells and micropopulations. It argues that if the highest initial load, as predicted by the Poisson distribution, is assigned to the sample with the shortest lag phase, the second highest to the sample with the second shortest lag phase and so on, the resulting lag phase distributions would be more accurate. This study also proposes the use of a robust test, permutation test, to compare lag phase distributions obtained in different situations.
基金supported by the Ministry of Higher Education(MoHE)through Fundamental Research Grant Scheme(FRGS/1/2020/STG06/UUM/02/2).
文摘Acceptance sampling is used to decide either the whole lot will be accepted or rejected,based on inspection of randomly sampled items from the same lot.As an alternative to traditional sampling plans,it is possible to use Baye-sian approaches using previous knowledge on process variation.This study pre-sents a Bayesian two-sided group chain sampling plan(BTSGChSP)by using various combinations of design parameters.In BTSGChSP,inspection is based on preceding as well as succeeding lots.Poisson function is used to derive the probability of lot acceptance based on defective and non-defective products.Gamma distribution is considered as a suitable prior for Poisson distribution.Four quality regions are found,namely:(i)quality decision region(QDR),(ii)probabil-istic quality region(PQR),(iii)limiting quality region(LQR)and(iv)indifference quality region(IQR).Producer’s risk and consumer’s risk are considered to esti-mate the quality regions,where acceptable quality level(AQL)is associated with producer’s risk and limiting quality level(LQL)is associated with consumer’s risk.Moreover,AQL and LQL are used in the selection of design parameters for BTSGChSP.The values based on all possible combinations of design parameters for BTSGChSP are presented and inflection points’values are found.Thefinding exposes that BTSGChSP is a better substitute for the existing plan for industrial practitioners.
文摘We introduce here the concept of Bayesian networks, in compound Poisson model, which provides a graphical modeling framework that encodes the joint probability distribution for a set of random variables within a directed acyclic graph. We suggest an approach proposal which offers a new mixed implicit estimator. We show that the implicit approach applied in compound Poisson model is very attractive for its ability to understand data and does not require any prior information. A comparative study between learned estimates given by implicit and by standard Bayesian approaches is established. Under some conditions and based on minimal squared error calculations, we show that the mixed implicit estimator is better than the standard Bayesian and the maximum likelihood estimators. We illustrate our approach by considering a simulation study in the context of mobile communication networks.
文摘A countable Markov chain in a Markovian environment is considered.A Poisson limit theorem for the chain recurring to small cylindrical sets is mainly achieved.In order to prove this theorem,the entropy function h is introduced and the Shannon-McMillan-Breiman theorem for the Markov chain in a Markovian environment is shown. It's well-known that a Markov process in a Markovian environment is generally not a standard Markov chain,so an example of Poisson approximation for a process which is not a Markov process is given.On the other hand,when the environmental process degenerates to a constant sequence,a Poisson limit theorem for countable Markov chains,which is the generalization of Pitskel's result for finite Markov chains is obtained.
基金supported by a grant from City University of Hong Kong (Project No.9610639).
文摘In the present study,we undertake the task of hypothesis testing in the context of Poissondistributed data.The primary objective of our investigation is to ascertain whether two distinct sets of discrete data share the same Poisson rate.We delve into a comprehensive review and comparative analysis of various frequentist and Bayesian methodologies specifically designed to address this problem.Among these are the conditional test,the likelihood ratio test,and the Bayes factor.Additionally,we employ the posterior predictive p-value in our analysis,coupled with its corresponding calibration procedures.As the culmination of our investigation,we apply these diverse methodologies to test both simulated datasets and real-world data.The latter consists of the offspring distributions linked to COVID-19 cases in two disparate geographies-Hong Kong and Rwanda.This allows us to provide a practical demonstration of the methodologies’applications and their potential implications in the field of epidemiology.
基金supported by the National Natural Science Foundation of China(61401363)the Science and Technology on Avionics Integration Laboratory and Aeronautical Science Foundation(20155153034)+1 种基金the Innovative Talents Promotion Plan in Shaanxi Province(2017KJXX-15)the Fundamental Research Funds for the Central Universities(3102016AXXX005)
文摘In view of the fact that traditional air target threat assessment methods are difficult to reflect the combat characteristics of uncertain, dynamic and hybrid formation, an algorithm is proposed to solve the multi-target threat assessment problems. The target attribute weight is calculated by the intuitionistic fuzzy entropy(IFE) algorithm and the time series weight is gained by the Poisson distribution method based on multi-times data. Finally,assessment and sequencing of the air multi-target threat model based on IFE and dynamic Vlse Kriterijumska Optimizacija I Kompromisno Resenje(VIKOR) is established with an example which indicates that the method is reasonable and effective.
基金This work was supported by the Taif University Researchers supporting Project Number(TURSP-2020/254).
文摘COVID-19,being the virus of fear and anxiety,is one of the most recent and emergent of various respiratory disorders.It is similar to the MERS-COV and SARS-COV,the viruses that affected a large population of different countries in the year 2012 and 2002,respectively.Various standard models have been used for COVID-19 epidemic prediction but they suffered from low accuracy due to lesser data availability and a high level of uncertainty.The proposed approach used a machine learning-based time-series Facebook NeuralProphet model for prediction of the number of death as well as confirmed cases and compared it with Poisson Distribution,and Random Forest Model.The analysis upon dataset has been performed considering the time duration from January 1st 2020 to16th July 2021.The model has been developed to obtain the forecast values till September 2021.This study aimed to determine the pandemic prediction of COVID-19 in the second wave of coronavirus in India using the latest Time-Series model to observe and predict the coronavirus pandemic situation across the country.In India,the cases are rapidly increasing day-by-day since mid of Feb 2021.The prediction of death rate using the proposed model has a good ability to forecast the COVID-19 dataset essentially in the second wave.To empower the prediction for future validation,the proposed model works effectively.
文摘Surface-enhanced resonance Raman scattering (SERRS) of Rhodamine 6G (R6G) adsorbed on colloidal silver clusters has been studied. Based on the great enhancement of the Raman signal and the quench of the fluorescence, the SERRS spectra of R6G were recorded for the samples of dye colloidal solution with different concentrations. Spectral inhomogeneity behaviours from single molecules in the dried sample films were observed with complementary evidences, such as spectral polarization, spectral diffusion, intensity fluctuation of vibrational lines and even "breathing" of the molecules. Sequential spectra observed from a liquid sample with an average of 0.3 dye molecules in the probed volume exhibited the expected Poisson distribution for actually measuring 0, 1 or 2 molecules. Difference between the SERRS spectra of R6G excited by linearly and circularly polarized light were experimentally measured.
文摘This paper describes a multi-threat real-time separating system for broadband anti-radiation missile seeker. It presents a method, with a dual-port memory as comparer, to perform PF and PW hardware real-time separation and to determine the time-of-arrival (TOA) by use of sequential difference histogram (SDIF). The method has been applied to practice, which has achieved good results.
基金Project supported by the National Natural Science Foundation of China (Nos. 10471008, 19831020)
文摘The rharginal recursive equations on excess-of-loss reinsurance treaty are investignted, under the assumption that the number of claims belongs to the family consisting of Poisson, binomial and negative binomial, and that the severity distribution has bounded continuous density function. On conditional of the numbers of claims associated with the reinsurer and the cedent, some recursive equations are obtained for the marginal distributions of the total payments of the reinsurer and the cedent.
文摘Background: In this paper, a regression model for predicting the spatial distribution of forest cockchafer larvae in the Hessian Ried region (Germany) is presented. The forest cockchafer, a native biotic pest, is a major cause of damage in forests in this region particularly during the regeneration phase. The model developed in this study is based on a systematic sample inventory of forest cockchafer larvae by excavation across the Hessian Ried. These forest cockchafer larvae data were characterized by excess zeros and overdispersion. Methods: Using specific generalized additive regression models, different discrete distributions, including the Poisson, negative binomial and zero-inflated Poisson distributions, were compared. The methodology employed allowed the simultaneous estimation of non-linear model effects of causal covariates and, to account for spatial autocorrelation, of a 2-dimensional spatial trend function. In the validation of the models, both the Akaike information criterion (AIC) and more detailed graphical procedures based on randomized quantile residuals were used. Results: The negative binomial distribution was superior to the Poisson and the zero-inflated Poisson distributions, providing a near perfect fit to the data, which was proven in an extensive validation process. The causal predictors found to affect the density of larvae significantly were distance to water table and percentage of pure clay layer in the soil to a depth of I m. Model predictions showed that larva density increased with an increase in distance to the water table up to almost 4 m, after which it remained constant, and with a reduction in the percentage of pure clay layer. However this latter correlation was weak and requires further investigation. The 2-dimensional trend function indicated a strong spatial effect, and thus explained by far the highest proportion of variation in larva density. Conclusions: As such the model can be used to support forest practitioners in their decision making for regeneration and forest protection planning in the Hessian predicting future spatial patterns of the larva density is still comparatively weak. Ried. However, the application of the model for somewhat limited because the causal effects are
文摘Wireless Body Area Network(WBAN)technologies are emerging with extensive applications in several domains.Health is a fascinating domain of WBAN for smart monitoring of a patient’s condition.An important factor to consider in WBAN is a node’s lifetime.Improving the lifetime of nodes is critical to address many issues,such as utility and reliability.Existing routing protocols have addressed the energy conservation problem but considered only a few parameters,thus affecting their performance.Moreover,most of the existing schemes did not consider traffic prioritization which is critical in WBANs.In this paper,an adaptive multi-cost routing protocol is proposed with a multi-objective cost function considering minimum distance from sink,temperature of sensor nodes,priority of sensed data,and maximum residual energy on sensor nodes.The performance of the proposed protocol is compared with the existing schemes for the parameters:network lifetime,stability period,throughput,energy consumption,and path loss.It is evident from the obtained results that the proposed protocol improves network lifetime and stability period by 30%and 15%,respectively,as well as outperforms the existing protocols in terms of throughput,energy consumption,and path loss.
文摘From the point of view of the interplay between order and chaos, the most regular single-particle motion of neutrons has been found in the superheavy system with and based on the Skyrme–Hartree–Fock model and in the system with and based on the relativistic mean-field model. It has been shown that the statistical analysis of spectra can give valuable information about the stability of suprheavy systems. In addition it may yield deep insight into the single-particle motion in the mean field formed by the superheavy system.
基金Supported by the National Natural Science Foundation of China (No. 50877004)
文摘Serial Analysis of Gene Expression (SAGE) is a powerful tool to analyze whole-genome expression profiles. SAGE data, characterized by large quantity and high dimensions, need reducing their dimensions and extract feature to improve the accuracy and efficiency when they are used for pattern recognition and clustering analysis. A Poisson Model-based Kernel (PMK) was proposed based on the Poisson distribution of the SAGE data. Kernel Principle Component Analysis (KPCA) with PMK was proposed and used in feature-extract analysis of mouse retinal SAGE data. The computa-tional results show that this algorithm can extract feature effectively and reduce dimensions of SAGE data.
文摘The probability of occurrence of strong ( M W≥6 0) earthquakes in the area of Aeghion (Central Greece) is determined by Bayes statistics. A catalogue of strong shocks around the city of Aeghion since 1794 is used. For the purposes of our study two distributions of earthquakes’ occurrence are considered. In applying the Bayes approach, a Poisson distribution, which is a memoryless one, is assumed. In order to reinforce the result a time dependent model (normal distribution) is also used. An effort is made to find the probabilities of earthquake occurrence for successive decades are determined by both distributions. The estimated probability for a strong earthquake to occur during 1996~2005 in relation to the Bayes approach shows that the year 2004 is the most likely for this future event. A pattern is also revealed which suggests that the earthquakes in the examined area occurred in clusters (in time). The strong earthquakes in these clusters occurred in quadruplets.
基金supported by the National Natural Science Foundation of China (Nos. 51279186,51479183)the National Program on Key Basic Research Project (2011CB013704)+1 种基金the 111 Project (B14028)the Marine and Fishery Information Center Project of Jiangsu Province (SJC2014110338)
文摘Storm surge is one of the predominant natural threats to coastal communities. Qingdao is located on the southern coast of the Shandong Peninsula in China. The storm surge disaster in Qingdao depends on various influencing factors such as the intensity, duration, and route of the passing typhoon, and thus a comprehensive understanding of natural coastal hazards is essential. In order to make up the defects of merely using the warning water level, this paper presents two statistical distribution models(Poisson Bi- variable Gumbel Logistic Distribution and Poisson Bi-variable Log-normal Distribution) to classify the intensity of storm surge. We emphasize the joint return period of typhoon-induced water levels and wave heights measured in the coastal area of Qingdao since 1949. The present study establishes a new criterion to classify the intensity grade of catastrophic storms using the typhoon surge estimated by the two models. A case study demonstrates that the new criterion is well defined in terms of probability concept, is easy to implement, and fits well the calculation of storm surge intensity. The procedures with the proposed statistical models would be useful for the disaster mitigation in other coastal areas influenced by typhoons.
基金supported by the National Natural Science Foundation of China (51279186, 51479183, 51509227)the National Key Research and Development Program (2016YFC0802301)+1 种基金the National Program on Key Basic Research Project (2011CB013704)the Shandong Province Natural Science Foundation, China (ZR2014EEQ030)
文摘Historically, Crescent City is one of the most vulnerable communities impacted by tsunamis along the west coast of the United States, largely attributed to its offshore geography. Trans-ocean tsunamis usually produce large wave runup at Crescent Harbor resulting in catastrophic damages, property loss and human death. How to determine the return values of tsunami height using relatively short-term observation data is of great significance to assess the tsunami hazards and improve engineering design along the coast of Crescent City. In the present study, the extreme tsunami heights observed along the coast of Crescent City from 1938 to 2015 are fitted using six different probabilistic distributions, namely, the Gumbel distribution, the Weibull distribution, the maximum entropy distribution, the lognormal distribution, the generalized extreme value distribution and the generalized Pareto distribution. The maximum likelihood method is applied to estimate the parameters of all above distributions. Both Kolmogorov-Smirnov test and root mean square error method are utilized for goodness-of-fit test and the better fitting distribution is selected. Assuming that the occurrence frequency of tsunami in each year follows the Poisson distribution, the Poisson compound extreme value distribution can be used to fit the annual maximum tsunami amplitude, and then the point and interval estimations of return tsunami heights are calculated for structural design. The results show that the Poisson compound extreme value distribution fits tsunami heights very well and is suitable to determine the return tsunami heights for coastal disaster prevention.