Count data is almost always over-dispersed where the variance exceeds the mean. Several count data models have been proposed by researchers but the problem of over-dispersion still remains unresolved, more so in the c...Count data is almost always over-dispersed where the variance exceeds the mean. Several count data models have been proposed by researchers but the problem of over-dispersion still remains unresolved, more so in the context of change point analysis. This study develops a likelihood-based algorithm that detects and estimates multiple change points in a set of count data assumed to follow the Negative Binomial distribution. Discrete change point procedures discussed in literature work well for equi-dispersed data. The new algorithm produces reliable estimates of change points in cases of both equi-dispersed and over-dispersed count data;hence its advantage over other count data change point techniques. The Negative Binomial Multiple Change Point Algorithm was tested using simulated data for different sample sizes and varying positions of change. Changes in the distribution parameters were detected and estimated by conducting a likelihood ratio test on several partitions of data obtained through step-wise recursive binary segmentation. Critical values for the likelihood ratio test were developed and used to check for significance of the maximum likelihood estimates of the change points. The change point algorithm was found to work best for large datasets, though it also works well for small and medium-sized datasets with little to no error in the location of change points. The algorithm correctly detects changes when present and fails to detect changes when change is absent in actual sense. Power analysis of the likelihood ratio test for change was performed through Monte-Carlo simulation in the single change point setting. Sensitivity analysis of the test power showed that likelihood ratio test is the most powerful when the simulated change points are located mid-way through the sample data as opposed to when changes were located in the periphery. Further, the test is more powerful when the change was located three-quarter-way through the sample data compared to when the change point is closer (quarter-way) to the first observation.展开更多
In this paper, asymptotic expansions of the distribution of the likelihood ratio statistic for testing sphericity in a crowth curve model have been derived in the null and nonnull cases when the alternatives are dose ...In this paper, asymptotic expansions of the distribution of the likelihood ratio statistic for testing sphericity in a crowth curve model have been derived in the null and nonnull cases when the alternatives are dose to the null hypothesis. These expansions are given in series form of beta distributions.展开更多
A novel technique is proposed to improve the performance of voice activity detection(VAD) by using deep belief networks(DBN) with a likelihood ratio(LR). The likelihood ratio is derived from the speech and noise spect...A novel technique is proposed to improve the performance of voice activity detection(VAD) by using deep belief networks(DBN) with a likelihood ratio(LR). The likelihood ratio is derived from the speech and noise spectral components that are assumed to follow the Gaussian probability density function(PDF). The proposed algorithm employs DBN learning in order to classify voice activity by using the input signal to calculate the likelihood ratio. Experiments show that the proposed algorithm yields improved results in various noise environments, compared to the conventional VAD algorithms. Furthermore, the DBN based algorithm decreases the detection probability of error with [0.7, 2.6] compared to the support vector machine based algorithm.展开更多
While the conventional forensic scientists routinely validate and express the results of their investigations quantitatively using statistical measures from probability theory,digital forensics examiners rarely if eve...While the conventional forensic scientists routinely validate and express the results of their investigations quantitatively using statistical measures from probability theory,digital forensics examiners rarely if ever do so.In this paper,we review some of the quantitative tools and techniques which are available for use in digital forensic investigations,including Bayesian networks,complexity theory,information theory and probability theory,and indicate how they may be used to obtain likelihood ratios or odds ratios for the relative plausibility of alternative explanations for the creation of the recovered digital evidence.The potential benefits of such quantitative measures for modern digital forensics are also outlined.展开更多
Suppose that several different imperfect instruments and one perfect instrument are independently used to measure some characteristics of a population. Thus, measurements of two or more sets of samples with varying ac...Suppose that several different imperfect instruments and one perfect instrument are independently used to measure some characteristics of a population. Thus, measurements of two or more sets of samples with varying accuracies are obtained. Statistical inference should be based on the pooled samples. In this article, the authors also assumes that all the imperfect instruments are unbiased. They consider the problem of combining this information to make statistical tests for parameters more relevant. They define the empirical likelihood ratio functions and obtain their asymptotic distributions in the presence of measurement error.展开更多
This paper investigates the modified likelihood ratio test(LRT) for homogeneity in normal mixtures of two samples with mixing proportions unknown. It is proved that the limit distribution of the modified likelihood ...This paper investigates the modified likelihood ratio test(LRT) for homogeneity in normal mixtures of two samples with mixing proportions unknown. It is proved that the limit distribution of the modified likelihood ratio test is X^2(1).展开更多
The environment of the wireless communication system in the coal mine has unique characteristics: great noise, strong multiple path interference, and the wireless communication of orthogonal frequency division multip...The environment of the wireless communication system in the coal mine has unique characteristics: great noise, strong multiple path interference, and the wireless communication of orthogonal frequency division multiplexing (OFDM) in underground coal mine is sensitive to the frequency selection of multiple path fading channel, whose decoding is separated from the traditional channel estimation algorithm. In order to increase its accuracy and reliability, a new iterating channel estimation algorithm combining the logarithm likelihood ratio (LLR) decode iterate based on the maximum likelihood estimation (ML) is proposed in this paper, which estimates iteration channel in combination with LLR decode. Without estimating the channel noise power, it exchanges the information between the ML channel estimation and the LLR decode using the feedback information of LLR decode. The decoding speed is very quick, and the satisfied result will be obtained by iterating in some time. The simulation results of the shortwave broadband channel in the coal mine show that the error rate of the system is basically convergent after the iteration in two times.展开更多
A distributed turbo codes( DTC) scheme with log likelihood ratio( LLR)-based threshold at the relay for a two-hop relay networks is proposed. Different from traditional DTC schemes,the retransmission scheme at the...A distributed turbo codes( DTC) scheme with log likelihood ratio( LLR)-based threshold at the relay for a two-hop relay networks is proposed. Different from traditional DTC schemes,the retransmission scheme at the relay,where imperfect decoding occurs,is considered in the proposed scheme. By employing a LLR-based threshold at the relay in the proposed scheme,the reliability of decoder-LLRs can be measured. As a result,only reliable symbols will be forwarded to the destination and a maximum ratio combiner( MRC) is used to combine signals received from both the source and the relay. In order to obtain the optimal threshold at the relay,an equivalent model of decoderLLRs is investigated,so as to derive the expression of the bit error probability( BEP) of the proposed scheme under binary phase shift keying( BPSK) modulation. Simulation results demonstrate that the proposed scheme can effectively mitigate error propagation at the relay and also outperforms other existing methods.展开更多
Global Navigation Satellite System(GNSS)-based passive radar(GBPR)has been widely used in remote sensing applications.However,for moving target detection(MTD),the quadratic phase error(QPE)introduced by the non-cooper...Global Navigation Satellite System(GNSS)-based passive radar(GBPR)has been widely used in remote sensing applications.However,for moving target detection(MTD),the quadratic phase error(QPE)introduced by the non-cooperative target motion is usually difficult to be compensated,as the low power level of the GBPR echo signal renders the estimation of the Doppler rate less effective.Consequently,the moving target in GBPR image is usually defocused,which aggravates the difficulty of target detection even further.In this paper,a spawning particle filter(SPF)is proposed for defocused MTD.Firstly,the measurement model and the likelihood ratio function(LRF)of the defocused point-like target image are deduced.Then,a spawning particle set is generated for subsequent target detection,with reference to traditional particles in particle filter(PF)as their parent.After that,based on the PF estimator,the SPF algorithm and its sequential Monte Carlo(SMC)implementation are proposed with a novel amplitude estimation method to decrease the target state dimension.Finally,the effectiveness of the proposed SPF is demonstrated by numerical simulations and pre-liminary experimental results,showing that the target range and Doppler can be estimated accurately.展开更多
Wireless Communication is a system for communicating information from one point to other,without utilizing any connections like wire,cable,or other physical medium.Cognitive Radio(CR)based systems and networks are a r...Wireless Communication is a system for communicating information from one point to other,without utilizing any connections like wire,cable,or other physical medium.Cognitive Radio(CR)based systems and networks are a revolutionary new perception in wireless communications.Spectrum sensing is a vital task of CR to avert destructive intrusion with licensed primary or main users and discover the accessible spectrum for the efficient utilization of the spectrum.Centralized Cooperative Spectrum Sensing(CSS)is a kind of spectrum sensing.Most of the test metrics designed till now for sensing the spectrum is produced by using the Sample Covariance Matrix(SCM)of the received signal.Some of the methods that use the SCM for the process of detection are Pietra-Ricci Index Detector(PRIDe),Hadamard Ratio(HR)detector,Gini Index Detector(GID),etc.This paper presents the simulation and comparative perfor-mance analysis of PRIDe with various other detectors like GID,HR,Arithmetic to Geometric Mean(AGM),Volume-based Detector number 1(VD1),Maximum-to-Minimum Eigenvalue Detection(MMED),and Generalized Likelihood Ratio Test(GLRT)using the MATLAB software.The PRIDe provides better performance in the presence of variations in the power of the signal and the noise power with less computational complexity.展开更多
This article provides a brief overview of various approaches that may be utilized for the analysis of human semen test results. Reference intervals are the most widely used tool for the interpretation of clinical labo...This article provides a brief overview of various approaches that may be utilized for the analysis of human semen test results. Reference intervals are the most widely used tool for the interpretation of clinical laboratory results. Reference interval development has classically relied on concepts elaborated by the International Federation of Clinical Chemistry Expert Panel on Reference Values during the 1980s. These guidelines involve obtaining and classifying samples from a healthy population of at least 120 individuals and then identifying the outermost 5% of observations to use in defining limits for two-sided or one-sided reference intervals. More recently, decision limits based on epidemiological outcome analysis have also been introduced to aid in test interpretation. The reference population must be carefully defined on the basis of the intended clinical use of the underlying test. To determine appropriate reference intervals for use in male fertility assessment, a reference population of men with documented time to pregnancy of 〈 12 months would be most suitable. However, for epidemiological assessment of semen testing results, a reference population made up ofunselected healthy men would be preferred. Although reference and decision limits derived for individual semen analysis test results will undoubtedly be the interpretational tools of choice in the near future, in the long term, multivariate methods for the interpretation of semen analysis alone or in combination with information from the female partner seem to represent better means for assessing the likelihood of achieving a successful pregnancy in a subfertile couple.展开更多
The occurrence of lightning-induced forest fires during a time period is count data featuring over-dispersion (i.e., variance is larger than mean) and a high frequency of zero counts. In this study, we used six gene...The occurrence of lightning-induced forest fires during a time period is count data featuring over-dispersion (i.e., variance is larger than mean) and a high frequency of zero counts. In this study, we used six generalized linear models to examine the relationship between the occurrence of lightning-induced forest fires and meteorological factors in the Northern Daxing'an Mountains of China. The six models included Poisson, negative binomial (NB), zero- inflated Poisson (ZIP), zero-inflated negative binomial (ZINB), Poisson hurdle (PH), and negative binomial hurdle (NBH) models. Goodness-of-fit was compared and tested among the six models using Akaike information criterion (AIC), sum of squared errors, likelihood ratio test, and Vuong test. The predictive performance of the models was assessed and compared using independent validation data by the data-splitting method. Based on the model AIC, the ZINB model best fitted the fire occurrence data, followed by (in order of smaller AIC) NBH, ZIP, NB, PH, and Poisson models. The ZINB model was also best for pre- dicting either zero counts or positive counts (〉1). The two Hurdle models (PH and NBH) were better than ZIP, Poisson, and NB models for predicting positive counts, but worse than these three models for predicting zero counts. Thus, the ZINB model was the first choice for modeling the occurrence of lightning-induced forest fires in this study, which implied that the excessive zero counts of lightning- induced fires came from both structure and sampling zeros.展开更多
Hypothesis testing analysis and unknown parameter estimation of both the intermediate frequency(IF) and baseband GPS signal detection are given by using the generalized likelihood ratio test(GLRT) approach,applying th...Hypothesis testing analysis and unknown parameter estimation of both the intermediate frequency(IF) and baseband GPS signal detection are given by using the generalized likelihood ratio test(GLRT) approach,applying the model of GPS signal in white Gaussian noise,It is proved that the test statistic follows central or noncentral F distribution,It is also pointed out that the test statistic is nearly identical to central or noncentral chi-squared distribution because the processing samples are large enough to be considered as infinite in GPS acquisition problem.It is also proved that the probability of false alarm,the probability of detection and the threshold are affected largely when the hypothesis testing refers to the full pseudorandom noise(PRN) code phase and Doppler frequency search space cells instead of each individual cell.The performance of the test statistic is also given with combining the noncoherent integration.展开更多
As location-based techniques and applications have become ubiquitous in emerging wireless networks, the verification of location information has become more important. In recent years, there has been an explosion of a...As location-based techniques and applications have become ubiquitous in emerging wireless networks, the verification of location information has become more important. In recent years, there has been an explosion of activity related to lo- cation-verification techniques in wireless networks. In particular, there has been a specific focus on intelligent transport systems because of the mission-critical nature of vehicle location verification. In this paper, we review recent research on wireless location verification related to vehicular networks. We focus on location verification systems that rely on for- mal mathematical classification frameworks and show how many systems are either partially or fully encompassed by such frameworks.展开更多
When all the rules of sensor decision are known, the optimal distributeddecision fusion, which relies only on the joint conditional probability densities, can be derivedfor very general decision systems. They include ...When all the rules of sensor decision are known, the optimal distributeddecision fusion, which relies only on the joint conditional probability densities, can be derivedfor very general decision systems. They include those systems with interdependent sensorobservations and any network structure. It is also valid for m-ary Bayesian decision problems andbinary problems under the Neyman-Pearson criterion. Local decision rules of a sensor withcommunication from other sensors that are optimal for the sensor itself are also presented, whichtake the form of a generalized likelihood ratio test. Numerical examples are given to reveal someinteresting phenomena that communication between sensors can improve performance of a senordecision, but cannot guarantee to improve the global fusion performance when sensor rules were givenbefore fusing.展开更多
As traditional two-parameter constant false alarm rate (CFAR) target detec-tion algorithms in SAR images ignore target distribution, their performances are not the best or near best. As the resolution of SAR images ...As traditional two-parameter constant false alarm rate (CFAR) target detec-tion algorithms in SAR images ignore target distribution, their performances are not the best or near best. As the resolution of SAR images increases, small targets present more pixels in SAR images. So the target distribution is of much significance. Distribution-based CFAR detection algorithm is presented. We unite the pixels around the test cell, and estimate the distribution of test cell by them. Generalized Likelihood Ratio Test (GLRT) is used to deduce the detectors. The performance of the distribution-based CFAR (DBCFAR) detectors is analyzed theoretically. False alarms of DBCFAR detection are fewer than those of CFAR at the same detection rate. Finally experiments are done and the results show the performance of DBCFAR is out of conventional CFAR. False alarms of DBCFAR detection are concentrated while those of CFAR detection are dispersive.展开更多
Varying-coefficient models are a useful extension of classical linear model. They are widely applied to economics, biomedicine, epidemiology, and so on. There are extensive studies on them in the latest three decade y...Varying-coefficient models are a useful extension of classical linear model. They are widely applied to economics, biomedicine, epidemiology, and so on. There are extensive studies on them in the latest three decade years. In this paper, many of models related to varying-coefficient models are gathered up. All kinds of the estimation procedures and theory of hypothesis test on the varying-coefficients model are summarized. Prom my opinion, some aspects waiting to study are proposed.展开更多
In this papert we give an approach for detecting one or more outliers inrandomized linear model.The likelihood ratio test statistic and its distributions underthe null hypothesis and the alternative hypothesis are giv...In this papert we give an approach for detecting one or more outliers inrandomized linear model.The likelihood ratio test statistic and its distributions underthe null hypothesis and the alternative hypothesis are given. Furthermore,the robustnessof the test statistic in a certain sense is proved. Finally,the optimality properties of thetest are derived.展开更多
The problem of detecting signal with multiple input mul-tiple output (MIMO) radar in correlated Gaussian clutter dominated scenario with unknown covariance matrix is dealt with. The gen-eral MIMO model, with widely ...The problem of detecting signal with multiple input mul-tiple output (MIMO) radar in correlated Gaussian clutter dominated scenario with unknown covariance matrix is dealt with. The gen-eral MIMO model, with widely separated sub-arrays and co-located antennas at each sub-array, is adopted. Firstly, the generalized likelihood ratio test (GLRT) with known covariance matrix is ob-tained, and then the Rao and Wald detectors are devised, which have proved that the Rao and Wald test coincide with GLRT detec-tor. To make the detectors fully adaptive, the secondary data with signal-free will be collected to estimate the covariance. The per-formance of the proposed detector is analyzed, however, it is just ancillary. A thorough performance assessment by several numer-ical examples is also given, which has considered the sense with co-located antennas configure of transmitters and receivers array. The results show that the performance the proposed adaptive de-tector is better than LJ-GLRT, and the loss can be acceptable in comparison to their non-adaptive counterparts.展开更多
文摘Count data is almost always over-dispersed where the variance exceeds the mean. Several count data models have been proposed by researchers but the problem of over-dispersion still remains unresolved, more so in the context of change point analysis. This study develops a likelihood-based algorithm that detects and estimates multiple change points in a set of count data assumed to follow the Negative Binomial distribution. Discrete change point procedures discussed in literature work well for equi-dispersed data. The new algorithm produces reliable estimates of change points in cases of both equi-dispersed and over-dispersed count data;hence its advantage over other count data change point techniques. The Negative Binomial Multiple Change Point Algorithm was tested using simulated data for different sample sizes and varying positions of change. Changes in the distribution parameters were detected and estimated by conducting a likelihood ratio test on several partitions of data obtained through step-wise recursive binary segmentation. Critical values for the likelihood ratio test were developed and used to check for significance of the maximum likelihood estimates of the change points. The change point algorithm was found to work best for large datasets, though it also works well for small and medium-sized datasets with little to no error in the location of change points. The algorithm correctly detects changes when present and fails to detect changes when change is absent in actual sense. Power analysis of the likelihood ratio test for change was performed through Monte-Carlo simulation in the single change point setting. Sensitivity analysis of the test power showed that likelihood ratio test is the most powerful when the simulated change points are located mid-way through the sample data as opposed to when changes were located in the periphery. Further, the test is more powerful when the change was located three-quarter-way through the sample data compared to when the change point is closer (quarter-way) to the first observation.
文摘In this paper, asymptotic expansions of the distribution of the likelihood ratio statistic for testing sphericity in a crowth curve model have been derived in the null and nonnull cases when the alternatives are dose to the null hypothesis. These expansions are given in series form of beta distributions.
基金supported by the KERI Primary Research Program through the Korea Research Council for Industrial Science & Technology funded by the Ministry of Science,ICT and Future Planning (No.15-12-N0101-46)
文摘A novel technique is proposed to improve the performance of voice activity detection(VAD) by using deep belief networks(DBN) with a likelihood ratio(LR). The likelihood ratio is derived from the speech and noise spectral components that are assumed to follow the Gaussian probability density function(PDF). The proposed algorithm employs DBN learning in order to classify voice activity by using the input signal to calculate the likelihood ratio. Experiments show that the proposed algorithm yields improved results in various noise environments, compared to the conventional VAD algorithms. Furthermore, the DBN based algorithm decreases the detection probability of error with [0.7, 2.6] compared to the support vector machine based algorithm.
文摘While the conventional forensic scientists routinely validate and express the results of their investigations quantitatively using statistical measures from probability theory,digital forensics examiners rarely if ever do so.In this paper,we review some of the quantitative tools and techniques which are available for use in digital forensic investigations,including Bayesian networks,complexity theory,information theory and probability theory,and indicate how they may be used to obtain likelihood ratios or odds ratios for the relative plausibility of alternative explanations for the creation of the recovered digital evidence.The potential benefits of such quantitative measures for modern digital forensics are also outlined.
基金This work is supported by NNSF of China (10571093)
文摘Suppose that several different imperfect instruments and one perfect instrument are independently used to measure some characteristics of a population. Thus, measurements of two or more sets of samples with varying accuracies are obtained. Statistical inference should be based on the pooled samples. In this article, the authors also assumes that all the imperfect instruments are unbiased. They consider the problem of combining this information to make statistical tests for parameters more relevant. They define the empirical likelihood ratio functions and obtain their asymptotic distributions in the presence of measurement error.
基金Supported by the National Natural Science Foundation of China(10661003)the SRF for ROCS,SEM([2004]527)the NSF of Guangxi(0728092)
文摘This paper investigates the modified likelihood ratio test(LRT) for homogeneity in normal mixtures of two samples with mixing proportions unknown. It is proved that the limit distribution of the modified likelihood ratio test is X^2(1).
文摘The environment of the wireless communication system in the coal mine has unique characteristics: great noise, strong multiple path interference, and the wireless communication of orthogonal frequency division multiplexing (OFDM) in underground coal mine is sensitive to the frequency selection of multiple path fading channel, whose decoding is separated from the traditional channel estimation algorithm. In order to increase its accuracy and reliability, a new iterating channel estimation algorithm combining the logarithm likelihood ratio (LLR) decode iterate based on the maximum likelihood estimation (ML) is proposed in this paper, which estimates iteration channel in combination with LLR decode. Without estimating the channel noise power, it exchanges the information between the ML channel estimation and the LLR decode using the feedback information of LLR decode. The decoding speed is very quick, and the satisfied result will be obtained by iterating in some time. The simulation results of the shortwave broadband channel in the coal mine show that the error rate of the system is basically convergent after the iteration in two times.
文摘A distributed turbo codes( DTC) scheme with log likelihood ratio( LLR)-based threshold at the relay for a two-hop relay networks is proposed. Different from traditional DTC schemes,the retransmission scheme at the relay,where imperfect decoding occurs,is considered in the proposed scheme. By employing a LLR-based threshold at the relay in the proposed scheme,the reliability of decoder-LLRs can be measured. As a result,only reliable symbols will be forwarded to the destination and a maximum ratio combiner( MRC) is used to combine signals received from both the source and the relay. In order to obtain the optimal threshold at the relay,an equivalent model of decoderLLRs is investigated,so as to derive the expression of the bit error probability( BEP) of the proposed scheme under binary phase shift keying( BPSK) modulation. Simulation results demonstrate that the proposed scheme can effectively mitigate error propagation at the relay and also outperforms other existing methods.
基金supported by the National Natural Science Foundation of China(62101014)the National Key Laboratory of Science and Technology on Space Microwave(6142411203307).
文摘Global Navigation Satellite System(GNSS)-based passive radar(GBPR)has been widely used in remote sensing applications.However,for moving target detection(MTD),the quadratic phase error(QPE)introduced by the non-cooperative target motion is usually difficult to be compensated,as the low power level of the GBPR echo signal renders the estimation of the Doppler rate less effective.Consequently,the moving target in GBPR image is usually defocused,which aggravates the difficulty of target detection even further.In this paper,a spawning particle filter(SPF)is proposed for defocused MTD.Firstly,the measurement model and the likelihood ratio function(LRF)of the defocused point-like target image are deduced.Then,a spawning particle set is generated for subsequent target detection,with reference to traditional particles in particle filter(PF)as their parent.After that,based on the PF estimator,the SPF algorithm and its sequential Monte Carlo(SMC)implementation are proposed with a novel amplitude estimation method to decrease the target state dimension.Finally,the effectiveness of the proposed SPF is demonstrated by numerical simulations and pre-liminary experimental results,showing that the target range and Doppler can be estimated accurately.
文摘Wireless Communication is a system for communicating information from one point to other,without utilizing any connections like wire,cable,or other physical medium.Cognitive Radio(CR)based systems and networks are a revolutionary new perception in wireless communications.Spectrum sensing is a vital task of CR to avert destructive intrusion with licensed primary or main users and discover the accessible spectrum for the efficient utilization of the spectrum.Centralized Cooperative Spectrum Sensing(CSS)is a kind of spectrum sensing.Most of the test metrics designed till now for sensing the spectrum is produced by using the Sample Covariance Matrix(SCM)of the received signal.Some of the methods that use the SCM for the process of detection are Pietra-Ricci Index Detector(PRIDe),Hadamard Ratio(HR)detector,Gini Index Detector(GID),etc.This paper presents the simulation and comparative perfor-mance analysis of PRIDe with various other detectors like GID,HR,Arithmetic to Geometric Mean(AGM),Volume-based Detector number 1(VD1),Maximum-to-Minimum Eigenvalue Detection(MMED),and Generalized Likelihood Ratio Test(GLRT)using the MATLAB software.The PRIDe provides better performance in the presence of variations in the power of the signal and the noise power with less computational complexity.
文摘This article provides a brief overview of various approaches that may be utilized for the analysis of human semen test results. Reference intervals are the most widely used tool for the interpretation of clinical laboratory results. Reference interval development has classically relied on concepts elaborated by the International Federation of Clinical Chemistry Expert Panel on Reference Values during the 1980s. These guidelines involve obtaining and classifying samples from a healthy population of at least 120 individuals and then identifying the outermost 5% of observations to use in defining limits for two-sided or one-sided reference intervals. More recently, decision limits based on epidemiological outcome analysis have also been introduced to aid in test interpretation. The reference population must be carefully defined on the basis of the intended clinical use of the underlying test. To determine appropriate reference intervals for use in male fertility assessment, a reference population of men with documented time to pregnancy of 〈 12 months would be most suitable. However, for epidemiological assessment of semen testing results, a reference population made up ofunselected healthy men would be preferred. Although reference and decision limits derived for individual semen analysis test results will undoubtedly be the interpretational tools of choice in the near future, in the long term, multivariate methods for the interpretation of semen analysis alone or in combination with information from the female partner seem to represent better means for assessing the likelihood of achieving a successful pregnancy in a subfertile couple.
基金funded by Asia–Pacific Forests Net(APFNET/2010/FPF/001)National Natural Science Foundation of China(Grant No.31400552)
文摘The occurrence of lightning-induced forest fires during a time period is count data featuring over-dispersion (i.e., variance is larger than mean) and a high frequency of zero counts. In this study, we used six generalized linear models to examine the relationship between the occurrence of lightning-induced forest fires and meteorological factors in the Northern Daxing'an Mountains of China. The six models included Poisson, negative binomial (NB), zero- inflated Poisson (ZIP), zero-inflated negative binomial (ZINB), Poisson hurdle (PH), and negative binomial hurdle (NBH) models. Goodness-of-fit was compared and tested among the six models using Akaike information criterion (AIC), sum of squared errors, likelihood ratio test, and Vuong test. The predictive performance of the models was assessed and compared using independent validation data by the data-splitting method. Based on the model AIC, the ZINB model best fitted the fire occurrence data, followed by (in order of smaller AIC) NBH, ZIP, NB, PH, and Poisson models. The ZINB model was also best for pre- dicting either zero counts or positive counts (〉1). The two Hurdle models (PH and NBH) were better than ZIP, Poisson, and NB models for predicting positive counts, but worse than these three models for predicting zero counts. Thus, the ZINB model was the first choice for modeling the occurrence of lightning-induced forest fires in this study, which implied that the excessive zero counts of lightning- induced fires came from both structure and sampling zeros.
文摘Hypothesis testing analysis and unknown parameter estimation of both the intermediate frequency(IF) and baseband GPS signal detection are given by using the generalized likelihood ratio test(GLRT) approach,applying the model of GPS signal in white Gaussian noise,It is proved that the test statistic follows central or noncentral F distribution,It is also pointed out that the test statistic is nearly identical to central or noncentral chi-squared distribution because the processing samples are large enough to be considered as infinite in GPS acquisition problem.It is also proved that the probability of false alarm,the probability of detection and the threshold are affected largely when the hypothesis testing refers to the full pseudorandom noise(PRN) code phase and Doppler frequency search space cells instead of each individual cell.The performance of the test statistic is also given with combining the noncoherent integration.
基金supported by the University of New South Wales and the Australian Research Council under grant No.DP120102607
文摘As location-based techniques and applications have become ubiquitous in emerging wireless networks, the verification of location information has become more important. In recent years, there has been an explosion of activity related to lo- cation-verification techniques in wireless networks. In particular, there has been a specific focus on intelligent transport systems because of the mission-critical nature of vehicle location verification. In this paper, we review recent research on wireless location verification related to vehicular networks. We focus on location verification systems that rely on for- mal mathematical classification frameworks and show how many systems are either partially or fully encompassed by such frameworks.
文摘When all the rules of sensor decision are known, the optimal distributeddecision fusion, which relies only on the joint conditional probability densities, can be derivedfor very general decision systems. They include those systems with interdependent sensorobservations and any network structure. It is also valid for m-ary Bayesian decision problems andbinary problems under the Neyman-Pearson criterion. Local decision rules of a sensor withcommunication from other sensors that are optimal for the sensor itself are also presented, whichtake the form of a generalized likelihood ratio test. Numerical examples are given to reveal someinteresting phenomena that communication between sensors can improve performance of a senordecision, but cannot guarantee to improve the global fusion performance when sensor rules were givenbefore fusing.
文摘As traditional two-parameter constant false alarm rate (CFAR) target detec-tion algorithms in SAR images ignore target distribution, their performances are not the best or near best. As the resolution of SAR images increases, small targets present more pixels in SAR images. So the target distribution is of much significance. Distribution-based CFAR detection algorithm is presented. We unite the pixels around the test cell, and estimate the distribution of test cell by them. Generalized Likelihood Ratio Test (GLRT) is used to deduce the detectors. The performance of the distribution-based CFAR (DBCFAR) detectors is analyzed theoretically. False alarms of DBCFAR detection are fewer than those of CFAR at the same detection rate. Finally experiments are done and the results show the performance of DBCFAR is out of conventional CFAR. False alarms of DBCFAR detection are concentrated while those of CFAR detection are dispersive.
基金Foundation item: Supported by the National Natural Science Foundation of China(10501053) Acknowledgement I would like to thank Henan Society of Applied Statistics for which give me a chance to declare my opinion about the varying-coefficient model.
文摘Varying-coefficient models are a useful extension of classical linear model. They are widely applied to economics, biomedicine, epidemiology, and so on. There are extensive studies on them in the latest three decade years. In this paper, many of models related to varying-coefficient models are gathered up. All kinds of the estimation procedures and theory of hypothesis test on the varying-coefficients model are summarized. Prom my opinion, some aspects waiting to study are proposed.
文摘In this papert we give an approach for detecting one or more outliers inrandomized linear model.The likelihood ratio test statistic and its distributions underthe null hypothesis and the alternative hypothesis are given. Furthermore,the robustnessof the test statistic in a certain sense is proved. Finally,the optimality properties of thetest are derived.
基金supported by the Fundamental Research Funds for the Central Universities (103.1.2-E022050205)
文摘The problem of detecting signal with multiple input mul-tiple output (MIMO) radar in correlated Gaussian clutter dominated scenario with unknown covariance matrix is dealt with. The gen-eral MIMO model, with widely separated sub-arrays and co-located antennas at each sub-array, is adopted. Firstly, the generalized likelihood ratio test (GLRT) with known covariance matrix is ob-tained, and then the Rao and Wald detectors are devised, which have proved that the Rao and Wald test coincide with GLRT detec-tor. To make the detectors fully adaptive, the secondary data with signal-free will be collected to estimate the covariance. The per-formance of the proposed detector is analyzed, however, it is just ancillary. A thorough performance assessment by several numer-ical examples is also given, which has considered the sense with co-located antennas configure of transmitters and receivers array. The results show that the performance the proposed adaptive de-tector is better than LJ-GLRT, and the loss can be acceptable in comparison to their non-adaptive counterparts.