As shallow resources are increasingly depleted,the mechanics'theory and testing technology of deep insitu rock has become urgent.Traditional coring technologies obtain rock samples without retaining the in-situ en...As shallow resources are increasingly depleted,the mechanics'theory and testing technology of deep insitu rock has become urgent.Traditional coring technologies obtain rock samples without retaining the in-situ environmental conditions,leading to distortion of the measured parameters.Herein,a coring and testing systems retaining in-situ geological conditions is presented:the coring system that obtains in-situ rock samples,and the transfer and testing system that stores and analyzes the rocks under a reconstructed environment.The ICP-Coring system mainly consists of the pressure controller,active insulated core reactor and insulation layer and sealing film.The ultimate bearing strength of 100 MPa for pressurepreservation,temperature control accuracy of 0.97%for temperature-retained are realized.CH_(4)and CO permeability of the optimized sealing film are as low as 3.85 and 0.33 ppm/min.The average tensile elongation of the film is 152.4%and the light transmittance is reduced to 0%.Additionally,the pressure and steady-state temperature accuracy for reconstructing the in-situ environment of transfer and storage system up to 1%and±0.2 is achieved.The error recorded of the noncontact sensor ring made of lowdensity polymer is less than 6%than that of the contact test.The system can provide technical support for the deep in-situ rock mechanics research,improving deep resource acquisition capabilities and further clarifying deep-earth processes.展开更多
According to the physical and chemical characteristics of superfine powder extinguishing agent,three test methods are selected to measure the flow ability.By studying and comparing various test methods,apparatus and c...According to the physical and chemical characteristics of superfine powder extinguishing agent,three test methods are selected to measure the flow ability.By studying and comparing various test methods,apparatus and conditions,the optimum method and conditions to test flow property of superfine powder extinguishing agent are confirmed.展开更多
The main purpose in many randomized trials is to make an inference about the average causal effect of a treatment. Therefore, on a binary outcome, the null hypothesis for the hypothesis test should be that the causal ...The main purpose in many randomized trials is to make an inference about the average causal effect of a treatment. Therefore, on a binary outcome, the null hypothesis for the hypothesis test should be that the causal risks are equal in the two groups. This null hypothesis is referred to as the weak causal null hypothesis. Nevertheless, at present, hypothesis tests applied in actual randomized trials are not for this null hypothesis;Fisher’s exact test is a test for the sharp causal null hypothesis that the causal effect of treatment is the same for all subjects. In general, the rejection of the sharp causal null hypothesis does not mean that the weak causal null hypothesis is rejected. Recently, Chiba developed new exact tests for the weak causal null hypothesis: a conditional exact test, which requires that a marginal total is fixed, and an unconditional exact test, which does not require that a marginal total is fixed and depends rather on the ratio of random assignment. To apply these exact tests in actual randomized trials, it is inevitable that the sample size calculation must be performed during the study design. In this paper, we present a sample size calculation procedure for these exact tests. Given the sample size, the procedure can derive the exact test power, because it examines all the patterns that can be obtained as observed data under the alternative hypothesis without large sample theories and any assumptions.展开更多
Nonparametric time-of-arrival(TOA) estimators for impulse radio ultra-wideband(IR-UWB) signals are proposed. Nonparametric detection is obviously useful in situations where detailed information about the statistic...Nonparametric time-of-arrival(TOA) estimators for impulse radio ultra-wideband(IR-UWB) signals are proposed. Nonparametric detection is obviously useful in situations where detailed information about the statistics of the noise is unavailable or not accurate. Such TOA estimators are obtained based on conditional statistical tests with only a symmetry distribution assumption on the noise probability density function. The nonparametric estimators are attractive choices for low-resolution IR-UWB digital receivers which can be implemented by fast comparators or high sampling rate low resolution analog-to-digital converters(ADCs),in place of high sampling rate high resolution ADCs which may not be available in practice. Simulation results demonstrate that nonparametric TOA estimators provide more effective and robust performance than typical energy detection(ED) based estimators.展开更多
This research proposes a new offshore wind energy generation system that uses a tension leg platform (TLP) and describes experiments performed on a TLP type wind turbine in both waves and wind. The following conclusio...This research proposes a new offshore wind energy generation system that uses a tension leg platform (TLP) and describes experiments performed on a TLP type wind turbine in both waves and wind. The following conclusions can be made from the results of this research. 1) In the case of coexisting wave-wind fields, the wind effect stabilizes the pitch motion. 2) The wind effect decreases vibration of the mooring lines when waves and wind coexist. In particular, the springing (2nd or 3rd order force) also decreases in this field. 3) It can be estimated that the reduction in the rate of generation of electrical power can be up to about 6% as a result of the heel angle. In addition, the annual amount of electricity generated was estimated along with the utilization factor based on the experimental results.展开更多
Bayesian network is a popular approach to uncertainty knowledge representation and reasoning. Structure learning is the first step to learn a Bayesian network. Score-based methods are one of the most popular ways of l...Bayesian network is a popular approach to uncertainty knowledge representation and reasoning. Structure learning is the first step to learn a Bayesian network. Score-based methods are one of the most popular ways of learning the structure. In most cases, the score of Bayesian network is defined as adding the log-likelihood score and complexity score by using the penalty function. If the penalty function is set unreasonably, it may hurt the performance of structure search. Thus, Bayesian network structure learning is essentially a bi-objective optimization problem. However, the existing bi-objective structure learning algorithms can only be applied to small-scale networks. To this end, this paper proposes a bi-objective evolutionary Bayesian network structure learning algorithm via skeleton constraint (BBS) for the medium-scale networks. To boost the performance of searching, BBS introduces the random order prior (ROP) initial operator. ROP generates a skeleton to constrain the searching space, which is the key to expanding the scale of structure learning problems. Then, the acyclic structures are guaranteed by adding the orders of variables in the initial skeleton. After that, BBS designs the Pareto rank based crossover and skeleton guided mutation operators. The operators operate on the skeleton obtained in ROP to make the search more targeted. Finally, BBS provides a strategy to choose the final solution. The experimental results show that BBS can always find the structure which is closer to the ground truth compared with the single-objective structure learning methods. Furthermore, compared with the existing bi-objective structure learning methods, BBS is scalable and can be applied to medium-scale Bayesian network datasets. On the educational problem of discovering the influencing factors of students’ academic performance, BBS provides higher quality solutions and is featured with the flexibility of solution selection compared with the widely-used Bayesian network structure learning methods.展开更多
Linear mixed models are popularly used to fit continuous longitudinal data, and the random effects are commonly assumed to have normal distribution. However, this assumption needs to be tested so that further analysis...Linear mixed models are popularly used to fit continuous longitudinal data, and the random effects are commonly assumed to have normal distribution. However, this assumption needs to be tested so that further analysis can be proceeded well. In this paper, we consider the Baringhaus-Henze-Epps-Pulley (BHEP) tests, which are based on an empirical characteristic function. Differing from their case, we consider the normality checking for the random effects which are unobservable and the test should be based on their predictors. The test is consistent against global alternatives, and is sensitive to the local alternatives converging to the null at a certain rate arbitrarily close to 1/V~ where n is sample size. ^-hlrthermore, to overcome the problem that the limiting null distribution of the test is not tractable, we suggest a new method: use a conditional Monte Carlo test (CMCT) to approximate the null distribution, and then to simulate p-values. The test is compared with existing methods, the power is examined, and several examples are applied to illustrate the usefulness of our test in the analysis of longitudinal data.展开更多
Learning Bayesian network structure is one of the most exciting challenges in machine learning. Discovering a correct skeleton of a directed acyclic graph(DAG) is the foundation for dependency analysis algorithms fo...Learning Bayesian network structure is one of the most exciting challenges in machine learning. Discovering a correct skeleton of a directed acyclic graph(DAG) is the foundation for dependency analysis algorithms for this problem. Considering the unreliability of high order condition independence(CI) tests, and to improve the efficiency of a dependency analysis algorithm, the key steps are to use few numbers of CI tests and reduce the sizes of conditioning sets as much as possible. Based on these reasons and inspired by the algorithm PC, we present an algorithm, named fast and efficient PC(FEPC), for learning the adjacent neighbourhood of every variable. FEPC implements the CI tests by three kinds of orders, which reduces the high order CI tests significantly. Compared with current algorithm proposals, the experiment results show that FEPC has better accuracy with fewer numbers of condition independence tests and smaller size of conditioning sets. The highest reduction percentage of CI test is 83.3% by EFPC compared with PC algorithm.展开更多
Inferring gene regulatory networks (GRNs) is a challenging task in Bioinformatics. In this paper, an algorithm, PCHMS, is introduced to infer GRNs. This method applies the path consistency (PC) algorithm based on ...Inferring gene regulatory networks (GRNs) is a challenging task in Bioinformatics. In this paper, an algorithm, PCHMS, is introduced to infer GRNs. This method applies the path consistency (PC) algorithm based on conditional mutual information test (PCA-CMI). In the PC-based algorithms the separator set is determined to detect the dependency between variables. The PCHMS algorithm attempts to select the set in the smart way. For this purpose, the edges of resulted skeleton are directed based on PC algorithm direction rule and mutual information test (MIT) score. Then the separator set is selected according to the directed network by considering a suitable sequential order of genes. The effectiveness of this method is benchmarked through several networks from the DREAM challenge and the widely used SOS DNA repair network of Escherichia coll. Results show that applying the PCHMS algorithm improves the precision of learning the structure of the GRNs in comparison with current popular approaches.展开更多
The conditional kernel correlation is proposed to measure the relationship between two random variables under covariates for multivariate data.Relying on the framework of reproducing kernel Hilbert spaces,we give the ...The conditional kernel correlation is proposed to measure the relationship between two random variables under covariates for multivariate data.Relying on the framework of reproducing kernel Hilbert spaces,we give the definitions of the conditional kernel covariance and conditional kernel correlation.We also provide their respective sample estimators and give the asymptotic properties,which help us construct a conditional independence test.According to the numerical results,the proposed test is more effective compared to the existing one under the considered scenarios.A real data is further analyzed to illustrate the efficacy of the proposed method.展开更多
基金supported by the Program for Guangdong Introducing Innovative and Enterpreneurial Teams(No.2019ZT08G315)National Natural Science Foundation of China(No.51827901,U2013603,and 52004166)。
文摘As shallow resources are increasingly depleted,the mechanics'theory and testing technology of deep insitu rock has become urgent.Traditional coring technologies obtain rock samples without retaining the in-situ environmental conditions,leading to distortion of the measured parameters.Herein,a coring and testing systems retaining in-situ geological conditions is presented:the coring system that obtains in-situ rock samples,and the transfer and testing system that stores and analyzes the rocks under a reconstructed environment.The ICP-Coring system mainly consists of the pressure controller,active insulated core reactor and insulation layer and sealing film.The ultimate bearing strength of 100 MPa for pressurepreservation,temperature control accuracy of 0.97%for temperature-retained are realized.CH_(4)and CO permeability of the optimized sealing film are as low as 3.85 and 0.33 ppm/min.The average tensile elongation of the film is 152.4%and the light transmittance is reduced to 0%.Additionally,the pressure and steady-state temperature accuracy for reconstructing the in-situ environment of transfer and storage system up to 1%and±0.2 is achieved.The error recorded of the noncontact sensor ring made of lowdensity polymer is less than 6%than that of the contact test.The system can provide technical support for the deep in-situ rock mechanics research,improving deep resource acquisition capabilities and further clarifying deep-earth processes.
文摘According to the physical and chemical characteristics of superfine powder extinguishing agent,three test methods are selected to measure the flow ability.By studying and comparing various test methods,apparatus and conditions,the optimum method and conditions to test flow property of superfine powder extinguishing agent are confirmed.
文摘The main purpose in many randomized trials is to make an inference about the average causal effect of a treatment. Therefore, on a binary outcome, the null hypothesis for the hypothesis test should be that the causal risks are equal in the two groups. This null hypothesis is referred to as the weak causal null hypothesis. Nevertheless, at present, hypothesis tests applied in actual randomized trials are not for this null hypothesis;Fisher’s exact test is a test for the sharp causal null hypothesis that the causal effect of treatment is the same for all subjects. In general, the rejection of the sharp causal null hypothesis does not mean that the weak causal null hypothesis is rejected. Recently, Chiba developed new exact tests for the weak causal null hypothesis: a conditional exact test, which requires that a marginal total is fixed, and an unconditional exact test, which does not require that a marginal total is fixed and depends rather on the ratio of random assignment. To apply these exact tests in actual randomized trials, it is inevitable that the sample size calculation must be performed during the study design. In this paper, we present a sample size calculation procedure for these exact tests. Given the sample size, the procedure can derive the exact test power, because it examines all the patterns that can be obtained as observed data under the alternative hypothesis without large sample theories and any assumptions.
基金supported by the National High Technology Research and Development Program of China(863 Program)(2009AA011204)
文摘Nonparametric time-of-arrival(TOA) estimators for impulse radio ultra-wideband(IR-UWB) signals are proposed. Nonparametric detection is obviously useful in situations where detailed information about the statistics of the noise is unavailable or not accurate. Such TOA estimators are obtained based on conditional statistical tests with only a symmetry distribution assumption on the noise probability density function. The nonparametric estimators are attractive choices for low-resolution IR-UWB digital receivers which can be implemented by fast comparators or high sampling rate low resolution analog-to-digital converters(ADCs),in place of high sampling rate high resolution ADCs which may not be available in practice. Simulation results demonstrate that nonparametric TOA estimators provide more effective and robust performance than typical energy detection(ED) based estimators.
基金Supported by The Japan Science Society(Foundation: Grant No.23-708K)
文摘This research proposes a new offshore wind energy generation system that uses a tension leg platform (TLP) and describes experiments performed on a TLP type wind turbine in both waves and wind. The following conclusions can be made from the results of this research. 1) In the case of coexisting wave-wind fields, the wind effect stabilizes the pitch motion. 2) The wind effect decreases vibration of the mooring lines when waves and wind coexist. In particular, the springing (2nd or 3rd order force) also decreases in this field. 3) It can be estimated that the reduction in the rate of generation of electrical power can be up to about 6% as a result of the heel angle. In addition, the annual amount of electricity generated was estimated along with the utilization factor based on the experimental results.
基金supported by the Fundamental Research Funds for the Central Universities,the Science and Technology Commission of Shanghai Municipality(No.19511120601)the Scientific and Technological Innovation 2030 Major Projects(No.2018AAA0100902)+1 种基金the CCF-AFSG Research Fund(No.CCF-AFSG RF20220205)the“Chenguang Program”sponsored by Shanghai Education Development Foundation and Shanghai Municipal Education Commission(No.21CGA32).
文摘Bayesian network is a popular approach to uncertainty knowledge representation and reasoning. Structure learning is the first step to learn a Bayesian network. Score-based methods are one of the most popular ways of learning the structure. In most cases, the score of Bayesian network is defined as adding the log-likelihood score and complexity score by using the penalty function. If the penalty function is set unreasonably, it may hurt the performance of structure search. Thus, Bayesian network structure learning is essentially a bi-objective optimization problem. However, the existing bi-objective structure learning algorithms can only be applied to small-scale networks. To this end, this paper proposes a bi-objective evolutionary Bayesian network structure learning algorithm via skeleton constraint (BBS) for the medium-scale networks. To boost the performance of searching, BBS introduces the random order prior (ROP) initial operator. ROP generates a skeleton to constrain the searching space, which is the key to expanding the scale of structure learning problems. Then, the acyclic structures are guaranteed by adding the orders of variables in the initial skeleton. After that, BBS designs the Pareto rank based crossover and skeleton guided mutation operators. The operators operate on the skeleton obtained in ROP to make the search more targeted. Finally, BBS provides a strategy to choose the final solution. The experimental results show that BBS can always find the structure which is closer to the ground truth compared with the single-objective structure learning methods. Furthermore, compared with the existing bi-objective structure learning methods, BBS is scalable and can be applied to medium-scale Bayesian network datasets. On the educational problem of discovering the influencing factors of students’ academic performance, BBS provides higher quality solutions and is featured with the flexibility of solution selection compared with the widely-used Bayesian network structure learning methods.
基金supported in part by a grant of Research Grants Council of Hong Kong,and National Natural Science Foundation of China (Grant No. 11101157)
文摘Linear mixed models are popularly used to fit continuous longitudinal data, and the random effects are commonly assumed to have normal distribution. However, this assumption needs to be tested so that further analysis can be proceeded well. In this paper, we consider the Baringhaus-Henze-Epps-Pulley (BHEP) tests, which are based on an empirical characteristic function. Differing from their case, we consider the normality checking for the random effects which are unobservable and the test should be based on their predictors. The test is consistent against global alternatives, and is sensitive to the local alternatives converging to the null at a certain rate arbitrarily close to 1/V~ where n is sample size. ^-hlrthermore, to overcome the problem that the limiting null distribution of the test is not tractable, we suggest a new method: use a conditional Monte Carlo test (CMCT) to approximate the null distribution, and then to simulate p-values. The test is compared with existing methods, the power is examined, and several examples are applied to illustrate the usefulness of our test in the analysis of longitudinal data.
基金Supported by the National Natural Science Foundation of China(61403290,11301408,11401454)the Foundation for Youths of Shaanxi Province(2014JQ1020)+1 种基金the Foundation of Baoji City(2013R7-3)the Foundation of Baoji University of Arts and Sciences(ZK15081)
文摘Learning Bayesian network structure is one of the most exciting challenges in machine learning. Discovering a correct skeleton of a directed acyclic graph(DAG) is the foundation for dependency analysis algorithms for this problem. Considering the unreliability of high order condition independence(CI) tests, and to improve the efficiency of a dependency analysis algorithm, the key steps are to use few numbers of CI tests and reduce the sizes of conditioning sets as much as possible. Based on these reasons and inspired by the algorithm PC, we present an algorithm, named fast and efficient PC(FEPC), for learning the adjacent neighbourhood of every variable. FEPC implements the CI tests by three kinds of orders, which reduces the high order CI tests significantly. Compared with current algorithm proposals, the experiment results show that FEPC has better accuracy with fewer numbers of condition independence tests and smaller size of conditioning sets. The highest reduction percentage of CI test is 83.3% by EFPC compared with PC algorithm.
文摘Inferring gene regulatory networks (GRNs) is a challenging task in Bioinformatics. In this paper, an algorithm, PCHMS, is introduced to infer GRNs. This method applies the path consistency (PC) algorithm based on conditional mutual information test (PCA-CMI). In the PC-based algorithms the separator set is determined to detect the dependency between variables. The PCHMS algorithm attempts to select the set in the smart way. For this purpose, the edges of resulted skeleton are directed based on PC algorithm direction rule and mutual information test (MIT) score. Then the separator set is selected according to the directed network by considering a suitable sequential order of genes. The effectiveness of this method is benchmarked through several networks from the DREAM challenge and the widely used SOS DNA repair network of Escherichia coll. Results show that applying the PCHMS algorithm improves the precision of learning the structure of the GRNs in comparison with current popular approaches.
基金partially supported by Knowledge Innovation Program of Hubei Province(No.2019CFB810)partially supported by NSFC(No.12325110)the CAS Project for Young Scientists in Basic Research(No.YSBR-034)。
文摘The conditional kernel correlation is proposed to measure the relationship between two random variables under covariates for multivariate data.Relying on the framework of reproducing kernel Hilbert spaces,we give the definitions of the conditional kernel covariance and conditional kernel correlation.We also provide their respective sample estimators and give the asymptotic properties,which help us construct a conditional independence test.According to the numerical results,the proposed test is more effective compared to the existing one under the considered scenarios.A real data is further analyzed to illustrate the efficacy of the proposed method.