In detecting system fault algorithms,the false alarm rate and undectect rate generated by residual Chi-square test can affect the stability of filters.The paper proposes a fault detection algorithm based on sequential...In detecting system fault algorithms,the false alarm rate and undectect rate generated by residual Chi-square test can affect the stability of filters.The paper proposes a fault detection algorithm based on sequential residual Chi-square test and applies to fault detection of an integrated navigation system.The simulation result shows that the algorithm can accurately detect the fault information of global positioning system(GPS),eliminate the influence of false alarm and missed detection on filter,and enhance fault tolerance of integrated navigation systems.展开更多
Schizophrenia(SZ)is one of the most common mental diseases.Its main characteristics are abnormal social behavior and inability to correctly understand real things.In recent years,the magnetic resonance imaging(MRI)tec...Schizophrenia(SZ)is one of the most common mental diseases.Its main characteristics are abnormal social behavior and inability to correctly understand real things.In recent years,the magnetic resonance imaging(MRI)technique has been popularly utilized to study SZ.However,it is still a great challenge to reveal the essential information contained in the MRI data.In this paper,we proposed a biomarker selection approach based on the multiple hypothesis testing techniques to explore the difference between SZ and healthy controls by using both functional and structural MRI data,in which biomarkers represent both abnormal brain functional connectivity and abnormal brain regions.By implementing the biomarker selection approach,six abnormal brain regions and twenty-three abnormal functional connectivity in the brains of SZ are explored.It is discovered that compared with healthy controls,the significantly reduced gray matter volumes are mainly distributed in the limbic lobe and the basal ganglia,and the significantly increased gray matter volumes are distributed in the frontal gyrus.Meanwhile,it is revealed that the significantly strengthened connections are those between the middle frontal gyrus and the superior occipital gyrus,the superior occipital gyrus and the middle occipital gyrus as well as the middle occipital gyrus and the fusiform gyrus,and the rest connections are significantly weakened.展开更多
Eliminating the false intersection (deghosting) is a difficult problem in a passive cross location system. Using a decentralized decision fusion topology, a new deghosting algorithm derived from hypothesis testing the...Eliminating the false intersection (deghosting) is a difficult problem in a passive cross location system. Using a decentralized decision fusion topology, a new deghosting algorithm derived from hypothesis testing theory is developed. It uses the difference between ghosts and true targets in the statistical error, which occurs between their projection angles on a deghosting sensor and is measured from a deghosting sensor, and constructs a corresponding test statistic. Under the Gaussian assumption, ghosts and true targets are decided and discriminated by Chi-square distribution. Simulation results show the feasibility of the algorithm.展开更多
Introduction: This work investigates whether to conduct a medical study from the point of view of the expected net benefit taking into account statistical power, time and cost. The hypothesis of this paper is that the...Introduction: This work investigates whether to conduct a medical study from the point of view of the expected net benefit taking into account statistical power, time and cost. The hypothesis of this paper is that the expected net benefit is equal to zero. Methods: Information were obtained from a pilot medical study that investigates the effects of two diagnostic modalities, magnetic resonance imaging (MRI) and computerized axial tomography scanner (CT), on patients with acute stroke. Statistical procedure was applied for planning and contrasting equivalence, non-inferiority and inequality hypotheses of the study for the effectiveness, health benefits and costs. A statistical simulation model was applied to test the hypothesis that conducting the study would or not result in overall net benefits. If the null hypothesis not rejected, no benefits would occurred and therefore the two arms-patterns of diagnostic and treatment are of equal net benefits. If the null hypothesis is rejected, net benefits would occur if patients are diagnosed with the more favourable diagnostic modality. Results: For any hypothesis design, the expected net benefits are in the range of 366 to 1796 per patient at 80% of statistical power if conducting the study. The power depends on the monetary value available for a unit of health improvement. Conclusion: The statistical simulations suggest that diagnosing patients with CT will provide more favourable health outcomes showing statistically significant expected net benefits in comparison with MRI.展开更多
When a statistical test of hypothesis for a population mean is performed, we are faced with the possibility of committing a Type II error by not rejecting the null hypothesis when in fact the population mean has chang...When a statistical test of hypothesis for a population mean is performed, we are faced with the possibility of committing a Type II error by not rejecting the null hypothesis when in fact the population mean has changed. We consider this issue and quantify matters in a manner that differs a bit from what is commonly done. In particular, we define the probability distribution function for Type II errors. We then explore some interesting properties that we have not seen mentioned elsewhere for this probability distribution function. Finally, we discuss several Maple procedures that can be used to perform various calculations using the distribution.展开更多
One of the important fields in statistics is testing hypothesis of correlation coefficient. The extension of the idea of testing correlation to fuzzy hypothesis is of great interesting. In this study, we examined the ...One of the important fields in statistics is testing hypothesis of correlation coefficient. The extension of the idea of testing correlation to fuzzy hypothesis is of great interesting. In this study, we examined the use of fuzzy hypothesis testing approach for the Sequential Probability Ratio Test (SPRT) of correlation coefficient. Use of fuzzy hypothesis testing for correlation coefficient with SPRT is illustrated by an example.展开更多
Testing the equality of percentiles (quantiles) between populations is an effective method for robust, nonparametric comparison, especially when the distributions are asymmetric or irregularly shaped. Unlike global no...Testing the equality of percentiles (quantiles) between populations is an effective method for robust, nonparametric comparison, especially when the distributions are asymmetric or irregularly shaped. Unlike global nonparametric tests for homogeneity such as the Kolmogorv-Smirnov test, testing the equality of a set of percentiles (i.e., a percentile profile) yields an estimate of the location and extent of the differences between the populations along the entire domain. The Wald test using bootstrap estimates of variance of the order statistics provides a unified method for hypothesis testing of functions of the population percentiles. Simulation studies are conducted to show performance of the method under various scenarios and to give suggestions on its use. Several examples are given to illustrate some useful applications to real data.展开更多
A novel statistical approach to evaluate the manufacturing quality of press coated tablets in terms of the centering of their core is presented. We also provide a formula to determine the necessary sample size. This a...A novel statistical approach to evaluate the manufacturing quality of press coated tablets in terms of the centering of their core is presented. We also provide a formula to determine the necessary sample size. This approach is applied to real data.展开更多
Hypothesis testing analysis and unknown parameter estimation of both the intermediate frequency(IF) and baseband GPS signal detection are given by using the generalized likelihood ratio test(GLRT) approach,applying th...Hypothesis testing analysis and unknown parameter estimation of both the intermediate frequency(IF) and baseband GPS signal detection are given by using the generalized likelihood ratio test(GLRT) approach,applying the model of GPS signal in white Gaussian noise,It is proved that the test statistic follows central or noncentral F distribution,It is also pointed out that the test statistic is nearly identical to central or noncentral chi-squared distribution because the processing samples are large enough to be considered as infinite in GPS acquisition problem.It is also proved that the probability of false alarm,the probability of detection and the threshold are affected largely when the hypothesis testing refers to the full pseudorandom noise(PRN) code phase and Doppler frequency search space cells instead of each individual cell.The performance of the test statistic is also given with combining the noncoherent integration.展开更多
Confining stresses serve as a pivotal determinant in shaping the behavior of grouted rock bolts.Nonetheless,prior investigations have oversimplified the three-dimensional stress state,primarily assuming hydrostatic st...Confining stresses serve as a pivotal determinant in shaping the behavior of grouted rock bolts.Nonetheless,prior investigations have oversimplified the three-dimensional stress state,primarily assuming hydrostatic stress conditions.Under these conditions,it is assumed that the intermediate principal stress(σ_(2))equals the minimum principal stress(σ_(3)).This assumption overlooks the potential variations in magnitudes of in situ stress conditions along all three directions near an underground opening where a rock bolt is installed.In this study,a series of push tests was meticulously conducted under triaxial conditions.These tests involved applying non-uniform confining stresses(σ_(2)≠σ_(3))to cubic specimens,aiming to unveil the previously overlooked influence of intermediate principal stresses on the strength properties of rock bolts.The results show that as the confining stresses increase from zero to higher levels,the pre-failure behavior changes from linear to nonlinear forms,resulting in an increase in initial stiffness from 2.08 kN/mm to 32.51 kN/mm.The load-displacement curves further illuminate distinct post-failure behavior at elevated levels of confining stresses,characterized by enhanced stiffness.Notably,the peak load capacity ranged from 27.9 kN to 46.5 kN as confining stresses advanced from σ_(2)=σ_(3)=0 to σ_(2)=20 MPa and σ_(3)=10 MPa.Additionally,the outcomes highlight an influence of confining stress on the lateral deformation of samples.Lower levels of confinement prompt overall dilation in lateral deformation,while higher confinements maintain a state of shrinkage.Furthermore,diverse failure modes have been identified,intricately tied to the arrangement of confining stresses.Lower confinements tend to induce a splitting mode of failure,whereas higher loads bring about a shift towards a pure interfacial shear-off and shear-crushed failure mechanism.展开更多
The beyond-dripline oxygen isotopes^(27,28)O were recently observed at RIKEN,and were found to be unbound decaying into^(24)O by emitting neutrons.The unbound feature of the heaviest oxygen isotope,^(28)O,provides an ...The beyond-dripline oxygen isotopes^(27,28)O were recently observed at RIKEN,and were found to be unbound decaying into^(24)O by emitting neutrons.The unbound feature of the heaviest oxygen isotope,^(28)O,provides an excellent test for stateof-the-art nuclear models.The atomic nucleus is a self-organized quantum manybody system comprising specific numbers of protons Z and neutrons N.展开更多
The use of Statistical Hypothesis Testing procedure to determine type I and type II errors was linked to the measurement of sensitivity and specificity in clinical trial test and experimental pathogen detection techni...The use of Statistical Hypothesis Testing procedure to determine type I and type II errors was linked to the measurement of sensitivity and specificity in clinical trial test and experimental pathogen detection techniques. A theoretical analysis of establishing these types of errors was made and compared to determination of False Positive, False Negative, True Positive and True Negative. Experimental laboratory detection methods used to detect Cryptosporidium spp. were used to highlight the relationship between hypothesis testing, sensitivity, specificity and predicted values. The study finds that, sensitivity and specificity for the two laboratory methods used for Cryptosporidium detection were low hence lowering the probability of detecting a “false null hypothesis” for the presence of cryptosporidium in the water samples using either Microscopic or PCR. Nevertheless, both procedures for cryptosporidium detection had higher “true negatives” increasing its probability of failing to reject a “true null hypothesis” with specificity of 1.00 for both Microscopic and PCR laboratory detection methods.展开更多
Soil liquefaction is one of the complex research topics in geotechnical engineering and engineering geology. Especially after the 1964 Niigata earthquake (Japan) induced many soil liquefaction incidents, a variety of ...Soil liquefaction is one of the complex research topics in geotechnical engineering and engineering geology. Especially after the 1964 Niigata earthquake (Japan) induced many soil liquefaction incidents, a variety of soil liquefaction studies were conducted and reported, including the liquefaction potential assessment methods utilizing the shear wave velocity (V<sub>s</sub>) or SPT-N profiles (SPT: standard penetration test). This study used the V<sub>s</sub> and SPT methods recommended by the National Center for Earthquake Engineering Research (NCEER) to examine which is more conservative according to the assessment results on 41 liquefiable soil layers at sites in two major cities in Taiwan. Statistical hypothesis testing was used to make the analysis more quantitative and objective. Based on three sets of hypothesis tests, it shows that the hypothesis—the SPT method is more conservative than the V<sub>s</sub> method—was not rejected on a 5% level of significance.展开更多
BACKGROUND Nonadherence is a major problem in the treatment of psychotic disorders.It has been hypothesized that nonadherent patients with schizophrenia are not a homogeneous population and subtypes of nonadherence mi...BACKGROUND Nonadherence is a major problem in the treatment of psychotic disorders.It has been hypothesized that nonadherent patients with schizophrenia are not a homogeneous population and subtypes of nonadherence might exist,but this hypothesis has not been specifically tested.AIM To test the hypothesis of subtypes of nonadherence in schizophrenia and schizoaffective disorder.METHODS This prospective study included 110 consecutively admitted patients diagnosed with schizophrenia or schizoaffective disorder.Assessments were performed at baseline and at 6 mo follow-up after discharge.Sociodemographic,clinical,psychopathological and treatment-related variables were evaluated.Adherence was defined as the concurrence of adherence to antipsychotic treatment and outpatient follow-up during the six-month period.Adherence to antipsychotic treatment was defined as the concurrence of objective and subjective adherence.Sixty-four patients(58%)fulfilled nonadherence criteria at the end of the followup period and were categorized according to their subtype of nonadherence.RESULTS In nonadherent patients(n=64),32(50%)fulfilled criteria of intentional nonadherence,and 32(50%)of unintentional nonadherence(UNA).Unintentional nonadherent patients,as compared to intentional nonadherent patients,are characterized by older age,lower educational level,worse cognitive and negative symptoms,greater severity,worse knowledge of their treatment regimen,greater prevalence of supervision of the treatment,lower number of prior hospitalizations and greater use of nonpsychiatric treatment,anticholinergics and hypnotics.Low educational level(OR=26.1;95%CI:2.819-241),worse treatment knowledge at six months(OR per unit=0.904;95%CI:0.853-0.957)and nonpsychiatric treatment at six months(OR=15.8;95%CI:1.790-139)were independently associated to UNA.CONCLUSION Differentiated subtypes of nonadherence according to intentionality seem to exist in patients with schizophrenia and schizoaffective disorder.Our findings suggest the need for differentiated approach,both in future research and in clinical practice.展开更多
Point-of-care testing(POCT)is the practice of diagnosing and monitoring diseases where the patient is located,as opposed to traditional treatment conducted solely in a medical laboratory or other clinical setting.POCT...Point-of-care testing(POCT)is the practice of diagnosing and monitoring diseases where the patient is located,as opposed to traditional treatment conducted solely in a medical laboratory or other clinical setting.POCT has been less common in the recent past due to a lack of portable medical devices capable of facilitating effective medical testing.However,recent growth has occurred in this field due to advances in diagnostic technologies,device miniaturization,and progress in wearable electronics.Among these developments,electrochemical sensors have attracted interest in the POCT field due to their high sensitivity,compact size,and affordability.They are used in various applications,from disease diagnosis to health status monitoring.In this paper we explore recent advancements in electrochemical sensors,the methods of fabricating them,and the various types of sensing mechanisms that can be used.Furthermore,we delve into methods for immobilizing specific biorecognition elements,including enzymes,antibodies,and aptamers,onto electrode surfaces and how these sensors are used in real-world POCT settings.展开更多
Knowledge of the mechanical behavior of planetary rocks is indispensable for space explorations.The scarcity of pristine samples and the irregular shapes of planetary meteorites make it difficult to obtain representat...Knowledge of the mechanical behavior of planetary rocks is indispensable for space explorations.The scarcity of pristine samples and the irregular shapes of planetary meteorites make it difficult to obtain representative samples for conventional macroscale rock mechanics experiments(macro-RMEs).This critical review discusses recent advances in microscale RMEs(micro-RMEs)techniques and the upscaling methods for extracting mechanical parameters.Methods of mineralogical and microstructural analyses,along with non-destructive mechanical techniques,have provided new opportunities for studying planetary rocks with unprecedented precision and capabilities.First,we summarize several mainstream methods for obtaining the mineralogy and microstructure of planetary rocks.Then,nondestructive micromechanical testing methods,nanoindentation and atomic force microscopy(AFM),are detailed reviewed,illustrating the principles,advantages,influencing factors,and available testing results from literature.Subsequently,several feasible upscaling methods that bridge the micro-measurements of meteorite pieces to the strength of the intact body are introduced.Finally,the potential applications of planetary rock mechanics research to guiding the design and execution of space missions are environed,ranging from sample return missions and planetary defense to extraterrestrial construction.These discussions are expected to broaden the understanding of the microscale mechanical properties of planetary rocks and their significant role in deep space exploration.展开更多
This paper demonstrates two versions of Whorf hypothesis by giving a conclusion of its commonality and differences. Besides, some classical experiments of testing the hypothesis are given and the author constructs an ...This paper demonstrates two versions of Whorf hypothesis by giving a conclusion of its commonality and differences. Besides, some classical experiments of testing the hypothesis are given and the author constructs an experimental model of testing the weak version of the hypothesis under the instruction of Carroll's three thinking.展开更多
The purpose of this paper is to find the relationship between balance of foreign trade and real exchange rate in econometrics concept by using time series method. The authors used annual data of foreign trade deficit,...The purpose of this paper is to find the relationship between balance of foreign trade and real exchange rate in econometrics concept by using time series method. The authors used annual data of foreign trade deficit, real exchange rate, gross domestic product (GDP) of Turkey from 1989 to 2014, and analyzed the long-term relation of them by using ARDL bound testing method. By the result of test method; although there was a long-term relationship between balance of foreign trade, real exchange rate, GDP of Turkey and of the world, the coefficient of real exchange rate was insignificant in terms of statistical methods. Turkey and the world as well as being statistically significant coefficient of GDP, it was concluded that there was significant relationship with the economic aspects.展开更多
Edge devices,due to their limited computational and storage resources,often require the use of compilers for program optimization.Therefore,ensuring the security and reliability of these compilers is of paramount impo...Edge devices,due to their limited computational and storage resources,often require the use of compilers for program optimization.Therefore,ensuring the security and reliability of these compilers is of paramount importance in the emerging field of edge AI.One widely used testing method for this purpose is fuzz testing,which detects bugs by inputting random test cases into the target program.However,this process consumes significant time and resources.To improve the efficiency of compiler fuzz testing,it is common practice to utilize test case prioritization techniques.Some researchers use machine learning to predict the code coverage of test cases,aiming to maximize the test capability for the target compiler by increasing the overall predicted coverage of the test cases.Nevertheless,these methods can only forecast the code coverage of the compiler at a specific optimization level,potentially missing many optimization-related bugs.In this paper,we introduce C-CORE(short for Clustering by Code Representation),the first framework to prioritize test cases according to their code representations,which are derived directly from the source codes.This approach avoids being limited to specific compiler states and extends to a broader range of compiler bugs.Specifically,we first train a scaled pre-trained programming language model to capture as many common features as possible from the test cases generated by a fuzzer.Using this pre-trained model,we then train two downstream models:one for predicting the likelihood of triggering a bug and another for identifying code representations associated with bugs.Subsequently,we cluster the test cases according to their code representations and select the highest-scoring test case from each cluster as the high-quality test case.This reduction in redundant testing cases leads to time savings.Comprehensive evaluation results reveal that code representations are better at distinguishing test capabilities,and C-CORE significantly enhances testing efficiency.Across four datasets,C-CORE increases the average of the percentage of faults detected(APFD)value by 0.16 to 0.31 and reduces test time by over 50% in 46% of cases.When compared to the best results from approaches using predicted code coverage,C-CORE improves the APFD value by 1.1% to 12.3% and achieves an overall time-saving of 159.1%.展开更多
基金supported by the National Natural Science Foundation of China(6063403060702066)+1 种基金the Aerospace Science Foundation(20090853013)Fundmental Research Foundation of NWPU(JC201015),Soaring Star of NWPU
文摘In detecting system fault algorithms,the false alarm rate and undectect rate generated by residual Chi-square test can affect the stability of filters.The paper proposes a fault detection algorithm based on sequential residual Chi-square test and applies to fault detection of an integrated navigation system.The simulation result shows that the algorithm can accurately detect the fault information of global positioning system(GPS),eliminate the influence of false alarm and missed detection on filter,and enhance fault tolerance of integrated navigation systems.
基金This work was supported by NSFC(No.11471006 and No.81601456),Science and Technology Innovation Plan of Xi’an(No.2019421315KYPT004JC006)and the HPC Platform,Xi’an Jiaotong University.
文摘Schizophrenia(SZ)is one of the most common mental diseases.Its main characteristics are abnormal social behavior and inability to correctly understand real things.In recent years,the magnetic resonance imaging(MRI)technique has been popularly utilized to study SZ.However,it is still a great challenge to reveal the essential information contained in the MRI data.In this paper,we proposed a biomarker selection approach based on the multiple hypothesis testing techniques to explore the difference between SZ and healthy controls by using both functional and structural MRI data,in which biomarkers represent both abnormal brain functional connectivity and abnormal brain regions.By implementing the biomarker selection approach,six abnormal brain regions and twenty-three abnormal functional connectivity in the brains of SZ are explored.It is discovered that compared with healthy controls,the significantly reduced gray matter volumes are mainly distributed in the limbic lobe and the basal ganglia,and the significantly increased gray matter volumes are distributed in the frontal gyrus.Meanwhile,it is revealed that the significantly strengthened connections are those between the middle frontal gyrus and the superior occipital gyrus,the superior occipital gyrus and the middle occipital gyrus as well as the middle occipital gyrus and the fusiform gyrus,and the rest connections are significantly weakened.
文摘Eliminating the false intersection (deghosting) is a difficult problem in a passive cross location system. Using a decentralized decision fusion topology, a new deghosting algorithm derived from hypothesis testing theory is developed. It uses the difference between ghosts and true targets in the statistical error, which occurs between their projection angles on a deghosting sensor and is measured from a deghosting sensor, and constructs a corresponding test statistic. Under the Gaussian assumption, ghosts and true targets are decided and discriminated by Chi-square distribution. Simulation results show the feasibility of the algorithm.
文摘Introduction: This work investigates whether to conduct a medical study from the point of view of the expected net benefit taking into account statistical power, time and cost. The hypothesis of this paper is that the expected net benefit is equal to zero. Methods: Information were obtained from a pilot medical study that investigates the effects of two diagnostic modalities, magnetic resonance imaging (MRI) and computerized axial tomography scanner (CT), on patients with acute stroke. Statistical procedure was applied for planning and contrasting equivalence, non-inferiority and inequality hypotheses of the study for the effectiveness, health benefits and costs. A statistical simulation model was applied to test the hypothesis that conducting the study would or not result in overall net benefits. If the null hypothesis not rejected, no benefits would occurred and therefore the two arms-patterns of diagnostic and treatment are of equal net benefits. If the null hypothesis is rejected, net benefits would occur if patients are diagnosed with the more favourable diagnostic modality. Results: For any hypothesis design, the expected net benefits are in the range of 366 to 1796 per patient at 80% of statistical power if conducting the study. The power depends on the monetary value available for a unit of health improvement. Conclusion: The statistical simulations suggest that diagnosing patients with CT will provide more favourable health outcomes showing statistically significant expected net benefits in comparison with MRI.
文摘When a statistical test of hypothesis for a population mean is performed, we are faced with the possibility of committing a Type II error by not rejecting the null hypothesis when in fact the population mean has changed. We consider this issue and quantify matters in a manner that differs a bit from what is commonly done. In particular, we define the probability distribution function for Type II errors. We then explore some interesting properties that we have not seen mentioned elsewhere for this probability distribution function. Finally, we discuss several Maple procedures that can be used to perform various calculations using the distribution.
文摘One of the important fields in statistics is testing hypothesis of correlation coefficient. The extension of the idea of testing correlation to fuzzy hypothesis is of great interesting. In this study, we examined the use of fuzzy hypothesis testing approach for the Sequential Probability Ratio Test (SPRT) of correlation coefficient. Use of fuzzy hypothesis testing for correlation coefficient with SPRT is illustrated by an example.
文摘Testing the equality of percentiles (quantiles) between populations is an effective method for robust, nonparametric comparison, especially when the distributions are asymmetric or irregularly shaped. Unlike global nonparametric tests for homogeneity such as the Kolmogorv-Smirnov test, testing the equality of a set of percentiles (i.e., a percentile profile) yields an estimate of the location and extent of the differences between the populations along the entire domain. The Wald test using bootstrap estimates of variance of the order statistics provides a unified method for hypothesis testing of functions of the population percentiles. Simulation studies are conducted to show performance of the method under various scenarios and to give suggestions on its use. Several examples are given to illustrate some useful applications to real data.
文摘A novel statistical approach to evaluate the manufacturing quality of press coated tablets in terms of the centering of their core is presented. We also provide a formula to determine the necessary sample size. This approach is applied to real data.
文摘Hypothesis testing analysis and unknown parameter estimation of both the intermediate frequency(IF) and baseband GPS signal detection are given by using the generalized likelihood ratio test(GLRT) approach,applying the model of GPS signal in white Gaussian noise,It is proved that the test statistic follows central or noncentral F distribution,It is also pointed out that the test statistic is nearly identical to central or noncentral chi-squared distribution because the processing samples are large enough to be considered as infinite in GPS acquisition problem.It is also proved that the probability of false alarm,the probability of detection and the threshold are affected largely when the hypothesis testing refers to the full pseudorandom noise(PRN) code phase and Doppler frequency search space cells instead of each individual cell.The performance of the test statistic is also given with combining the noncoherent integration.
文摘Confining stresses serve as a pivotal determinant in shaping the behavior of grouted rock bolts.Nonetheless,prior investigations have oversimplified the three-dimensional stress state,primarily assuming hydrostatic stress conditions.Under these conditions,it is assumed that the intermediate principal stress(σ_(2))equals the minimum principal stress(σ_(3)).This assumption overlooks the potential variations in magnitudes of in situ stress conditions along all three directions near an underground opening where a rock bolt is installed.In this study,a series of push tests was meticulously conducted under triaxial conditions.These tests involved applying non-uniform confining stresses(σ_(2)≠σ_(3))to cubic specimens,aiming to unveil the previously overlooked influence of intermediate principal stresses on the strength properties of rock bolts.The results show that as the confining stresses increase from zero to higher levels,the pre-failure behavior changes from linear to nonlinear forms,resulting in an increase in initial stiffness from 2.08 kN/mm to 32.51 kN/mm.The load-displacement curves further illuminate distinct post-failure behavior at elevated levels of confining stresses,characterized by enhanced stiffness.Notably,the peak load capacity ranged from 27.9 kN to 46.5 kN as confining stresses advanced from σ_(2)=σ_(3)=0 to σ_(2)=20 MPa and σ_(3)=10 MPa.Additionally,the outcomes highlight an influence of confining stress on the lateral deformation of samples.Lower levels of confinement prompt overall dilation in lateral deformation,while higher confinements maintain a state of shrinkage.Furthermore,diverse failure modes have been identified,intricately tied to the arrangement of confining stresses.Lower confinements tend to induce a splitting mode of failure,whereas higher loads bring about a shift towards a pure interfacial shear-off and shear-crushed failure mechanism.
基金This work was supported by the National Natural Science Foundation of China(Nos.12335007,11835001,11921006,12035001 and 12205340)the State Key Laboratory of Nuclear Physics and Technology,Peking University(No.NPT2020KFY13)Gansu Natural Science Foundation(No.22JR5RA123).
文摘The beyond-dripline oxygen isotopes^(27,28)O were recently observed at RIKEN,and were found to be unbound decaying into^(24)O by emitting neutrons.The unbound feature of the heaviest oxygen isotope,^(28)O,provides an excellent test for stateof-the-art nuclear models.The atomic nucleus is a self-organized quantum manybody system comprising specific numbers of protons Z and neutrons N.
文摘The use of Statistical Hypothesis Testing procedure to determine type I and type II errors was linked to the measurement of sensitivity and specificity in clinical trial test and experimental pathogen detection techniques. A theoretical analysis of establishing these types of errors was made and compared to determination of False Positive, False Negative, True Positive and True Negative. Experimental laboratory detection methods used to detect Cryptosporidium spp. were used to highlight the relationship between hypothesis testing, sensitivity, specificity and predicted values. The study finds that, sensitivity and specificity for the two laboratory methods used for Cryptosporidium detection were low hence lowering the probability of detecting a “false null hypothesis” for the presence of cryptosporidium in the water samples using either Microscopic or PCR. Nevertheless, both procedures for cryptosporidium detection had higher “true negatives” increasing its probability of failing to reject a “true null hypothesis” with specificity of 1.00 for both Microscopic and PCR laboratory detection methods.
文摘Soil liquefaction is one of the complex research topics in geotechnical engineering and engineering geology. Especially after the 1964 Niigata earthquake (Japan) induced many soil liquefaction incidents, a variety of soil liquefaction studies were conducted and reported, including the liquefaction potential assessment methods utilizing the shear wave velocity (V<sub>s</sub>) or SPT-N profiles (SPT: standard penetration test). This study used the V<sub>s</sub> and SPT methods recommended by the National Center for Earthquake Engineering Research (NCEER) to examine which is more conservative according to the assessment results on 41 liquefiable soil layers at sites in two major cities in Taiwan. Statistical hypothesis testing was used to make the analysis more quantitative and objective. Based on three sets of hypothesis tests, it shows that the hypothesis—the SPT method is more conservative than the V<sub>s</sub> method—was not rejected on a 5% level of significance.
基金Supported by College of Physicians of Las Palmas,No.I03/19.
文摘BACKGROUND Nonadherence is a major problem in the treatment of psychotic disorders.It has been hypothesized that nonadherent patients with schizophrenia are not a homogeneous population and subtypes of nonadherence might exist,but this hypothesis has not been specifically tested.AIM To test the hypothesis of subtypes of nonadherence in schizophrenia and schizoaffective disorder.METHODS This prospective study included 110 consecutively admitted patients diagnosed with schizophrenia or schizoaffective disorder.Assessments were performed at baseline and at 6 mo follow-up after discharge.Sociodemographic,clinical,psychopathological and treatment-related variables were evaluated.Adherence was defined as the concurrence of adherence to antipsychotic treatment and outpatient follow-up during the six-month period.Adherence to antipsychotic treatment was defined as the concurrence of objective and subjective adherence.Sixty-four patients(58%)fulfilled nonadherence criteria at the end of the followup period and were categorized according to their subtype of nonadherence.RESULTS In nonadherent patients(n=64),32(50%)fulfilled criteria of intentional nonadherence,and 32(50%)of unintentional nonadherence(UNA).Unintentional nonadherent patients,as compared to intentional nonadherent patients,are characterized by older age,lower educational level,worse cognitive and negative symptoms,greater severity,worse knowledge of their treatment regimen,greater prevalence of supervision of the treatment,lower number of prior hospitalizations and greater use of nonpsychiatric treatment,anticholinergics and hypnotics.Low educational level(OR=26.1;95%CI:2.819-241),worse treatment knowledge at six months(OR per unit=0.904;95%CI:0.853-0.957)and nonpsychiatric treatment at six months(OR=15.8;95%CI:1.790-139)were independently associated to UNA.CONCLUSION Differentiated subtypes of nonadherence according to intentionality seem to exist in patients with schizophrenia and schizoaffective disorder.Our findings suggest the need for differentiated approach,both in future research and in clinical practice.
基金supported by the National Research Foundation of Korea(No.2021R1A2B5B03001691).
文摘Point-of-care testing(POCT)is the practice of diagnosing and monitoring diseases where the patient is located,as opposed to traditional treatment conducted solely in a medical laboratory or other clinical setting.POCT has been less common in the recent past due to a lack of portable medical devices capable of facilitating effective medical testing.However,recent growth has occurred in this field due to advances in diagnostic technologies,device miniaturization,and progress in wearable electronics.Among these developments,electrochemical sensors have attracted interest in the POCT field due to their high sensitivity,compact size,and affordability.They are used in various applications,from disease diagnosis to health status monitoring.In this paper we explore recent advancements in electrochemical sensors,the methods of fabricating them,and the various types of sensing mechanisms that can be used.Furthermore,we delve into methods for immobilizing specific biorecognition elements,including enzymes,antibodies,and aptamers,onto electrode surfaces and how these sensors are used in real-world POCT settings.
基金supported by China Postdoctoral Science Foundation(No.2023TQ0247)Shenzhen Science and Technology Program(No.JCYJ20220530140602005)+2 种基金the Fundamental Research Funds for the Central Universities(No.2042023kfyq03)Guangdong Basic and Applied Basic Research Foundation(No.2023A1515111071)the Postdoctoral Fellowship Program(Grade B)of China Postdoctoral Science Foundation(No.GZB20230544).
文摘Knowledge of the mechanical behavior of planetary rocks is indispensable for space explorations.The scarcity of pristine samples and the irregular shapes of planetary meteorites make it difficult to obtain representative samples for conventional macroscale rock mechanics experiments(macro-RMEs).This critical review discusses recent advances in microscale RMEs(micro-RMEs)techniques and the upscaling methods for extracting mechanical parameters.Methods of mineralogical and microstructural analyses,along with non-destructive mechanical techniques,have provided new opportunities for studying planetary rocks with unprecedented precision and capabilities.First,we summarize several mainstream methods for obtaining the mineralogy and microstructure of planetary rocks.Then,nondestructive micromechanical testing methods,nanoindentation and atomic force microscopy(AFM),are detailed reviewed,illustrating the principles,advantages,influencing factors,and available testing results from literature.Subsequently,several feasible upscaling methods that bridge the micro-measurements of meteorite pieces to the strength of the intact body are introduced.Finally,the potential applications of planetary rock mechanics research to guiding the design and execution of space missions are environed,ranging from sample return missions and planetary defense to extraterrestrial construction.These discussions are expected to broaden the understanding of the microscale mechanical properties of planetary rocks and their significant role in deep space exploration.
文摘This paper demonstrates two versions of Whorf hypothesis by giving a conclusion of its commonality and differences. Besides, some classical experiments of testing the hypothesis are given and the author constructs an experimental model of testing the weak version of the hypothesis under the instruction of Carroll's three thinking.
文摘The purpose of this paper is to find the relationship between balance of foreign trade and real exchange rate in econometrics concept by using time series method. The authors used annual data of foreign trade deficit, real exchange rate, gross domestic product (GDP) of Turkey from 1989 to 2014, and analyzed the long-term relation of them by using ARDL bound testing method. By the result of test method; although there was a long-term relationship between balance of foreign trade, real exchange rate, GDP of Turkey and of the world, the coefficient of real exchange rate was insignificant in terms of statistical methods. Turkey and the world as well as being statistically significant coefficient of GDP, it was concluded that there was significant relationship with the economic aspects.
文摘Edge devices,due to their limited computational and storage resources,often require the use of compilers for program optimization.Therefore,ensuring the security and reliability of these compilers is of paramount importance in the emerging field of edge AI.One widely used testing method for this purpose is fuzz testing,which detects bugs by inputting random test cases into the target program.However,this process consumes significant time and resources.To improve the efficiency of compiler fuzz testing,it is common practice to utilize test case prioritization techniques.Some researchers use machine learning to predict the code coverage of test cases,aiming to maximize the test capability for the target compiler by increasing the overall predicted coverage of the test cases.Nevertheless,these methods can only forecast the code coverage of the compiler at a specific optimization level,potentially missing many optimization-related bugs.In this paper,we introduce C-CORE(short for Clustering by Code Representation),the first framework to prioritize test cases according to their code representations,which are derived directly from the source codes.This approach avoids being limited to specific compiler states and extends to a broader range of compiler bugs.Specifically,we first train a scaled pre-trained programming language model to capture as many common features as possible from the test cases generated by a fuzzer.Using this pre-trained model,we then train two downstream models:one for predicting the likelihood of triggering a bug and another for identifying code representations associated with bugs.Subsequently,we cluster the test cases according to their code representations and select the highest-scoring test case from each cluster as the high-quality test case.This reduction in redundant testing cases leads to time savings.Comprehensive evaluation results reveal that code representations are better at distinguishing test capabilities,and C-CORE significantly enhances testing efficiency.Across four datasets,C-CORE increases the average of the percentage of faults detected(APFD)value by 0.16 to 0.31 and reduces test time by over 50% in 46% of cases.When compared to the best results from approaches using predicted code coverage,C-CORE improves the APFD value by 1.1% to 12.3% and achieves an overall time-saving of 159.1%.