A novel technique of Moveable Reduction Bed Hydride Generator(MRBHG)was applied tohe hydride generation or cold vapor generation of As,Se,Ge,and Hg existing In TraditionalChinese Medicinal Material(TCM).The si...A novel technique of Moveable Reduction Bed Hydride Generator(MRBHG)was applied tohe hydride generation or cold vapor generation of As,Se,Ge,and Hg existing In TraditionalChinese Medicinal Material(TCM).The simultaneous determination of the multi-elements wasperformed with ICP-MS.A solid reduction system involving the use of potassiumtetraborohydride and tartaric acid was applied to generating metal hydride or cold vaporefficiently.The factors affecting the metal cold vapor generation were studied.The mainadvantage of the technique is that only a 4μL volume of sample was required for the cold vapor展开更多
This paper advances the viewpoints and methods of the rapid sample product trial manufacture technique for developing water meter new products by CAD and simulation, computer virtual assembling and optimizing, rapid m...This paper advances the viewpoints and methods of the rapid sample product trial manufacture technique for developing water meter new products by CAD and simulation, computer virtual assembling and optimizing, rapid machining process and measurement etc. as the design and sample product trial manufacture process of water meter new products are long in product development period, and low in product development efficiency in the present time.展开更多
Integrating machine learning and data mining is crucial for processing big data and extracting valuable insights to enhance decision-making.However,imbalanced target variables within big data present technical challen...Integrating machine learning and data mining is crucial for processing big data and extracting valuable insights to enhance decision-making.However,imbalanced target variables within big data present technical challenges that hinder the performance of supervised learning classifiers on key evaluation metrics,limiting their overall effectiveness.This study presents a comprehensive review of both common and recently developed Supervised Learning Classifiers(SLCs)and evaluates their performance in data-driven decision-making.The evaluation uses various metrics,with a particular focus on the Harmonic Mean Score(F-1 score)on an imbalanced real-world bank target marketing dataset.The findings indicate that grid-search random forest and random-search random forest excel in Precision and area under the curve,while Extreme Gradient Boosting(XGBoost)outperforms other traditional classifiers in terms of F-1 score.Employing oversampling methods to address the imbalanced data shows significant performance improvement in XGBoost,delivering superior results across all metrics,particularly when using the SMOTE variant known as the BorderlineSMOTE2 technique.The study concludes several key factors for effectively addressing the challenges of supervised learning with imbalanced datasets.These factors include the importance of selecting appropriate datasets for training and testing,choosing the right classifiers,employing effective techniques for processing and handling imbalanced datasets,and identifying suitable metrics for performance evaluation.Additionally,factors also entail the utilisation of effective exploratory data analysis in conjunction with visualisation techniques to yield insights conducive to data-driven decision-making.展开更多
Threshold signature has been widely used in electronic wills,electronic elections,cloud computing,secure multiparty computation and other fields.Until now,certificateless threshold signature schemes are all based on t...Threshold signature has been widely used in electronic wills,electronic elections,cloud computing,secure multiparty computation and other fields.Until now,certificateless threshold signature schemes are all based on traditional mathematic theory,so they cannot resist quantum computing attacks.In view of this,we combine the advantages of lattice-based cryptosystem and certificateless cryptosystem to construct a certificateless threshold signature from lattice(LCLTS)that is efficient and resistant to quantum algorithm attacks.LCLTS has the threshold characteristics and can resist the quantum computing attacks,and the analysis shows that it is unforgeable against the adaptive Chosen-Message Attacks(UF-CMA)with the difficulty of Inhomogeneous Small Integer Solution(ISIS)problem.In addition,LCLTS solves the problems of the certificate management through key escrow.展开更多
Deep-sea sediment is extremely important in marine scientific research,such as that concerning marine geology and microbial communities.The research findings are closely related to the in-situ information of the sedim...Deep-sea sediment is extremely important in marine scientific research,such as that concerning marine geology and microbial communities.The research findings are closely related to the in-situ information of the sediment.One prerequisite for investigations of deep-sea sediment is providing sampling techniques capable of preventing distortion during recovery.As the fruit of such sampling techniques,samplers designed for obtaining sediment have become indispensable equipment,owing to their low cost,light weight,compactness,easy operation,and high adaptability to sea conditions.This paper introduces the research and application of typical deep-sea sediment samplers.Then,a representative sampler recently developed in China is analyzed.On this basis,a review and analysis is conducted regarding the key techniques of various deep-sea sediment samplers,including sealing,pressure and temperature retaining,low-disturbance sampling,and no-pressure drop transfer.Then,the shortcomings in the key techniques for deep-sea sediment sampling are identified.Finally,prospects for the future development of key techniques for deep-sea sediment sampling are proposed,from the perspectives of structural diversification,functional integration,intelligent operation,and high-fidelity samples.This paper summarizes the existing samplers in the context of the key techniques mentioned above,and can provide reference for the optimized design of samplers and development of key sampling techniques.展开更多
The accumulator is used as a pressure compensation device to realize deep-sea microbe gastight sampling. Four key states of the accumulator are proposed to describe the pressure compensation process and a correspondin...The accumulator is used as a pressure compensation device to realize deep-sea microbe gastight sampling. Four key states of the accumulator are proposed to describe the pressure compensation process and a corresponding mathematical model is established to investigate the relationship between the results of pressure compensation and the parameters of the accumulator. Simulation results show that during the falling process of the sampler, the accumulator' s real opening pressure is greater than its precharge pressure; when the sampling depth is 6000 m and the accumulator' s precharge pressure is less than 30 MPa, to increase the accumulator' s precharge pressure can improve pressure compensation results obviously. Laboratory experiments at 60 MPa show that the acctunulator is an effective and reliable pressure compensation device for deep-sea microbe samplers, The success in sea trial at a depth of 2000 m in the South China Sea shows that the mathematical model and laboratory experiment results are reliable.展开更多
Atmospheric radionuclide monitoring usually includes two sampling techniques, namely ultra-high volume aerosol samplers to collect at- mospheric particles by using filter media, and radioactive noble gas samplers to c...Atmospheric radionuclide monitoring usually includes two sampling techniques, namely ultra-high volume aerosol samplers to collect at- mospheric particles by using filter media, and radioactive noble gas samplers to collect atmospheric noble gas based on adsorption method. Atmos- pheric sampling techniques have been researched in Northwest Institute of Nuclear Technology since the Comprehensive Nuclear-Test-Ban Treaty (CTBT) was signed in 1996. Several ultra-high volume aerosol samplers and some types of radioactive xenon isotopes samplers had been devel- oped. For the aerosol sampler, the sampling flow is between 450 and 800 m3/h, with the minimum detectable concentration (MDC) of 131I less than 5 pBq/m3. For the xenon sampler, the sampling capacity of xenon is more than 4 ml per day, with MDC of l=Xe less than 0.25 mBq/m3. After the nuclear accident of Fukushima in 2011, monitoring of the atmospheric radionuclide was carried out for 3 months at Xi'an, and part of radionuclide was detected with concentrations hiaher than their backorounds in the period, includina 131I.134Cs. 137Cs and 133Xe.展开更多
The laboratories in the bauxite processing industry are always under a heavy workload of sample collection, analysis, and compilation of the results. After size reduction from grinding mills, the samples of bauxite ar...The laboratories in the bauxite processing industry are always under a heavy workload of sample collection, analysis, and compilation of the results. After size reduction from grinding mills, the samples of bauxite are collected after intervals of 3 to 4 hours. Large bauxite processing industries producing 1 million tons of pure aluminium can have three grinding mills. Thus, the total number of samples to be tested in one day reaches a figure of 18 to 24. The sample of bauxite ore coming from the grinding mill is tested for its particle size and composition. For testing the composition, the bauxite ore sample is first prepared by fusing it with X-ray flux. Then the sample is sent for X-ray fluorescence analysis. Afterwards, the crucibles are washed in ultrasonic baths to be used for the next testing. The whole procedure takes about 2 - 3 hours. With a large number of samples reaching the laboratory, the chances of error in composition analysis increase. In this study, we have used a composite sampling methodology to reduce the number of samples reaching the laboratory without compromising their validity. The results of the average composition of fifteen samples were measured against composite samples. The mean of difference was calculated. The standard deviation and paired t-test values were evaluated against predetermined critical values obtained using a two-tailed test. It was found from the results that paired test-t values were much lower than the critical values thus validating the composition attained through composite sampling. The composite sampling approach not only reduced the number of samples but also the chemicals used in the laboratory. The objective of improved analytical protocol to reduce the number of samples reaching the laboratory was successfully achieved without compromising the quality of analytical results.展开更多
Developments in biomedical science, signal processing technologies have led Electroencephalography (EEG) signals to be widely used in the diagnosis of brain disease and in the field of Brain-Computer Interface (BCI). ...Developments in biomedical science, signal processing technologies have led Electroencephalography (EEG) signals to be widely used in the diagnosis of brain disease and in the field of Brain-Computer Interface (BCI). The collected EEG signals are processed using Machine Learning-Random Forest and Naive Bayes- and Deep Learning-Recurrent Neural Network (RNN), Neural Network (NN) and Long Short Term Memory (LSTM)-Algorithms to obtain the recent mood of a person. The Algorithms mentioned above have been imposed on the data set in order to find out what the person is feeling at a particular moment. The following thesis is conducted to find out one of the following moods (happy, surprised, disgust, fear, anger and sadness) of a person at an instant, with an aim to obtain the result with least amount of time delay as the mood differs. It is pretty obvious that the accuracy of the output varies depending upon the algorithm used, time taken to process the data, so that it is easy for us to compare the reliability and dependency of a particular algorithm to another, prior to its practical implementation. The imbalance data sets that were used had an imbalanced class and thus, over fitting occurred. This problem was handled by generating Artificial Data sets with the use of SMOTE Oversampling Technique.展开更多
[ Objective] The paper was to study the spatial distribution pattern of fourth-generation mature larvae of cotton bellworm in corn field. [ Method] The plots with different occurrence densities of fourth-generation co...[ Objective] The paper was to study the spatial distribution pattern of fourth-generation mature larvae of cotton bellworm in corn field. [ Method] The plots with different occurrence densities of fourth-generation cotton bollworm were investigated from August to September in 2009. Six groups of sampling data were obtained, and seven indicators including aggregation index method, Iwao method and Taylor method, etc. were used to determine its spatial distribution pattern. [ Result ] Aggregation index test showed that in all plots, Moore I 〈 0, Lloyed m*/m 〈 1, Kuno Ca 〈 0, diffusion coefficient C 〈 1, diffusion index 16 〈 1, negative binomial distribution K 〈 0, indicating that mature larvae of cotton bollworm showed uniform distribution in summer corn. Iwae regression equation of fourth-genera- tion mature larvae of cotton boUworm in summer corn was m * = 0. 090 6 + 0. 766 9 m, r = 0. 986 3, indicating that the basic components of cotton bollworm distribu- ted was single individual, and mature larvae of cotton bollworm in summer corn showed uniform distribution. The optimal sampling number of fourth-generation ma- ture larvae of cotton bollworm in corn under different population densities could be calculated using formula N1 = ( 1. 090 6/m -0. 233 1 )/D2. [ Conclusion] The result provided basis for accurate evaluation of population quantities and variation law of cotton boUworm, as well as prediction and control of the pest.展开更多
For imbalanced datasets, the focus of classification is to identify samples of the minority class. The performance of current data mining algorithms is not good enough for processing imbalanced datasets. The synthetic...For imbalanced datasets, the focus of classification is to identify samples of the minority class. The performance of current data mining algorithms is not good enough for processing imbalanced datasets. The synthetic minority over-sampling technique(SMOTE) is specifically designed for learning from imbalanced datasets, generating synthetic minority class examples by interpolating between minority class examples nearby. However, the SMOTE encounters the overgeneralization problem. The densitybased spatial clustering of applications with noise(DBSCAN) is not rigorous when dealing with the samples near the borderline.We optimize the DBSCAN algorithm for this problem to make clustering more reasonable. This paper integrates the optimized DBSCAN and SMOTE, and proposes a density-based synthetic minority over-sampling technique(DSMOTE). First, the optimized DBSCAN is used to divide the samples of the minority class into three groups, including core samples, borderline samples and noise samples, and then the noise samples of minority class is removed to synthesize more effective samples. In order to make full use of the information of core samples and borderline samples,different strategies are used to over-sample core samples and borderline samples. Experiments show that DSMOTE can achieve better results compared with SMOTE and Borderline-SMOTE in terms of precision, recall and F-value.展开更多
As far as the vibration signal processing is concerned, composition ofvibration signal resulting from incipient localized faults in gearbox is too weak to be detected bytraditional detecting technology available now. ...As far as the vibration signal processing is concerned, composition ofvibration signal resulting from incipient localized faults in gearbox is too weak to be detected bytraditional detecting technology available now. The method, which includes two steps: vibrationsignal from gearbox is first processed by synchronous average sampling technique and then it isanalyzed by complex continuous wavelet transform to diagnose gear fault, is introduced. Twodifferent kinds of faults in the gearbox, i.e. shaft eccentricity and initial crack in tooth fillet,are detected and distinguished from each other successfully.展开更多
Since its introduction,endoscopic ultrasound(EUS)guided fine needle aspiration and fine needle biopsy have become an indispensable tool for the diagnosis of lesions within the gastrointestinal tract and surrounding or...Since its introduction,endoscopic ultrasound(EUS)guided fine needle aspiration and fine needle biopsy have become an indispensable tool for the diagnosis of lesions within the gastrointestinal tract and surrounding organs.It has proved to be an effective diagnostic method with high accuracy and low complication rates.Several factors can influence the accuracy and the diagnostic yield of this procedure including experience of the endosonographer,availability of onsite cytopathology services,the method of cytopathology preparation,the location and physical characteristics of the lesion,sampling techniques and the type and size of the needle used.In this review we will outline the recent studies evaluating EUS-guided tissue acquisition and will provide practical recommendations to maximize tissue yield.展开更多
Information on forest structure is important for forest management decisions. This is inadequate in many situations, especially where timber is not of primary interest. We analyzed the structure of two forest types in...Information on forest structure is important for forest management decisions. This is inadequate in many situations, especially where timber is not of primary interest. We analyzed the structure of two forest types in the Oban Division of Cross River National Park, Nigeria. Systematic sampling technique was used to establish two transects measuring 2,000 x 2 m, at 600 m interval in the two forest types in four locations. Four 50 m x 50 m plots were located alternately at 500 m intervals along each transect, constituting 32 plots per forest type and 64 plots in all, Diameters at breast height (DBH), base; middle and top; crown diameter; total height and crown length were measured on all trees with DBH 〉_ 10 cm. There were 159 stems/ha in the close-canopy forest and 132 stems/ha in the secondary forest. The mean DBH were 34.5 cm and 33.62 cm respectively. The mean heights were 24.79 m and 23.97 m, respectively. Basal area/ha were 41.59 m2 ha~ and 27.38 m2 hal for the two forest types. Majority of the trees encountered in the two forest types belonged to the middle stratum which has implication for small mammals' populations. Emergent trees which are otherwise scarce in other parts of the country were recorded, which also has implications for density thinning and seed supplies.展开更多
Summary: To compare and evaluate two methodologies, entire-sampling and micro-sampling for the harvesting of vitreous humor, the vitreous humor of rabbits were sampled with the two methods respectively, and the conce...Summary: To compare and evaluate two methodologies, entire-sampling and micro-sampling for the harvesting of vitreous humor, the vitreous humor of rabbits were sampled with the two methods respectively, and the concentrations of calcium, chlorine, potassium, sodium and phosphorus of the were measured. The results showed that the differences in the variance coefficient and two-eye concentrations of micro-sampled specimens were less than those of the entire-sampled specimens. In the micro-sampling group, the concentrations of repeated micro-sampling showed no differences among different groups (P〉0.05) and the intra-ocular fluid dynamics did not have significant influence on post-mortem sampling. The sampling technique may affect the concentrations of specimen collected. Our study suggests that micro-sampling is less influenced by the human factor and is reliable, reproducible, and more suitable for forensic investigation.展开更多
The purpose of this work is to apply Game theory approach to determine patients’ preferences of healthcare facilities for quality healthcare in Akwa Ibom State. </span><span style="font-family:Verdana;&...The purpose of this work is to apply Game theory approach to determine patients’ preferences of healthcare facilities for quality healthcare in Akwa Ibom State. </span><span style="font-family:Verdana;">Cross-sectional descriptive study and purposive sampling technique were adopted in order to collect the relevant data. Factors influencing patients’ preferences of health care facilities between public and private hospitals in Akwa Ibom State were assessed using a set of questionnaires which were distributed to 9976 patients in University of Uyo Teaching Hospital, Uyo, Akwa Ibom State. A</span><span style="font-family:Verdana;"> two-person zero sum game theory approach was applied. Perception of quality healthcare services received by respondent’s preferred facilities between public and private hospitals w</span></span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">as</span></span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;"> examined. Also the reasons for patients’ persistence of their preferred facilities were evaluated using questionnaire. The optimal strategy and the value of the game were determined using the factors influencing patients</span></span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">’</span></span></span></span><span><span><span><span style="font-family:""><span style="font-family:Verdana;"> preferences of healthcare facilities, and analysed with two-person-zero-sum game. Facility that gives their </span><span style="font-family:Verdana;">clients the best satisfaction w</span></span></span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">as</span></span></span></span><span><span><span><span style="font-family:""><span style="font-family:Verdana;"> identified. </span><span style="font-family:Verdana;">The data collected through questionnaire were analysed using the rules of dominance in a two-person</span></span></span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">-</span></span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">zero</span></span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">-</span></span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">sum </span></span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">game and TORA statistical software was employed. The result shows that the value of the game, v = 330 which implies that the game is favourable to public hospital. The result also showed that patients preferred public hospitals due to costs of services with probability one (1), while private hospitals attributed their preferences to attitude of healthcare providers with probability one (1</span></span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">).展开更多
By means of polymerase chain reaction(PCR) technique,direct smear fluorescence microscopy and bacterial culture,the sputa and purulent secretion of 122 TB patients were examined to detect mycobacterium tuberculosis.
When a mass spreads in a turbulent flow, areas with obviously high concentration of the mass compared with surrounding areas are formed by organized structures of turbulence. In this study, we extract the high concent...When a mass spreads in a turbulent flow, areas with obviously high concentration of the mass compared with surrounding areas are formed by organized structures of turbulence. In this study, we extract the high concentration areas and investigate their diffusion process. For this purpose, a combination of Planar Laser Induced Fluorescence (PLIF) and Particle Image Velocimetry (PIV) techniques was employed to obtain simultaneously the two fields of the concentration of injected dye and of the velocity in a water turbulent channel flow. With focusing on a quasi-homogeneous turbulence in the channel central region, a series of PLIF and PIV images were acquired at several different downstream positions. We applied a conditional sampling technique to the PLIF images to extract the high concentration areas, or spikes, and calculated the conditional-averaged statistics of the extracted areas such as length scale, mean concentration, and turbulent diffusion coefficient. We found that the averaged length scale was constant with downstream distance from the diffusion source and was smaller than integral scale of the turbulent eddies. The spanwise distribution of the mean concentration was basically Gaussian, and the spanwise width of the spikes increased linearly with downstream distance from the diffusion source. Moreover, the turbulent diffusion coefficient was found to increase in proportion to the spanwise distance from the source. These results reveal aspects different from those of regular mass diffusion and let us conclude that the diffusion process of the spikes differs from that of regular mass diffusion.展开更多
The improved line sampling (LS) technique, an effective numerical simulation method, is employed to analyze the probabilistic characteristics and reliability sensitivity of flutter with random structural parameter i...The improved line sampling (LS) technique, an effective numerical simulation method, is employed to analyze the probabilistic characteristics and reliability sensitivity of flutter with random structural parameter in transonic flow. The improved LS technique is a novel methodology for reliability and sensitivity analysis of high dimensionality and low probability problem with implicit limit state function, and it does not require any approximating surrogate of the implicit limit state equation. The improved LS is used to estimate the flutter reliability and the sensitivity of a two-dimensional wing, in which some structural properties, such as frequency, parameters of gravity center and mass ratio, are considered as random variables. Computational fluid dynamics (CFD) based unsteady aerodynamic reduced order model (ROM) method is used to construct the aerodynamic state equations. Coupling structural state equations with aerodynamic state equations, the safety margin of flutter is founded by using the critical velocity of flutter. The results show that the improved LS technique can effectively decrease the computational cost in the random uncertainty analysis of flutter. The reliability sensitivity, defined by the partial derivative of the failure probability with respect to the distribution parameter of random variable, can help to identify the important parameters and guide the structural optimization design.展开更多
文摘A novel technique of Moveable Reduction Bed Hydride Generator(MRBHG)was applied tohe hydride generation or cold vapor generation of As,Se,Ge,and Hg existing In TraditionalChinese Medicinal Material(TCM).The simultaneous determination of the multi-elements wasperformed with ICP-MS.A solid reduction system involving the use of potassiumtetraborohydride and tartaric acid was applied to generating metal hydride or cold vaporefficiently.The factors affecting the metal cold vapor generation were studied.The mainadvantage of the technique is that only a 4μL volume of sample was required for the cold vapor
文摘This paper advances the viewpoints and methods of the rapid sample product trial manufacture technique for developing water meter new products by CAD and simulation, computer virtual assembling and optimizing, rapid machining process and measurement etc. as the design and sample product trial manufacture process of water meter new products are long in product development period, and low in product development efficiency in the present time.
基金support from the Cyber Technology Institute(CTI)at the School of Computer Science and Informatics,De Montfort University,United Kingdom,along with financial assistance from Universiti Tun Hussein Onn Malaysia and the UTHM Publisher’s office through publication fund E15216.
文摘Integrating machine learning and data mining is crucial for processing big data and extracting valuable insights to enhance decision-making.However,imbalanced target variables within big data present technical challenges that hinder the performance of supervised learning classifiers on key evaluation metrics,limiting their overall effectiveness.This study presents a comprehensive review of both common and recently developed Supervised Learning Classifiers(SLCs)and evaluates their performance in data-driven decision-making.The evaluation uses various metrics,with a particular focus on the Harmonic Mean Score(F-1 score)on an imbalanced real-world bank target marketing dataset.The findings indicate that grid-search random forest and random-search random forest excel in Precision and area under the curve,while Extreme Gradient Boosting(XGBoost)outperforms other traditional classifiers in terms of F-1 score.Employing oversampling methods to address the imbalanced data shows significant performance improvement in XGBoost,delivering superior results across all metrics,particularly when using the SMOTE variant known as the BorderlineSMOTE2 technique.The study concludes several key factors for effectively addressing the challenges of supervised learning with imbalanced datasets.These factors include the importance of selecting appropriate datasets for training and testing,choosing the right classifiers,employing effective techniques for processing and handling imbalanced datasets,and identifying suitable metrics for performance evaluation.Additionally,factors also entail the utilisation of effective exploratory data analysis in conjunction with visualisation techniques to yield insights conducive to data-driven decision-making.
基金supported by the Key Project of Natural Science Basic Research Plan of Shaanxi Province under the Grant 2020JZ-54.
文摘Threshold signature has been widely used in electronic wills,electronic elections,cloud computing,secure multiparty computation and other fields.Until now,certificateless threshold signature schemes are all based on traditional mathematic theory,so they cannot resist quantum computing attacks.In view of this,we combine the advantages of lattice-based cryptosystem and certificateless cryptosystem to construct a certificateless threshold signature from lattice(LCLTS)that is efficient and resistant to quantum algorithm attacks.LCLTS has the threshold characteristics and can resist the quantum computing attacks,and the analysis shows that it is unforgeable against the adaptive Chosen-Message Attacks(UF-CMA)with the difficulty of Inhomogeneous Small Integer Solution(ISIS)problem.In addition,LCLTS solves the problems of the certificate management through key escrow.
基金Supported by National Key Research and Development Program of China(Grant No.2016YFC0300502)Hunan Provincial Innovation Foundation For Postgraduate(Grant No.CX2018B658)+2 种基金National Natural Science Foundation of China(Grant Nos.51705145,517779092)Supported by Scientific Research Fund of Hunan Provincial Education Department(Grant No.18B205)Hunan Province Natural Science Foundation(Grant No.2019 JJ50182).
文摘Deep-sea sediment is extremely important in marine scientific research,such as that concerning marine geology and microbial communities.The research findings are closely related to the in-situ information of the sediment.One prerequisite for investigations of deep-sea sediment is providing sampling techniques capable of preventing distortion during recovery.As the fruit of such sampling techniques,samplers designed for obtaining sediment have become indispensable equipment,owing to their low cost,light weight,compactness,easy operation,and high adaptability to sea conditions.This paper introduces the research and application of typical deep-sea sediment samplers.Then,a representative sampler recently developed in China is analyzed.On this basis,a review and analysis is conducted regarding the key techniques of various deep-sea sediment samplers,including sealing,pressure and temperature retaining,low-disturbance sampling,and no-pressure drop transfer.Then,the shortcomings in the key techniques for deep-sea sediment sampling are identified.Finally,prospects for the future development of key techniques for deep-sea sediment sampling are proposed,from the perspectives of structural diversification,functional integration,intelligent operation,and high-fidelity samples.This paper summarizes the existing samplers in the context of the key techniques mentioned above,and can provide reference for the optimized design of samplers and development of key sampling techniques.
文摘The accumulator is used as a pressure compensation device to realize deep-sea microbe gastight sampling. Four key states of the accumulator are proposed to describe the pressure compensation process and a corresponding mathematical model is established to investigate the relationship between the results of pressure compensation and the parameters of the accumulator. Simulation results show that during the falling process of the sampler, the accumulator' s real opening pressure is greater than its precharge pressure; when the sampling depth is 6000 m and the accumulator' s precharge pressure is less than 30 MPa, to increase the accumulator' s precharge pressure can improve pressure compensation results obviously. Laboratory experiments at 60 MPa show that the acctunulator is an effective and reliable pressure compensation device for deep-sea microbe samplers, The success in sea trial at a depth of 2000 m in the South China Sea shows that the mathematical model and laboratory experiment results are reliable.
文摘Atmospheric radionuclide monitoring usually includes two sampling techniques, namely ultra-high volume aerosol samplers to collect at- mospheric particles by using filter media, and radioactive noble gas samplers to collect atmospheric noble gas based on adsorption method. Atmos- pheric sampling techniques have been researched in Northwest Institute of Nuclear Technology since the Comprehensive Nuclear-Test-Ban Treaty (CTBT) was signed in 1996. Several ultra-high volume aerosol samplers and some types of radioactive xenon isotopes samplers had been devel- oped. For the aerosol sampler, the sampling flow is between 450 and 800 m3/h, with the minimum detectable concentration (MDC) of 131I less than 5 pBq/m3. For the xenon sampler, the sampling capacity of xenon is more than 4 ml per day, with MDC of l=Xe less than 0.25 mBq/m3. After the nuclear accident of Fukushima in 2011, monitoring of the atmospheric radionuclide was carried out for 3 months at Xi'an, and part of radionuclide was detected with concentrations hiaher than their backorounds in the period, includina 131I.134Cs. 137Cs and 133Xe.
文摘The laboratories in the bauxite processing industry are always under a heavy workload of sample collection, analysis, and compilation of the results. After size reduction from grinding mills, the samples of bauxite are collected after intervals of 3 to 4 hours. Large bauxite processing industries producing 1 million tons of pure aluminium can have three grinding mills. Thus, the total number of samples to be tested in one day reaches a figure of 18 to 24. The sample of bauxite ore coming from the grinding mill is tested for its particle size and composition. For testing the composition, the bauxite ore sample is first prepared by fusing it with X-ray flux. Then the sample is sent for X-ray fluorescence analysis. Afterwards, the crucibles are washed in ultrasonic baths to be used for the next testing. The whole procedure takes about 2 - 3 hours. With a large number of samples reaching the laboratory, the chances of error in composition analysis increase. In this study, we have used a composite sampling methodology to reduce the number of samples reaching the laboratory without compromising their validity. The results of the average composition of fifteen samples were measured against composite samples. The mean of difference was calculated. The standard deviation and paired t-test values were evaluated against predetermined critical values obtained using a two-tailed test. It was found from the results that paired test-t values were much lower than the critical values thus validating the composition attained through composite sampling. The composite sampling approach not only reduced the number of samples but also the chemicals used in the laboratory. The objective of improved analytical protocol to reduce the number of samples reaching the laboratory was successfully achieved without compromising the quality of analytical results.
文摘Developments in biomedical science, signal processing technologies have led Electroencephalography (EEG) signals to be widely used in the diagnosis of brain disease and in the field of Brain-Computer Interface (BCI). The collected EEG signals are processed using Machine Learning-Random Forest and Naive Bayes- and Deep Learning-Recurrent Neural Network (RNN), Neural Network (NN) and Long Short Term Memory (LSTM)-Algorithms to obtain the recent mood of a person. The Algorithms mentioned above have been imposed on the data set in order to find out what the person is feeling at a particular moment. The following thesis is conducted to find out one of the following moods (happy, surprised, disgust, fear, anger and sadness) of a person at an instant, with an aim to obtain the result with least amount of time delay as the mood differs. It is pretty obvious that the accuracy of the output varies depending upon the algorithm used, time taken to process the data, so that it is easy for us to compare the reliability and dependency of a particular algorithm to another, prior to its practical implementation. The imbalance data sets that were used had an imbalanced class and thus, over fitting occurred. This problem was handled by generating Artificial Data sets with the use of SMOTE Oversampling Technique.
文摘[ Objective] The paper was to study the spatial distribution pattern of fourth-generation mature larvae of cotton bellworm in corn field. [ Method] The plots with different occurrence densities of fourth-generation cotton bollworm were investigated from August to September in 2009. Six groups of sampling data were obtained, and seven indicators including aggregation index method, Iwao method and Taylor method, etc. were used to determine its spatial distribution pattern. [ Result ] Aggregation index test showed that in all plots, Moore I 〈 0, Lloyed m*/m 〈 1, Kuno Ca 〈 0, diffusion coefficient C 〈 1, diffusion index 16 〈 1, negative binomial distribution K 〈 0, indicating that mature larvae of cotton bollworm showed uniform distribution in summer corn. Iwae regression equation of fourth-genera- tion mature larvae of cotton boUworm in summer corn was m * = 0. 090 6 + 0. 766 9 m, r = 0. 986 3, indicating that the basic components of cotton bollworm distribu- ted was single individual, and mature larvae of cotton bollworm in summer corn showed uniform distribution. The optimal sampling number of fourth-generation ma- ture larvae of cotton bollworm in corn under different population densities could be calculated using formula N1 = ( 1. 090 6/m -0. 233 1 )/D2. [ Conclusion] The result provided basis for accurate evaluation of population quantities and variation law of cotton boUworm, as well as prediction and control of the pest.
基金supported by the National Key Research and Development Program of China(2018YFB1003700)the Scientific and Technological Support Project(Society)of Jiangsu Province(BE2016776)+2 种基金the“333” project of Jiangsu Province(BRA2017228 BRA2017401)the Talent Project in Six Fields of Jiangsu Province(2015-JNHB-012)
文摘For imbalanced datasets, the focus of classification is to identify samples of the minority class. The performance of current data mining algorithms is not good enough for processing imbalanced datasets. The synthetic minority over-sampling technique(SMOTE) is specifically designed for learning from imbalanced datasets, generating synthetic minority class examples by interpolating between minority class examples nearby. However, the SMOTE encounters the overgeneralization problem. The densitybased spatial clustering of applications with noise(DBSCAN) is not rigorous when dealing with the samples near the borderline.We optimize the DBSCAN algorithm for this problem to make clustering more reasonable. This paper integrates the optimized DBSCAN and SMOTE, and proposes a density-based synthetic minority over-sampling technique(DSMOTE). First, the optimized DBSCAN is used to divide the samples of the minority class into three groups, including core samples, borderline samples and noise samples, and then the noise samples of minority class is removed to synthesize more effective samples. In order to make full use of the information of core samples and borderline samples,different strategies are used to over-sample core samples and borderline samples. Experiments show that DSMOTE can achieve better results compared with SMOTE and Borderline-SMOTE in terms of precision, recall and F-value.
基金Provicial Natural Science Foundation of Shanxi,China(No.991051)Provincial Foundation for Homecoming Personnel from Study Abroad of Shanxi,China(No.194-101005)
文摘As far as the vibration signal processing is concerned, composition ofvibration signal resulting from incipient localized faults in gearbox is too weak to be detected bytraditional detecting technology available now. The method, which includes two steps: vibrationsignal from gearbox is first processed by synchronous average sampling technique and then it isanalyzed by complex continuous wavelet transform to diagnose gear fault, is introduced. Twodifferent kinds of faults in the gearbox, i.e. shaft eccentricity and initial crack in tooth fillet,are detected and distinguished from each other successfully.
文摘Since its introduction,endoscopic ultrasound(EUS)guided fine needle aspiration and fine needle biopsy have become an indispensable tool for the diagnosis of lesions within the gastrointestinal tract and surrounding organs.It has proved to be an effective diagnostic method with high accuracy and low complication rates.Several factors can influence the accuracy and the diagnostic yield of this procedure including experience of the endosonographer,availability of onsite cytopathology services,the method of cytopathology preparation,the location and physical characteristics of the lesion,sampling techniques and the type and size of the needle used.In this review we will outline the recent studies evaluating EUS-guided tissue acquisition and will provide practical recommendations to maximize tissue yield.
文摘Information on forest structure is important for forest management decisions. This is inadequate in many situations, especially where timber is not of primary interest. We analyzed the structure of two forest types in the Oban Division of Cross River National Park, Nigeria. Systematic sampling technique was used to establish two transects measuring 2,000 x 2 m, at 600 m interval in the two forest types in four locations. Four 50 m x 50 m plots were located alternately at 500 m intervals along each transect, constituting 32 plots per forest type and 64 plots in all, Diameters at breast height (DBH), base; middle and top; crown diameter; total height and crown length were measured on all trees with DBH 〉_ 10 cm. There were 159 stems/ha in the close-canopy forest and 132 stems/ha in the secondary forest. The mean DBH were 34.5 cm and 33.62 cm respectively. The mean heights were 24.79 m and 23.97 m, respectively. Basal area/ha were 41.59 m2 ha~ and 27.38 m2 hal for the two forest types. Majority of the trees encountered in the two forest types belonged to the middle stratum which has implication for small mammals' populations. Emergent trees which are otherwise scarce in other parts of the country were recorded, which also has implications for density thinning and seed supplies.
文摘Summary: To compare and evaluate two methodologies, entire-sampling and micro-sampling for the harvesting of vitreous humor, the vitreous humor of rabbits were sampled with the two methods respectively, and the concentrations of calcium, chlorine, potassium, sodium and phosphorus of the were measured. The results showed that the differences in the variance coefficient and two-eye concentrations of micro-sampled specimens were less than those of the entire-sampled specimens. In the micro-sampling group, the concentrations of repeated micro-sampling showed no differences among different groups (P〉0.05) and the intra-ocular fluid dynamics did not have significant influence on post-mortem sampling. The sampling technique may affect the concentrations of specimen collected. Our study suggests that micro-sampling is less influenced by the human factor and is reliable, reproducible, and more suitable for forensic investigation.
文摘The purpose of this work is to apply Game theory approach to determine patients’ preferences of healthcare facilities for quality healthcare in Akwa Ibom State. </span><span style="font-family:Verdana;">Cross-sectional descriptive study and purposive sampling technique were adopted in order to collect the relevant data. Factors influencing patients’ preferences of health care facilities between public and private hospitals in Akwa Ibom State were assessed using a set of questionnaires which were distributed to 9976 patients in University of Uyo Teaching Hospital, Uyo, Akwa Ibom State. A</span><span style="font-family:Verdana;"> two-person zero sum game theory approach was applied. Perception of quality healthcare services received by respondent’s preferred facilities between public and private hospitals w</span></span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">as</span></span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;"> examined. Also the reasons for patients’ persistence of their preferred facilities were evaluated using questionnaire. The optimal strategy and the value of the game were determined using the factors influencing patients</span></span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">’</span></span></span></span><span><span><span><span style="font-family:""><span style="font-family:Verdana;"> preferences of healthcare facilities, and analysed with two-person-zero-sum game. Facility that gives their </span><span style="font-family:Verdana;">clients the best satisfaction w</span></span></span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">as</span></span></span></span><span><span><span><span style="font-family:""><span style="font-family:Verdana;"> identified. </span><span style="font-family:Verdana;">The data collected through questionnaire were analysed using the rules of dominance in a two-person</span></span></span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">-</span></span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">zero</span></span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">-</span></span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">sum </span></span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">game and TORA statistical software was employed. The result shows that the value of the game, v = 330 which implies that the game is favourable to public hospital. The result also showed that patients preferred public hospitals due to costs of services with probability one (1), while private hospitals attributed their preferences to attitude of healthcare providers with probability one (1</span></span></span></span><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;"><span style="font-family:Verdana;">).
文摘By means of polymerase chain reaction(PCR) technique,direct smear fluorescence microscopy and bacterial culture,the sputa and purulent secretion of 122 TB patients were examined to detect mycobacterium tuberculosis.
文摘When a mass spreads in a turbulent flow, areas with obviously high concentration of the mass compared with surrounding areas are formed by organized structures of turbulence. In this study, we extract the high concentration areas and investigate their diffusion process. For this purpose, a combination of Planar Laser Induced Fluorescence (PLIF) and Particle Image Velocimetry (PIV) techniques was employed to obtain simultaneously the two fields of the concentration of injected dye and of the velocity in a water turbulent channel flow. With focusing on a quasi-homogeneous turbulence in the channel central region, a series of PLIF and PIV images were acquired at several different downstream positions. We applied a conditional sampling technique to the PLIF images to extract the high concentration areas, or spikes, and calculated the conditional-averaged statistics of the extracted areas such as length scale, mean concentration, and turbulent diffusion coefficient. We found that the averaged length scale was constant with downstream distance from the diffusion source and was smaller than integral scale of the turbulent eddies. The spanwise distribution of the mean concentration was basically Gaussian, and the spanwise width of the spikes increased linearly with downstream distance from the diffusion source. Moreover, the turbulent diffusion coefficient was found to increase in proportion to the spanwise distance from the source. These results reveal aspects different from those of regular mass diffusion and let us conclude that the diffusion process of the spikes differs from that of regular mass diffusion.
基金Foundation items: National Natural Science Foundation of China (NSFC 10572117, 10802063, 50875213) National High-tech Research and Development Program (2007AA04Z401)+2 种基金 Aeronautical Science Foundation of China (2007ZA53012) New Century Program For Excellent Talents of Ministry of Education of China (NCET-05-0868) Ph.D. Program Foundation of Northwestern Polytechnical University (CX200801).
文摘The improved line sampling (LS) technique, an effective numerical simulation method, is employed to analyze the probabilistic characteristics and reliability sensitivity of flutter with random structural parameter in transonic flow. The improved LS technique is a novel methodology for reliability and sensitivity analysis of high dimensionality and low probability problem with implicit limit state function, and it does not require any approximating surrogate of the implicit limit state equation. The improved LS is used to estimate the flutter reliability and the sensitivity of a two-dimensional wing, in which some structural properties, such as frequency, parameters of gravity center and mass ratio, are considered as random variables. Computational fluid dynamics (CFD) based unsteady aerodynamic reduced order model (ROM) method is used to construct the aerodynamic state equations. Coupling structural state equations with aerodynamic state equations, the safety margin of flutter is founded by using the critical velocity of flutter. The results show that the improved LS technique can effectively decrease the computational cost in the random uncertainty analysis of flutter. The reliability sensitivity, defined by the partial derivative of the failure probability with respect to the distribution parameter of random variable, can help to identify the important parameters and guide the structural optimization design.