BACKGROUND Postoperative delirium,particularly prevalent in elderly patients after abdominal cancer surgery,presents significant challenges in clinical management.AIM To develop a synthetic minority oversampling techn...BACKGROUND Postoperative delirium,particularly prevalent in elderly patients after abdominal cancer surgery,presents significant challenges in clinical management.AIM To develop a synthetic minority oversampling technique(SMOTE)-based model for predicting postoperative delirium in elderly abdominal cancer patients.METHODS In this retrospective cohort study,we analyzed data from 611 elderly patients who underwent abdominal malignant tumor surgery at our hospital between September 2020 and October 2022.The incidence of postoperative delirium was recorded for 7 d post-surgery.Patients were divided into delirium and non-delirium groups based on the occurrence of postoperative delirium or not.A multivariate logistic regression model was used to identify risk factors and develop a predictive model for postoperative delirium.The SMOTE technique was applied to enhance the model by oversampling the delirium cases.The model’s predictive accuracy was then validated.RESULTS In our study involving 611 elderly patients with abdominal malignant tumors,multivariate logistic regression analysis identified significant risk factors for postoperative delirium.These included the Charlson comorbidity index,American Society of Anesthesiologists classification,history of cerebrovascular disease,surgical duration,perioperative blood transfusion,and postoperative pain score.The incidence rate of postoperative delirium in our study was 22.91%.The original predictive model(P1)exhibited an area under the receiver operating characteristic curve of 0.862.In comparison,the SMOTE-based logistic early warning model(P2),which utilized the SMOTE oversampling algorithm,showed a slightly lower but comparable area under the curve of 0.856,suggesting no significant difference in performance between the two predictive approaches.CONCLUSION This study confirms that the SMOTE-enhanced predictive model for postoperative delirium in elderly abdominal tumor patients shows performance equivalent to that of traditional methods,effectively addressing data imbalance.展开更多
BACKGROUND The induced-membrane technique was initially described by Masquelet as an effective treatment for large bone defects,especially those caused by infection.Here,we report a case of chronic osteomyelitis of th...BACKGROUND The induced-membrane technique was initially described by Masquelet as an effective treatment for large bone defects,especially those caused by infection.Here,we report a case of chronic osteomyelitis of the radius associated with a 9 cm bone defect,which was filled with a large allogeneic cortical bone graft from a bone bank.Complete bony union was achieved after 14 months of follow-up.Previous studies have used autogenous bone as the primary bone source for the Masquelet technique;in our case,the exclusive use of allografts is as successful as the use of autologous bone grafts.With the advent of bone banks,it is possible to obtain an unlimited amount of allograft,and the Masquelet technique may be further improved based on this new way of bone grafting.CASE SUMMARY In this study,we reported a case of repair of a long bone defect in a 40-year-old male patient,which was characterized by the utilization of allograft cortical bone combined with the Masquelet technique for the treatment of the patient's long bone defect in the forearm.The patient's results of functional recovery of the forearm were surprising,which further deepens the scope of application of Masquelet technique and helps to strengthen the efficacy of Masquelet technique in the treatment of long bones indeed.CONCLUSION Allograft cortical bone combined with the Masquelet technique provides a new method of treatment to large bone defect.展开更多
Lung cancer continues to be a leading cause of cancer-related deaths worldwide,emphasizing the critical need for improved diagnostic techniques.Early detection of lung tumors significantly increases the chances of suc...Lung cancer continues to be a leading cause of cancer-related deaths worldwide,emphasizing the critical need for improved diagnostic techniques.Early detection of lung tumors significantly increases the chances of successful treatment and survival.However,current diagnostic methods often fail to detect tumors at an early stage or to accurately pinpoint their location within the lung tissue.Single-model deep learning technologies for lung cancer detection,while beneficial,cannot capture the full range of features present in medical imaging data,leading to incomplete or inaccurate detection.Furthermore,it may not be robust enough to handle the wide variability in medical images due to different imaging conditions,patient anatomy,and tumor characteristics.To overcome these disadvantages,dual-model or multi-model approaches can be employed.This research focuses on enhancing the detection of lung cancer by utilizing a combination of two learning models:a Convolutional Neural Network(CNN)for categorization and the You Only Look Once(YOLOv8)architecture for real-time identification and pinpointing of tumors.CNNs automatically learn to extract hierarchical features from raw image data,capturing patterns such as edges,textures,and complex structures that are crucial for identifying lung cancer.YOLOv8 incorporates multiscale feature extraction,enabling the detection of tumors of varying sizes and scales within a single image.This is particularly beneficial for identifying small or irregularly shaped tumors that may be challenging to detect.Furthermore,through the utilization of cutting-edge data augmentation methods,such as Deep Convolutional Generative Adversarial Networks(DCGAN),the suggested approach can handle the issue of limited data and boost the models’ability to learn from diverse and comprehensive datasets.The combined method not only improved accuracy and localization but also ensured efficient real-time processing,which is crucial for practical clinical applications.The CNN achieved an accuracy of 97.67%in classifying lung tissues into healthy and cancerous categories.The YOLOv8 model achieved an Intersection over Union(IoU)score of 0.85 for tumor localization,reflecting high precision in detecting and marking tumor boundaries within the images.Finally,the incorporation of synthetic images generated by DCGAN led to a 10%improvement in both the CNN classification accuracy and YOLOv8 detection performance.展开更多
Despite advancements in neuroimaging,false positive diagnoses of intracranial aneurysms remain a significant concern.This article examines the causes,prevalence,and implications of such false-positive diagnoses.We dis...Despite advancements in neuroimaging,false positive diagnoses of intracranial aneurysms remain a significant concern.This article examines the causes,prevalence,and implications of such false-positive diagnoses.We discuss how conditions like arterial occlusion with vascular stump formation and infundibular widening can mimic aneurysms,particularly in the anterior circulation.The article compares various imaging modalities,including computer tomography angiogram,magnetic resonance imaging/angiography,and digital subtraction angiogram,highlighting their strengths and limitations.We emphasize the im-portance of accurate differentiation to avoid unnecessary surgical interventions.The potential of emerging technologies,such as high-resolution vessel wall ima-ging and deep neural networks for automated detection,is explored as promising avenues for improving diagnostic accuracy.This manuscript underscores the need for continued research and clinical vigilance in the diagnosis of intracranial aneurysms.展开更多
[ Objective] The aim was to study the spatial distribution pattern and field sampling method of aphid population in spring wheat. [ Method] The aphid quantity in tested wheat field was calculated, the field distributi...[ Objective] The aim was to study the spatial distribution pattern and field sampling method of aphid population in spring wheat. [ Method] The aphid quantity in tested wheat field was calculated, the field distribution pattern of wheat aphid was calculated by using aggregated index method, the aggregated reason was analyzed and the field theoretical sampling number was ascertained. [ Result] The wheat aphid population showed aggregation distribution and negative binomial distribution, and such aggregation distribution was induced by interaction of its behavior and environmental factors. The field theoretical sampling number of wheat aphid was related to sample variance and permissible error; the less the sam- ple variance [ S^2 ) was, the less the permissible error [ d') was, the bigger the theoretical sampling number without replacement was; when the initial population numbers were different, the theoretical sampling numbers were also different; after the permissible error being set, the bigger the sample variance ( S^2 ) was, the bigger the theoretical sampling number was. [ Conclusion] This study supplied scientific basis for prediction and field control of wheat aphid.展开更多
A single CMOS image sensor based on a 0.35μm process along with its design and implementation is introduced. The architecture of an active pixel sensor is used in the chip. The fill factor of a pixel cell can reach 4...A single CMOS image sensor based on a 0.35μm process along with its design and implementation is introduced. The architecture of an active pixel sensor is used in the chip. The fill factor of a pixel cell can reach 43%,higher than the traditional factor of 30%. Moreover, compared with the conventional method whose fixed pattern noise (FPN) is around 0.5%, a dynamic digital double sampling technique is developed, which possesses simpler circuit architecture and a better FPN suppression outcome. The CMOS image sensor chip is implemented in the 0.35μm mixed signal process of a Chartered by MPW. The experimental results show that the chip operates welt,with an FPN of about 0.17%.展开更多
Deep-sea sediment is extremely important in marine scientific research,such as that concerning marine geology and microbial communities.The research findings are closely related to the in-situ information of the sedim...Deep-sea sediment is extremely important in marine scientific research,such as that concerning marine geology and microbial communities.The research findings are closely related to the in-situ information of the sediment.One prerequisite for investigations of deep-sea sediment is providing sampling techniques capable of preventing distortion during recovery.As the fruit of such sampling techniques,samplers designed for obtaining sediment have become indispensable equipment,owing to their low cost,light weight,compactness,easy operation,and high adaptability to sea conditions.This paper introduces the research and application of typical deep-sea sediment samplers.Then,a representative sampler recently developed in China is analyzed.On this basis,a review and analysis is conducted regarding the key techniques of various deep-sea sediment samplers,including sealing,pressure and temperature retaining,low-disturbance sampling,and no-pressure drop transfer.Then,the shortcomings in the key techniques for deep-sea sediment sampling are identified.Finally,prospects for the future development of key techniques for deep-sea sediment sampling are proposed,from the perspectives of structural diversification,functional integration,intelligent operation,and high-fidelity samples.This paper summarizes the existing samplers in the context of the key techniques mentioned above,and can provide reference for the optimized design of samplers and development of key sampling techniques.展开更多
The accumulator is used as a pressure compensation device to realize deep-sea microbe gastight sampling. Four key states of the accumulator are proposed to describe the pressure compensation process and a correspondin...The accumulator is used as a pressure compensation device to realize deep-sea microbe gastight sampling. Four key states of the accumulator are proposed to describe the pressure compensation process and a corresponding mathematical model is established to investigate the relationship between the results of pressure compensation and the parameters of the accumulator. Simulation results show that during the falling process of the sampler, the accumulator' s real opening pressure is greater than its precharge pressure; when the sampling depth is 6000 m and the accumulator' s precharge pressure is less than 30 MPa, to increase the accumulator' s precharge pressure can improve pressure compensation results obviously. Laboratory experiments at 60 MPa show that the acctunulator is an effective and reliable pressure compensation device for deep-sea microbe samplers, The success in sea trial at a depth of 2000 m in the South China Sea shows that the mathematical model and laboratory experiment results are reliable.展开更多
Atmospheric radionuclide monitoring usually includes two sampling techniques, namely ultra-high volume aerosol samplers to collect at- mospheric particles by using filter media, and radioactive noble gas samplers to c...Atmospheric radionuclide monitoring usually includes two sampling techniques, namely ultra-high volume aerosol samplers to collect at- mospheric particles by using filter media, and radioactive noble gas samplers to collect atmospheric noble gas based on adsorption method. Atmos- pheric sampling techniques have been researched in Northwest Institute of Nuclear Technology since the Comprehensive Nuclear-Test-Ban Treaty (CTBT) was signed in 1996. Several ultra-high volume aerosol samplers and some types of radioactive xenon isotopes samplers had been devel- oped. For the aerosol sampler, the sampling flow is between 450 and 800 m3/h, with the minimum detectable concentration (MDC) of 131I less than 5 pBq/m3. For the xenon sampler, the sampling capacity of xenon is more than 4 ml per day, with MDC of l=Xe less than 0.25 mBq/m3. After the nuclear accident of Fukushima in 2011, monitoring of the atmospheric radionuclide was carried out for 3 months at Xi'an, and part of radionuclide was detected with concentrations hiaher than their backorounds in the period, includina 131I.134Cs. 137Cs and 133Xe.展开更多
Analyzed and calculated are pressure changes and body deformation of the sample inside of the corer in the process of sampling of deep-sea shallow sediment with a non-piston corer for gas hydrate investigation, Two co...Analyzed and calculated are pressure changes and body deformation of the sample inside of the corer in the process of sampling of deep-sea shallow sediment with a non-piston corer for gas hydrate investigation, Two conclusions are drawn: (1) the stress increments associated with the corer through the sampling process do not affect the stabilization of the gas hydrate; (2) the body deformation of the sample is serious and the "incremental filling ratio" (IFR) is less than unit, For taking samples with in-situ pressure and structure, combining with the design theories of the pressure tight corer, we have designed a kind of piston corer, named the gas hydrate pressure tight piston corer, Several tests on the sea have been conducted. Test results indicate that the piston corer has a good ability of taking sediment samples on the seafloor and maintaining their original in-situ pressure, meeting the requirement of exploration of gas hydrate in deep-sea shallow sediment layers.展开更多
Clustering, in data mining, is a useful technique for discovering interesting data distributions and patterns in the underlying data, and has many application fields, such as statistical data analysis, pattern recogni...Clustering, in data mining, is a useful technique for discovering interesting data distributions and patterns in the underlying data, and has many application fields, such as statistical data analysis, pattern recognition, image processing, and etc. We combine sampling technique with DBSCAN algorithm to cluster large spatial databases, and two sampling based DBSCAN (SDBSCAN) algorithms are developed. One algorithm introduces sampling technique inside DBSCAN, and the other uses sampling procedure outside DBSCAN. Experimental results demonstrate that our algorithms are effective and efficient in clustering large scale spatial databases.展开更多
[Objective] To further improve the prediction and forecast and continuous control ability of broccoli clubroot disease. [Methods] The spatial distribution pattern of diseased or infected plants was analyzed using the ...[Objective] To further improve the prediction and forecast and continuous control ability of broccoli clubroot disease. [Methods] The spatial distribution pattern of diseased or infected plants was analyzed using the least square method, fre- quency distribution, aggregation index, m*-m regression analysis and Taylor's pow- er law model. [Result] The field distribution of broccoli plants with clubroot disease tended to be aggregated distribution, m'-m regression analysis showed that the el- ementary composition of the spatial distribution of diseased or infected plants was individual colony, the individuals attracted each other; the disease had obvious dis- ease focus in the field, and the individual colony showed uniform distribution pattern in the field. Taylor's power law showed that the spatial pattern of individual dis- eased or infected plant with clubroot disease tended to be uniform distribution with the increase of the density. On the basis of this, Iwao optimal theoretical sampling model and sequential sampling model were established, namely N =273.954 1/m- 59.698 5, To (N)=0.368 4N±1.926 8√N, respectively, it meant that when surveying N plants, if the accumulative incidence rate exceeded upper bound, the field can be set as control object; if the accumulative incidence rate didn't reach lower bound, it can be set as uncontrol field; if the accumulative incidence rate was between upper bound and lower bound, it should be surveyed continuously until the maximum sample size (mo=0.368 4) appeared, that was, the disease incidence was 15%, so the sampling number should be 684 plants. [Conclusion] The research results had very important instructive meaning for disease control.展开更多
In China, little information is known about the nutrient requirements of jackfruits and the traditional nutrient management usually depends on the experience. Therefore, in this study, an attempt was made to standardi...In China, little information is known about the nutrient requirements of jackfruits and the traditional nutrient management usually depends on the experience. Therefore, in this study, an attempt was made to standardize the leaf sampling technique and the suitable range of leaf nutrient concentrations for jackfruit (Artocarpus heterophyllus Lam.) nutrient status diagnosis. The sampling result was affected by canopy height, leaf age and time of sampling. Therefore, the three factors were studied. The results illustrated that the stability in level of nutrient concentrations was in 3 - 6 month-old leaves from the central part of the canopy. The most stable period was from April to May for leaf sampling. It was recommended that the stable intracanopy and stable period of nutrient concentrations could be used as the standards of leaf sampling technique. Based on the leaf sampling technique, the standard of leaf nutrient concentrations was summarized, and could be used as the standard of nutrient suitability evaluation.展开更多
This study analyzes the sample influx (samples per case file) into forensic science laboratory (FSL) and the corresponding analysis costs and uses arbitrary re-sampling plans to establish the minimum cost function. Th...This study analyzes the sample influx (samples per case file) into forensic science laboratory (FSL) and the corresponding analysis costs and uses arbitrary re-sampling plans to establish the minimum cost function. The demand for forensic analysis increased for all disciplines, especially biology/DNA between 2014 and 2015. While the average distribution of case files was about 42.5%, 40.6% and 17% for the three disciplines, the distribution of samples was rather different being 12%, 82.5% and 5.5% for samples requiring forensic biology, chemistry and toxicology analysis, respectively. Results show that most of the analysis workload was on forensic chemistry analysis. The cost of analysis for case files and the corresponding sample influx varied in the ratio of 35:6:1 and 28:12:1 for forensic chemistry, biology/DNA and toxicology for year 2014 for 2015, respectively. In the two consecutive years, the cost for forensic chemistry analysis was comparatively very high, necessitating re-sampling. The time series of sample influx in all disciplines are strongly stochastic, with higher magnitude for chemistry, biology/DNA and toxicology, in this order. The PDFs of sample influx data are highly skewed to the right, especially forensic toxicology and biology/DNA with peaks at 1 and 3 samples per case file. The arbitrary re-sampling plans were best suited to forensic chemistry case files (where re-sampling conditions apply). The locus of arbitrary number of samples to take from the submitted forensic samples was used to establish the minimum and scientifically acceptable samples by applying minimization function developed in this paper. The cost minimization function was also developed based on the average cost per sample and choice of re-sampling plans depending on the range of sample influx, from which the savings were determined and maximized. Thus, the study gives a forensic scientist a business model and scientific decision making tool on minimum number of samples to analyze focusing on savings on analysis cost.展开更多
This paper reveaed some problems of the forest samling investigation from application.and pointed out the defects. Determining sample size method was precisely put forward from formla's origin in simple random Sam...This paper reveaed some problems of the forest samling investigation from application.and pointed out the defects. Determining sample size method was precisely put forward from formla's origin in simple random Samling procedure In stratified random samgling, two cases were distinguished: the variances Sh2 are equal for all h and not all Sh2 are equal This method made the assertion of making confidence interval more reliable.展开更多
The aim of this study is to investigate the impacts of the sampling strategy of landslide and non-landslide on the performance of landslide susceptibility assessment(LSA).The study area is the Feiyun catchment in Wenz...The aim of this study is to investigate the impacts of the sampling strategy of landslide and non-landslide on the performance of landslide susceptibility assessment(LSA).The study area is the Feiyun catchment in Wenzhou City,Southeast China.Two types of landslides samples,combined with seven non-landslide sampling strategies,resulted in a total of 14 scenarios.The corresponding landslide susceptibility map(LSM)for each scenario was generated using the random forest model.The receiver operating characteristic(ROC)curve and statistical indicators were calculated and used to assess the impact of the dataset sampling strategy.The results showed that higher accuracies were achieved when using the landslide core as positive samples,combined with non-landslide sampling from the very low zone or buffer zone.The results reveal the influence of landslide and non-landslide sampling strategies on the accuracy of LSA,which provides a reference for subsequent researchers aiming to obtain a more reasonable LSM.展开更多
Red turpentine beetle (RTB), Dendroctongs valens LeConte, is a destructive forest invasive species in China, it mainly attacks Pings tabuliformis and P. bungeana. So far it has spread rapidly to the provinces of Sha...Red turpentine beetle (RTB), Dendroctongs valens LeConte, is a destructive forest invasive species in China, it mainly attacks Pings tabuliformis and P. bungeana. So far it has spread rapidly to the provinces of Shanxi, Hebei, Henan, Shanxi and Beijing since its first outbreak in Shanxi Province in 1998, and has caused extensive tree mortality. Space-time dynamics of D. valens population and spatial sampling technique based on its spatial distribution pattern were ana- lyzed using geostatistical methods in the pure P. tabuliforis forests and mixedwood stands which were at different damage levels. According to the spatial distribu- tion of D. valeas population, the specific spatial sampling technique was also studied, and then was compared with traditional sampling technique. The spatial sam- piing technique combined with sampling theory and the biological characteristics of D. valens population, which not only could calcnlate the error of the sampling, but also could discuss the optimal sampling number and the optimum size of plot according to different damage levels and different stand types. This helps to explain population expansion and colonization mechanism of D. valens, and to provide a good reference for adopting snitable control measures.展开更多
A new sampling method of deepsea microplankton with function of in-situ concentrated sampling and gastight sampling was proposed. In-situ concentrated sampling technique was realized as follows: a microplankton membra...A new sampling method of deepsea microplankton with function of in-situ concentrated sampling and gastight sampling was proposed. In-situ concentrated sampling technique was realized as follows: a microplankton membrane was used as filtration membrane, and a deepsea pump was used to pump seawater; the microplankton was captured and the density of microplankton was increased when seawater flow through the filtration membrane. Gastight sampling technique was realized as follows: a precharged accumulator was used as pressure compensator. During the process of lifting the sampler, the accumulator compensated the pressure drop continuously. The laboratory experimental results show that with in-situ concentrated sampling technique, in-situ concentrated sampling can be realized and the maximum concentration ratio reaches up to 500. With pressure compensation technique based on accumulator, gastight sampling can be realized. When sampling at 6 km and the precharge pressure of accumulator is 18 MPa, pressure drop of the sample is less than 2% compared with its original pressure. Deepsea experiment (at 1.9 km) results show that the sampler can realize in-situ concentrated sampling and gastight sampling.展开更多
In this paper, the problem of nonparametric estimation of finite population quantile function using multiplicative bias correction technique is considered. A robust estimator of the finite population quantile function...In this paper, the problem of nonparametric estimation of finite population quantile function using multiplicative bias correction technique is considered. A robust estimator of the finite population quantile function based on multiplicative bias correction is derived with the aid of a super population model. Most studies have concentrated on kernel smoothers in the estimation of regression functions. This technique has also been applied to various methods of non-parametric estimation of the finite population quantile already under review. A major problem with the use of nonparametric kernel-based regression over a finite interval, such as the estimation of finite population quantities, is bias at boundary points. By correcting the boundary problems associated with previous model-based estimators, the multiplicative bias corrected estimator produced better results in estimating the finite population quantile function. Furthermore, the asymptotic behavior of the proposed estimators </span><span style="font-family:Verdana;">is</span><span style="font-family:Verdana;"> presented</span><span style="font-family:Verdana;">. </span><span style="font-family:Verdana;">It is observed that the estimator is asymptotically unbiased and statistically consistent when certain conditions are satisfied. The simulation results show that the suggested estimator is quite well in terms of relative bias, mean squared error, and relative root mean error. As a result, the multiplicative bias corrected estimator is strongly suggested for survey sampling estimation of the finite population quantile function.展开更多
Huge amounts of various polymers are being used in many fields with numerous benefits. However, their great ability to ignition and rapid flame spreading make these materials dangerous for human life and properties du...Huge amounts of various polymers are being used in many fields with numerous benefits. However, their great ability to ignition and rapid flame spreading make these materials dangerous for human life and properties due to the release of highly toxic combustion products. The present work aims to investigate several methods of sampling and identification of aromatic hydrocarbons produced by controlled burning of low-density polyethylene (LDPE) using a toxicity tube furnace. Five different sampling methods were used: solid phase micro extraction (SPME), syringe, tedlar bags, sorption tubes, and gas-solution absorbers (midget impingers). The produced hydrocarbons were analysed by gas chromatography coupled to mass spectrometry with and without pyrolysis. Among the tested techniques, the most convenient sampling method was using syringe with a glass vessel which allowed detection of the highest amount of aromatic hydrocarbons at both 800°C and 600°C, and then followed by SPME. On the other hand, the use of gas-solution absorber (midget impinger) showed poorer results. Regarding the use of tedlar bags and sorption tubes, they did not give satisfactory results. Several carcinogenic or possibly carcinogenic compounds were identified in the combustion products, such as benzene, naphthalene, anthracene and pyrene.展开更多
基金Supported by Discipline Advancement Program of Shanghai Fourth People’s Hospital,No.SY-XKZT-2020-2013.
文摘BACKGROUND Postoperative delirium,particularly prevalent in elderly patients after abdominal cancer surgery,presents significant challenges in clinical management.AIM To develop a synthetic minority oversampling technique(SMOTE)-based model for predicting postoperative delirium in elderly abdominal cancer patients.METHODS In this retrospective cohort study,we analyzed data from 611 elderly patients who underwent abdominal malignant tumor surgery at our hospital between September 2020 and October 2022.The incidence of postoperative delirium was recorded for 7 d post-surgery.Patients were divided into delirium and non-delirium groups based on the occurrence of postoperative delirium or not.A multivariate logistic regression model was used to identify risk factors and develop a predictive model for postoperative delirium.The SMOTE technique was applied to enhance the model by oversampling the delirium cases.The model’s predictive accuracy was then validated.RESULTS In our study involving 611 elderly patients with abdominal malignant tumors,multivariate logistic regression analysis identified significant risk factors for postoperative delirium.These included the Charlson comorbidity index,American Society of Anesthesiologists classification,history of cerebrovascular disease,surgical duration,perioperative blood transfusion,and postoperative pain score.The incidence rate of postoperative delirium in our study was 22.91%.The original predictive model(P1)exhibited an area under the receiver operating characteristic curve of 0.862.In comparison,the SMOTE-based logistic early warning model(P2),which utilized the SMOTE oversampling algorithm,showed a slightly lower but comparable area under the curve of 0.856,suggesting no significant difference in performance between the two predictive approaches.CONCLUSION This study confirms that the SMOTE-enhanced predictive model for postoperative delirium in elderly abdominal tumor patients shows performance equivalent to that of traditional methods,effectively addressing data imbalance.
文摘BACKGROUND The induced-membrane technique was initially described by Masquelet as an effective treatment for large bone defects,especially those caused by infection.Here,we report a case of chronic osteomyelitis of the radius associated with a 9 cm bone defect,which was filled with a large allogeneic cortical bone graft from a bone bank.Complete bony union was achieved after 14 months of follow-up.Previous studies have used autogenous bone as the primary bone source for the Masquelet technique;in our case,the exclusive use of allografts is as successful as the use of autologous bone grafts.With the advent of bone banks,it is possible to obtain an unlimited amount of allograft,and the Masquelet technique may be further improved based on this new way of bone grafting.CASE SUMMARY In this study,we reported a case of repair of a long bone defect in a 40-year-old male patient,which was characterized by the utilization of allograft cortical bone combined with the Masquelet technique for the treatment of the patient's long bone defect in the forearm.The patient's results of functional recovery of the forearm were surprising,which further deepens the scope of application of Masquelet technique and helps to strengthen the efficacy of Masquelet technique in the treatment of long bones indeed.CONCLUSION Allograft cortical bone combined with the Masquelet technique provides a new method of treatment to large bone defect.
文摘Lung cancer continues to be a leading cause of cancer-related deaths worldwide,emphasizing the critical need for improved diagnostic techniques.Early detection of lung tumors significantly increases the chances of successful treatment and survival.However,current diagnostic methods often fail to detect tumors at an early stage or to accurately pinpoint their location within the lung tissue.Single-model deep learning technologies for lung cancer detection,while beneficial,cannot capture the full range of features present in medical imaging data,leading to incomplete or inaccurate detection.Furthermore,it may not be robust enough to handle the wide variability in medical images due to different imaging conditions,patient anatomy,and tumor characteristics.To overcome these disadvantages,dual-model or multi-model approaches can be employed.This research focuses on enhancing the detection of lung cancer by utilizing a combination of two learning models:a Convolutional Neural Network(CNN)for categorization and the You Only Look Once(YOLOv8)architecture for real-time identification and pinpointing of tumors.CNNs automatically learn to extract hierarchical features from raw image data,capturing patterns such as edges,textures,and complex structures that are crucial for identifying lung cancer.YOLOv8 incorporates multiscale feature extraction,enabling the detection of tumors of varying sizes and scales within a single image.This is particularly beneficial for identifying small or irregularly shaped tumors that may be challenging to detect.Furthermore,through the utilization of cutting-edge data augmentation methods,such as Deep Convolutional Generative Adversarial Networks(DCGAN),the suggested approach can handle the issue of limited data and boost the models’ability to learn from diverse and comprehensive datasets.The combined method not only improved accuracy and localization but also ensured efficient real-time processing,which is crucial for practical clinical applications.The CNN achieved an accuracy of 97.67%in classifying lung tissues into healthy and cancerous categories.The YOLOv8 model achieved an Intersection over Union(IoU)score of 0.85 for tumor localization,reflecting high precision in detecting and marking tumor boundaries within the images.Finally,the incorporation of synthetic images generated by DCGAN led to a 10%improvement in both the CNN classification accuracy and YOLOv8 detection performance.
文摘Despite advancements in neuroimaging,false positive diagnoses of intracranial aneurysms remain a significant concern.This article examines the causes,prevalence,and implications of such false-positive diagnoses.We discuss how conditions like arterial occlusion with vascular stump formation and infundibular widening can mimic aneurysms,particularly in the anterior circulation.The article compares various imaging modalities,including computer tomography angiogram,magnetic resonance imaging/angiography,and digital subtraction angiogram,highlighting their strengths and limitations.We emphasize the im-portance of accurate differentiation to avoid unnecessary surgical interventions.The potential of emerging technologies,such as high-resolution vessel wall ima-ging and deep neural networks for automated detection,is explored as promising avenues for improving diagnostic accuracy.This manuscript underscores the need for continued research and clinical vigilance in the diagnosis of intracranial aneurysms.
基金Supported by National Natural Science Foundation of China(30660017)~~
文摘[ Objective] The aim was to study the spatial distribution pattern and field sampling method of aphid population in spring wheat. [ Method] The aphid quantity in tested wheat field was calculated, the field distribution pattern of wheat aphid was calculated by using aggregated index method, the aggregated reason was analyzed and the field theoretical sampling number was ascertained. [ Result] The wheat aphid population showed aggregation distribution and negative binomial distribution, and such aggregation distribution was induced by interaction of its behavior and environmental factors. The field theoretical sampling number of wheat aphid was related to sample variance and permissible error; the less the sam- ple variance [ S^2 ) was, the less the permissible error [ d') was, the bigger the theoretical sampling number without replacement was; when the initial population numbers were different, the theoretical sampling numbers were also different; after the permissible error being set, the bigger the sample variance ( S^2 ) was, the bigger the theoretical sampling number was. [ Conclusion] This study supplied scientific basis for prediction and field control of wheat aphid.
文摘A single CMOS image sensor based on a 0.35μm process along with its design and implementation is introduced. The architecture of an active pixel sensor is used in the chip. The fill factor of a pixel cell can reach 43%,higher than the traditional factor of 30%. Moreover, compared with the conventional method whose fixed pattern noise (FPN) is around 0.5%, a dynamic digital double sampling technique is developed, which possesses simpler circuit architecture and a better FPN suppression outcome. The CMOS image sensor chip is implemented in the 0.35μm mixed signal process of a Chartered by MPW. The experimental results show that the chip operates welt,with an FPN of about 0.17%.
基金Supported by National Key Research and Development Program of China(Grant No.2016YFC0300502)Hunan Provincial Innovation Foundation For Postgraduate(Grant No.CX2018B658)+2 种基金National Natural Science Foundation of China(Grant Nos.51705145,517779092)Supported by Scientific Research Fund of Hunan Provincial Education Department(Grant No.18B205)Hunan Province Natural Science Foundation(Grant No.2019 JJ50182).
文摘Deep-sea sediment is extremely important in marine scientific research,such as that concerning marine geology and microbial communities.The research findings are closely related to the in-situ information of the sediment.One prerequisite for investigations of deep-sea sediment is providing sampling techniques capable of preventing distortion during recovery.As the fruit of such sampling techniques,samplers designed for obtaining sediment have become indispensable equipment,owing to their low cost,light weight,compactness,easy operation,and high adaptability to sea conditions.This paper introduces the research and application of typical deep-sea sediment samplers.Then,a representative sampler recently developed in China is analyzed.On this basis,a review and analysis is conducted regarding the key techniques of various deep-sea sediment samplers,including sealing,pressure and temperature retaining,low-disturbance sampling,and no-pressure drop transfer.Then,the shortcomings in the key techniques for deep-sea sediment sampling are identified.Finally,prospects for the future development of key techniques for deep-sea sediment sampling are proposed,from the perspectives of structural diversification,functional integration,intelligent operation,and high-fidelity samples.This paper summarizes the existing samplers in the context of the key techniques mentioned above,and can provide reference for the optimized design of samplers and development of key sampling techniques.
文摘The accumulator is used as a pressure compensation device to realize deep-sea microbe gastight sampling. Four key states of the accumulator are proposed to describe the pressure compensation process and a corresponding mathematical model is established to investigate the relationship between the results of pressure compensation and the parameters of the accumulator. Simulation results show that during the falling process of the sampler, the accumulator' s real opening pressure is greater than its precharge pressure; when the sampling depth is 6000 m and the accumulator' s precharge pressure is less than 30 MPa, to increase the accumulator' s precharge pressure can improve pressure compensation results obviously. Laboratory experiments at 60 MPa show that the acctunulator is an effective and reliable pressure compensation device for deep-sea microbe samplers, The success in sea trial at a depth of 2000 m in the South China Sea shows that the mathematical model and laboratory experiment results are reliable.
文摘Atmospheric radionuclide monitoring usually includes two sampling techniques, namely ultra-high volume aerosol samplers to collect at- mospheric particles by using filter media, and radioactive noble gas samplers to collect atmospheric noble gas based on adsorption method. Atmos- pheric sampling techniques have been researched in Northwest Institute of Nuclear Technology since the Comprehensive Nuclear-Test-Ban Treaty (CTBT) was signed in 1996. Several ultra-high volume aerosol samplers and some types of radioactive xenon isotopes samplers had been devel- oped. For the aerosol sampler, the sampling flow is between 450 and 800 m3/h, with the minimum detectable concentration (MDC) of 131I less than 5 pBq/m3. For the xenon sampler, the sampling capacity of xenon is more than 4 ml per day, with MDC of l=Xe less than 0.25 mBq/m3. After the nuclear accident of Fukushima in 2011, monitoring of the atmospheric radionuclide was carried out for 3 months at Xi'an, and part of radionuclide was detected with concentrations hiaher than their backorounds in the period, includina 131I.134Cs. 137Cs and 133Xe.
基金The project was financially supported bythe National Natural science Foundation of China (Grant No.50675055)
文摘Analyzed and calculated are pressure changes and body deformation of the sample inside of the corer in the process of sampling of deep-sea shallow sediment with a non-piston corer for gas hydrate investigation, Two conclusions are drawn: (1) the stress increments associated with the corer through the sampling process do not affect the stabilization of the gas hydrate; (2) the body deformation of the sample is serious and the "incremental filling ratio" (IFR) is less than unit, For taking samples with in-situ pressure and structure, combining with the design theories of the pressure tight corer, we have designed a kind of piston corer, named the gas hydrate pressure tight piston corer, Several tests on the sea have been conducted. Test results indicate that the piston corer has a good ability of taking sediment samples on the seafloor and maintaining their original in-situ pressure, meeting the requirement of exploration of gas hydrate in deep-sea shallow sediment layers.
基金Supported by the Open Researches Fund Program of L IESMARS(WKL(0 0 ) 0 30 2 )
文摘Clustering, in data mining, is a useful technique for discovering interesting data distributions and patterns in the underlying data, and has many application fields, such as statistical data analysis, pattern recognition, image processing, and etc. We combine sampling technique with DBSCAN algorithm to cluster large spatial databases, and two sampling based DBSCAN (SDBSCAN) algorithms are developed. One algorithm introduces sampling technique inside DBSCAN, and the other uses sampling procedure outside DBSCAN. Experimental results demonstrate that our algorithms are effective and efficient in clustering large scale spatial databases.
基金Supported by Agricultural Key Projects of Science and Technology Program of Taizhou City in Zhejiang Province(121KY17)~~
文摘[Objective] To further improve the prediction and forecast and continuous control ability of broccoli clubroot disease. [Methods] The spatial distribution pattern of diseased or infected plants was analyzed using the least square method, fre- quency distribution, aggregation index, m*-m regression analysis and Taylor's pow- er law model. [Result] The field distribution of broccoli plants with clubroot disease tended to be aggregated distribution, m'-m regression analysis showed that the el- ementary composition of the spatial distribution of diseased or infected plants was individual colony, the individuals attracted each other; the disease had obvious dis- ease focus in the field, and the individual colony showed uniform distribution pattern in the field. Taylor's power law showed that the spatial pattern of individual dis- eased or infected plant with clubroot disease tended to be uniform distribution with the increase of the density. On the basis of this, Iwao optimal theoretical sampling model and sequential sampling model were established, namely N =273.954 1/m- 59.698 5, To (N)=0.368 4N±1.926 8√N, respectively, it meant that when surveying N plants, if the accumulative incidence rate exceeded upper bound, the field can be set as control object; if the accumulative incidence rate didn't reach lower bound, it can be set as uncontrol field; if the accumulative incidence rate was between upper bound and lower bound, it should be surveyed continuously until the maximum sample size (mo=0.368 4) appeared, that was, the disease incidence was 15%, so the sampling number should be 684 plants. [Conclusion] The research results had very important instructive meaning for disease control.
文摘In China, little information is known about the nutrient requirements of jackfruits and the traditional nutrient management usually depends on the experience. Therefore, in this study, an attempt was made to standardize the leaf sampling technique and the suitable range of leaf nutrient concentrations for jackfruit (Artocarpus heterophyllus Lam.) nutrient status diagnosis. The sampling result was affected by canopy height, leaf age and time of sampling. Therefore, the three factors were studied. The results illustrated that the stability in level of nutrient concentrations was in 3 - 6 month-old leaves from the central part of the canopy. The most stable period was from April to May for leaf sampling. It was recommended that the stable intracanopy and stable period of nutrient concentrations could be used as the standards of leaf sampling technique. Based on the leaf sampling technique, the standard of leaf nutrient concentrations was summarized, and could be used as the standard of nutrient suitability evaluation.
文摘This study analyzes the sample influx (samples per case file) into forensic science laboratory (FSL) and the corresponding analysis costs and uses arbitrary re-sampling plans to establish the minimum cost function. The demand for forensic analysis increased for all disciplines, especially biology/DNA between 2014 and 2015. While the average distribution of case files was about 42.5%, 40.6% and 17% for the three disciplines, the distribution of samples was rather different being 12%, 82.5% and 5.5% for samples requiring forensic biology, chemistry and toxicology analysis, respectively. Results show that most of the analysis workload was on forensic chemistry analysis. The cost of analysis for case files and the corresponding sample influx varied in the ratio of 35:6:1 and 28:12:1 for forensic chemistry, biology/DNA and toxicology for year 2014 for 2015, respectively. In the two consecutive years, the cost for forensic chemistry analysis was comparatively very high, necessitating re-sampling. The time series of sample influx in all disciplines are strongly stochastic, with higher magnitude for chemistry, biology/DNA and toxicology, in this order. The PDFs of sample influx data are highly skewed to the right, especially forensic toxicology and biology/DNA with peaks at 1 and 3 samples per case file. The arbitrary re-sampling plans were best suited to forensic chemistry case files (where re-sampling conditions apply). The locus of arbitrary number of samples to take from the submitted forensic samples was used to establish the minimum and scientifically acceptable samples by applying minimization function developed in this paper. The cost minimization function was also developed based on the average cost per sample and choice of re-sampling plans depending on the range of sample influx, from which the savings were determined and maximized. Thus, the study gives a forensic scientist a business model and scientific decision making tool on minimum number of samples to analyze focusing on savings on analysis cost.
文摘This paper reveaed some problems of the forest samling investigation from application.and pointed out the defects. Determining sample size method was precisely put forward from formla's origin in simple random Samling procedure In stratified random samgling, two cases were distinguished: the variances Sh2 are equal for all h and not all Sh2 are equal This method made the assertion of making confidence interval more reliable.
文摘The aim of this study is to investigate the impacts of the sampling strategy of landslide and non-landslide on the performance of landslide susceptibility assessment(LSA).The study area is the Feiyun catchment in Wenzhou City,Southeast China.Two types of landslides samples,combined with seven non-landslide sampling strategies,resulted in a total of 14 scenarios.The corresponding landslide susceptibility map(LSM)for each scenario was generated using the random forest model.The receiver operating characteristic(ROC)curve and statistical indicators were calculated and used to assess the impact of the dataset sampling strategy.The results showed that higher accuracies were achieved when using the landslide core as positive samples,combined with non-landslide sampling from the very low zone or buffer zone.The results reveal the influence of landslide and non-landslide sampling strategies on the accuracy of LSA,which provides a reference for subsequent researchers aiming to obtain a more reasonable LSM.
基金Supported by Research Project of Jiangsu Entry-Exit Inspection and Quarantine Bureau(2015KJ49)Project of Beijing Municipal Education Commission(JD100220888)+2 种基金Project of Beijing Excellent Talents Funding(D Class)Project of Beijing Municipal Education Commission(JD100220888)Beijing Excellent Talents Funding(D Class)Project "Study on Prevention and Control Technology of Dendroctonus valens"
文摘Red turpentine beetle (RTB), Dendroctongs valens LeConte, is a destructive forest invasive species in China, it mainly attacks Pings tabuliformis and P. bungeana. So far it has spread rapidly to the provinces of Shanxi, Hebei, Henan, Shanxi and Beijing since its first outbreak in Shanxi Province in 1998, and has caused extensive tree mortality. Space-time dynamics of D. valens population and spatial sampling technique based on its spatial distribution pattern were ana- lyzed using geostatistical methods in the pure P. tabuliforis forests and mixedwood stands which were at different damage levels. According to the spatial distribu- tion of D. valeas population, the specific spatial sampling technique was also studied, and then was compared with traditional sampling technique. The spatial sam- piing technique combined with sampling theory and the biological characteristics of D. valens population, which not only could calcnlate the error of the sampling, but also could discuss the optimal sampling number and the optimum size of plot according to different damage levels and different stand types. This helps to explain population expansion and colonization mechanism of D. valens, and to provide a good reference for adopting snitable control measures.
基金Project(DY105-03-01-10) supported by China Ocean Mineral Resources Research and Development Association Project (1343-75221) supported by Central South University
文摘A new sampling method of deepsea microplankton with function of in-situ concentrated sampling and gastight sampling was proposed. In-situ concentrated sampling technique was realized as follows: a microplankton membrane was used as filtration membrane, and a deepsea pump was used to pump seawater; the microplankton was captured and the density of microplankton was increased when seawater flow through the filtration membrane. Gastight sampling technique was realized as follows: a precharged accumulator was used as pressure compensator. During the process of lifting the sampler, the accumulator compensated the pressure drop continuously. The laboratory experimental results show that with in-situ concentrated sampling technique, in-situ concentrated sampling can be realized and the maximum concentration ratio reaches up to 500. With pressure compensation technique based on accumulator, gastight sampling can be realized. When sampling at 6 km and the precharge pressure of accumulator is 18 MPa, pressure drop of the sample is less than 2% compared with its original pressure. Deepsea experiment (at 1.9 km) results show that the sampler can realize in-situ concentrated sampling and gastight sampling.
文摘In this paper, the problem of nonparametric estimation of finite population quantile function using multiplicative bias correction technique is considered. A robust estimator of the finite population quantile function based on multiplicative bias correction is derived with the aid of a super population model. Most studies have concentrated on kernel smoothers in the estimation of regression functions. This technique has also been applied to various methods of non-parametric estimation of the finite population quantile already under review. A major problem with the use of nonparametric kernel-based regression over a finite interval, such as the estimation of finite population quantities, is bias at boundary points. By correcting the boundary problems associated with previous model-based estimators, the multiplicative bias corrected estimator produced better results in estimating the finite population quantile function. Furthermore, the asymptotic behavior of the proposed estimators </span><span style="font-family:Verdana;">is</span><span style="font-family:Verdana;"> presented</span><span style="font-family:Verdana;">. </span><span style="font-family:Verdana;">It is observed that the estimator is asymptotically unbiased and statistically consistent when certain conditions are satisfied. The simulation results show that the suggested estimator is quite well in terms of relative bias, mean squared error, and relative root mean error. As a result, the multiplicative bias corrected estimator is strongly suggested for survey sampling estimation of the finite population quantile function.
文摘Huge amounts of various polymers are being used in many fields with numerous benefits. However, their great ability to ignition and rapid flame spreading make these materials dangerous for human life and properties due to the release of highly toxic combustion products. The present work aims to investigate several methods of sampling and identification of aromatic hydrocarbons produced by controlled burning of low-density polyethylene (LDPE) using a toxicity tube furnace. Five different sampling methods were used: solid phase micro extraction (SPME), syringe, tedlar bags, sorption tubes, and gas-solution absorbers (midget impingers). The produced hydrocarbons were analysed by gas chromatography coupled to mass spectrometry with and without pyrolysis. Among the tested techniques, the most convenient sampling method was using syringe with a glass vessel which allowed detection of the highest amount of aromatic hydrocarbons at both 800°C and 600°C, and then followed by SPME. On the other hand, the use of gas-solution absorber (midget impinger) showed poorer results. Regarding the use of tedlar bags and sorption tubes, they did not give satisfactory results. Several carcinogenic or possibly carcinogenic compounds were identified in the combustion products, such as benzene, naphthalene, anthracene and pyrene.