The proportion of elderly patients in intensive care is increasing, and a significant proportion of them require mechanical ventilation. How to implement safe and effective mechanical ventilation for elderly patients,...The proportion of elderly patients in intensive care is increasing, and a significant proportion of them require mechanical ventilation. How to implement safe and effective mechanical ventilation for elderly patients, and when appropriate off-line is an important issue in the field of critical care medicine. Appropriate sedation can improve patient outcomes, but excessive sedation may lead to prolonged mechanical ventilation and increase the risk of complications. Elderly patients should be closely monitored and evaluated on an individual basis while offline, and the sedation regimen should be dynamically adjusted. This requires the healthcare team to consider the patient’s sedation needs, disease status, and pharmacodynamics and pharmacokinetics of the drug to arrive at the best strategy. Although the current research has provided valuable insights and strategies for sedation and off-line management, there are still many problems to be further explored and solved.展开更多
Many fields,such as neuroscience,are experiencing the vast prolife ration of cellular data,underscoring the need fo r organizing and interpreting large datasets.A popular approach partitions data into manageable subse...Many fields,such as neuroscience,are experiencing the vast prolife ration of cellular data,underscoring the need fo r organizing and interpreting large datasets.A popular approach partitions data into manageable subsets via hierarchical clustering,but objective methods to determine the appropriate classification granularity are missing.We recently introduced a technique to systematically identify when to stop subdividing clusters based on the fundamental principle that cells must differ more between than within clusters.Here we present the corresponding protocol to classify cellular datasets by combining datadriven unsupervised hierarchical clustering with statistical testing.These general-purpose functions are applicable to any cellular dataset that can be organized as two-dimensional matrices of numerical values,including molecula r,physiological,and anatomical datasets.We demonstrate the protocol using cellular data from the Janelia MouseLight project to chara cterize morphological aspects of neurons.展开更多
With the rise of remote collaboration,the demand for advanced storage and collaboration tools has rapidly increased.However,traditional collaboration tools primarily rely on access control,leaving data stored on cloud...With the rise of remote collaboration,the demand for advanced storage and collaboration tools has rapidly increased.However,traditional collaboration tools primarily rely on access control,leaving data stored on cloud servers vulnerable due to insufficient encryption.This paper introduces a novel mechanism that encrypts data in‘bundle’units,designed to meet the dual requirements of efficiency and security for frequently updated collaborative data.Each bundle includes updated information,allowing only the updated portions to be reencrypted when changes occur.The encryption method proposed in this paper addresses the inefficiencies of traditional encryption modes,such as Cipher Block Chaining(CBC)and Counter(CTR),which require decrypting and re-encrypting the entire dataset whenever updates occur.The proposed method leverages update-specific information embedded within data bundles and metadata that maps the relationship between these bundles and the plaintext data.By utilizing this information,the method accurately identifies the modified portions and applies algorithms to selectively re-encrypt only those sections.This approach significantly enhances the efficiency of data updates while maintaining high performance,particularly in large-scale data environments.To validate this approach,we conducted experiments measuring execution time as both the size of the modified data and the total dataset size varied.Results show that the proposed method significantly outperforms CBC and CTR modes in execution speed,with greater performance gains as data size increases.Additionally,our security evaluation confirms that this method provides robust protection against both passive and active attacks.展开更多
A remarkable marine heatwave,known as the“Blob”,occurred in the Northeast Pacific Ocean from late 2013 to early 2016,which displayed strong warm anomalies extending from the surface to a depth of 300 m.This study em...A remarkable marine heatwave,known as the“Blob”,occurred in the Northeast Pacific Ocean from late 2013 to early 2016,which displayed strong warm anomalies extending from the surface to a depth of 300 m.This study employed two assimilation schemes based on the global Climate Forecast System of Nanjing University of Information Science(NUIST-CFS 1.0)to investigate the impact of ocean data assimilation on the seasonal prediction of this extreme marine heatwave.The sea surface temperature(SST)nudging scheme assimilates SST only,while the deterministic ensemble Kalman filter(EnKF)scheme assimilates observations from the surface to the deep ocean.The latter notably improves the forecasting skill for subsurface temperature anomalies,especially at the depth of 100-300 m(the lower layer),outperforming the SST nudging scheme.It excels in predicting both horizontal and vertical heat transport in the lower layer,contributing to improved forecasts of the lower-layer warming during the Blob.These improvements stem from the assimilation of subsurface observational data,which are important in predicting the upper-ocean conditions.The results suggest that assimilating ocean data with the EnKF scheme significantly enhances the accuracy in predicting subsurface temperature anomalies during the Blob and offers better understanding of its underlying mechanisms.展开更多
There is a growing body of clinical research on the utility of synthetic data derivatives,an emerging research tool in medicine.In nephrology,clinicians can use machine learning and artificial intelligence as powerful...There is a growing body of clinical research on the utility of synthetic data derivatives,an emerging research tool in medicine.In nephrology,clinicians can use machine learning and artificial intelligence as powerful aids in their clinical decision-making while also preserving patient privacy.This is especially important given the epidemiology of chronic kidney disease,renal oncology,and hypertension worldwide.However,there remains a need to create a framework for guidance regarding how to better utilize synthetic data as a practical application in this research.展开更多
In the face of data scarcity in the optimization of maintenance strategies for civil aircraft,traditional failure data-driven methods are encountering challenges owing to the increasing reliability of aircraft design....In the face of data scarcity in the optimization of maintenance strategies for civil aircraft,traditional failure data-driven methods are encountering challenges owing to the increasing reliability of aircraft design.This study addresses this issue by presenting a novel combined data fusion algorithm,which serves to enhance the accuracy and reliability of failure rate analysis for a specific aircraft model by integrating historical failure data from similar models as supplementary information.Through a comprehensive analysis of two different maintenance projects,this study illustrates the application process of the algorithm.Building upon the analysis results,this paper introduces the innovative equal integral value method as a replacement for the conventional equal interval method in the context of maintenance schedule optimization.The Monte Carlo simulation example validates that the equivalent essential value method surpasses the traditional method by over 20%in terms of inspection efficiency ratio.This discovery indicates that the equal critical value method not only upholds maintenance efficiency but also substantially decreases workload and maintenance costs.The findings of this study open up novel perspectives for airlines grappling with data scarcity,offer fresh strategies for the optimization of aviation maintenance practices,and chart a new course toward achieving more efficient and cost-effective maintenance schedule optimization through refined data analysis.展开更多
Air temperature is an important indicator to analyze climate change in mountainous areas.ERA5 reanalysis air temperature data are important products that were widely used to analyze temperature change in mountainous a...Air temperature is an important indicator to analyze climate change in mountainous areas.ERA5 reanalysis air temperature data are important products that were widely used to analyze temperature change in mountainous areas.However,the reliability of ERA5 reanalysis air temperature over the Qilian Mountains(QLM)is unclear.In this study,we evaluated the reliability of ERA5 monthly averaged reanalysis 2 m air temperature data using the observations at 17 meteorological stations in the QLM from 1979 to 2017.The results showed that:ERA5 reanalysis monthly averaged air temperature data have a good applicability in the QLM in general(R2=0.99).ERA5 reanalysis temperature data overestimated the observed temperature in the QLM in general.Root mean square error(RMSE)increases with the increasing of elevation range,showing that the reliability of ERA5 reanalysis temperature data is worse in higher elevation than that in lower altitude.ERA5 reanalysis temperature can capture observational warming rates well.All the smallest warming rates of observational temperature and ERA5 reanalysis temperature are found in winter,with the warming rates of 0.393°C/10a and 0.360°C/10a,respectively.This study will provide a reference for the application of ERA5 reanalysis monthly averaged air temperature data at different elevation ranges in the Qilian Mountains.展开更多
Semantic communication(SemCom)aims to achieve high-fidelity information delivery under low communication consumption by only guaranteeing semantic accuracy.Nevertheless,semantic communication still suffers from unexpe...Semantic communication(SemCom)aims to achieve high-fidelity information delivery under low communication consumption by only guaranteeing semantic accuracy.Nevertheless,semantic communication still suffers from unexpected channel volatility and thus developing a re-transmission mechanism(e.g.,hybrid automatic repeat request[HARQ])becomes indispensable.In that regard,instead of discarding previously transmitted information,the incremental knowledge-based HARQ(IK-HARQ)is deemed as a more effective mechanism that could sufficiently utilize the information semantics.However,considering the possible existence of semantic ambiguity in image transmission,a simple bit-level cyclic redundancy check(CRC)might compromise the performance of IK-HARQ.Therefore,there emerges a strong incentive to revolutionize the CRC mechanism,thus more effectively reaping the benefits of both SemCom and HARQ.In this paper,built on top of swin transformer-based joint source-channel coding(JSCC)and IK-HARQ,we propose a semantic image transmission framework SC-TDA-HARQ.In particular,different from the conventional CRC,we introduce a topological data analysis(TDA)-based error detection method,which capably digs out the inner topological and geometric information of images,to capture semantic information and determine the necessity for re-transmission.Extensive numerical results validate the effectiveness and efficiency of the proposed SC-TDA-HARQ framework,especially under the limited bandwidth condition,and manifest the superiority of TDA-based error detection method in image transmission.展开更多
Wireless sensor network deployment optimization is a classic NP-hard problem and a popular topic in academic research.However,the current research on wireless sensor network deployment problems uses overly simplistic ...Wireless sensor network deployment optimization is a classic NP-hard problem and a popular topic in academic research.However,the current research on wireless sensor network deployment problems uses overly simplistic models,and there is a significant gap between the research results and actual wireless sensor networks.Some scholars have now modeled data fusion networks to make them more suitable for practical applications.This paper will explore the deployment problem of a stochastic data fusion wireless sensor network(SDFWSN),a model that reflects the randomness of environmental monitoring and uses data fusion techniques widely used in actual sensor networks for information collection.The deployment problem of SDFWSN is modeled as a multi-objective optimization problem.The network life cycle,spatiotemporal coverage,detection rate,and false alarm rate of SDFWSN are used as optimization objectives to optimize the deployment of network nodes.This paper proposes an enhanced multi-objective mongoose optimization algorithm(EMODMOA)to solve the deployment problem of SDFWSN.First,to overcome the shortcomings of the DMOA algorithm,such as its low convergence and tendency to get stuck in a local optimum,an encircling and hunting strategy is introduced into the original algorithm to propose the EDMOA algorithm.The EDMOA algorithm is designed as the EMODMOA algorithm by selecting reference points using the K-Nearest Neighbor(KNN)algorithm.To verify the effectiveness of the proposed algorithm,the EMODMOA algorithm was tested at CEC 2020 and achieved good results.In the SDFWSN deployment problem,the algorithm was compared with the Non-dominated Sorting Genetic Algorithm II(NSGAII),Multiple Objective Particle Swarm Optimization(MOPSO),Multi-Objective Evolutionary Algorithm based on Decomposition(MOEA/D),and Multi-Objective Grey Wolf Optimizer(MOGWO).By comparing and analyzing the performance evaluation metrics and optimization results of the objective functions of the multi-objective algorithms,the algorithm outperforms the other algorithms in the SDFWSN deployment results.To better demonstrate the superiority of the algorithm,simulations of diverse test cases were also performed,and good results were obtained.展开更多
This study presents a machine learning-based method for predicting fragment velocity distribution in warhead fragmentation under explosive loading condition.The fragment resultant velocities are correlated with key de...This study presents a machine learning-based method for predicting fragment velocity distribution in warhead fragmentation under explosive loading condition.The fragment resultant velocities are correlated with key design parameters including casing dimensions and detonation positions.The paper details the finite element analysis for fragmentation,the characterizations of the dynamic hardening and fracture models,the generation of comprehensive datasets,and the training of the ANN model.The results show the influence of casing dimensions on fragment velocity distributions,with the tendencies indicating increased resultant velocity with reduced thickness,increased length and diameter.The model's predictive capability is demonstrated through the accurate predictions for both training and testing datasets,showing its potential for the real-time prediction of fragmentation performance.展开更多
The integration of image analysis through deep learning(DL)into rock classification represents a significant leap forward in geological research.While traditional methods remain invaluable for their expertise and hist...The integration of image analysis through deep learning(DL)into rock classification represents a significant leap forward in geological research.While traditional methods remain invaluable for their expertise and historical context,DL offers a powerful complement by enhancing the speed,objectivity,and precision of the classification process.This research explores the significance of image data augmentation techniques in optimizing the performance of convolutional neural networks(CNNs)for geological image analysis,particularly in the classification of igneous,metamorphic,and sedimentary rock types from rock thin section(RTS)images.This study primarily focuses on classic image augmentation techniques and evaluates their impact on model accuracy and precision.Results demonstrate that augmentation techniques like Equalize significantly enhance the model's classification capabilities,achieving an F1-Score of 0.9869 for igneous rocks,0.9884 for metamorphic rocks,and 0.9929 for sedimentary rocks,representing improvements compared to the baseline original results.Moreover,the weighted average F1-Score across all classes and techniques is 0.9886,indicating an enhancement.Conversely,methods like Distort lead to decreased accuracy and F1-Score,with an F1-Score of 0.949 for igneous rocks,0.954 for metamorphic rocks,and 0.9416 for sedimentary rocks,exacerbating the performance compared to the baseline.The study underscores the practicality of image data augmentation in geological image classification and advocates for the adoption of DL methods in this domain for automation and improved results.The findings of this study can benefit various fields,including remote sensing,mineral exploration,and environmental monitoring,by enhancing the accuracy of geological image analysis both for scientific research and industrial applications.展开更多
The effects of on-line solution, off-line solution and aging heat treatment on the microstructure and hardness of the die-cast AZ91D alloys were investigated. Brinell hardness of die-cast AZ91D alloy increases through...The effects of on-line solution, off-line solution and aging heat treatment on the microstructure and hardness of the die-cast AZ91D alloys were investigated. Brinell hardness of die-cast AZ91D alloy increases through on-line solution and off-line aging treatment but decreases after off-line solution treatment. By X-ray diffractometry, optical microscopy, differential thermal analysis, scanning electron microscopy and X-ray energy dispersive spectroscopy, it is found that the microstructures of the die-cast AZ91D magnesium alloy before and after on-line solution and off-line aging are similar, consisting of α-Mg and β-Al12Mg17. The precipitation of Al element is prevented by on-line solution so that the effect of solid solution strengthening is enhanced. The β-Al12Mg17 phases precipitate from supersaturated Mg solid solution after off-line aging treatment, and lead to microstructure refinement of AZ91D alloy, so the effect of precipitation hardening is enhanced. The β-Al12Mg17 phases dissolve in the substructure after off-line solution treatment, which leads to that the grain boundary strengthening phase is reduced significantly and the hardness of die cast AZ91D is reduced.展开更多
The IAP (Institute of Atmospheric Physics) land-surface model (IAP94) is described. This model is a comprehensive one with detailed description for the processes of vegetation, snow and soil. Particular attention has ...The IAP (Institute of Atmospheric Physics) land-surface model (IAP94) is described. This model is a comprehensive one with detailed description for the processes of vegetation, snow and soil. Particular attention has been paid to the cases with three water phases in the surface media. On the basis of the mixture theory and the theory of fluid dynamics of porous media, the system of universal conservational equations for water and heat of soil, snow and vegetation canopy has been constructed. On this background, all important factors that may affect the water and heat balance in media can be considered naturally, and each factor and term possess distinct physical meaning. In the computation of water content and temperature, the water phase change and the heat transportation by water flow are taken into account. Moreover, particular attention has been given to the water vapor diffusion in soil for arid or semi-arid cases, and snow compaction. In the treatment of surface turbulent fluxes, the difference between aerodynamic and thermal roughness is taken into account. The aerodynamic roughness of vegetation is calculated as a function of canopy density, height and zero-plane displacement. An extrapolation of log linear and exponential relationship is used when calculating the wind profile within canopy. The model has been validated against field measurements in off-line simulations. The desirable model′s performance leads to the conclusion that the IAP94 is able to reproduce the main physical mechanisms governing the energy and water balances in the global land surface. Part II of the present study will concern the validation in a 3-D experiment coupled with the IAP Two-Level AGCM.展开更多
Off-line programming (OLP) system becomes one of the most important programming modules for the robotic belt grinding process, however there lacks research on increasing the grinding dexterous space depending on the...Off-line programming (OLP) system becomes one of the most important programming modules for the robotic belt grinding process, however there lacks research on increasing the grinding dexterous space depending on the OLP system. A new type of grinding robot and a novel robotic belt grinding workcell are forwarded, and their features are briefly introduced. An open and object-oriented off-line programming system is developed for this robotic belt grinding system. The parameters of the trimmed surface are read from the initial graphics exchange specification (IGES) file of the CAD model of the workpiece. The deBoor-Cox basis function is used to sample the grinding target with local contact frame on the workpiece. The numerical formula of inverse kinematics is set up based on Newton's iterative procedure, to calculate the grinding robot configurations corresponding to the grinding targets. After the grinding path is obtained, the OLP system turns to be more effective than the teach-by-showing system. In order to improve the grinding workspace, an optimization algorithm for dynamic tool frame is proposed and performed on the special robotic belt grinding system. The initial tool frame and the interval of neighboring tool frames are defined as the preparation of the algorithm. An optimized tool local frame can be selected to grind the complex surface for a maximum dexterity index of the robot. Under the optimization algorithm, a simulation of grinding a vane is included and comparison of grinding workspace is done before and after the tool frame optimization. By the algorithm, the grinding workspace can be enlarged. Moreover the dynamic tool frame can be considered to add one degree-of-freedom to the grinding kinematical chain, which provides the theoretical support for the improvement of robotic dexterity for the complex surface grinding.展开更多
In order to further understand the land surface processes over the northern Tibetan Plateau, this study produced an off-line simulated examination at the Bujiao site on the northern Tibetan Plateau from June 2002 to A...In order to further understand the land surface processes over the northern Tibetan Plateau, this study produced an off-line simulated examination at the Bujiao site on the northern Tibetan Plateau from June 2002 to April 2004, using the Noah Land Surface Model (Noah LSM) and observed data from the CAMP/Tibet experiment. The observed data were neces- sarily corrected and the number of soil layers in the Noah LSM was changed from 4 to 10 to enable this off-line simulation and analysis. The main conclusions are as follows: the Noah LSM performed well on the northern Tibetan Plateau. The simulated net radiation, upward longwave radiation, and upward shortwave radiation demonstrated the same remarkable annual and seasonal variation as the observed data, especially the upward longwave radiation. The simulated soil temperatures were acceptably close to the observed temperatures, especially in the shallow soil layers. The simulated freezing and melting processes were shown to start from the surface soil layer and spread down to the deep soil layers, but they took longer than the observed processes. However, Noah LSM did not adequately simulate the soil moisture. Therefore, additional high-quality, long-term observations of land surface-atmosphere processes over the Tibetan Plateau will be a key factor in proper adiustments of the model parameters in the future.展开更多
Segmentation of cursive text has been one of the major problems in Arabic writing. The problem is the shape of the letter which is context sensitive, depending on it’s location within a word. Many text recognition sy...Segmentation of cursive text has been one of the major problems in Arabic writing. The problem is the shape of the letter which is context sensitive, depending on it’s location within a word. Many text recognition systems recognize text imagery at the character level and assemble words from the recognized characters. Unfortunately this approach does not work with Arabic text. In this paper we describe a new approach to segment Arabic text imagery at a word level, without analyzing individual characters. This approach avoids the problem of individual characters segmentation, and can overcome local errors in character recognition.展开更多
A study of the interference simulation based on robot welding of the radar pedestal was carried out by using the KUKA Sim Pro simulation software and off-line program technology. Compared with the actual robot welding...A study of the interference simulation based on robot welding of the radar pedestal was carried out by using the KUKA Sim Pro simulation software and off-line program technology. Compared with the actual robot welding process, it was found that the trajectory of the simulated robot welding process in line with that recorded in the actual welding process, and the actual limit and interference appeared at the same place as the simulation process. There was no interference phenomenon on the outside weld-seam; on the internal weld-seam, especially on the weld-joint of support plate connected to the cylinder, a phenomenon of interference appeared. It was helpful to use the simulation method to guide the actual robot welding so as to protect robot from impacting and reduce the weld defects.展开更多
文摘The proportion of elderly patients in intensive care is increasing, and a significant proportion of them require mechanical ventilation. How to implement safe and effective mechanical ventilation for elderly patients, and when appropriate off-line is an important issue in the field of critical care medicine. Appropriate sedation can improve patient outcomes, but excessive sedation may lead to prolonged mechanical ventilation and increase the risk of complications. Elderly patients should be closely monitored and evaluated on an individual basis while offline, and the sedation regimen should be dynamically adjusted. This requires the healthcare team to consider the patient’s sedation needs, disease status, and pharmacodynamics and pharmacokinetics of the drug to arrive at the best strategy. Although the current research has provided valuable insights and strategies for sedation and off-line management, there are still many problems to be further explored and solved.
基金supported in part by NIH grants R01NS39600,U01MH114829RF1MH128693(to GAA)。
文摘Many fields,such as neuroscience,are experiencing the vast prolife ration of cellular data,underscoring the need fo r organizing and interpreting large datasets.A popular approach partitions data into manageable subsets via hierarchical clustering,but objective methods to determine the appropriate classification granularity are missing.We recently introduced a technique to systematically identify when to stop subdividing clusters based on the fundamental principle that cells must differ more between than within clusters.Here we present the corresponding protocol to classify cellular datasets by combining datadriven unsupervised hierarchical clustering with statistical testing.These general-purpose functions are applicable to any cellular dataset that can be organized as two-dimensional matrices of numerical values,including molecula r,physiological,and anatomical datasets.We demonstrate the protocol using cellular data from the Janelia MouseLight project to chara cterize morphological aspects of neurons.
基金supported by the Institute of Information&communications Technology Planning&Evaluation(IITP)grant funded by the Korea government(MSIT)(RS-2024-00399401,Development of Quantum-Safe Infrastructure Migration and Quantum Security Verification Technologies).
文摘With the rise of remote collaboration,the demand for advanced storage and collaboration tools has rapidly increased.However,traditional collaboration tools primarily rely on access control,leaving data stored on cloud servers vulnerable due to insufficient encryption.This paper introduces a novel mechanism that encrypts data in‘bundle’units,designed to meet the dual requirements of efficiency and security for frequently updated collaborative data.Each bundle includes updated information,allowing only the updated portions to be reencrypted when changes occur.The encryption method proposed in this paper addresses the inefficiencies of traditional encryption modes,such as Cipher Block Chaining(CBC)and Counter(CTR),which require decrypting and re-encrypting the entire dataset whenever updates occur.The proposed method leverages update-specific information embedded within data bundles and metadata that maps the relationship between these bundles and the plaintext data.By utilizing this information,the method accurately identifies the modified portions and applies algorithms to selectively re-encrypt only those sections.This approach significantly enhances the efficiency of data updates while maintaining high performance,particularly in large-scale data environments.To validate this approach,we conducted experiments measuring execution time as both the size of the modified data and the total dataset size varied.Results show that the proposed method significantly outperforms CBC and CTR modes in execution speed,with greater performance gains as data size increases.Additionally,our security evaluation confirms that this method provides robust protection against both passive and active attacks.
基金supported by the National Natural Science Foundation of China [grant number 42030605]the National Key R&D Program of China [grant number 2020YFA0608004]。
文摘A remarkable marine heatwave,known as the“Blob”,occurred in the Northeast Pacific Ocean from late 2013 to early 2016,which displayed strong warm anomalies extending from the surface to a depth of 300 m.This study employed two assimilation schemes based on the global Climate Forecast System of Nanjing University of Information Science(NUIST-CFS 1.0)to investigate the impact of ocean data assimilation on the seasonal prediction of this extreme marine heatwave.The sea surface temperature(SST)nudging scheme assimilates SST only,while the deterministic ensemble Kalman filter(EnKF)scheme assimilates observations from the surface to the deep ocean.The latter notably improves the forecasting skill for subsurface temperature anomalies,especially at the depth of 100-300 m(the lower layer),outperforming the SST nudging scheme.It excels in predicting both horizontal and vertical heat transport in the lower layer,contributing to improved forecasts of the lower-layer warming during the Blob.These improvements stem from the assimilation of subsurface observational data,which are important in predicting the upper-ocean conditions.The results suggest that assimilating ocean data with the EnKF scheme significantly enhances the accuracy in predicting subsurface temperature anomalies during the Blob and offers better understanding of its underlying mechanisms.
文摘There is a growing body of clinical research on the utility of synthetic data derivatives,an emerging research tool in medicine.In nephrology,clinicians can use machine learning and artificial intelligence as powerful aids in their clinical decision-making while also preserving patient privacy.This is especially important given the epidemiology of chronic kidney disease,renal oncology,and hypertension worldwide.However,there remains a need to create a framework for guidance regarding how to better utilize synthetic data as a practical application in this research.
文摘In the face of data scarcity in the optimization of maintenance strategies for civil aircraft,traditional failure data-driven methods are encountering challenges owing to the increasing reliability of aircraft design.This study addresses this issue by presenting a novel combined data fusion algorithm,which serves to enhance the accuracy and reliability of failure rate analysis for a specific aircraft model by integrating historical failure data from similar models as supplementary information.Through a comprehensive analysis of two different maintenance projects,this study illustrates the application process of the algorithm.Building upon the analysis results,this paper introduces the innovative equal integral value method as a replacement for the conventional equal interval method in the context of maintenance schedule optimization.The Monte Carlo simulation example validates that the equivalent essential value method surpasses the traditional method by over 20%in terms of inspection efficiency ratio.This discovery indicates that the equal critical value method not only upholds maintenance efficiency but also substantially decreases workload and maintenance costs.The findings of this study open up novel perspectives for airlines grappling with data scarcity,offer fresh strategies for the optimization of aviation maintenance practices,and chart a new course toward achieving more efficient and cost-effective maintenance schedule optimization through refined data analysis.
基金financially supported by the National Natural Science Foundation of China(No.41621001)。
文摘Air temperature is an important indicator to analyze climate change in mountainous areas.ERA5 reanalysis air temperature data are important products that were widely used to analyze temperature change in mountainous areas.However,the reliability of ERA5 reanalysis air temperature over the Qilian Mountains(QLM)is unclear.In this study,we evaluated the reliability of ERA5 monthly averaged reanalysis 2 m air temperature data using the observations at 17 meteorological stations in the QLM from 1979 to 2017.The results showed that:ERA5 reanalysis monthly averaged air temperature data have a good applicability in the QLM in general(R2=0.99).ERA5 reanalysis temperature data overestimated the observed temperature in the QLM in general.Root mean square error(RMSE)increases with the increasing of elevation range,showing that the reliability of ERA5 reanalysis temperature data is worse in higher elevation than that in lower altitude.ERA5 reanalysis temperature can capture observational warming rates well.All the smallest warming rates of observational temperature and ERA5 reanalysis temperature are found in winter,with the warming rates of 0.393°C/10a and 0.360°C/10a,respectively.This study will provide a reference for the application of ERA5 reanalysis monthly averaged air temperature data at different elevation ranges in the Qilian Mountains.
基金supported in part by the National Key Research and Development Program of China under Grant 2024YFE0200600in part by the National Natural Science Foundation of China under Grant 62071425+3 种基金in part by the Zhejiang Key Research and Development Plan under Grant 2022C01093in part by the Zhejiang Provincial Natural Science Foundation of China under Grant LR23F010005in part by the National Key Laboratory of Wireless Communications Foundation under Grant 2023KP01601in part by the Big Data and Intelligent Computing Key Lab of CQUPT under Grant BDIC-2023-B-001.
文摘Semantic communication(SemCom)aims to achieve high-fidelity information delivery under low communication consumption by only guaranteeing semantic accuracy.Nevertheless,semantic communication still suffers from unexpected channel volatility and thus developing a re-transmission mechanism(e.g.,hybrid automatic repeat request[HARQ])becomes indispensable.In that regard,instead of discarding previously transmitted information,the incremental knowledge-based HARQ(IK-HARQ)is deemed as a more effective mechanism that could sufficiently utilize the information semantics.However,considering the possible existence of semantic ambiguity in image transmission,a simple bit-level cyclic redundancy check(CRC)might compromise the performance of IK-HARQ.Therefore,there emerges a strong incentive to revolutionize the CRC mechanism,thus more effectively reaping the benefits of both SemCom and HARQ.In this paper,built on top of swin transformer-based joint source-channel coding(JSCC)and IK-HARQ,we propose a semantic image transmission framework SC-TDA-HARQ.In particular,different from the conventional CRC,we introduce a topological data analysis(TDA)-based error detection method,which capably digs out the inner topological and geometric information of images,to capture semantic information and determine the necessity for re-transmission.Extensive numerical results validate the effectiveness and efficiency of the proposed SC-TDA-HARQ framework,especially under the limited bandwidth condition,and manifest the superiority of TDA-based error detection method in image transmission.
基金supported by the National Natural Science Foundation of China under Grant Nos.U21A20464,62066005Innovation Project of Guangxi Graduate Education under Grant No.YCSW2024313.
文摘Wireless sensor network deployment optimization is a classic NP-hard problem and a popular topic in academic research.However,the current research on wireless sensor network deployment problems uses overly simplistic models,and there is a significant gap between the research results and actual wireless sensor networks.Some scholars have now modeled data fusion networks to make them more suitable for practical applications.This paper will explore the deployment problem of a stochastic data fusion wireless sensor network(SDFWSN),a model that reflects the randomness of environmental monitoring and uses data fusion techniques widely used in actual sensor networks for information collection.The deployment problem of SDFWSN is modeled as a multi-objective optimization problem.The network life cycle,spatiotemporal coverage,detection rate,and false alarm rate of SDFWSN are used as optimization objectives to optimize the deployment of network nodes.This paper proposes an enhanced multi-objective mongoose optimization algorithm(EMODMOA)to solve the deployment problem of SDFWSN.First,to overcome the shortcomings of the DMOA algorithm,such as its low convergence and tendency to get stuck in a local optimum,an encircling and hunting strategy is introduced into the original algorithm to propose the EDMOA algorithm.The EDMOA algorithm is designed as the EMODMOA algorithm by selecting reference points using the K-Nearest Neighbor(KNN)algorithm.To verify the effectiveness of the proposed algorithm,the EMODMOA algorithm was tested at CEC 2020 and achieved good results.In the SDFWSN deployment problem,the algorithm was compared with the Non-dominated Sorting Genetic Algorithm II(NSGAII),Multiple Objective Particle Swarm Optimization(MOPSO),Multi-Objective Evolutionary Algorithm based on Decomposition(MOEA/D),and Multi-Objective Grey Wolf Optimizer(MOGWO).By comparing and analyzing the performance evaluation metrics and optimization results of the objective functions of the multi-objective algorithms,the algorithm outperforms the other algorithms in the SDFWSN deployment results.To better demonstrate the superiority of the algorithm,simulations of diverse test cases were also performed,and good results were obtained.
基金supported by Poongsan-KAIST Future Research Center Projectthe fund support provided by the National Research Foundation of Korea(NRF)grant funded by the Korea government(MSIT)(Grant No.2023R1A2C2005661)。
文摘This study presents a machine learning-based method for predicting fragment velocity distribution in warhead fragmentation under explosive loading condition.The fragment resultant velocities are correlated with key design parameters including casing dimensions and detonation positions.The paper details the finite element analysis for fragmentation,the characterizations of the dynamic hardening and fracture models,the generation of comprehensive datasets,and the training of the ANN model.The results show the influence of casing dimensions on fragment velocity distributions,with the tendencies indicating increased resultant velocity with reduced thickness,increased length and diameter.The model's predictive capability is demonstrated through the accurate predictions for both training and testing datasets,showing its potential for the real-time prediction of fragmentation performance.
文摘The integration of image analysis through deep learning(DL)into rock classification represents a significant leap forward in geological research.While traditional methods remain invaluable for their expertise and historical context,DL offers a powerful complement by enhancing the speed,objectivity,and precision of the classification process.This research explores the significance of image data augmentation techniques in optimizing the performance of convolutional neural networks(CNNs)for geological image analysis,particularly in the classification of igneous,metamorphic,and sedimentary rock types from rock thin section(RTS)images.This study primarily focuses on classic image augmentation techniques and evaluates their impact on model accuracy and precision.Results demonstrate that augmentation techniques like Equalize significantly enhance the model's classification capabilities,achieving an F1-Score of 0.9869 for igneous rocks,0.9884 for metamorphic rocks,and 0.9929 for sedimentary rocks,representing improvements compared to the baseline original results.Moreover,the weighted average F1-Score across all classes and techniques is 0.9886,indicating an enhancement.Conversely,methods like Distort lead to decreased accuracy and F1-Score,with an F1-Score of 0.949 for igneous rocks,0.954 for metamorphic rocks,and 0.9416 for sedimentary rocks,exacerbating the performance compared to the baseline.The study underscores the practicality of image data augmentation in geological image classification and advocates for the adoption of DL methods in this domain for automation and improved results.The findings of this study can benefit various fields,including remote sensing,mineral exploration,and environmental monitoring,by enhancing the accuracy of geological image analysis both for scientific research and industrial applications.
基金Projects (2011BAE22B01, 2011BAE22B06) supported by the National Key Technologies R&D Program During the 12th Five-Year Plan Period of ChinaProject (2010NC018) supported by the Innovation Fund of Inner Mongolia University of Science and Technology, China
文摘The effects of on-line solution, off-line solution and aging heat treatment on the microstructure and hardness of the die-cast AZ91D alloys were investigated. Brinell hardness of die-cast AZ91D alloy increases through on-line solution and off-line aging treatment but decreases after off-line solution treatment. By X-ray diffractometry, optical microscopy, differential thermal analysis, scanning electron microscopy and X-ray energy dispersive spectroscopy, it is found that the microstructures of the die-cast AZ91D magnesium alloy before and after on-line solution and off-line aging are similar, consisting of α-Mg and β-Al12Mg17. The precipitation of Al element is prevented by on-line solution so that the effect of solid solution strengthening is enhanced. The β-Al12Mg17 phases precipitate from supersaturated Mg solid solution after off-line aging treatment, and lead to microstructure refinement of AZ91D alloy, so the effect of precipitation hardening is enhanced. The β-Al12Mg17 phases dissolve in the substructure after off-line solution treatment, which leads to that the grain boundary strengthening phase is reduced significantly and the hardness of die cast AZ91D is reduced.
文摘The IAP (Institute of Atmospheric Physics) land-surface model (IAP94) is described. This model is a comprehensive one with detailed description for the processes of vegetation, snow and soil. Particular attention has been paid to the cases with three water phases in the surface media. On the basis of the mixture theory and the theory of fluid dynamics of porous media, the system of universal conservational equations for water and heat of soil, snow and vegetation canopy has been constructed. On this background, all important factors that may affect the water and heat balance in media can be considered naturally, and each factor and term possess distinct physical meaning. In the computation of water content and temperature, the water phase change and the heat transportation by water flow are taken into account. Moreover, particular attention has been given to the water vapor diffusion in soil for arid or semi-arid cases, and snow compaction. In the treatment of surface turbulent fluxes, the difference between aerodynamic and thermal roughness is taken into account. The aerodynamic roughness of vegetation is calculated as a function of canopy density, height and zero-plane displacement. An extrapolation of log linear and exponential relationship is used when calculating the wind profile within canopy. The model has been validated against field measurements in off-line simulations. The desirable model′s performance leads to the conclusion that the IAP94 is able to reproduce the main physical mechanisms governing the energy and water balances in the global land surface. Part II of the present study will concern the validation in a 3-D experiment coupled with the IAP Two-Level AGCM.
基金supported by National Hi-tech Research and Development Program of China (863 Program, Grant No. 2007AA04Z2443)State Key Laboratory for Man ufacturing Systems Engineering of Xi’an Jiaotong University of China
文摘Off-line programming (OLP) system becomes one of the most important programming modules for the robotic belt grinding process, however there lacks research on increasing the grinding dexterous space depending on the OLP system. A new type of grinding robot and a novel robotic belt grinding workcell are forwarded, and their features are briefly introduced. An open and object-oriented off-line programming system is developed for this robotic belt grinding system. The parameters of the trimmed surface are read from the initial graphics exchange specification (IGES) file of the CAD model of the workpiece. The deBoor-Cox basis function is used to sample the grinding target with local contact frame on the workpiece. The numerical formula of inverse kinematics is set up based on Newton's iterative procedure, to calculate the grinding robot configurations corresponding to the grinding targets. After the grinding path is obtained, the OLP system turns to be more effective than the teach-by-showing system. In order to improve the grinding workspace, an optimization algorithm for dynamic tool frame is proposed and performed on the special robotic belt grinding system. The initial tool frame and the interval of neighboring tool frames are defined as the preparation of the algorithm. An optimized tool local frame can be selected to grind the complex surface for a maximum dexterity index of the robot. Under the optimization algorithm, a simulation of grinding a vane is included and comparison of grinding workspace is done before and after the tool frame optimization. By the algorithm, the grinding workspace can be enlarged. Moreover the dynamic tool frame can be considered to add one degree-of-freedom to the grinding kinematical chain, which provides the theoretical support for the improvement of robotic dexterity for the complex surface grinding.
基金the National Natural Science Foundation of China (Nos. 41075053 and 41275016)
文摘In order to further understand the land surface processes over the northern Tibetan Plateau, this study produced an off-line simulated examination at the Bujiao site on the northern Tibetan Plateau from June 2002 to April 2004, using the Noah Land Surface Model (Noah LSM) and observed data from the CAMP/Tibet experiment. The observed data were neces- sarily corrected and the number of soil layers in the Noah LSM was changed from 4 to 10 to enable this off-line simulation and analysis. The main conclusions are as follows: the Noah LSM performed well on the northern Tibetan Plateau. The simulated net radiation, upward longwave radiation, and upward shortwave radiation demonstrated the same remarkable annual and seasonal variation as the observed data, especially the upward longwave radiation. The simulated soil temperatures were acceptably close to the observed temperatures, especially in the shallow soil layers. The simulated freezing and melting processes were shown to start from the surface soil layer and spread down to the deep soil layers, but they took longer than the observed processes. However, Noah LSM did not adequately simulate the soil moisture. Therefore, additional high-quality, long-term observations of land surface-atmosphere processes over the Tibetan Plateau will be a key factor in proper adiustments of the model parameters in the future.
文摘Segmentation of cursive text has been one of the major problems in Arabic writing. The problem is the shape of the letter which is context sensitive, depending on it’s location within a word. Many text recognition systems recognize text imagery at the character level and assemble words from the recognized characters. Unfortunately this approach does not work with Arabic text. In this paper we describe a new approach to segment Arabic text imagery at a word level, without analyzing individual characters. This approach avoids the problem of individual characters segmentation, and can overcome local errors in character recognition.
基金Funded by Anhui Provincial Natural Science Foundation of China(GFKJ2015B002)Quality Engineering Project of Anhui province(2014zy122)
文摘A study of the interference simulation based on robot welding of the radar pedestal was carried out by using the KUKA Sim Pro simulation software and off-line program technology. Compared with the actual robot welding process, it was found that the trajectory of the simulated robot welding process in line with that recorded in the actual welding process, and the actual limit and interference appeared at the same place as the simulation process. There was no interference phenomenon on the outside weld-seam; on the internal weld-seam, especially on the weld-joint of support plate connected to the cylinder, a phenomenon of interference appeared. It was helpful to use the simulation method to guide the actual robot welding so as to protect robot from impacting and reduce the weld defects.