Refined 3D modeling of mine slopes is pivotal for precise prediction of geological hazards.Aiming at the inadequacy of existing single modeling methods in comprehensively representing the overall and localized charact...Refined 3D modeling of mine slopes is pivotal for precise prediction of geological hazards.Aiming at the inadequacy of existing single modeling methods in comprehensively representing the overall and localized characteristics of mining slopes,this study introduces a new method that fuses model data from Unmanned aerial vehicles(UAV)tilt photogrammetry and 3D laser scanning through a data alignment algorithm based on control points.First,the mini batch K-Medoids algorithm is utilized to cluster the point cloud data from ground 3D laser scanning.Then,the elbow rule is applied to determine the optimal cluster number(K0),and the feature points are extracted.Next,the nearest neighbor point algorithm is employed to match the feature points obtained from UAV tilt photogrammetry,and the internal point coordinates are adjusted through the distanceweighted average to construct a 3D model.Finally,by integrating an engineering case study,the K0 value is determined to be 8,with a matching accuracy between the two model datasets ranging from 0.0669 to 1.0373 mm.Therefore,compared with the modeling method utilizing K-medoids clustering algorithm,the new modeling method significantly enhances the computational efficiency,the accuracy of selecting the optimal number of feature points in 3D laser scanning,and the precision of the 3D model derived from UAV tilt photogrammetry.This method provides a research foundation for constructing mine slope model.展开更多
The accurate estimation of parameters is the premise for establishing a high-fidelity simulation model of a valve-controlled cylinder system.Bench test data are easily obtained,but it is challenging to emulate actual ...The accurate estimation of parameters is the premise for establishing a high-fidelity simulation model of a valve-controlled cylinder system.Bench test data are easily obtained,but it is challenging to emulate actual loads in the research on parameter estimation of valve-controlled cylinder system.Despite the actual load information contained in the operating data of the control valve,its acquisition remains challenging.This paper proposes a method that fuses bench test and operating data for parameter estimation to address the aforementioned problems.The proposed method is based on Bayesian theory,and its core is a pool fusion of prior information from bench test and operating data.Firstly,a system model is established,and the parameters in the model are analysed.Secondly,the bench and operating data of the system are collected.Then,the model parameters and weight coefficients are estimated using the data fusion method.Finally,the estimated effects of the data fusion method,Bayesian method,and particle swarm optimisation(PSO)algorithm on system model parameters are compared.The research shows that the weight coefficient represents the contribution of different prior information to the parameter estimation result.The effect of parameter estimation based on the data fusion method is better than that of the Bayesian method and the PSO algorithm.Increasing load complexity leads to a decrease in model accuracy,highlighting the crucial role of the data fusion method in parameter estimation studies.展开更多
With the popularisation of intelligent power,power devices have different shapes,numbers and specifications.This means that the power data has distributional variability,the model learning process cannot achieve suffi...With the popularisation of intelligent power,power devices have different shapes,numbers and specifications.This means that the power data has distributional variability,the model learning process cannot achieve sufficient extraction of data features,which seriously affects the accuracy and performance of anomaly detection.Therefore,this paper proposes a deep learning-based anomaly detection model for power data,which integrates a data alignment enhancement technique based on random sampling and an adaptive feature fusion method leveraging dimension reduction.Aiming at the distribution variability of power data,this paper developed a sliding window-based data adjustment method for this model,which solves the problem of high-dimensional feature noise and low-dimensional missing data.To address the problem of insufficient feature fusion,an adaptive feature fusion method based on feature dimension reduction and dictionary learning is proposed to improve the anomaly data detection accuracy of the model.In order to verify the effectiveness of the proposed method,we conducted effectiveness comparisons through elimination experiments.The experimental results show that compared with the traditional anomaly detection methods,the method proposed in this paper not only has an advantage in model accuracy,but also reduces the amount of parameter calculation of the model in the process of feature matching and improves the detection speed.展开更多
In order to obtain more accurate precipitation data and better simulate the precipitation on the Tibetan Plateau,the simulation capability of 14 Coupled Model Intercomparison Project Phase 6(CMIP6)models of historical...In order to obtain more accurate precipitation data and better simulate the precipitation on the Tibetan Plateau,the simulation capability of 14 Coupled Model Intercomparison Project Phase 6(CMIP6)models of historical precipitation(1982-2014)on the Qinghai-Tibetan Plateau was evaluated in this study.Results indicate that all models exhibit an overestimation of precipitation through the analysis of the Taylor index,temporal and spatial statistical parameters.To correct the overestimation,a fusion correction method combining the Backpropagation Neural Network Correction(BP)and Quantum Mapping(QM)correction,named BQ method,was proposed.With this method,the historical precipitation of each model was corrected in space and time,respectively.The correction results were then analyzed in time,space,and analysis of variance(ANOVA)with those corrected by the BP and QM methods,respectively.Finally,the fusion correction method results for each model were compared with the Climatic Research Unit(CRU)data for significance analysis to obtain the trends of precipitation increase and decrease for each model.The results show that the IPSL-CM6A-LR model is relatively good in simulating historical precipitation on the Qinghai-Tibetan Plateau(R=0.7,RSME=0.15)among the uncorrected data.In terms of time,the total precipitation corrected by the fusion method has the same interannual trend and the closest precipitation values to the CRU data;In terms of space,the annual average precipitation corrected by the fusion method has the smallest difference with the CRU data,and the total historical annual average precipitation is not significantly different from the CRU data,which is better than BP and QM.Therefore,the correction effect of the fusion method on the historical precipitation of each model is better than that of the QM and BP methods.The precipitation in the central and northeastern parts of the plateau shows a significant increasing trend.The correlation coefficients between monthly precipitation and site-detected precipitation for all models after BQ correction exceed 0.8.展开更多
The spaceborne platform has unprecedently provided the global eddy-permitting(typically about 0.25°)products of sea surface salinity(SSS),however the existing SSS products can hardly resolve mesoscale motions due...The spaceborne platform has unprecedently provided the global eddy-permitting(typically about 0.25°)products of sea surface salinity(SSS),however the existing SSS products can hardly resolve mesoscale motions due to the heavy noises therein and the over-smoothing in denoising processes.By means of the multi-fractal fusion(MFF),the high-resolution SSS product is synthesized with the template of sea surface temperature(SST).Two low-resolution SSS products and four SST products are considered as the source data and the templates respectively to determine the best combination.The fused products are validated by the in situ observations and intercompared via SSS maps,Singularity Exponent maps and wavenumber spectra.The results demonstrate that the MFF can perform a good work in mitigating the noises and improving the resolution.The combination of the climate change initiative SSS and the remote sensing system SST can produce the 0.1°denoised product whose global mean standard derivation of salinity against Argo is 0.21 and the feature resolution can reach 30−40 km.展开更多
There is a growing body of research on the swarm unmanned aerial vehicle(UAV)in recent years,which has the characteristics of small,low speed,and low height as radar target.To confront the swarm UAV,the design of anti...There is a growing body of research on the swarm unmanned aerial vehicle(UAV)in recent years,which has the characteristics of small,low speed,and low height as radar target.To confront the swarm UAV,the design of anti-UAV radar system based on multiple input multiple output(MIMO)is put forward,which can elevate the performance of resolution,angle accuracy,high data rate,and tracking flexibility for swarm UAV detection.Target resolution and detection are the core problem in detecting the swarm UAV.The distinct advantage of MIMO system in angular accuracy measurement is demonstrated by comparing MIMO radar with phased array radar.Since MIMO radar has better performance in resolution,swarm UAV detection still has difficulty in target detection.This paper proposes a multi-mode data fusion algorithm based on deep neural networks to improve the detection effect.Subsequently,signal processing and data processing based on the detection fusion algorithm above are designed,forming a high resolution detection loop.Several simulations are designed to illustrate the feasibility of the designed system and the proposed algorithm.展开更多
Mangroves are indispensable to coastlines,maintaining biodiversity,and mitigating climate change.Therefore,improving the accuracy of mangrove information identification is crucial for their ecological protection.Aimin...Mangroves are indispensable to coastlines,maintaining biodiversity,and mitigating climate change.Therefore,improving the accuracy of mangrove information identification is crucial for their ecological protection.Aiming at the limited morphological information of synthetic aperture radar(SAR)images,which is greatly interfered by noise,and the susceptibility of optical images to weather and lighting conditions,this paper proposes a pixel-level weighted fusion method for SAR and optical images.Image fusion enhanced the target features and made mangrove monitoring more comprehensive and accurate.To address the problem of high similarity between mangrove forests and other forests,this paper is based on the U-Net convolutional neural network,and an attention mechanism is added in the feature extraction stage to make the model pay more attention to the mangrove vegetation area in the image.In order to accelerate the convergence and normalize the input,batch normalization(BN)layer and Dropout layer are added after each convolutional layer.Since mangroves are a minority class in the image,an improved cross-entropy loss function is introduced in this paper to improve the model’s ability to recognize mangroves.The AttU-Net model for mangrove recognition in high similarity environments is thus constructed based on the fused images.Through comparison experiments,the overall accuracy of the improved U-Net model trained from the fused images to recognize the predicted regions is significantly improved.Based on the fused images,the recognition results of the AttU-Net model proposed in this paper are compared with its benchmark model,U-Net,and the Dense-Net,Res-Net,and Seg-Net methods.The AttU-Net model captured mangroves’complex structures and textural features in images more effectively.The average OA,F1-score,and Kappa coefficient in the four tested regions were 94.406%,90.006%,and 84.045%,which were significantly higher than several other methods.This method can provide some technical support for the monitoring and protection of mangrove ecosystems.展开更多
Non-contact remote sensing techniques,such as terrestrial laser scanning(TLS)and unmanned aerial vehicle(UAV)photogrammetry,have been globally applied for landslide monitoring in high and steep mountainous areas.These...Non-contact remote sensing techniques,such as terrestrial laser scanning(TLS)and unmanned aerial vehicle(UAV)photogrammetry,have been globally applied for landslide monitoring in high and steep mountainous areas.These techniques acquire terrain data and enable ground deformation monitoring.However,practical application of these technologies still faces many difficulties due to complex terrain,limited access and dense vegetation.For instance,monitoring high and steep slopes can obstruct the TLS sightline,and the accuracy of the UAV model may be compromised by absence of ground control points(GCPs).This paper proposes a TLS-and UAV-based method for monitoring landslide deformation in high mountain valleys using traditional real-time kinematics(RTK)-based control points(RCPs),low-precision TLS-based control points(TCPs)and assumed control points(ACPs)to achieve high-precision surface deformation analysis under obstructed vision and impassable conditions.The effects of GCP accuracy,GCP quantity and automatic tie point(ATP)quantity on the accuracy of UAV modeling and surface deformation analysis were comprehensively analyzed.The results show that,the proposed method allows for the monitoring accuracy of landslides to exceed the accuracy of the GCPs themselves by adding additional low-accuracy GCPs.The proposed method was implemented for monitoring the Xinhua landslide in Baoxing County,China,and was validated against data from multiple sources.展开更多
Dead fine fuel moisture content(DFFMC)is a key factor affecting the spread of forest fires,which plays an important role in evaluation of forest fire risk.In order to achieve high-precision real-time measurement of DF...Dead fine fuel moisture content(DFFMC)is a key factor affecting the spread of forest fires,which plays an important role in evaluation of forest fire risk.In order to achieve high-precision real-time measurement of DFFMC,this study established a long short-term memory(LSTM)network based on particle swarm optimization(PSO)algorithm as a measurement model.A multi-point surface monitoring scheme combining near-infrared measurement method and meteorological measurement method is proposed.The near-infrared spectral information of dead fine fuels and the meteorological factors in the region are processed by data fusion technology to construct a spectral-meteorological data set.The surface fine dead fuel of Mongolian oak(Quercus mongolica Fisch.ex Ledeb.),white birch(Betula platyphylla Suk.),larch(Larix gmelinii(Rupr.)Kuzen.),and Manchurian walnut(Juglans mandshurica Maxim.)in the maoershan experimental forest farm of the Northeast Forestry University were investigated.We used the PSO-LSTM model for moisture content to compare the near-infrared spectroscopy,meteorological,and spectral meteorological fusion methods.The results show that the mean absolute error of the DFFMC of the four stands by spectral meteorological fusion method were 1.1%for Mongolian oak,1.3%for white birch,1.4%for larch,and 1.8%for Manchurian walnut,and these values were lower than those of the near-infrared method and the meteorological method.The spectral meteorological fusion method provides a new way for high-precision measurement of moisture content of fine dead fuel.展开更多
Sea surface temperature(SST)is one of the important parameters of global ocean and climate research,which can be retrieved by satellite infrared and passive microwave remote sensing instruments.While satellite infrare...Sea surface temperature(SST)is one of the important parameters of global ocean and climate research,which can be retrieved by satellite infrared and passive microwave remote sensing instruments.While satellite infrared SST offers high spatial resolution,it is limited by cloud cover.On the other hand,passive microwave SST provides all-weather observation but suffers from poor spatial resolution and susceptibility to environmental factors such as rainfall,coastal effects,and high wind speeds.To achieve high-precision,comprehensive,and high-resolution SST data,it is essential to fuse infrared and microwave SST measurements.In this study,data from the Fengyun-3D(FY-3D)medium resolution spectral imager II(MERSI-II)SST and microwave imager(MWRI)SST were fused.Firstly,the accuracy of both MERSIII SST and MWRI SST was verified,and the latter was bilinearly interpolated to match the 5km resolution grid of MERSI SST.After pretreatment and quality control of MERSI SST and MWRI SST,a Piece-Wise Regression method was employed to correct biases in MWRI SST.Subsequently,SST data were selected based on spatial resolution and accuracy within a 3-day window of the analysis date.Finally,an optimal interpolation method was applied to fuse the FY-3D MERSI-II SST and MWRI SST.The results demonstrated a significant improvement in spatial coverage compared to MERSI-II SST and MWRI SST.Furthermore,the fusion SST retained true spatial distribution details and exhibited an accuracy of–0.12±0.74℃compared to OSTIA SST.This study has improved the accuracy of FY satellite fusion SST products in China.展开更多
针对自动驾驶路面上目标漏检和错检的问题,提出一种基于改进Centerfusion的自动驾驶3D目标检测模型。该模型通过将相机信息和雷达特征融合,构成多通道特征数据输入,从而增强目标检测网络的鲁棒性,减少漏检问题;为了能够得到更加准确丰富...针对自动驾驶路面上目标漏检和错检的问题,提出一种基于改进Centerfusion的自动驾驶3D目标检测模型。该模型通过将相机信息和雷达特征融合,构成多通道特征数据输入,从而增强目标检测网络的鲁棒性,减少漏检问题;为了能够得到更加准确丰富的3D目标检测信息,引入了改进的注意力机制,用于增强视锥网格中的雷达点云和视觉信息融合;使用改进的损失函数优化边框预测的准确度。在Nuscenes数据集上进行模型验证和对比,实验结果表明,相较于传统的Centerfusion模型,提出的模型平均检测精度均值(mean Average Precision,mAP)提高了1.3%,Nuscenes检测分数(Nuscenes Detection Scores,NDS)提高了1.2%。展开更多
To address the difficulties in fusing multi-mode sensor data for complex industrial machinery, an adaptive deep coupling convolutional auto-encoder (ADCCAE) fusion method was proposed. First, the multi-mode features e...To address the difficulties in fusing multi-mode sensor data for complex industrial machinery, an adaptive deep coupling convolutional auto-encoder (ADCCAE) fusion method was proposed. First, the multi-mode features extracted synchronously by the CCAE were stacked and fed to the multi-channel convolution layers for fusion. Then, the fused data was passed to all connection layers for compression and fed to the Softmax module for classification. Finally, the coupling loss function coefficients and the network parameters were optimized through an adaptive approach using the gray wolf optimization (GWO) algorithm. Experimental comparisons showed that the proposed ADCCAE fusion model was superior to existing models for multi-mode data fusion.展开更多
Magnesium(Mg)alloys are considered to be a new generation of revolutionary medical metals.Laser-beam powder bed fusion(PBF-LB)is suitable for fabricating metal implants withpersonalized and complicated structures.Howe...Magnesium(Mg)alloys are considered to be a new generation of revolutionary medical metals.Laser-beam powder bed fusion(PBF-LB)is suitable for fabricating metal implants withpersonalized and complicated structures.However,the as-built part usually exhibits undesirable microstructure and unsatisfactory performance.In this work,WE43 parts were firstly fabricated by PBF-LB and then subjected to heat treatment.Although a high densification rate of 99.91%was achieved using suitable processes,the as-built parts exhibited anisotropic and layeredmicrostructure with heterogeneously precipitated Nd-rich intermetallic.After heat treatment,fine and nano-scaled Mg24Y5particles were precipitated.Meanwhile,theα-Mg grainsunderwent recrystallization and turned coarsened slightly,which effectively weakened thetexture intensity and reduced the anisotropy.As a consequence,the yield strength and ultimate tensile strength were significantly improved to(250.2±3.5)MPa and(312±3.7)MPa,respectively,while the elongation was still maintained at a high level of 15.2%.Furthermore,the homogenized microstructure reduced the tendency of localized corrosion and favoredthe development of uniform passivation film.Thus,the degradation rate of WE43 parts was decreased by an order of magnitude.Besides,in-vitro cell experiments proved their favorable biocompatibility.展开更多
Metal additive manufacturing(AM)has been extensively studied in recent decades.Despite the significant progress achieved in manufacturing complex shapes and structures,challenges such as severe cracking when using exi...Metal additive manufacturing(AM)has been extensively studied in recent decades.Despite the significant progress achieved in manufacturing complex shapes and structures,challenges such as severe cracking when using existing alloys for laser powder bed fusion(L-PBF)AM have persisted.These challenges arise because commercial alloys are primarily designed for conventional casting or forging processes,overlooking the fast cooling rates,steep temperature gradients and multiple thermal cycles of L-PBF.To address this,there is an urgent need to develop novel alloys specifically tailored for L-PBF technologies.This review provides a comprehensive summary of the strategies employed in alloy design for L-PBF.It aims to guide future research on designing novel alloys dedicated to L-PBF instead of adapting existing alloys.The review begins by discussing the features of the L-PBF processes,focusing on rapid solidification and intrinsic heat treatment.Next,the printability of the four main existing alloys(Fe-,Ni-,Al-and Ti-based alloys)is critically assessed,with a comparison of their conventional weldability.It was found that the weldability criteria are not always applicable in estimating printability.Furthermore,the review presents recent advances in alloy development and associated strategies,categorizing them into crack mitigation-oriented,microstructure manipulation-oriented and machine learning-assisted approaches.Lastly,an outlook and suggestions are given to highlight the issues that need to be addressed in future work.展开更多
For many environmental and agricultural applications, an accurate estimation of surface soil moisture is essential. This study sought to determine whether combining Sentinel-1A, Sentinel-2A, and meteorological data wi...For many environmental and agricultural applications, an accurate estimation of surface soil moisture is essential. This study sought to determine whether combining Sentinel-1A, Sentinel-2A, and meteorological data with artificial neural networks (ANN) could improve soil moisture estimation in various land cover types. To train and evaluate the model’s performance, we used field data (provided by La Tuscia University) on the study area collected during time periods between October 2022, and December 2022. Surface soil moisture was measured at 29 locations. The performance of the model was trained, validated, and tested using input features in a 60:10:30 ratio, using the feed-forward ANN model. It was found that the ANN model exhibited high precision in predicting soil moisture. The model achieved a coefficient of determination (R<sup>2</sup>) of 0.71 and correlation coefficient (R) of 0.84. Furthermore, the incorporation of Random Forest (RF) algorithms for soil moisture prediction resulted in an improved R<sup>2</sup> of 0.89. The unique combination of active microwave, meteorological data and multispectral data provides an opportunity to exploit the complementary nature of the datasets. Through preprocessing, fusion, and ANN modeling, this research contributes to advancing soil moisture estimation techniques and providing valuable insights for water resource management and agricultural planning in the study area.展开更多
Laser powder bed fusion(L-PBF) has attracted significant attention in both the industry and academic fields since its inception, providing unprecedented advantages to fabricate complex-shaped metallic components. The ...Laser powder bed fusion(L-PBF) has attracted significant attention in both the industry and academic fields since its inception, providing unprecedented advantages to fabricate complex-shaped metallic components. The printing quality and performance of L-PBF alloys are infuenced by numerous variables consisting of feedstock powders, manufacturing process,and post-treatment. As the starting materials, metallic powders play a critical role in infuencing the fabrication cost, printing consistency, and properties. Given their deterministic roles, the present review aims to retrospect the recent progress on metallic powders for L-PBF including characterization, preparation, and reuse. The powder characterization mainly serves for printing consistency while powder preparation and reuse are introduced to reduce the fabrication costs.Various powder characterization and preparation methods are presented in the beginning by analyzing the measurement principles, advantages, and limitations. Subsequently, the effect of powder reuse on the powder characteristics and mechanical performance of L-PBF parts is analyzed, focusing on steels, nickel-based superalloys, titanium and titanium alloys, and aluminum alloys. The evolution trends of powders and L-PBF parts vary depending on specific alloy systems, which makes the proposal of a unified reuse protocol infeasible. Finally,perspectives are presented to cater to the increased applications of L-PBF technologies for future investigations. The present state-of-the-art work can pave the way for the broad industrial applications of L-PBF by enhancing printing consistency and reducing the total costs from the perspective of powders.展开更多
Data fusion generates fused data by combining multiple sources,resulting in information that is more consistent,accurate,and useful than any individual source and more reliable and consistent than the raw original dat...Data fusion generates fused data by combining multiple sources,resulting in information that is more consistent,accurate,and useful than any individual source and more reliable and consistent than the raw original data,which are often imperfect,inconsistent,complex,and uncertain.Traditional data fusion methods like probabilistic fusion,set-based fusion,and evidential belief reasoning fusion methods are computationally complex and require accurate classification and proper handling of raw data.Data fusion is the process of integrating multiple data sources.Data filtering means examining a dataset to exclude,rearrange,or apportion data according to the criteria.Different sensors generate a large amount of data,requiring the development of machine learning(ML)algorithms to overcome the challenges of traditional methods.The advancement in hardware acceleration and the abundance of data from various sensors have led to the development of machine learning(ML)algorithms,expected to address the limitations of traditional methods.However,many open issues still exist as machine learning algorithms are used for data fusion.From the literature,nine issues have been identified irrespective of any application.The decision-makers should pay attention to these issues as data fusion becomes more applicable and successful.A fuzzy analytical hierarchical process(FAHP)enables us to handle these issues.It helps to get the weights for each corresponding issue and rank issues based on these calculated weights.The most significant issue identified is the lack of deep learning models used for data fusion that improve accuracy and learning quality weighted 0.141.The least significant one is the cross-domain multimodal data fusion weighted 0.076 because the whole semantic knowledge for multimodal data cannot be captured.展开更多
With the development of Industry 4.0 and big data technology,the Industrial Internet of Things(IIoT)is hampered by inherent issues such as privacy,security,and fault tolerance,which pose certain challenges to the rapi...With the development of Industry 4.0 and big data technology,the Industrial Internet of Things(IIoT)is hampered by inherent issues such as privacy,security,and fault tolerance,which pose certain challenges to the rapid development of IIoT.Blockchain technology has immutability,decentralization,and autonomy,which can greatly improve the inherent defects of the IIoT.In the traditional blockchain,data is stored in a Merkle tree.As data continues to grow,the scale of proofs used to validate it grows,threatening the efficiency,security,and reliability of blockchain-based IIoT.Accordingly,this paper first analyzes the inefficiency of the traditional blockchain structure in verifying the integrity and correctness of data.To solve this problem,a new Vector Commitment(VC)structure,Partition Vector Commitment(PVC),is proposed by improving the traditional VC structure.Secondly,this paper uses PVC instead of the Merkle tree to store big data generated by IIoT.PVC can improve the efficiency of traditional VC in the process of commitment and opening.Finally,this paper uses PVC to build a blockchain-based IIoT data security storage mechanism and carries out a comparative analysis of experiments.This mechanism can greatly reduce communication loss and maximize the rational use of storage space,which is of great significance for maintaining the security and stability of blockchain-based IIoT.展开更多
基金funded by National Natural Science Foundation of China(Grant Nos.42272333,42277147).
文摘Refined 3D modeling of mine slopes is pivotal for precise prediction of geological hazards.Aiming at the inadequacy of existing single modeling methods in comprehensively representing the overall and localized characteristics of mining slopes,this study introduces a new method that fuses model data from Unmanned aerial vehicles(UAV)tilt photogrammetry and 3D laser scanning through a data alignment algorithm based on control points.First,the mini batch K-Medoids algorithm is utilized to cluster the point cloud data from ground 3D laser scanning.Then,the elbow rule is applied to determine the optimal cluster number(K0),and the feature points are extracted.Next,the nearest neighbor point algorithm is employed to match the feature points obtained from UAV tilt photogrammetry,and the internal point coordinates are adjusted through the distanceweighted average to construct a 3D model.Finally,by integrating an engineering case study,the K0 value is determined to be 8,with a matching accuracy between the two model datasets ranging from 0.0669 to 1.0373 mm.Therefore,compared with the modeling method utilizing K-medoids clustering algorithm,the new modeling method significantly enhances the computational efficiency,the accuracy of selecting the optimal number of feature points in 3D laser scanning,and the precision of the 3D model derived from UAV tilt photogrammetry.This method provides a research foundation for constructing mine slope model.
基金Supported by National Key R&D Program of China(Grant Nos.2020YFB1709901,2020YFB1709904)National Natural Science Foundation of China(Grant Nos.51975495,51905460)+1 种基金Guangdong Provincial Basic and Applied Basic Research Foundation of China(Grant No.2021-A1515012286)Science and Technology Plan Project of Fuzhou City of China(Grant No.2022-P-022).
文摘The accurate estimation of parameters is the premise for establishing a high-fidelity simulation model of a valve-controlled cylinder system.Bench test data are easily obtained,but it is challenging to emulate actual loads in the research on parameter estimation of valve-controlled cylinder system.Despite the actual load information contained in the operating data of the control valve,its acquisition remains challenging.This paper proposes a method that fuses bench test and operating data for parameter estimation to address the aforementioned problems.The proposed method is based on Bayesian theory,and its core is a pool fusion of prior information from bench test and operating data.Firstly,a system model is established,and the parameters in the model are analysed.Secondly,the bench and operating data of the system are collected.Then,the model parameters and weight coefficients are estimated using the data fusion method.Finally,the estimated effects of the data fusion method,Bayesian method,and particle swarm optimisation(PSO)algorithm on system model parameters are compared.The research shows that the weight coefficient represents the contribution of different prior information to the parameter estimation result.The effect of parameter estimation based on the data fusion method is better than that of the Bayesian method and the PSO algorithm.Increasing load complexity leads to a decrease in model accuracy,highlighting the crucial role of the data fusion method in parameter estimation studies.
文摘With the popularisation of intelligent power,power devices have different shapes,numbers and specifications.This means that the power data has distributional variability,the model learning process cannot achieve sufficient extraction of data features,which seriously affects the accuracy and performance of anomaly detection.Therefore,this paper proposes a deep learning-based anomaly detection model for power data,which integrates a data alignment enhancement technique based on random sampling and an adaptive feature fusion method leveraging dimension reduction.Aiming at the distribution variability of power data,this paper developed a sliding window-based data adjustment method for this model,which solves the problem of high-dimensional feature noise and low-dimensional missing data.To address the problem of insufficient feature fusion,an adaptive feature fusion method based on feature dimension reduction and dictionary learning is proposed to improve the anomaly data detection accuracy of the model.In order to verify the effectiveness of the proposed method,we conducted effectiveness comparisons through elimination experiments.The experimental results show that compared with the traditional anomaly detection methods,the method proposed in this paper not only has an advantage in model accuracy,but also reduces the amount of parameter calculation of the model in the process of feature matching and improves the detection speed.
文摘In order to obtain more accurate precipitation data and better simulate the precipitation on the Tibetan Plateau,the simulation capability of 14 Coupled Model Intercomparison Project Phase 6(CMIP6)models of historical precipitation(1982-2014)on the Qinghai-Tibetan Plateau was evaluated in this study.Results indicate that all models exhibit an overestimation of precipitation through the analysis of the Taylor index,temporal and spatial statistical parameters.To correct the overestimation,a fusion correction method combining the Backpropagation Neural Network Correction(BP)and Quantum Mapping(QM)correction,named BQ method,was proposed.With this method,the historical precipitation of each model was corrected in space and time,respectively.The correction results were then analyzed in time,space,and analysis of variance(ANOVA)with those corrected by the BP and QM methods,respectively.Finally,the fusion correction method results for each model were compared with the Climatic Research Unit(CRU)data for significance analysis to obtain the trends of precipitation increase and decrease for each model.The results show that the IPSL-CM6A-LR model is relatively good in simulating historical precipitation on the Qinghai-Tibetan Plateau(R=0.7,RSME=0.15)among the uncorrected data.In terms of time,the total precipitation corrected by the fusion method has the same interannual trend and the closest precipitation values to the CRU data;In terms of space,the annual average precipitation corrected by the fusion method has the smallest difference with the CRU data,and the total historical annual average precipitation is not significantly different from the CRU data,which is better than BP and QM.Therefore,the correction effect of the fusion method on the historical precipitation of each model is better than that of the QM and BP methods.The precipitation in the central and northeastern parts of the plateau shows a significant increasing trend.The correlation coefficients between monthly precipitation and site-detected precipitation for all models after BQ correction exceed 0.8.
基金The National Natural Science Foundation of China under contract Nos 42206205,41976188 and 42276205.
文摘The spaceborne platform has unprecedently provided the global eddy-permitting(typically about 0.25°)products of sea surface salinity(SSS),however the existing SSS products can hardly resolve mesoscale motions due to the heavy noises therein and the over-smoothing in denoising processes.By means of the multi-fractal fusion(MFF),the high-resolution SSS product is synthesized with the template of sea surface temperature(SST).Two low-resolution SSS products and four SST products are considered as the source data and the templates respectively to determine the best combination.The fused products are validated by the in situ observations and intercompared via SSS maps,Singularity Exponent maps and wavenumber spectra.The results demonstrate that the MFF can perform a good work in mitigating the noises and improving the resolution.The combination of the climate change initiative SSS and the remote sensing system SST can produce the 0.1°denoised product whose global mean standard derivation of salinity against Argo is 0.21 and the feature resolution can reach 30−40 km.
基金supported by the Municipal Gavemment of Quzhou(2022D0009,2022D013,2022D033)the Science and Technology Project of Sichuan Province(2023YFG0176)。
文摘There is a growing body of research on the swarm unmanned aerial vehicle(UAV)in recent years,which has the characteristics of small,low speed,and low height as radar target.To confront the swarm UAV,the design of anti-UAV radar system based on multiple input multiple output(MIMO)is put forward,which can elevate the performance of resolution,angle accuracy,high data rate,and tracking flexibility for swarm UAV detection.Target resolution and detection are the core problem in detecting the swarm UAV.The distinct advantage of MIMO system in angular accuracy measurement is demonstrated by comparing MIMO radar with phased array radar.Since MIMO radar has better performance in resolution,swarm UAV detection still has difficulty in target detection.This paper proposes a multi-mode data fusion algorithm based on deep neural networks to improve the detection effect.Subsequently,signal processing and data processing based on the detection fusion algorithm above are designed,forming a high resolution detection loop.Several simulations are designed to illustrate the feasibility of the designed system and the proposed algorithm.
基金The Key R&D Project of Hainan Province under contract No.ZDYF2023SHFZ097the National Natural Science Foundation of China under contract No.42376180。
文摘Mangroves are indispensable to coastlines,maintaining biodiversity,and mitigating climate change.Therefore,improving the accuracy of mangrove information identification is crucial for their ecological protection.Aiming at the limited morphological information of synthetic aperture radar(SAR)images,which is greatly interfered by noise,and the susceptibility of optical images to weather and lighting conditions,this paper proposes a pixel-level weighted fusion method for SAR and optical images.Image fusion enhanced the target features and made mangrove monitoring more comprehensive and accurate.To address the problem of high similarity between mangrove forests and other forests,this paper is based on the U-Net convolutional neural network,and an attention mechanism is added in the feature extraction stage to make the model pay more attention to the mangrove vegetation area in the image.In order to accelerate the convergence and normalize the input,batch normalization(BN)layer and Dropout layer are added after each convolutional layer.Since mangroves are a minority class in the image,an improved cross-entropy loss function is introduced in this paper to improve the model’s ability to recognize mangroves.The AttU-Net model for mangrove recognition in high similarity environments is thus constructed based on the fused images.Through comparison experiments,the overall accuracy of the improved U-Net model trained from the fused images to recognize the predicted regions is significantly improved.Based on the fused images,the recognition results of the AttU-Net model proposed in this paper are compared with its benchmark model,U-Net,and the Dense-Net,Res-Net,and Seg-Net methods.The AttU-Net model captured mangroves’complex structures and textural features in images more effectively.The average OA,F1-score,and Kappa coefficient in the four tested regions were 94.406%,90.006%,and 84.045%,which were significantly higher than several other methods.This method can provide some technical support for the monitoring and protection of mangrove ecosystems.
基金support of the National Natural Science Foundation of China(Grant Nos.U2240221 and 41977229)the Sichuan Youth Science and Technology Innovation Research Team Project(Grant No.2020JDTD0006).
文摘Non-contact remote sensing techniques,such as terrestrial laser scanning(TLS)and unmanned aerial vehicle(UAV)photogrammetry,have been globally applied for landslide monitoring in high and steep mountainous areas.These techniques acquire terrain data and enable ground deformation monitoring.However,practical application of these technologies still faces many difficulties due to complex terrain,limited access and dense vegetation.For instance,monitoring high and steep slopes can obstruct the TLS sightline,and the accuracy of the UAV model may be compromised by absence of ground control points(GCPs).This paper proposes a TLS-and UAV-based method for monitoring landslide deformation in high mountain valleys using traditional real-time kinematics(RTK)-based control points(RCPs),low-precision TLS-based control points(TCPs)and assumed control points(ACPs)to achieve high-precision surface deformation analysis under obstructed vision and impassable conditions.The effects of GCP accuracy,GCP quantity and automatic tie point(ATP)quantity on the accuracy of UAV modeling and surface deformation analysis were comprehensively analyzed.The results show that,the proposed method allows for the monitoring accuracy of landslides to exceed the accuracy of the GCPs themselves by adding additional low-accuracy GCPs.The proposed method was implemented for monitoring the Xinhua landslide in Baoxing County,China,and was validated against data from multiple sources.
基金supported by the National Key R&D Program of China (Project No.2020YFC2200800,Task No.2020YFC2200803)the Key Projects of the Natural Science Foundation of Heilongjiang Province (Grant No.ZD2021E001)。
文摘Dead fine fuel moisture content(DFFMC)is a key factor affecting the spread of forest fires,which plays an important role in evaluation of forest fire risk.In order to achieve high-precision real-time measurement of DFFMC,this study established a long short-term memory(LSTM)network based on particle swarm optimization(PSO)algorithm as a measurement model.A multi-point surface monitoring scheme combining near-infrared measurement method and meteorological measurement method is proposed.The near-infrared spectral information of dead fine fuels and the meteorological factors in the region are processed by data fusion technology to construct a spectral-meteorological data set.The surface fine dead fuel of Mongolian oak(Quercus mongolica Fisch.ex Ledeb.),white birch(Betula platyphylla Suk.),larch(Larix gmelinii(Rupr.)Kuzen.),and Manchurian walnut(Juglans mandshurica Maxim.)in the maoershan experimental forest farm of the Northeast Forestry University were investigated.We used the PSO-LSTM model for moisture content to compare the near-infrared spectroscopy,meteorological,and spectral meteorological fusion methods.The results show that the mean absolute error of the DFFMC of the four stands by spectral meteorological fusion method were 1.1%for Mongolian oak,1.3%for white birch,1.4%for larch,and 1.8%for Manchurian walnut,and these values were lower than those of the near-infrared method and the meteorological method.The spectral meteorological fusion method provides a new way for high-precision measurement of moisture content of fine dead fuel.
文摘Sea surface temperature(SST)is one of the important parameters of global ocean and climate research,which can be retrieved by satellite infrared and passive microwave remote sensing instruments.While satellite infrared SST offers high spatial resolution,it is limited by cloud cover.On the other hand,passive microwave SST provides all-weather observation but suffers from poor spatial resolution and susceptibility to environmental factors such as rainfall,coastal effects,and high wind speeds.To achieve high-precision,comprehensive,and high-resolution SST data,it is essential to fuse infrared and microwave SST measurements.In this study,data from the Fengyun-3D(FY-3D)medium resolution spectral imager II(MERSI-II)SST and microwave imager(MWRI)SST were fused.Firstly,the accuracy of both MERSIII SST and MWRI SST was verified,and the latter was bilinearly interpolated to match the 5km resolution grid of MERSI SST.After pretreatment and quality control of MERSI SST and MWRI SST,a Piece-Wise Regression method was employed to correct biases in MWRI SST.Subsequently,SST data were selected based on spatial resolution and accuracy within a 3-day window of the analysis date.Finally,an optimal interpolation method was applied to fuse the FY-3D MERSI-II SST and MWRI SST.The results demonstrated a significant improvement in spatial coverage compared to MERSI-II SST and MWRI SST.Furthermore,the fusion SST retained true spatial distribution details and exhibited an accuracy of–0.12±0.74℃compared to OSTIA SST.This study has improved the accuracy of FY satellite fusion SST products in China.
文摘针对自动驾驶路面上目标漏检和错检的问题,提出一种基于改进Centerfusion的自动驾驶3D目标检测模型。该模型通过将相机信息和雷达特征融合,构成多通道特征数据输入,从而增强目标检测网络的鲁棒性,减少漏检问题;为了能够得到更加准确丰富的3D目标检测信息,引入了改进的注意力机制,用于增强视锥网格中的雷达点云和视觉信息融合;使用改进的损失函数优化边框预测的准确度。在Nuscenes数据集上进行模型验证和对比,实验结果表明,相较于传统的Centerfusion模型,提出的模型平均检测精度均值(mean Average Precision,mAP)提高了1.3%,Nuscenes检测分数(Nuscenes Detection Scores,NDS)提高了1.2%。
文摘To address the difficulties in fusing multi-mode sensor data for complex industrial machinery, an adaptive deep coupling convolutional auto-encoder (ADCCAE) fusion method was proposed. First, the multi-mode features extracted synchronously by the CCAE were stacked and fed to the multi-channel convolution layers for fusion. Then, the fused data was passed to all connection layers for compression and fed to the Softmax module for classification. Finally, the coupling loss function coefficients and the network parameters were optimized through an adaptive approach using the gray wolf optimization (GWO) algorithm. Experimental comparisons showed that the proposed ADCCAE fusion model was superior to existing models for multi-mode data fusion.
基金supported by the following funds:National Natural Science Foundation of China(51935014,52165043)Jiangxi Provincial Cultivation Program for Academic and Technical Leaders of Major Subjects(20225BCJ23008)+1 种基金Jiangxi Provincial Natural Science Foundation(20224ACB204013,20224ACB214008)Scientific Research Project of Anhui Universities(KJ2021A1106)。
文摘Magnesium(Mg)alloys are considered to be a new generation of revolutionary medical metals.Laser-beam powder bed fusion(PBF-LB)is suitable for fabricating metal implants withpersonalized and complicated structures.However,the as-built part usually exhibits undesirable microstructure and unsatisfactory performance.In this work,WE43 parts were firstly fabricated by PBF-LB and then subjected to heat treatment.Although a high densification rate of 99.91%was achieved using suitable processes,the as-built parts exhibited anisotropic and layeredmicrostructure with heterogeneously precipitated Nd-rich intermetallic.After heat treatment,fine and nano-scaled Mg24Y5particles were precipitated.Meanwhile,theα-Mg grainsunderwent recrystallization and turned coarsened slightly,which effectively weakened thetexture intensity and reduced the anisotropy.As a consequence,the yield strength and ultimate tensile strength were significantly improved to(250.2±3.5)MPa and(312±3.7)MPa,respectively,while the elongation was still maintained at a high level of 15.2%.Furthermore,the homogenized microstructure reduced the tendency of localized corrosion and favoredthe development of uniform passivation film.Thus,the degradation rate of WE43 parts was decreased by an order of magnitude.Besides,in-vitro cell experiments proved their favorable biocompatibility.
基金financially supported by the National Key Research and Development Program of China(2022YFB4600302)National Natural Science Foundation of China(52090041)+1 种基金National Natural Science Foundation of China(52104368)National Major Science and Technology Projects of China(J2019-VII-0010-0150)。
文摘Metal additive manufacturing(AM)has been extensively studied in recent decades.Despite the significant progress achieved in manufacturing complex shapes and structures,challenges such as severe cracking when using existing alloys for laser powder bed fusion(L-PBF)AM have persisted.These challenges arise because commercial alloys are primarily designed for conventional casting or forging processes,overlooking the fast cooling rates,steep temperature gradients and multiple thermal cycles of L-PBF.To address this,there is an urgent need to develop novel alloys specifically tailored for L-PBF technologies.This review provides a comprehensive summary of the strategies employed in alloy design for L-PBF.It aims to guide future research on designing novel alloys dedicated to L-PBF instead of adapting existing alloys.The review begins by discussing the features of the L-PBF processes,focusing on rapid solidification and intrinsic heat treatment.Next,the printability of the four main existing alloys(Fe-,Ni-,Al-and Ti-based alloys)is critically assessed,with a comparison of their conventional weldability.It was found that the weldability criteria are not always applicable in estimating printability.Furthermore,the review presents recent advances in alloy development and associated strategies,categorizing them into crack mitigation-oriented,microstructure manipulation-oriented and machine learning-assisted approaches.Lastly,an outlook and suggestions are given to highlight the issues that need to be addressed in future work.
文摘For many environmental and agricultural applications, an accurate estimation of surface soil moisture is essential. This study sought to determine whether combining Sentinel-1A, Sentinel-2A, and meteorological data with artificial neural networks (ANN) could improve soil moisture estimation in various land cover types. To train and evaluate the model’s performance, we used field data (provided by La Tuscia University) on the study area collected during time periods between October 2022, and December 2022. Surface soil moisture was measured at 29 locations. The performance of the model was trained, validated, and tested using input features in a 60:10:30 ratio, using the feed-forward ANN model. It was found that the ANN model exhibited high precision in predicting soil moisture. The model achieved a coefficient of determination (R<sup>2</sup>) of 0.71 and correlation coefficient (R) of 0.84. Furthermore, the incorporation of Random Forest (RF) algorithms for soil moisture prediction resulted in an improved R<sup>2</sup> of 0.89. The unique combination of active microwave, meteorological data and multispectral data provides an opportunity to exploit the complementary nature of the datasets. Through preprocessing, fusion, and ANN modeling, this research contributes to advancing soil moisture estimation techniques and providing valuable insights for water resource management and agricultural planning in the study area.
基金supported by the Fundamental Research Funds for the Central Universities (Grant No. AE89991/403)National Natural Science Foundation of China (Grant No. 52005262)+1 种基金Natural Science Foundation of Jiangsu Province (BK20202007)National Key Research and Development Program of China (2022YFB4600800)。
文摘Laser powder bed fusion(L-PBF) has attracted significant attention in both the industry and academic fields since its inception, providing unprecedented advantages to fabricate complex-shaped metallic components. The printing quality and performance of L-PBF alloys are infuenced by numerous variables consisting of feedstock powders, manufacturing process,and post-treatment. As the starting materials, metallic powders play a critical role in infuencing the fabrication cost, printing consistency, and properties. Given their deterministic roles, the present review aims to retrospect the recent progress on metallic powders for L-PBF including characterization, preparation, and reuse. The powder characterization mainly serves for printing consistency while powder preparation and reuse are introduced to reduce the fabrication costs.Various powder characterization and preparation methods are presented in the beginning by analyzing the measurement principles, advantages, and limitations. Subsequently, the effect of powder reuse on the powder characteristics and mechanical performance of L-PBF parts is analyzed, focusing on steels, nickel-based superalloys, titanium and titanium alloys, and aluminum alloys. The evolution trends of powders and L-PBF parts vary depending on specific alloy systems, which makes the proposal of a unified reuse protocol infeasible. Finally,perspectives are presented to cater to the increased applications of L-PBF technologies for future investigations. The present state-of-the-art work can pave the way for the broad industrial applications of L-PBF by enhancing printing consistency and reducing the total costs from the perspective of powders.
基金supported in part by the Higher Education Sprout Project from the Ministry of Education(MOE)and National Science and Technology Council,Taiwan(109-2628-E-224-001-MY3,112-2622-E-224-003)and in part by Isuzu Optics Corporation.Dr.Shih-Yu Chen is the corresponding author.
文摘Data fusion generates fused data by combining multiple sources,resulting in information that is more consistent,accurate,and useful than any individual source and more reliable and consistent than the raw original data,which are often imperfect,inconsistent,complex,and uncertain.Traditional data fusion methods like probabilistic fusion,set-based fusion,and evidential belief reasoning fusion methods are computationally complex and require accurate classification and proper handling of raw data.Data fusion is the process of integrating multiple data sources.Data filtering means examining a dataset to exclude,rearrange,or apportion data according to the criteria.Different sensors generate a large amount of data,requiring the development of machine learning(ML)algorithms to overcome the challenges of traditional methods.The advancement in hardware acceleration and the abundance of data from various sensors have led to the development of machine learning(ML)algorithms,expected to address the limitations of traditional methods.However,many open issues still exist as machine learning algorithms are used for data fusion.From the literature,nine issues have been identified irrespective of any application.The decision-makers should pay attention to these issues as data fusion becomes more applicable and successful.A fuzzy analytical hierarchical process(FAHP)enables us to handle these issues.It helps to get the weights for each corresponding issue and rank issues based on these calculated weights.The most significant issue identified is the lack of deep learning models used for data fusion that improve accuracy and learning quality weighted 0.141.The least significant one is the cross-domain multimodal data fusion weighted 0.076 because the whole semantic knowledge for multimodal data cannot be captured.
基金supported by China’s National Natural Science Foundation(Nos.62072249,62072056)This work is also funded by the National Science Foundation of Hunan Province(2020JJ2029).
文摘With the development of Industry 4.0 and big data technology,the Industrial Internet of Things(IIoT)is hampered by inherent issues such as privacy,security,and fault tolerance,which pose certain challenges to the rapid development of IIoT.Blockchain technology has immutability,decentralization,and autonomy,which can greatly improve the inherent defects of the IIoT.In the traditional blockchain,data is stored in a Merkle tree.As data continues to grow,the scale of proofs used to validate it grows,threatening the efficiency,security,and reliability of blockchain-based IIoT.Accordingly,this paper first analyzes the inefficiency of the traditional blockchain structure in verifying the integrity and correctness of data.To solve this problem,a new Vector Commitment(VC)structure,Partition Vector Commitment(PVC),is proposed by improving the traditional VC structure.Secondly,this paper uses PVC instead of the Merkle tree to store big data generated by IIoT.PVC can improve the efficiency of traditional VC in the process of commitment and opening.Finally,this paper uses PVC to build a blockchain-based IIoT data security storage mechanism and carries out a comparative analysis of experiments.This mechanism can greatly reduce communication loss and maximize the rational use of storage space,which is of great significance for maintaining the security and stability of blockchain-based IIoT.