Tunnel Boring Machines(TBMs)are vital for tunnel and underground construction due to their high safety and efficiency.Accurately predicting TBM operational parameters based on the surrounding environment is crucial fo...Tunnel Boring Machines(TBMs)are vital for tunnel and underground construction due to their high safety and efficiency.Accurately predicting TBM operational parameters based on the surrounding environment is crucial for planning schedules and managing costs.This study investigates the effectiveness of tree-based machine learning models,including Random Forest,Extremely Randomized Trees,Adaptive Boosting Machine,Gradient Boosting Machine,Extreme Gradient Boosting Machine(XGBoost),Light Gradient Boosting Machine,and CatBoost,in predicting the Penetration Rate(PR)of TBMs by considering rock mass and material characteristics.These techniques are able to provide a good relationship between input(s)and output parameters;hence,obtaining a high level of accuracy.To do that,a comprehensive database comprising various rock mass and material parameters,including Rock Mass Rating,Brazilian Tensile Strength,and Weathering Zone,was utilized for model development.The practical application of these models was assessed with a new dataset representing diverse rock mass and material properties.To evaluate model performance,ranking systems and Taylor diagrams were employed.CatBoost emerged as the most accurate model during training and testing,with R2 scores of 0.927 and 0.861,respectively.However,during validation,XGBoost demonstrated superior performance with an R2 of 0.713.Despite these variations,all tree-based models showed promising accuracy in predicting TBM performance,providing valuable insights for similar projects in the future.展开更多
This paper investigates whether e-hailing performs better than on-street searching for taxi services.By adopting the Poission point process to model the temporal-spatial distributions of idle vehicles,passengers’wait...This paper investigates whether e-hailing performs better than on-street searching for taxi services.By adopting the Poission point process to model the temporal-spatial distributions of idle vehicles,passengers’waiting time distributions of on-street searching and ehailing are explicitly modeled,and closed-form results of their expected waiting time are given.It is proved that whether e-hailing performs better than on-street searching mainly depends on the density of idle vehicles within the matching area and the matching period.It is proved that given the advantage of e-hailing in rapidly pairing passengers and idle vehicles,the expected waiting time for onstreet searching is always longer than that of e-hailing as long as the number of idle vehicles within a passenger’s dominant temporal-spatial area is lower than 4/π.Moreover,we extend our analysis to explore the market equilibria for both e-hailing and on-street searching,and present the equilibrium conditions for a taxi market operating under ehailing versus on-street searching.With a special reciprocal passenger demand function,it is shown that the performance difference between e-hailing and on-street searching is mainly determined by the ratio of fleet size to maximum potential passenger demand.It suggests that e-hailing can achieve a higher capacity utilization rate of vehicles than on-street searching when vehicle density is relatively low.Furthermore,it is shown that an extended average trip duration improves the chance that e-hailing performs better than on-street searching.The optimal vehicle fleet sizes to maximize the total social welfare/profit are then analyzed,and the corresponding maximization problems are formulated.展开更多
Terrestrial laser scanning(TLS)accurately captures tree structural information and provides prerequisites for treescale estimations of forest biophysical attributes.Quantifying tree-scale attributes from TLS point clo...Terrestrial laser scanning(TLS)accurately captures tree structural information and provides prerequisites for treescale estimations of forest biophysical attributes.Quantifying tree-scale attributes from TLS point clouds requires segmentation,yet the occlusion effects severely affect the accuracy of automated individual tree segmentation.In this study,we proposed a novel method using ellipsoid directional searching and point compensation algorithms to alleviate occlusion effects.Firstly,region growing and point compensation algorithms are used to determine the location of tree roots.Secondly,the neighbor points are extracted within an ellipsoid neighborhood to mitigate occlusion effects compared with k-nearest neighbor(KNN).Thirdly,neighbor points are uniformly subsampled by the directional searching algorithm based on the Fibonacci principle in multiple spatial directions to reduce memory consumption.Finally,a graph describing connectivity between a point and its neighbors is constructed,and it is utilized to complete individual tree segmentation based on the shortest path algorithm.The proposed method was evaluated on a public TLS dataset comprising six forest plots with three complexity categories in Evo,Finland,and it reached the highest mean accuracy of 77.5%,higher than previous studies on tree detection.We also extracted and validated the tree structure attributes using manual segmentation reference values.The RMSE,RMSE%,bias,and bias%of tree height,crown base height,crown projection area,crown surface area,and crown volume were used to evaluate the segmentation accuracy,respectively.Overall,the proposed method avoids many inherent limitations of current methods and can accurately map canopy structures in occluded complex forest stands.展开更多
Determination of Shear Bond strength(SBS)at interlayer of double-layer asphalt concrete is crucial in flexible pavement structures.The study used three Machine Learning(ML)models,including K-Nearest Neighbors(KNN),Ext...Determination of Shear Bond strength(SBS)at interlayer of double-layer asphalt concrete is crucial in flexible pavement structures.The study used three Machine Learning(ML)models,including K-Nearest Neighbors(KNN),Extra Trees(ET),and Light Gradient Boosting Machine(LGBM),to predict SBS based on easily determinable input parameters.Also,the Grid Search technique was employed for hyper-parameter tuning of the ML models,and cross-validation and learning curve analysis were used for training the models.The models were built on a database of 240 experimental results and three input variables:temperature,normal pressure,and tack coat rate.Model validation was performed using three statistical criteria:the coefficient of determination(R2),the Root Mean Square Error(RMSE),and the mean absolute error(MAE).Additionally,SHAP analysis was also used to validate the importance of the input variables in the prediction of the SBS.Results show that these models accurately predict SBS,with LGBM providing outstanding performance.SHAP(Shapley Additive explanation)analysis for LGBM indicates that temperature is the most influential factor on SBS.Consequently,the proposed ML models can quickly and accurately predict SBS between two layers of asphalt concrete,serving practical applications in flexible pavement structure design.展开更多
Precisely estimating the state of health(SOH)of lithium-ion batteries is essential for battery management systems(BMS),as it plays a key role in ensuring the safe and reliable operation of battery systems.However,curr...Precisely estimating the state of health(SOH)of lithium-ion batteries is essential for battery management systems(BMS),as it plays a key role in ensuring the safe and reliable operation of battery systems.However,current SOH estimation methods often overlook the valuable temperature information that can effectively characterize battery aging during capacity degradation.Additionally,the Elman neural network,which is commonly employed for SOH estimation,exhibits several drawbacks,including slow training speed,a tendency to become trapped in local minima,and the initialization of weights and thresholds using pseudo-random numbers,leading to unstable model performance.To address these issues,this study addresses the challenge of precise and effective SOH detection by proposing a method for estimating the SOH of lithium-ion batteries based on differential thermal voltammetry(DTV)and an SSA-Elman neural network.Firstly,two health features(HFs)considering temperature factors and battery voltage are extracted fromthe differential thermal voltammetry curves and incremental capacity curves.Next,the Sparrow Search Algorithm(SSA)is employed to optimize the initial weights and thresholds of the Elman neural network,forming the SSA-Elman neural network model.To validate the performance,various neural networks,including the proposed SSA-Elman network,are tested using the Oxford battery aging dataset.The experimental results demonstrate that the method developed in this study achieves superior accuracy and robustness,with a mean absolute error(MAE)of less than 0.9%and a rootmean square error(RMSE)below 1.4%.展开更多
Towards the crossing and coupling permissions in tasks existed widely in many fields and considering the design of role view must rely on the activities of the tasks process,based on Role Based Accessing Control (RBAC...Towards the crossing and coupling permissions in tasks existed widely in many fields and considering the design of role view must rely on the activities of the tasks process,based on Role Based Accessing Control (RBAC) model,this paper put forward a Role Tree-Based Access Control (RTBAC) model. In addition,the model definition and its constraint formal description is also discussed in this paper. RTBAC model is able to realize the dynamic organizing,self-determination and convenience of the design of role view,and guarantee the least role permission when task separating in the mean time.展开更多
The Cross-domain Heuristic Search Challenge(CHeSC)is a competition focused on creating efficient search algorithms adaptable to diverse problem domains.Selection hyper-heuristics are a class of algorithms that dynamic...The Cross-domain Heuristic Search Challenge(CHeSC)is a competition focused on creating efficient search algorithms adaptable to diverse problem domains.Selection hyper-heuristics are a class of algorithms that dynamically choose heuristics during the search process.Numerous selection hyper-heuristics have different imple-mentation strategies.However,comparisons between them are lacking in the literature,and previous works have not highlighted the beneficial and detrimental implementation methods of different components.The question is how to effectively employ them to produce an efficient search heuristic.Furthermore,the algorithms that competed in the inaugural CHeSC have not been collectively reviewed.This work conducts a review analysis of the top twenty competitors from this competition to identify effective and ineffective strategies influencing algorithmic performance.A summary of the main characteristics and classification of the algorithms is presented.The analysis underlines efficient and inefficient methods in eight key components,including search points,search phases,heuristic selection,move acceptance,feedback,Tabu mechanism,restart mechanism,and low-level heuristic parameter control.This review analyzes the components referencing the competition’s final leaderboard and discusses future research directions for these components.The effective approaches,identified as having the highest quality index,are mixed search point,iterated search phases,relay hybridization selection,threshold acceptance,mixed learning,Tabu heuristics,stochastic restart,and dynamic parameters.Findings are also compared with recent trends in hyper-heuristics.This work enhances the understanding of selection hyper-heuristics,offering valuable insights for researchers and practitioners aiming to develop effective search algorithms for diverse problem domains.展开更多
In consideration of the limitation of super-peer overlay network, the semantic information was introduced into the super-peers' organization. A novel P2P (peer-to-peer) searching model, SSP2P, was put forward. The ...In consideration of the limitation of super-peer overlay network, the semantic information was introduced into the super-peers' organization. A novel P2P (peer-to-peer) searching model, SSP2P, was put forward. The peers in the model were organized in a natural area autonomy system (AAS) based on the smallworld theory. A super-peer was selected in each AAS based on power law; and all the super-peers formed different super-peer semantic networks. Thus, a hierarchical super-peer overlay network was formed. The results show that the model reduces the communication cost and enhances the search efficiency while ensuring the system expansibility. It proves that the introduction of semantic information in the construction of a super-peer overlay is favorable to P2P system capability.展开更多
Distributed data sources which employ taxonomy hierarchy to describe the contents of their objects are considered, and a super-peer-based semantic overlay network (SSON) is proposed for sharing and searching their d...Distributed data sources which employ taxonomy hierarchy to describe the contents of their objects are considered, and a super-peer-based semantic overlay network (SSON) is proposed for sharing and searching their data objects. In SSON, peers are dynamically clustered into many semantic clusters based on the semantics of their data objects and organized in the semantic clusters into a semantic overlay network. Each semantic cluster consists of a super-peer and more peers, and is only responsible for answering queries in its semantic subspace. A query is first routed to the appropriate semantic clusters by an efficient searching algorithm, and then it is forwarded to the specific peers that hold the relevant data objects. Experimental results indicate that SSON has good scalability and achieves a competitive trade-off between search efficiency and costs.展开更多
A novel idea,called the optimal shape subspace (OSS) is first proposed for optimizing active shape model (ASM) search.It is constructed from the principal shape subspace and the principal shape variance subspace.I...A novel idea,called the optimal shape subspace (OSS) is first proposed for optimizing active shape model (ASM) search.It is constructed from the principal shape subspace and the principal shape variance subspace.It allows the reconstructed shape to vary more than that reconstructed in the standard ASM shape space,hence it is more expressive in representing shapes in real life.Then a cost function is developed,based on a study on the search process.An optimal searching method using the feedback information provided by the evaluation cost is proposed to improve the performance of ASM alignment.Experimental results show that the proposed OSS can offer the maximum shape variation with reserving the principal information and a unique local optimal shape is acquired after optimal searching.The combination of OSS and optimal searching can improve the ASM performance greatly.展开更多
For density inversion of gravity anomaly data, once the inversion method is determined, the main factors affecting the inversion result are the inversion parameters and subdivision scheme. A set of reasonable inversio...For density inversion of gravity anomaly data, once the inversion method is determined, the main factors affecting the inversion result are the inversion parameters and subdivision scheme. A set of reasonable inversion parameters and subdivision scheme can, not only improve the inversion process efficiency, but also ensure inversion result accuracy. The gravity inversion method based on correlation searching and the golden section algorithm is an effective potential field inversion method. It can be used to invert 2D and 3D physical properties with potential data observed on flat or rough surfaces. In this paper, we introduce in detail the density inversion principles based on correlation searching and the golden section algorithm. Considering that the gold section algorithm is not globally optimized. we present a heuristic method to ensure the inversion result is globally optimized. With a series of model tests, we systematically compare and analyze the inversion result efficiency and accuracy with different parameters. Based on the model test results, we conclude the selection principles for each inversion parameter with which the inversion accuracy can be obviously improved.展开更多
A new contact searching algorithm for contact-impact systems is proposed in this paper.In terms of the cell structure and the linked-list,this algo- rithm solves the problem of sorting and searching contacts in three ...A new contact searching algorithm for contact-impact systems is proposed in this paper.In terms of the cell structure and the linked-list,this algo- rithm solves the problem of sorting and searching contacts in three dimensions by transforming it to a retrieving process from two one-dimensional arrays,and binary searching is no longer required.Using this algorithm, the cost of contact searching is reduced to the order of O(N)instead of O(Nlog_2N)for traditional ones,where N is the node number in the system.Moreover,this algorithm can handle contact systems with arbitrary mesh layouts.Due to the simplicity of this algorithm it can be easily implemented in a dynamic explicit finite element program.Our numerical experi- mental result shows that this algorithm is reliable arid efficient for contact searching of three dimensional systems.展开更多
Existing methods of local search mostly focus on how to reach optimal solution.However,in some emergency situations,search time is the hard constraint for job shop scheduling problem while optimal solution is not nece...Existing methods of local search mostly focus on how to reach optimal solution.However,in some emergency situations,search time is the hard constraint for job shop scheduling problem while optimal solution is not necessary.In this situation,the existing method of local search is not fast enough.This paper presents an emergency local search(ELS) approach which can reach feasible and nearly optimal solution in limited search time.The ELS approach is desirable for the aforementioned emergency situations where search time is limited and a nearly optimal solution is sufficient,which consists of three phases.Firstly,in order to reach a feasible and nearly optimal solution,infeasible solutions are repaired and a repair technique named group repair is proposed.Secondly,in order to save time,the amount of local search moves need to be reduced and this is achieved by a quickly search method named critical path search(CPS).Finally,CPS sometimes stops at a solution far from the optimal one.In order to jump out the search dilemma of CPS,a jump technique based on critical part is used to improve CPS.Furthermore,the schedule system based on ELS has been developed and experiments based on this system completed on the computer of Intel Pentium(R) 2.93 GHz.The experimental result shows that the optimal solutions of small scale instances are reached in 2 s,and the nearly optimal solutions of large scale instances are reached in 4 s.The proposed ELS approach can stably reach nearly optimal solutions with manageable search time,and can be applied on some emergency situations.展开更多
We present the design and performance of a home-built scanning tunneling microscope (STM), which is compact (66 mm tall and 25 mm in diameter), yet equipped with a 3D atomic precision piezoelectric motor in which ...We present the design and performance of a home-built scanning tunneling microscope (STM), which is compact (66 mm tall and 25 mm in diameter), yet equipped with a 3D atomic precision piezoelectric motor in which the Z coarse approach relies on a high simplic-ity friction-type walker (of our own invention) driven by an axially cut piezoelectric tube. The walker is vertically inserted in a piezoelectric scanner tube (PST) with its brim laying at on the PST end as the inertial slider (driven by the PST) for the XZ (sample plane) motion. The STM is designed to be capable of searching rare microscopic targets (defects, dopants, boundaries, nano-devices, etc.) in a macroscopic sample area (square millimeters) under extreme conditions (low temperatures, strong magnetic elds, etc.) in which it ts. It gives good atomic resolution images after scanning a highly oriented pyrolytic graphite sample in air at room temperature.展开更多
A hybrid carrier(HC) scheme based on weighted-type fractional Fourier transform(WFRFT) has been proposed recently.While most of the works focus on HC scheme's inherent characteristics, little attention is paid to...A hybrid carrier(HC) scheme based on weighted-type fractional Fourier transform(WFRFT) has been proposed recently.While most of the works focus on HC scheme's inherent characteristics, little attention is paid to the WFRFT modulation recognition.In this paper, a new theory is provided to recognize the WFRFT modulation based on higher order cumulants(HOC). First, it is deduced that the optimal WFRFT received order can be obtained through the minimization of 4 th-order cumulants, C_(42). Then, a combinatorial searching algorithm is designed to minimize C_(42).Finally, simulation results show that the designed scheme has a high recognition rate and the combinatorial searching algorithm is effective and reliable.展开更多
Artificial Searching Swarm Algorithm (ASSA) is a new optimization algorithm. ASSA simulates the soldiers to search an enemy’s important goal, and transforms the process of solving optimization problem into the proces...Artificial Searching Swarm Algorithm (ASSA) is a new optimization algorithm. ASSA simulates the soldiers to search an enemy’s important goal, and transforms the process of solving optimization problem into the process of searching optimal goal by searching swarm with set rules. This work selects complicated and highn dimension functions to deeply analyse the performance for unconstrained and constrained optimization problems and the results produced by ASSA, Genetic Algorithm (GA), Particle Swarm Optimization (PSO), Artificial Fish-Swarm Algorithm (AFSA) have been compared. The main factors which influence the performance of ASSA are also discussed. The results demonstrate the effectiveness of the proposed ASSA optimization algorithm.展开更多
Two deficiencies in traditional iterative closest pointsimultaneous localization and mapping( ICP-SLAM) usually result in poor real-time performance. On one hand, relative position between current scan frame and globa...Two deficiencies in traditional iterative closest pointsimultaneous localization and mapping( ICP-SLAM) usually result in poor real-time performance. On one hand, relative position between current scan frame and global map cannot be previously known. As a result, ICP algorithm will take much amount of iterations to reach convergence. On the other hand,establishment of correspondence is done by global searching, which requires enormous computational time. To overcome the two problems,a fast ICP-SLAM with rough alignment and narrowing-scale nearby searching is proposed. As for the decrease of iterative times,rough alignment based on initial pose matrix is proposed. In detail,initial pose matrix is obtained by micro-electro-mechanical system( MEMS) magnetometer and global landmarks. Then rough alignment will be applied between current scan frame and global map at the beginning of ICP algorithm with initial pose matrix. As for accelerating the establishment of correspondence, narrowingscale nearby searching with dynamic threshold is proposed,where match-points are found within a progressively constrictive range.Compared to traditional ICP-SLAM,the experimental results show that the amount of iteration for ICP algorithm to reach convergence reduces to 92. 34% and ICP algorithm runtime reduces to 98. 86% on average. In addition,computational cost is kept in a stable level due to the eliminating of the accumulation of computational consumption. Moreover,great improvement can also been achieved in SLAM quality and robustness.展开更多
Wireless Mesh Network (WMN) is seen as an effective Intemet access solution for dynamic wireless applications. For the low mobility of mesh routers in WMN, the backbone topography can be effectively maintained by pr...Wireless Mesh Network (WMN) is seen as an effective Intemet access solution for dynamic wireless applications. For the low mobility of mesh routers in WMN, the backbone topography can be effectively maintained by proactive routing protocol. Pre-proposals like Tree Based Routing (TBR) protocol and Root Driven Routing (RDR) protocol are so centralized that they make the gateway becorre a bottleneck which severely restricts the network performance. We proposed an Optimized Tree-based Routing (OTR) protocol that logically separated the proactive tree into pieces. Route is partly computed by the branches instead of root. We also discussed the operation of multipie Intemet gateways which is a main issue in WMN. The new proposal lightens the load in root, reduces the overhead and improves the throughput. Numerical analysis and simulation results confirm that the perforrmnce of WMN is improved and OTR is more suitable for large scale WMN.展开更多
The current Grover quantum searching algorithm cannot identify the difference in importance of the search targets when it is applied to an unsorted quantum database, and the probability for each search target is equal...The current Grover quantum searching algorithm cannot identify the difference in importance of the search targets when it is applied to an unsorted quantum database, and the probability for each search target is equal. To solve this problem, a Grover searching algorithm based on weighted targets is proposed. First, each target is endowed a weight coefficient according to its importance. Applying these different weight coefficients, the targets are represented as quantum superposition states. Second, the novel Grover searching algorithm based on the quantum superposition of the weighted targets is constructed. Using this algorithm, the probability of getting each target can be approximated to the corresponding weight coefficient, which shows the flexibility of this algorithm. Finally, the validity of the algorithm is proved by a simple searching example.展开更多
The sensor virus is a serious threat,as an attacker can simply send a single packet to compromise the entire sensor network.Epidemics become drastic with link additions among sensors when the small world phenomena occ...The sensor virus is a serious threat,as an attacker can simply send a single packet to compromise the entire sensor network.Epidemics become drastic with link additions among sensors when the small world phenomena occur.Two immunization strategies,uniform immunization and temporary immunization,are conducted on small worlds of tree-based wireless sensor networks to combat the sensor viruses.With the former strategy,the infection extends exponentially,although the immunization effectively reduces the contagion speed.With the latter strategy,recurrent contagion oscillations occur in the small world when the spatial-temporal dynamics of the epidemic are considered.The oscillations come from the small-world structure and the temporary immunization.Mathematical analyses on the small world of the Cayley tree are presented to reveal the epidemic dynamics with the two immunization strategies.展开更多
文摘Tunnel Boring Machines(TBMs)are vital for tunnel and underground construction due to their high safety and efficiency.Accurately predicting TBM operational parameters based on the surrounding environment is crucial for planning schedules and managing costs.This study investigates the effectiveness of tree-based machine learning models,including Random Forest,Extremely Randomized Trees,Adaptive Boosting Machine,Gradient Boosting Machine,Extreme Gradient Boosting Machine(XGBoost),Light Gradient Boosting Machine,and CatBoost,in predicting the Penetration Rate(PR)of TBMs by considering rock mass and material characteristics.These techniques are able to provide a good relationship between input(s)and output parameters;hence,obtaining a high level of accuracy.To do that,a comprehensive database comprising various rock mass and material parameters,including Rock Mass Rating,Brazilian Tensile Strength,and Weathering Zone,was utilized for model development.The practical application of these models was assessed with a new dataset representing diverse rock mass and material properties.To evaluate model performance,ranking systems and Taylor diagrams were employed.CatBoost emerged as the most accurate model during training and testing,with R2 scores of 0.927 and 0.861,respectively.However,during validation,XGBoost demonstrated superior performance with an R2 of 0.713.Despite these variations,all tree-based models showed promising accuracy in predicting TBM performance,providing valuable insights for similar projects in the future.
基金supported by the National Natural Science Foundation of China(Grant Nos.72361137002 and 72288101)the Fundamental Research Funds for the Central Universities(Grant No.2023XKRC038).
文摘This paper investigates whether e-hailing performs better than on-street searching for taxi services.By adopting the Poission point process to model the temporal-spatial distributions of idle vehicles,passengers’waiting time distributions of on-street searching and ehailing are explicitly modeled,and closed-form results of their expected waiting time are given.It is proved that whether e-hailing performs better than on-street searching mainly depends on the density of idle vehicles within the matching area and the matching period.It is proved that given the advantage of e-hailing in rapidly pairing passengers and idle vehicles,the expected waiting time for onstreet searching is always longer than that of e-hailing as long as the number of idle vehicles within a passenger’s dominant temporal-spatial area is lower than 4/π.Moreover,we extend our analysis to explore the market equilibria for both e-hailing and on-street searching,and present the equilibrium conditions for a taxi market operating under ehailing versus on-street searching.With a special reciprocal passenger demand function,it is shown that the performance difference between e-hailing and on-street searching is mainly determined by the ratio of fleet size to maximum potential passenger demand.It suggests that e-hailing can achieve a higher capacity utilization rate of vehicles than on-street searching when vehicle density is relatively low.Furthermore,it is shown that an extended average trip duration improves the chance that e-hailing performs better than on-street searching.The optimal vehicle fleet sizes to maximize the total social welfare/profit are then analyzed,and the corresponding maximization problems are formulated.
基金supported by the National Natural Science Foundation of China(Nos.32171789,32211530031,12411530088)the National Key Research and Development Program of China(No.2023YFF1303901)+2 种基金the Joint Open Funded Project of State Key Laboratory of Geo-Information Engineering and Key Laboratory of the Ministry of Natural Resources for Surveying and Mapping Science and Geo-spatial Information Technology(2022-02-02)Background Resources Survey in Shennongjia National Park(SNJNP2022001)the Open Project Fund of Hubei Provincial Key Laboratory for Conservation Biology of Shennongjia Snub-nosed Monkeys(SNJGKL2022001).
文摘Terrestrial laser scanning(TLS)accurately captures tree structural information and provides prerequisites for treescale estimations of forest biophysical attributes.Quantifying tree-scale attributes from TLS point clouds requires segmentation,yet the occlusion effects severely affect the accuracy of automated individual tree segmentation.In this study,we proposed a novel method using ellipsoid directional searching and point compensation algorithms to alleviate occlusion effects.Firstly,region growing and point compensation algorithms are used to determine the location of tree roots.Secondly,the neighbor points are extracted within an ellipsoid neighborhood to mitigate occlusion effects compared with k-nearest neighbor(KNN).Thirdly,neighbor points are uniformly subsampled by the directional searching algorithm based on the Fibonacci principle in multiple spatial directions to reduce memory consumption.Finally,a graph describing connectivity between a point and its neighbors is constructed,and it is utilized to complete individual tree segmentation based on the shortest path algorithm.The proposed method was evaluated on a public TLS dataset comprising six forest plots with three complexity categories in Evo,Finland,and it reached the highest mean accuracy of 77.5%,higher than previous studies on tree detection.We also extracted and validated the tree structure attributes using manual segmentation reference values.The RMSE,RMSE%,bias,and bias%of tree height,crown base height,crown projection area,crown surface area,and crown volume were used to evaluate the segmentation accuracy,respectively.Overall,the proposed method avoids many inherent limitations of current methods and can accurately map canopy structures in occluded complex forest stands.
基金the University of Transport Technology under grant number DTTD2022-12.
文摘Determination of Shear Bond strength(SBS)at interlayer of double-layer asphalt concrete is crucial in flexible pavement structures.The study used three Machine Learning(ML)models,including K-Nearest Neighbors(KNN),Extra Trees(ET),and Light Gradient Boosting Machine(LGBM),to predict SBS based on easily determinable input parameters.Also,the Grid Search technique was employed for hyper-parameter tuning of the ML models,and cross-validation and learning curve analysis were used for training the models.The models were built on a database of 240 experimental results and three input variables:temperature,normal pressure,and tack coat rate.Model validation was performed using three statistical criteria:the coefficient of determination(R2),the Root Mean Square Error(RMSE),and the mean absolute error(MAE).Additionally,SHAP analysis was also used to validate the importance of the input variables in the prediction of the SBS.Results show that these models accurately predict SBS,with LGBM providing outstanding performance.SHAP(Shapley Additive explanation)analysis for LGBM indicates that temperature is the most influential factor on SBS.Consequently,the proposed ML models can quickly and accurately predict SBS between two layers of asphalt concrete,serving practical applications in flexible pavement structure design.
基金supported by the National Natural Science Foundation of China(NSFC)under Grant(No.51677058).
文摘Precisely estimating the state of health(SOH)of lithium-ion batteries is essential for battery management systems(BMS),as it plays a key role in ensuring the safe and reliable operation of battery systems.However,current SOH estimation methods often overlook the valuable temperature information that can effectively characterize battery aging during capacity degradation.Additionally,the Elman neural network,which is commonly employed for SOH estimation,exhibits several drawbacks,including slow training speed,a tendency to become trapped in local minima,and the initialization of weights and thresholds using pseudo-random numbers,leading to unstable model performance.To address these issues,this study addresses the challenge of precise and effective SOH detection by proposing a method for estimating the SOH of lithium-ion batteries based on differential thermal voltammetry(DTV)and an SSA-Elman neural network.Firstly,two health features(HFs)considering temperature factors and battery voltage are extracted fromthe differential thermal voltammetry curves and incremental capacity curves.Next,the Sparrow Search Algorithm(SSA)is employed to optimize the initial weights and thresholds of the Elman neural network,forming the SSA-Elman neural network model.To validate the performance,various neural networks,including the proposed SSA-Elman network,are tested using the Oxford battery aging dataset.The experimental results demonstrate that the method developed in this study achieves superior accuracy and robustness,with a mean absolute error(MAE)of less than 0.9%and a rootmean square error(RMSE)below 1.4%.
基金Knowledge Innovation Project and Intelligent Infor mation Service and Support Project of the Shanghai Education Commission, China
文摘Towards the crossing and coupling permissions in tasks existed widely in many fields and considering the design of role view must rely on the activities of the tasks process,based on Role Based Accessing Control (RBAC) model,this paper put forward a Role Tree-Based Access Control (RTBAC) model. In addition,the model definition and its constraint formal description is also discussed in this paper. RTBAC model is able to realize the dynamic organizing,self-determination and convenience of the design of role view,and guarantee the least role permission when task separating in the mean time.
基金funded by Ministry of Higher Education(MoHE)Malaysia,under Transdisciplinary Research Grant Scheme(TRGS/1/2019/UKM/01/4/2).
文摘The Cross-domain Heuristic Search Challenge(CHeSC)is a competition focused on creating efficient search algorithms adaptable to diverse problem domains.Selection hyper-heuristics are a class of algorithms that dynamically choose heuristics during the search process.Numerous selection hyper-heuristics have different imple-mentation strategies.However,comparisons between them are lacking in the literature,and previous works have not highlighted the beneficial and detrimental implementation methods of different components.The question is how to effectively employ them to produce an efficient search heuristic.Furthermore,the algorithms that competed in the inaugural CHeSC have not been collectively reviewed.This work conducts a review analysis of the top twenty competitors from this competition to identify effective and ineffective strategies influencing algorithmic performance.A summary of the main characteristics and classification of the algorithms is presented.The analysis underlines efficient and inefficient methods in eight key components,including search points,search phases,heuristic selection,move acceptance,feedback,Tabu mechanism,restart mechanism,and low-level heuristic parameter control.This review analyzes the components referencing the competition’s final leaderboard and discusses future research directions for these components.The effective approaches,identified as having the highest quality index,are mixed search point,iterated search phases,relay hybridization selection,threshold acceptance,mixed learning,Tabu heuristics,stochastic restart,and dynamic parameters.Findings are also compared with recent trends in hyper-heuristics.This work enhances the understanding of selection hyper-heuristics,offering valuable insights for researchers and practitioners aiming to develop effective search algorithms for diverse problem domains.
基金The National Natural Science Foundation of China(No.60573127), Specialized Research Fund for the Doctoral Program of Higher Education (No.20040533036).
文摘In consideration of the limitation of super-peer overlay network, the semantic information was introduced into the super-peers' organization. A novel P2P (peer-to-peer) searching model, SSP2P, was put forward. The peers in the model were organized in a natural area autonomy system (AAS) based on the smallworld theory. A super-peer was selected in each AAS based on power law; and all the super-peers formed different super-peer semantic networks. Thus, a hierarchical super-peer overlay network was formed. The results show that the model reduces the communication cost and enhances the search efficiency while ensuring the system expansibility. It proves that the introduction of semantic information in the construction of a super-peer overlay is favorable to P2P system capability.
基金The National Natural Science Foundation of China(No60573089)the Natural Science Foundation of Liaoning Province(No20052031)the National High Technology Research and Develop-ment Program of China (863Program)(No2006AA09Z139)
文摘Distributed data sources which employ taxonomy hierarchy to describe the contents of their objects are considered, and a super-peer-based semantic overlay network (SSON) is proposed for sharing and searching their data objects. In SSON, peers are dynamically clustered into many semantic clusters based on the semantics of their data objects and organized in the semantic clusters into a semantic overlay network. Each semantic cluster consists of a super-peer and more peers, and is only responsible for answering queries in its semantic subspace. A query is first routed to the appropriate semantic clusters by an efficient searching algorithm, and then it is forwarded to the specific peers that hold the relevant data objects. Experimental results indicate that SSON has good scalability and achieves a competitive trade-off between search efficiency and costs.
基金21st Century Education Revitalization Project (No.301703201).
文摘A novel idea,called the optimal shape subspace (OSS) is first proposed for optimizing active shape model (ASM) search.It is constructed from the principal shape subspace and the principal shape variance subspace.It allows the reconstructed shape to vary more than that reconstructed in the standard ASM shape space,hence it is more expressive in representing shapes in real life.Then a cost function is developed,based on a study on the search process.An optimal searching method using the feedback information provided by the evaluation cost is proposed to improve the performance of ASM alignment.Experimental results show that the proposed OSS can offer the maximum shape variation with reserving the principal information and a unique local optimal shape is acquired after optimal searching.The combination of OSS and optimal searching can improve the ASM performance greatly.
基金supported by Specialized Research Fund for the Doctoral Program of Higher Education of China(20110022120004)the Fundamental Research Funds for the Central Universities
文摘For density inversion of gravity anomaly data, once the inversion method is determined, the main factors affecting the inversion result are the inversion parameters and subdivision scheme. A set of reasonable inversion parameters and subdivision scheme can, not only improve the inversion process efficiency, but also ensure inversion result accuracy. The gravity inversion method based on correlation searching and the golden section algorithm is an effective potential field inversion method. It can be used to invert 2D and 3D physical properties with potential data observed on flat or rough surfaces. In this paper, we introduce in detail the density inversion principles based on correlation searching and the golden section algorithm. Considering that the gold section algorithm is not globally optimized. we present a heuristic method to ensure the inversion result is globally optimized. With a series of model tests, we systematically compare and analyze the inversion result efficiency and accuracy with different parameters. Based on the model test results, we conclude the selection principles for each inversion parameter with which the inversion accuracy can be obviously improved.
基金The project supported by the National Natural Science Foundation of China(59875045)and the State Key Laboratory of Automobile Safety and Energy Saving(K9705)
文摘A new contact searching algorithm for contact-impact systems is proposed in this paper.In terms of the cell structure and the linked-list,this algo- rithm solves the problem of sorting and searching contacts in three dimensions by transforming it to a retrieving process from two one-dimensional arrays,and binary searching is no longer required.Using this algorithm, the cost of contact searching is reduced to the order of O(N)instead of O(Nlog_2N)for traditional ones,where N is the node number in the system.Moreover,this algorithm can handle contact systems with arbitrary mesh layouts.Due to the simplicity of this algorithm it can be easily implemented in a dynamic explicit finite element program.Our numerical experi- mental result shows that this algorithm is reliable arid efficient for contact searching of three dimensional systems.
基金supported by National Natural Science Foundation of China(Grant No.61004109)Fundamental Research Funds for the Central Universities of China(Grant No.FRF-TP-12-071A)
文摘Existing methods of local search mostly focus on how to reach optimal solution.However,in some emergency situations,search time is the hard constraint for job shop scheduling problem while optimal solution is not necessary.In this situation,the existing method of local search is not fast enough.This paper presents an emergency local search(ELS) approach which can reach feasible and nearly optimal solution in limited search time.The ELS approach is desirable for the aforementioned emergency situations where search time is limited and a nearly optimal solution is sufficient,which consists of three phases.Firstly,in order to reach a feasible and nearly optimal solution,infeasible solutions are repaired and a repair technique named group repair is proposed.Secondly,in order to save time,the amount of local search moves need to be reduced and this is achieved by a quickly search method named critical path search(CPS).Finally,CPS sometimes stops at a solution far from the optimal one.In order to jump out the search dilemma of CPS,a jump technique based on critical part is used to improve CPS.Furthermore,the schedule system based on ELS has been developed and experiments based on this system completed on the computer of Intel Pentium(R) 2.93 GHz.The experimental result shows that the optimal solutions of small scale instances are reached in 2 s,and the nearly optimal solutions of large scale instances are reached in 4 s.The proposed ELS approach can stably reach nearly optimal solutions with manageable search time,and can be applied on some emergency situations.
文摘We present the design and performance of a home-built scanning tunneling microscope (STM), which is compact (66 mm tall and 25 mm in diameter), yet equipped with a 3D atomic precision piezoelectric motor in which the Z coarse approach relies on a high simplic-ity friction-type walker (of our own invention) driven by an axially cut piezoelectric tube. The walker is vertically inserted in a piezoelectric scanner tube (PST) with its brim laying at on the PST end as the inertial slider (driven by the PST) for the XZ (sample plane) motion. The STM is designed to be capable of searching rare microscopic targets (defects, dopants, boundaries, nano-devices, etc.) in a macroscopic sample area (square millimeters) under extreme conditions (low temperatures, strong magnetic elds, etc.) in which it ts. It gives good atomic resolution images after scanning a highly oriented pyrolytic graphite sample in air at room temperature.
基金supported by the National Natural Science Foundation of China(6127125061571460)
文摘A hybrid carrier(HC) scheme based on weighted-type fractional Fourier transform(WFRFT) has been proposed recently.While most of the works focus on HC scheme's inherent characteristics, little attention is paid to the WFRFT modulation recognition.In this paper, a new theory is provided to recognize the WFRFT modulation based on higher order cumulants(HOC). First, it is deduced that the optimal WFRFT received order can be obtained through the minimization of 4 th-order cumulants, C_(42). Then, a combinatorial searching algorithm is designed to minimize C_(42).Finally, simulation results show that the designed scheme has a high recognition rate and the combinatorial searching algorithm is effective and reliable.
文摘Artificial Searching Swarm Algorithm (ASSA) is a new optimization algorithm. ASSA simulates the soldiers to search an enemy’s important goal, and transforms the process of solving optimization problem into the process of searching optimal goal by searching swarm with set rules. This work selects complicated and highn dimension functions to deeply analyse the performance for unconstrained and constrained optimization problems and the results produced by ASSA, Genetic Algorithm (GA), Particle Swarm Optimization (PSO), Artificial Fish-Swarm Algorithm (AFSA) have been compared. The main factors which influence the performance of ASSA are also discussed. The results demonstrate the effectiveness of the proposed ASSA optimization algorithm.
文摘Two deficiencies in traditional iterative closest pointsimultaneous localization and mapping( ICP-SLAM) usually result in poor real-time performance. On one hand, relative position between current scan frame and global map cannot be previously known. As a result, ICP algorithm will take much amount of iterations to reach convergence. On the other hand,establishment of correspondence is done by global searching, which requires enormous computational time. To overcome the two problems,a fast ICP-SLAM with rough alignment and narrowing-scale nearby searching is proposed. As for the decrease of iterative times,rough alignment based on initial pose matrix is proposed. In detail,initial pose matrix is obtained by micro-electro-mechanical system( MEMS) magnetometer and global landmarks. Then rough alignment will be applied between current scan frame and global map at the beginning of ICP algorithm with initial pose matrix. As for accelerating the establishment of correspondence, narrowingscale nearby searching with dynamic threshold is proposed,where match-points are found within a progressively constrictive range.Compared to traditional ICP-SLAM,the experimental results show that the amount of iteration for ICP algorithm to reach convergence reduces to 92. 34% and ICP algorithm runtime reduces to 98. 86% on average. In addition,computational cost is kept in a stable level due to the eliminating of the accumulation of computational consumption. Moreover,great improvement can also been achieved in SLAM quality and robustness.
基金Acknowledgements This paper was supported by the Major National Science and Technology program under Grant No. 2011ZX03005-002 the National Natural Science Foundation of China under Grant No. 61100233 the Fundamental Universities under Grant No Research Funds for the Central K50510030010.
文摘Wireless Mesh Network (WMN) is seen as an effective Intemet access solution for dynamic wireless applications. For the low mobility of mesh routers in WMN, the backbone topography can be effectively maintained by proactive routing protocol. Pre-proposals like Tree Based Routing (TBR) protocol and Root Driven Routing (RDR) protocol are so centralized that they make the gateway becorre a bottleneck which severely restricts the network performance. We proposed an Optimized Tree-based Routing (OTR) protocol that logically separated the proactive tree into pieces. Route is partly computed by the branches instead of root. We also discussed the operation of multipie Intemet gateways which is a main issue in WMN. The new proposal lightens the load in root, reduces the overhead and improves the throughput. Numerical analysis and simulation results confirm that the perforrmnce of WMN is improved and OTR is more suitable for large scale WMN.
基金the National Natural Science Foundation of China (60773065).
文摘The current Grover quantum searching algorithm cannot identify the difference in importance of the search targets when it is applied to an unsorted quantum database, and the probability for each search target is equal. To solve this problem, a Grover searching algorithm based on weighted targets is proposed. First, each target is endowed a weight coefficient according to its importance. Applying these different weight coefficients, the targets are represented as quantum superposition states. Second, the novel Grover searching algorithm based on the quantum superposition of the weighted targets is constructed. Using this algorithm, the probability of getting each target can be approximated to the corresponding weight coefficient, which shows the flexibility of this algorithm. Finally, the validity of the algorithm is proved by a simple searching example.
文摘The sensor virus is a serious threat,as an attacker can simply send a single packet to compromise the entire sensor network.Epidemics become drastic with link additions among sensors when the small world phenomena occur.Two immunization strategies,uniform immunization and temporary immunization,are conducted on small worlds of tree-based wireless sensor networks to combat the sensor viruses.With the former strategy,the infection extends exponentially,although the immunization effectively reduces the contagion speed.With the latter strategy,recurrent contagion oscillations occur in the small world when the spatial-temporal dynamics of the epidemic are considered.The oscillations come from the small-world structure and the temporary immunization.Mathematical analyses on the small world of the Cayley tree are presented to reveal the epidemic dynamics with the two immunization strategies.