期刊文献+
共找到60,363篇文章
< 1 2 250 >
每页显示 20 50 100
Distributed Sampling Measurement Model in a Large-Scale High-Speed IP Networks 被引量:1
1
作者 龚俭 程光 《Journal of Southeast University(English Edition)》 EI CAS 2002年第1期40-45,共6页
The distributed passive measurement is an important technology for networkbehavior research. To achieve a consistent measurement, the same packets should be sampled atdistributed measurement points. And in order to es... The distributed passive measurement is an important technology for networkbehavior research. To achieve a consistent measurement, the same packets should be sampled atdistributed measurement points. And in order to estimate the character of traffic statistics, thetraffic sample should be random in statistics. A distributed samplingmask measurement model isintroduced to tackle the difficulty of measuring the full trace of high-speed networks. The keypoint of the model is to choose some bits that are suitable to be sampling mask. In the paper, thebit entropy and bit flow entropy of IP packet headers in CERNET backbone are analyzed, and we findthat the 16 bits of identification field in IP packet header are fit to the matching field ofsampling mask. Measurement traffic also can be used to analyze the statistical character ofmeasurement sample and the randomicity of the model. At the same time the experiment resultsindicate that the model has a good sampling performance. 展开更多
关键词 sampling measurement bit entropy matching field identification field
下载PDF
Large-scale laboratory investigation of pillar-support interaction
2
作者 Akash Chaurasia Gabriel Walton +4 位作者 Sankhaneel Sinha Timothy J.Batchler Kieran Moore Nicholas Vlachopoulos Bradley Forbes 《Journal of Rock Mechanics and Geotechnical Engineering》 2025年第1期71-93,共23页
Underground mine pillars provide natural stability to the mine area,allowing safe operations for workers and machinery.Extensive prior research has been conducted to understand pillar failure mechanics and design safe... Underground mine pillars provide natural stability to the mine area,allowing safe operations for workers and machinery.Extensive prior research has been conducted to understand pillar failure mechanics and design safe pillar layouts.However,limited studies(mostly based on empirical field observation and small-scale laboratory tests)have considered pillar-support interactions under monotonic loading conditions for the design of pillar-support systems.This study used a series of large-scale laboratory compression tests on porous limestone blocks to analyze rock and support behavior at a sufficiently large scale(specimens with edge length of 0.5 m)for incorporation of actual support elements,with consideration of different w/h ratios.Both unsupported and supported(grouted rebar rockbolt and wire mesh)tests were conducted,and the surface deformations of the specimens were monitored using three-dimensional(3D)digital image correlation(DIC).Rockbolts instrumented with distributed fiber optic strain sensors were used to study rockbolt strain distribution,load mobilization,and localized deformation at different w/h ratios.Both axial and bending strains were observed in the rockbolts,which became more prominent in the post-peak region of the stress-strain curve. 展开更多
关键词 Grouted rockbolt Welded wire mesh Porous limestone Digital image correlation Distributed fiber optic sensing large-scale laboratory tests
下载PDF
Research on the Large-scale Network Intrusion Mode based on Principal Component Analysis and Drop Quality Sampling
3
作者 Yanmei Zhang 《International Journal of Technology Management》 2016年第7期8-11,共4页
In this paper, we conduct research on the large-scale network intrusion mode based on the principal component analysis and dropquality sampling. With the growing of network security issues, invasion detection becomes ... In this paper, we conduct research on the large-scale network intrusion mode based on the principal component analysis and dropquality sampling. With the growing of network security issues, invasion detection becomes the study hotspot. There are two main types of thatinvasion detection technology, the fi rst is that misuse detection and the anomaly detection. Misuse detection can more accurately detect attacks,but high non-response rates, anomaly detection could detect the unknown attacks, but higher rate of false positives. Network invasion detectionproblem is summed up in the network data fl ow of discriminant problem, namely the judgment of network data fl ow is normal or malicious andin this sense here invasion detection problem can be understood as a pattern recognition problem. Our research integrates the PCA and samplingtechnique to propose the new idea on the IDS that is innovative and will promote the development of the corresponding techniques. 展开更多
关键词 Network Intrusion Principal Component Analysis Drop Quality sampling Scale.
下载PDF
A Two-Layer Encoding Learning Swarm Optimizer Based on Frequent Itemsets for Sparse Large-Scale Multi-Objective Optimization 被引量:1
4
作者 Sheng Qi Rui Wang +3 位作者 Tao Zhang Xu Yang Ruiqing Sun Ling Wang 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2024年第6期1342-1357,共16页
Traditional large-scale multi-objective optimization algorithms(LSMOEAs)encounter difficulties when dealing with sparse large-scale multi-objective optimization problems(SLM-OPs)where most decision variables are zero.... Traditional large-scale multi-objective optimization algorithms(LSMOEAs)encounter difficulties when dealing with sparse large-scale multi-objective optimization problems(SLM-OPs)where most decision variables are zero.As a result,many algorithms use a two-layer encoding approach to optimize binary variable Mask and real variable Dec separately.Nevertheless,existing optimizers often focus on locating non-zero variable posi-tions to optimize the binary variables Mask.However,approxi-mating the sparse distribution of real Pareto optimal solutions does not necessarily mean that the objective function is optimized.In data mining,it is common to mine frequent itemsets appear-ing together in a dataset to reveal the correlation between data.Inspired by this,we propose a novel two-layer encoding learning swarm optimizer based on frequent itemsets(TELSO)to address these SLMOPs.TELSO mined the frequent terms of multiple particles with better target values to find mask combinations that can obtain better objective values for fast convergence.Experi-mental results on five real-world problems and eight benchmark sets demonstrate that TELSO outperforms existing state-of-the-art sparse large-scale multi-objective evolutionary algorithms(SLMOEAs)in terms of performance and convergence speed. 展开更多
关键词 Evolutionary algorithms learning swarm optimiza-tion sparse large-scale optimization sparse large-scale multi-objec-tive problems two-layer encoding.
下载PDF
How do the landslide and non-landslide sampling strategies impact landslide susceptibility assessment? d A catchment-scale case study from China 被引量:2
5
作者 Zizheng Guo Bixia Tian +2 位作者 Yuhang Zhu Jun He Taili Zhang 《Journal of Rock Mechanics and Geotechnical Engineering》 SCIE CSCD 2024年第3期877-894,共18页
The aim of this study is to investigate the impacts of the sampling strategy of landslide and non-landslide on the performance of landslide susceptibility assessment(LSA).The study area is the Feiyun catchment in Wenz... The aim of this study is to investigate the impacts of the sampling strategy of landslide and non-landslide on the performance of landslide susceptibility assessment(LSA).The study area is the Feiyun catchment in Wenzhou City,Southeast China.Two types of landslides samples,combined with seven non-landslide sampling strategies,resulted in a total of 14 scenarios.The corresponding landslide susceptibility map(LSM)for each scenario was generated using the random forest model.The receiver operating characteristic(ROC)curve and statistical indicators were calculated and used to assess the impact of the dataset sampling strategy.The results showed that higher accuracies were achieved when using the landslide core as positive samples,combined with non-landslide sampling from the very low zone or buffer zone.The results reveal the influence of landslide and non-landslide sampling strategies on the accuracy of LSA,which provides a reference for subsequent researchers aiming to obtain a more reasonable LSM. 展开更多
关键词 Landslide susceptibility sampling strategy Machine learning Random forest China
下载PDF
Assessment of Wet Season Precipitation in the Central United States by the Regional Climate Simulation of the WRFG Member in NARCCAP and Its Relationship with Large-Scale Circulation Biases 被引量:1
6
作者 Yating ZHAO Ming XUE +2 位作者 Jing JIANG Xiao-Ming HU Anning HUANG 《Advances in Atmospheric Sciences》 SCIE CAS CSCD 2024年第4期619-638,共20页
Assessment of past-climate simulations of regional climate models(RCMs)is important for understanding the reliability of RCMs when used to project future regional climate.Here,we assess the performance and discuss pos... Assessment of past-climate simulations of regional climate models(RCMs)is important for understanding the reliability of RCMs when used to project future regional climate.Here,we assess the performance and discuss possible causes of biases in a WRF-based RCM with a grid spacing of 50 km,named WRFG,from the North American Regional Climate Change Assessment Program(NARCCAP)in simulating wet season precipitation over the Central United States for a period when observational data are available.The RCM reproduces key features of the precipitation distribution characteristics during late spring to early summer,although it tends to underestimate the magnitude of precipitation.This dry bias is partially due to the model’s lack of skill in simulating nocturnal precipitation related to the lack of eastward propagating convective systems in the simulation.Inaccuracy in reproducing large-scale circulation and environmental conditions is another contributing factor.The too weak simulated pressure gradient between the Rocky Mountains and the Gulf of Mexico results in weaker southerly winds in between,leading to a reduction of warm moist air transport from the Gulf to the Central Great Plains.The simulated low-level horizontal convergence fields are less favorable for upward motion than in the NARR and hence,for the development of moist convection as well.Therefore,a careful examination of an RCM’s deficiencies and the identification of the source of errors are important when using the RCM to project precipitation changes in future climate scenarios. 展开更多
关键词 NARCCAP Central United States PRECIPITATION low-level jet large-scale environment diurnal variation
下载PDF
Research on a Monte Carlo global variance reduction method based on an automatic importance sampling method 被引量:1
7
作者 Yi-Sheng Hao Zhen Wu +3 位作者 Shen-Shen Gao Rui Qiu Hui Zhang Jun-Li Li 《Nuclear Science and Techniques》 SCIE EI CAS CSCD 2024年第5期200-215,共16页
Global variance reduction is a bottleneck in Monte Carlo shielding calculations.The global variance reduction problem requires that the statistical error of the entire space is uniform.This study proposed a grid-AIS m... Global variance reduction is a bottleneck in Monte Carlo shielding calculations.The global variance reduction problem requires that the statistical error of the entire space is uniform.This study proposed a grid-AIS method for the global variance reduction problem based on the AIS method,which was implemented in the Monte Carlo program MCShield.The proposed method was validated using the VENUS-Ⅲ international benchmark problem and a self-shielding calculation example.The results from the VENUS-Ⅲ benchmark problem showed that the grid-AIS method achieved a significant reduction in the variance of the statistical errors of the MESH grids,decreasing from 1.08×10^(-2) to 3.84×10^(-3),representing a 64.00% reduction.This demonstrates that the grid-AIS method is effective in addressing global issues.The results of the selfshielding calculation demonstrate that the grid-AIS method produced accurate computational results.Moreover,the grid-AIS method exhibited a computational efficiency approximately one order of magnitude higher than that of the AIS method and approximately two orders of magnitude higher than that of the conventional Monte Carlo method. 展开更多
关键词 Monte Carlo Global variance reduction Reactor shielding Automatic importance sampling
下载PDF
Some results of modeling D-D seismogenic pattern by the fracture model experiment of large-scale rock samples(I) 被引量:2
8
作者 陆阳泉 赵家骝 +2 位作者 钱家栋 王玉祥 刘建毅 《Acta Seismologica Sinica(English Edition)》 CSCD 1998年第2期95-102,共8页
sing the natural limestone samples taken from the field with dimension of 500 mm×500 mm×1 000 mm, the D-D (dilatancy-diffusion) seismogeny pattern was modeled under the condition of water injection, which ob... sing the natural limestone samples taken from the field with dimension of 500 mm×500 mm×1 000 mm, the D-D (dilatancy-diffusion) seismogeny pattern was modeled under the condition of water injection, which observes the time-space evolutionary features about the relative physics fields of the loaded samples from deformation, formation of microcracks to the occurrence of main rupture. The results of observed apparent resistivity show: ① The process of the deformation from microcrack to main rupture on the loaded rock sample could be characterized by the precursory spatial-temporal changes in the observation of apparent resistivity; ② The precursory temporal changes of observation in apparent resistivity could be divided into several stages, and its spatial distribution shows the difference in different parts of the rock sample; ③ Before the main rupture of the rock sample the obvious ″tendency anomaly′ and ′short-term anomaly″ were observed, and some of them could be likely considered as the ″impending earthquake ″anomaly precursor of apparent resistivity. The changes and distribution features of apparent resistivity show that they are intrinsically related to the dilatancy phenomenon of the loaded rock sample. Finally, this paper discusses the mechanism of resistivity change of loaded rock sample theoretically. 展开更多
关键词 fracture experiment of large-scale rock sample D-D seismogenic pattern apparent resistivity
下载PDF
Enhancing Evolutionary Algorithms With Pattern Mining for Sparse Large-Scale Multi-Objective Optimization Problems
9
作者 Sheng Qi Rui Wang +3 位作者 Tao Zhang Weixiong Huang Fan Yu Ling Wang 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2024年第8期1786-1801,共16页
Sparse large-scale multi-objective optimization problems(SLMOPs)are common in science and engineering.However,the large-scale problem represents the high dimensionality of the decision space,requiring algorithms to tr... Sparse large-scale multi-objective optimization problems(SLMOPs)are common in science and engineering.However,the large-scale problem represents the high dimensionality of the decision space,requiring algorithms to traverse vast expanse with limited computational resources.Furthermore,in the context of sparse,most variables in Pareto optimal solutions are zero,making it difficult for algorithms to identify non-zero variables efficiently.This paper is dedicated to addressing the challenges posed by SLMOPs.To start,we introduce innovative objective functions customized to mine maximum and minimum candidate sets.This substantial enhancement dramatically improves the efficacy of frequent pattern mining.In this way,selecting candidate sets is no longer based on the quantity of nonzero variables they contain but on a higher proportion of nonzero variables within specific dimensions.Additionally,we unveil a novel approach to association rule mining,which delves into the intricate relationships between non-zero variables.This novel methodology aids in identifying sparse distributions that can potentially expedite reductions in the objective function value.We extensively tested our algorithm across eight benchmark problems and four real-world SLMOPs.The results demonstrate that our approach achieves competitive solutions across various challenges. 展开更多
关键词 Evolutionary algorithms pattern mining sparse large-scale multi-objective problems(SLMOPs) sparse large-scale optimization.
下载PDF
Multivariate form of Hermite sampling series
10
作者 Rashad M.Asharabi 《Applied Mathematics(A Journal of Chinese Universities)》 SCIE CSCD 2024年第2期253-265,共13页
In this paper,we establish a new multivariate Hermite sampling series involving samples from the function itself and its mixed and non-mixed partial derivatives of arbitrary order.This multivariate form of Hermite sam... In this paper,we establish a new multivariate Hermite sampling series involving samples from the function itself and its mixed and non-mixed partial derivatives of arbitrary order.This multivariate form of Hermite sampling will be valid for some classes of multivariate entire functions,satisfying certain growth conditions.We will show that many known results included in Commun Korean Math Soc,2002,17:731-740,Turk J Math,2017,41:387-403 and Filomat,2020,34:3339-3347 are special cases of our results.Moreover,we estimate the truncation error of this sampling based on localized sampling without decay assumption.Illustrative examples are also presented. 展开更多
关键词 multidimensional sampling series sampling with partial derivatives contour integral truncation error
下载PDF
Large-scale model testing of high-pressure grouting reinforcement for bedding slope with rapid-setting polyurethane
11
作者 ZHANG Zhichao TANG Xuefeng +2 位作者 LIU Kan YE Longzhen HE Xiang 《Journal of Mountain Science》 SCIE CSCD 2024年第9期3083-3093,共11页
Bedding slope is a typical heterogeneous slope consisting of different soil/rock layers and is likely to slide along the weakest interface.Conventional slope protection methods for bedding slopes,such as retaining wal... Bedding slope is a typical heterogeneous slope consisting of different soil/rock layers and is likely to slide along the weakest interface.Conventional slope protection methods for bedding slopes,such as retaining walls,stabilizing piles,and anchors,are time-consuming and labor-and energy-intensive.This study proposes an innovative polymer grout method to improve the bearing capacity and reduce the displacement of bedding slopes.A series of large-scale model tests were carried out to verify the effectiveness of polymer grout in protecting bedding slopes.Specifically,load-displacement relationships and failure patterns were analyzed for different testing slopes with various dosages of polymer.Results show the great potential of polymer grout in improving bearing capacity,reducing settlement,and protecting slopes from being crushed under shearing.The polymer-treated slopes remained structurally intact,while the untreated slope exhibited considerable damage when subjected to loads surpassing the bearing capacity.It is also found that polymer-cemented soils concentrate around the injection pipe,forming a fan-shaped sheet-like structure.This study proves the improvement of polymer grouting for bedding slope treatment and will contribute to the development of a fast method to protect bedding slopes from landslides. 展开更多
关键词 POLYURETHANE Bedding slope GROUTING Slope protection large-scale model test
下载PDF
Modified DS np Chart Using Generalized Multiple Dependent State Sampling under Time Truncated Life Test
12
作者 Wimonmas Bamrungsetthapong Pramote Charongrattanasakul 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第3期2471-2495,共25页
This study presents the design of a modified attributed control chart based on a double sampling(DS)np chart applied in combination with generalized multiple dependent state(GMDS)sampling to monitor the mean life of t... This study presents the design of a modified attributed control chart based on a double sampling(DS)np chart applied in combination with generalized multiple dependent state(GMDS)sampling to monitor the mean life of the product based on the time truncated life test employing theWeibull distribution.The control chart developed supports the examination of the mean lifespan variation for a particular product in the process of manufacturing.Three control limit levels are used:the warning control limit,inner control limit,and outer control limit.Together,they enhance the capability for variation detection.A genetic algorithm can be used for optimization during the in-control process,whereby the optimal parameters can be established for the proposed control chart.The control chart performance is assessed using the average run length,while the influence of the model parameters upon the control chart solution is assessed via sensitivity analysis based on an orthogonal experimental design withmultiple linear regression.A comparative study was conducted based on the out-of-control average run length,in which the developed control chart offered greater sensitivity in the detection of process shifts while making use of smaller samples on average than is the case for existing control charts.Finally,to exhibit the utility of the developed control chart,this paper presents its application using simulated data with parameters drawn from the real set of data. 展开更多
关键词 Modified DS np chart generalizedmultiple dependent state sampling time truncated life test Weibull distribution average run length average sample size
下载PDF
Online identification and extraction method of regional large-scale adjustable load-aggregation characteristics
13
作者 Siwei Li Liang Yue +1 位作者 Xiangyu Kong Chengshan Wang 《Global Energy Interconnection》 EI CSCD 2024年第3期313-323,共11页
This article introduces the concept of load aggregation,which involves a comprehensive analysis of loads to acquire their external characteristics for the purpose of modeling and analyzing power systems.The online ide... This article introduces the concept of load aggregation,which involves a comprehensive analysis of loads to acquire their external characteristics for the purpose of modeling and analyzing power systems.The online identification method is a computer-involved approach for data collection,processing,and system identification,commonly used for adaptive control and prediction.This paper proposes a method for dynamically aggregating large-scale adjustable loads to support high proportions of new energy integration,aiming to study the aggregation characteristics of regional large-scale adjustable loads using online identification techniques and feature extraction methods.The experiment selected 300 central air conditioners as the research subject and analyzed their regulation characteristics,economic efficiency,and comfort.The experimental results show that as the adjustment time of the air conditioner increases from 5 minutes to 35 minutes,the stable adjustment quantity during the adjustment period decreases from 28.46 to 3.57,indicating that air conditioning loads can be controlled over a long period and have better adjustment effects in the short term.Overall,the experimental results of this paper demonstrate that analyzing the aggregation characteristics of regional large-scale adjustable loads using online identification techniques and feature extraction algorithms is effective. 展开更多
关键词 Load aggregation Regional large-scale Online recognition Feature extraction method
下载PDF
A semantic vector map-based approach for aircraft positioning in GNSS/GPS denied large-scale environment
14
作者 Chenguang Ouyang Suxing Hu +6 位作者 Fengqi Long Shuai Shi Zhichao Yu Kaichun Zhao Zheng You Junyin Pi Bowen Xing 《Defence Technology(防务技术)》 SCIE EI CAS CSCD 2024年第4期1-10,共10页
Accurate positioning is one of the essential requirements for numerous applications of remote sensing data,especially in the event of a noisy or unreliable satellite signal.Toward this end,we present a novel framework... Accurate positioning is one of the essential requirements for numerous applications of remote sensing data,especially in the event of a noisy or unreliable satellite signal.Toward this end,we present a novel framework for aircraft geo-localization in a large range that only requires a downward-facing monocular camera,an altimeter,a compass,and an open-source Vector Map(VMAP).The algorithm combines the matching and particle filter methods.Shape vector and correlation between two building contour vectors are defined,and a coarse-to-fine building vector matching(CFBVM)method is proposed in the matching stage,for which the original matching results are described by the Gaussian mixture model(GMM).Subsequently,an improved resampling strategy is designed to reduce computing expenses with a huge number of initial particles,and a credibility indicator is designed to avoid location mistakes in the particle filter stage.An experimental evaluation of the approach based on flight data is provided.On a flight at a height of 0.2 km over a flight distance of 2 km,the aircraft is geo-localized in a reference map of 11,025 km~2using 0.09 km~2aerial images without any prior information.The absolute localization error is less than 10 m. 展开更多
关键词 large-scale positioning Building vector matching Improved particle filter GPS-Denied Vector map
下载PDF
RE-SMOTE:A Novel Imbalanced Sampling Method Based on SMOTE with Radius Estimation
15
作者 Dazhi E Jiale Liu +2 位作者 Ming Zhang Huiyuan Jiang Keming Mao 《Computers, Materials & Continua》 SCIE EI 2024年第12期3853-3880,共28页
Imbalance is a distinctive feature of many datasets,and how to make the dataset balanced become a hot topic in the machine learning field.The Synthetic Minority Oversampling Technique(SMOTE)is the classical method to ... Imbalance is a distinctive feature of many datasets,and how to make the dataset balanced become a hot topic in the machine learning field.The Synthetic Minority Oversampling Technique(SMOTE)is the classical method to solve this problem.Although much research has been conducted on SMOTE,there is still the problem of synthetic sample singularity.To solve the issues of class imbalance and diversity of generated samples,this paper proposes a hybrid resampling method for binary imbalanced data sets,RE-SMOTE,which is designed based on the improvements of two oversampling methods parameter-free SMOTE(PF-SMOTE)and SMOTE-Weighted Ensemble Nearest Neighbor(SMOTE-WENN).Initially,minority class samples are divided into safe and boundary minority categories.Boundary minority samples are regenerated through linear interpolation with the nearest majority class samples.In contrast,safe minority samples are randomly generated within a circular range centered on the initial safe minority samples with a radius determined by the distance to the nearest majority class samples.Furthermore,we use Weighted Edited Nearest Neighbor(WENN)and relative density methods to clean the generated samples and remove the low-quality samples.Relative density is calculated based on the ratio of majority to minority samples among the reverse k-nearest neighbor samples.To verify the effectiveness and robustness of the proposed model,we conducted a comprehensive experimental study on 40 datasets selected from real applications.The experimental results show the superiority of radius estimation-SMOTE(RE-SMOTE)over other state-of-the-art methods.Code is available at:https://github.com/blue9792/RE-SMOTE(accessed on 30 September 2024). 展开更多
关键词 Imbalanced data sampling SMOTE radius estimation
下载PDF
Enhancing Deep Learning Semantics:The Diffusion Sampling and Label-Driven Co-Attention Approach
16
作者 ChunhuaWang Wenqian Shang +1 位作者 Tong Yi Haibin Zhu 《Computers, Materials & Continua》 SCIE EI 2024年第5期1939-1956,共18页
The advent of self-attention mechanisms within Transformer models has significantly propelled the advancement of deep learning algorithms,yielding outstanding achievements across diverse domains.Nonetheless,self-atten... The advent of self-attention mechanisms within Transformer models has significantly propelled the advancement of deep learning algorithms,yielding outstanding achievements across diverse domains.Nonetheless,self-attention mechanisms falter when applied to datasets with intricate semantic content and extensive dependency structures.In response,this paper introduces a Diffusion Sampling and Label-Driven Co-attention Neural Network(DSLD),which adopts a diffusion sampling method to capture more comprehensive semantic information of the data.Additionally,themodel leverages the joint correlation information of labels and data to introduce the computation of text representation,correcting semantic representationbiases in thedata,andincreasing the accuracyof semantic representation.Ultimately,the model computes the corresponding classification results by synthesizing these rich data semantic representations.Experiments on seven benchmark datasets show that our proposed model achieves competitive results compared to state-of-the-art methods. 展开更多
关键词 Semantic representation sampling attention label-driven co-attention attention mechanisms
下载PDF
Large-Scale Multi-Objective Optimization Algorithm Based on Weighted Overlapping Grouping of Decision Variables
17
作者 Liang Chen Jingbo Zhang +2 位作者 Linjie Wu Xingjuan Cai Yubin Xu 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第7期363-383,共21页
The large-scale multi-objective optimization algorithm(LSMOA),based on the grouping of decision variables,is an advanced method for handling high-dimensional decision variables.However,in practical problems,the intera... The large-scale multi-objective optimization algorithm(LSMOA),based on the grouping of decision variables,is an advanced method for handling high-dimensional decision variables.However,in practical problems,the interaction among decision variables is intricate,leading to large group sizes and suboptimal optimization effects;hence a large-scale multi-objective optimization algorithm based on weighted overlapping grouping of decision variables(MOEAWOD)is proposed in this paper.Initially,the decision variables are perturbed and categorized into convergence and diversity variables;subsequently,the convergence variables are subdivided into groups based on the interactions among different decision variables.If the size of a group surpasses the set threshold,that group undergoes a process of weighting and overlapping grouping.Specifically,the interaction strength is evaluated based on the interaction frequency and number of objectives among various decision variables.The decision variable with the highest interaction in the group is identified and disregarded,and the remaining variables are then reclassified into subgroups.Finally,the decision variable with the strongest interaction is added to each subgroup.MOEAWOD minimizes the interactivity between different groups and maximizes the interactivity of decision variables within groups,which contributed to the optimized direction of convergence and diversity exploration with different groups.MOEAWOD was subjected to testing on 18 benchmark large-scale optimization problems,and the experimental results demonstrate the effectiveness of our methods.Compared with the other algorithms,our method is still at an advantage. 展开更多
关键词 Decision variable grouping large-scale multi-objective optimization algorithms weighted overlapping grouping direction-guided evolution
下载PDF
Recent advances in protein conformation sampling by combining machine learning with molecular simulation
18
作者 唐一鸣 杨中元 +7 位作者 姚逸飞 周运 谈圆 王子超 潘瞳 熊瑞 孙俊力 韦广红 《Chinese Physics B》 SCIE EI CAS CSCD 2024年第3期80-87,共8页
The rapid advancement and broad application of machine learning(ML)have driven a groundbreaking revolution in computational biology.One of the most cutting-edge and important applications of ML is its integration with... The rapid advancement and broad application of machine learning(ML)have driven a groundbreaking revolution in computational biology.One of the most cutting-edge and important applications of ML is its integration with molecular simulations to improve the sampling efficiency of the vast conformational space of large biomolecules.This review focuses on recent studies that utilize ML-based techniques in the exploration of protein conformational landscape.We first highlight the recent development of ML-aided enhanced sampling methods,including heuristic algorithms and neural networks that are designed to refine the selection of reaction coordinates for the construction of bias potential,or facilitate the exploration of the unsampled region of the energy landscape.Further,we review the development of autoencoder based methods that combine molecular simulations and deep learning to expand the search for protein conformations.Lastly,we discuss the cutting-edge methodologies for the one-shot generation of protein conformations with precise Boltzmann weights.Collectively,this review demonstrates the promising potential of machine learning in revolutionizing our insight into the complex conformational ensembles of proteins. 展开更多
关键词 machine learning molecular simulation protein conformational space enhanced sampling
下载PDF
A Large-Scale Group Decision Making Model Based on Trust Relationship and Social Network Updating
19
作者 Rongrong Ren Luyang Su +2 位作者 Xinyu Meng Jianfang Wang Meng Zhao 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第1期429-458,共30页
With the development of big data and social computing,large-scale group decisionmaking(LGDM)is nowmerging with social networks.Using social network analysis(SNA),this study proposes an LGDM consensus model that consid... With the development of big data and social computing,large-scale group decisionmaking(LGDM)is nowmerging with social networks.Using social network analysis(SNA),this study proposes an LGDM consensus model that considers the trust relationship among decisionmakers(DMs).In the process of consensusmeasurement:the social network is constructed according to the social relationship among DMs,and the Louvain method is introduced to classify social networks to form subgroups.In this study,the weights of each decision maker and each subgroup are computed by comprehensive network weights and trust weights.In the process of consensus improvement:A feedback mechanism with four identification and two direction rules is designed to guide the consensus of the improvement process.Based on the trust relationship among DMs,the preferences are modified,and the corresponding social network is updated to accelerate the consensus.Compared with the previous research,the proposedmodel not only allows the subgroups to be reconstructed and updated during the adjustment process,but also improves the accuracy of the adjustment by the feedbackmechanism.Finally,an example analysis is conducted to verify the effectiveness and flexibility of the proposed method.Moreover,compared with previous studies,the superiority of the proposed method in solving the LGDM problem is highlighted. 展开更多
关键词 large-scale group decision making social network updating trust relationship group consensus feedback mechanism
下载PDF
FPSblo:A Blockchain Network Transmission Model Utilizing Farthest Point Sampling
20
作者 Longle Cheng Xiru Li +4 位作者 Shiyu Fang Wansu Pan He Zhao Haibo Tan Xiaofeng Li 《Computers, Materials & Continua》 SCIE EI 2024年第2期2491-2509,共19页
Peer-to-peer(P2P)overlay networks provide message transmission capabilities for blockchain systems.Improving data transmission efficiency in P2P networks can greatly enhance the performance of blockchain systems.Howev... Peer-to-peer(P2P)overlay networks provide message transmission capabilities for blockchain systems.Improving data transmission efficiency in P2P networks can greatly enhance the performance of blockchain systems.However,traditional blockchain P2P networks face a common challenge where there is often a mismatch between the upper-layer traffic requirements and the underlying physical network topology.This mismatch results in redundant data transmission and inefficient routing,severely constraining the scalability of blockchain systems.To address these pressing issues,we propose FPSblo,an efficient transmission method for blockchain networks.Our inspiration for FPSblo stems from the Farthest Point Sampling(FPS)algorithm,a well-established technique widely utilized in point cloud image processing.In this work,we analogize blockchain nodes to points in a point cloud image and select a representative set of nodes to prioritize message forwarding so that messages reach the network edge quickly and are evenly distributed.Moreover,we compare our model with the Kadcast transmission model,which is a classic improvement model for blockchain P2P transmission networks,the experimental findings show that the FPSblo model reduces 34.8%of transmission redundancy and reduces the overload rate by 37.6%.By conducting experimental analysis,the FPS-BT model enhances the transmission capabilities of the P2P network in blockchain. 展开更多
关键词 Blockchain P2P networks SCALABILITY farthest point sampling
下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部