期刊文献+
共找到248,846篇文章
< 1 2 250 >
每页显示 20 50 100
Large-scale computational screening of metal–organic frameworks for D_(2)/H_(2) separation 被引量:2
1
作者 Fei Wang Zhiyuan Bi +1 位作者 Lifeng Ding Qingyuan Yang 《Chinese Journal of Chemical Engineering》 SCIE EI CAS CSCD 2023年第2期323-330,共8页
Deuterium(D_(2)) is one of the important fuel sources that power nuclear fusion reactors. The existing D_(2)/H_(2) separation technologies that obtain high-purity D_(2) are cost-intensive. Recent research has shown th... Deuterium(D_(2)) is one of the important fuel sources that power nuclear fusion reactors. The existing D_(2)/H_(2) separation technologies that obtain high-purity D_(2) are cost-intensive. Recent research has shown that metal-organic frameworks(MOFs) are of good potential for D_(2)/H_(2) separation application. In this work, a high-throughput computational screening of 12020 computation-ready experimental MOFs is carried out to determine the best MOFs for hydrogen isotope separation application. Meanwhile, the detailed structure-performance correlation is systematically investigated with the aid of machine learning. The results indicate that the ideal D_(2)/H_(2) adsorption selectivity calculated based on Henry coefficient is strongly correlated with the 1/ΔAD feature descriptor;that is, inverse of the adsorbility difference of the two adsorbates. Meanwhile, the machine learning(ML) results show that the prediction accuracy of all the four ML methods is significantly improved after the addition of this feature descriptor. In addition, the ML results based on extreme gradient boosting model also revealed that the 1/ΔAD descriptor has the highest relative importance compared to other commonly-used descriptors. To further explore the effect of hydrogen isotope separation in binary mixture, 1548 MOFs with ideal adsorption selectivity greater than 1.5 are simulated at equimolar conditions. The structure-performance relationship shows that high adsorption selectivity MOFs generally have smaller pore size(0.3-0.5 nm) and lower surface area. Among the top 200 performers, the materials mainly have the sql, pcu, cds, hxl, and ins topologies.Finally, three MOFs with high D_(2)/H_(2) selectivity and good D_(2) uptake are identified as the best candidates,of all which had one-dimensional channel pore. The findings obtained in this work may be helpful for the identification of potentially promising candidates for hydrogen isotope separation. 展开更多
关键词 Metal–organic frameworks computational screening Hydrogen isotope separation
下载PDF
Data-Driven Healthcare:The Role of Computational Methods in Medical Innovation
2
作者 Hariharasakthisudhan Ponnarengan Sivakumar Rajendran +2 位作者 Vikas Khalkar Gunapriya Devarajan Logesh Kamaraj 《Computer Modeling in Engineering & Sciences》 SCIE EI 2025年第1期1-48,共48页
The purpose of this review is to explore the intersection of computational engineering and biomedical science,highlighting the transformative potential this convergence holds for innovation in healthcare and medical r... The purpose of this review is to explore the intersection of computational engineering and biomedical science,highlighting the transformative potential this convergence holds for innovation in healthcare and medical research.The review covers key topics such as computational modelling,bioinformatics,machine learning in medical diagnostics,and the integration of wearable technology for real-time health monitoring.Major findings indicate that computational models have significantly enhanced the understanding of complex biological systems,while machine learning algorithms have improved the accuracy of disease prediction and diagnosis.The synergy between bioinformatics and computational techniques has led to breakthroughs in personalized medicine,enabling more precise treatment strategies.Additionally,the integration of wearable devices with advanced computational methods has opened new avenues for continuous health monitoring and early disease detection.The review emphasizes the need for interdisciplinary collaboration to further advance this field.Future research should focus on developing more robust and scalable computational models,enhancing data integration techniques,and addressing ethical considerations related to data privacy and security.By fostering innovation at the intersection of these disciplines,the potential to revolutionize healthcare delivery and outcomes becomes increasingly attainable. 展开更多
关键词 computational models biomedical engineering BIOINFORMATICS machine learning wearable technology
下载PDF
Finite element solution based on fast numerical technique for large-scale electromagnetic computation
3
作者 赵阳 储家美 Satish Udpa 《Journal of Southeast University(English Edition)》 EI CAS 2006年第4期470-474,共5页
A numerical technique of the target-region locating (TRL) solver in conjunction with the wave-front method is presented for the application of the finite element method (FEM) for 3-D electromagnetic computation. F... A numerical technique of the target-region locating (TRL) solver in conjunction with the wave-front method is presented for the application of the finite element method (FEM) for 3-D electromagnetic computation. First, the principle of TRL technique is described. Then, the availability of TRL solver for nonlinear application is particularly discussed demonstrating that this solver can be easily used while still remaining great efficiency. The implementation on how to apply this technique in FEM based on magnetic vector potential (MVP) is also introduced. Finally, a numerical example of 3-D magnetostatic modeling using the TRL solver and FEMLAB is given. It shows that a huge computer resource can be saved by employing the new solver. 展开更多
关键词 finite element method electromagnetic computation numerical technique fast solver
下载PDF
Large-scale laboratory investigation of pillar-support interaction
4
作者 Akash Chaurasia Gabriel Walton +4 位作者 Sankhaneel Sinha Timothy J.Batchler Kieran Moore Nicholas Vlachopoulos Bradley Forbes 《Journal of Rock Mechanics and Geotechnical Engineering》 2025年第1期71-93,共23页
Underground mine pillars provide natural stability to the mine area,allowing safe operations for workers and machinery.Extensive prior research has been conducted to understand pillar failure mechanics and design safe... Underground mine pillars provide natural stability to the mine area,allowing safe operations for workers and machinery.Extensive prior research has been conducted to understand pillar failure mechanics and design safe pillar layouts.However,limited studies(mostly based on empirical field observation and small-scale laboratory tests)have considered pillar-support interactions under monotonic loading conditions for the design of pillar-support systems.This study used a series of large-scale laboratory compression tests on porous limestone blocks to analyze rock and support behavior at a sufficiently large scale(specimens with edge length of 0.5 m)for incorporation of actual support elements,with consideration of different w/h ratios.Both unsupported and supported(grouted rebar rockbolt and wire mesh)tests were conducted,and the surface deformations of the specimens were monitored using three-dimensional(3D)digital image correlation(DIC).Rockbolts instrumented with distributed fiber optic strain sensors were used to study rockbolt strain distribution,load mobilization,and localized deformation at different w/h ratios.Both axial and bending strains were observed in the rockbolts,which became more prominent in the post-peak region of the stress-strain curve. 展开更多
关键词 Grouted rockbolt Welded wire mesh Porous limestone Digital image correlation Distributed fiber optic sensing large-scale laboratory tests
下载PDF
Hysteresis-Loop Criticality in Disordered Ferromagnets–A Comprehensive Review of Computational Techniques
5
作者 Djordje Spasojevic Sanja Janicevic +1 位作者 Svetislav Mijatovic Bosiljka Tadic 《Computer Modeling in Engineering & Sciences》 2025年第2期1021-1107,共87页
Disordered ferromagnets with a domain structure that exhibit a hysteresis loop when driven by the external magnetic field are essential materials for modern technological applications.Therefore,the understanding and p... Disordered ferromagnets with a domain structure that exhibit a hysteresis loop when driven by the external magnetic field are essential materials for modern technological applications.Therefore,the understanding and potential for controlling the hysteresis phenomenon in thesematerials,especially concerning the disorder-induced critical behavior on the hysteresis loop,have attracted significant experimental,theoretical,and numerical research efforts.We review the challenges of the numerical modeling of physical phenomena behind the hysteresis loop critical behavior in disordered ferromagnetic systems related to the non-equilibriumstochastic dynamics of domain walls driven by external fields.Specifically,using the extended Random Field Ising Model,we present different simulation approaches and advanced numerical techniques that adequately describe the hysteresis loop shapes and the collective nature of the magnetization fluctuations associated with the criticality of the hysteresis loop for different sample shapes and varied parameters of disorder and rate of change of the external field,as well as the influence of thermal fluctuations and demagnetizing fields.The studied examples demonstrate how these numerical approaches reveal newphysical insights,providing quantitativemeasures of pertinent variables extracted from the systems’simulated or experimentally measured Barkhausen noise signals.The described computational techniques using inherent scale-invariance can be applied to the analysis of various complex systems,both quantum and classical,exhibiting non-equilibrium dynamical critical point or self-organized criticality. 展开更多
关键词 Disordered ferromagnets hysteresis-loop criticality magnetization-reversal avalanches in simulations and experiments zero-temperature and thermal Random Field Ising Model simulations computational techniques for multiparameter scaling analysis multifractal Barkhausen noise finite driving rates demagnetizing effects nonequilibrium critical dynamics
下载PDF
PARALLEL COMPUTATIONAL ALGORITHM OF SUBSTRUCTURE METHOD OF LARGE-SCALE STRUCTURE ANALYSIS
6
作者 张汝清 《Applied Mathematics and Mechanics(English Edition)》 SCIE EI 1991年第1期93-100,共8页
In this paper, according to the parallel environment of ELXSI computer, a parallel solving process of substructure method in static and dynamic analyses of large-scale and complex structure has been put forward, and t... In this paper, according to the parallel environment of ELXSI computer, a parallel solving process of substructure method in static and dynamic analyses of large-scale and complex structure has been put forward, and the corresponding parallel computational program has been developed. 展开更多
关键词 computer Programming ALGORITHMS computer Systems Digital Parallel Processing
下载PDF
A Two-Layer Encoding Learning Swarm Optimizer Based on Frequent Itemsets for Sparse Large-Scale Multi-Objective Optimization 被引量:1
7
作者 Sheng Qi Rui Wang +3 位作者 Tao Zhang Xu Yang Ruiqing Sun Ling Wang 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2024年第6期1342-1357,共16页
Traditional large-scale multi-objective optimization algorithms(LSMOEAs)encounter difficulties when dealing with sparse large-scale multi-objective optimization problems(SLM-OPs)where most decision variables are zero.... Traditional large-scale multi-objective optimization algorithms(LSMOEAs)encounter difficulties when dealing with sparse large-scale multi-objective optimization problems(SLM-OPs)where most decision variables are zero.As a result,many algorithms use a two-layer encoding approach to optimize binary variable Mask and real variable Dec separately.Nevertheless,existing optimizers often focus on locating non-zero variable posi-tions to optimize the binary variables Mask.However,approxi-mating the sparse distribution of real Pareto optimal solutions does not necessarily mean that the objective function is optimized.In data mining,it is common to mine frequent itemsets appear-ing together in a dataset to reveal the correlation between data.Inspired by this,we propose a novel two-layer encoding learning swarm optimizer based on frequent itemsets(TELSO)to address these SLMOPs.TELSO mined the frequent terms of multiple particles with better target values to find mask combinations that can obtain better objective values for fast convergence.Experi-mental results on five real-world problems and eight benchmark sets demonstrate that TELSO outperforms existing state-of-the-art sparse large-scale multi-objective evolutionary algorithms(SLMOEAs)in terms of performance and convergence speed. 展开更多
关键词 Evolutionary algorithms learning swarm optimiza-tion sparse large-scale optimization sparse large-scale multi-objec-tive problems two-layer encoding.
下载PDF
Assessment of Wet Season Precipitation in the Central United States by the Regional Climate Simulation of the WRFG Member in NARCCAP and Its Relationship with Large-Scale Circulation Biases 被引量:1
8
作者 Yating ZHAO Ming XUE +2 位作者 Jing JIANG Xiao-Ming HU Anning HUANG 《Advances in Atmospheric Sciences》 SCIE CAS CSCD 2024年第4期619-638,共20页
Assessment of past-climate simulations of regional climate models(RCMs)is important for understanding the reliability of RCMs when used to project future regional climate.Here,we assess the performance and discuss pos... Assessment of past-climate simulations of regional climate models(RCMs)is important for understanding the reliability of RCMs when used to project future regional climate.Here,we assess the performance and discuss possible causes of biases in a WRF-based RCM with a grid spacing of 50 km,named WRFG,from the North American Regional Climate Change Assessment Program(NARCCAP)in simulating wet season precipitation over the Central United States for a period when observational data are available.The RCM reproduces key features of the precipitation distribution characteristics during late spring to early summer,although it tends to underestimate the magnitude of precipitation.This dry bias is partially due to the model’s lack of skill in simulating nocturnal precipitation related to the lack of eastward propagating convective systems in the simulation.Inaccuracy in reproducing large-scale circulation and environmental conditions is another contributing factor.The too weak simulated pressure gradient between the Rocky Mountains and the Gulf of Mexico results in weaker southerly winds in between,leading to a reduction of warm moist air transport from the Gulf to the Central Great Plains.The simulated low-level horizontal convergence fields are less favorable for upward motion than in the NARR and hence,for the development of moist convection as well.Therefore,a careful examination of an RCM’s deficiencies and the identification of the source of errors are important when using the RCM to project precipitation changes in future climate scenarios. 展开更多
关键词 NARCCAP Central United States PRECIPITATION low-level jet large-scale environment diurnal variation
下载PDF
Computational Experiments for Complex Social Systems:Experiment Design and Generative Explanation 被引量:2
9
作者 Xiao Xue Deyu Zhou +5 位作者 Xiangning Yu Gang Wang Juanjuan Li Xia Xie Lizhen Cui Fei-Yue Wang 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2024年第4期1022-1038,共17页
Powered by advanced information technology,more and more complex systems are exhibiting characteristics of the cyber-physical-social systems(CPSS).In this context,computational experiments method has emerged as a nove... Powered by advanced information technology,more and more complex systems are exhibiting characteristics of the cyber-physical-social systems(CPSS).In this context,computational experiments method has emerged as a novel approach for the design,analysis,management,control,and integration of CPSS,which can realize the causal analysis of complex systems by means of“algorithmization”of“counterfactuals”.However,because CPSS involve human and social factors(e.g.,autonomy,initiative,and sociality),it is difficult for traditional design of experiment(DOE)methods to achieve the generative explanation of system emergence.To address this challenge,this paper proposes an integrated approach to the design of computational experiments,incorporating three key modules:1)Descriptive module:Determining the influencing factors and response variables of the system by means of the modeling of an artificial society;2)Interpretative module:Selecting factorial experimental design solution to identify the relationship between influencing factors and macro phenomena;3)Predictive module:Building a meta-model that is equivalent to artificial society to explore its operating laws.Finally,a case study of crowd-sourcing platforms is presented to illustrate the application process and effectiveness of the proposed approach,which can reveal the social impact of algorithmic behavior on“rider race”. 展开更多
关键词 Agent-based modeling computational experiments cyber-physical-social systems(CPSS) generative deduction generative experiments meta model
下载PDF
Enhancing Evolutionary Algorithms With Pattern Mining for Sparse Large-Scale Multi-Objective Optimization Problems
10
作者 Sheng Qi Rui Wang +3 位作者 Tao Zhang Weixiong Huang Fan Yu Ling Wang 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2024年第8期1786-1801,共16页
Sparse large-scale multi-objective optimization problems(SLMOPs)are common in science and engineering.However,the large-scale problem represents the high dimensionality of the decision space,requiring algorithms to tr... Sparse large-scale multi-objective optimization problems(SLMOPs)are common in science and engineering.However,the large-scale problem represents the high dimensionality of the decision space,requiring algorithms to traverse vast expanse with limited computational resources.Furthermore,in the context of sparse,most variables in Pareto optimal solutions are zero,making it difficult for algorithms to identify non-zero variables efficiently.This paper is dedicated to addressing the challenges posed by SLMOPs.To start,we introduce innovative objective functions customized to mine maximum and minimum candidate sets.This substantial enhancement dramatically improves the efficacy of frequent pattern mining.In this way,selecting candidate sets is no longer based on the quantity of nonzero variables they contain but on a higher proportion of nonzero variables within specific dimensions.Additionally,we unveil a novel approach to association rule mining,which delves into the intricate relationships between non-zero variables.This novel methodology aids in identifying sparse distributions that can potentially expedite reductions in the objective function value.We extensively tested our algorithm across eight benchmark problems and four real-world SLMOPs.The results demonstrate that our approach achieves competitive solutions across various challenges. 展开更多
关键词 Evolutionary algorithms pattern mining sparse large-scale multi-objective problems(SLMOPs) sparse large-scale optimization.
下载PDF
Public cloud computing for seismological research:Calculating large-scale noise cross-correlations using ALIYUN 被引量:3
11
作者 Weitao Wang Baoshan Wang Xiufen Zheng 《Earthquake Science》 CSCD 2018年第5期227-233,共7页
The amount of seismological data is rapidly increasing with accumulating observational time and increasing number of stations, requiring modern technique to provide adequate computing power. In present study, we propo... The amount of seismological data is rapidly increasing with accumulating observational time and increasing number of stations, requiring modern technique to provide adequate computing power. In present study, we proposed a framework to calculate large-scale noise crosscorrelation functions(NCFs) using public cloud service from ALIYUN. The entire computation is factorized into small pieces which are performed parallelly on specified number of virtual servers provided by the cloud. Using data from most seismic stations in China, five NCF databases are built. The results show that, comparing to the time cost using a single server, the entire time can be reduced over two orders of magnitude depending number of evoked virtual servers. This could reduce computation time from months to less than 12 hours. Based on obtained massive NCFs, the global body waves are retrieved through array interferometry and agree well with those from earthquakes. This leads to a solution to process massive seismic dataset within an affordable time and is applicable to other large-scale computing in seismological researches. 展开更多
关键词 cloud computing ambient noise CROSS-CORRELATION global body wave
下载PDF
Hypergraph Computation
12
作者 Yue Gao Shuyi Ji +1 位作者 Xiangmin Han Qionghai Dai 《Engineering》 SCIE EI CAS CSCD 2024年第9期188-201,共14页
Practical real-world scenarios such as the Internet,social networks,and biological networks present the challenges of data scarcity and complex correlations,which limit the applications of artificial intelligence.The ... Practical real-world scenarios such as the Internet,social networks,and biological networks present the challenges of data scarcity and complex correlations,which limit the applications of artificial intelligence.The graph structure is a typical tool used to formulate such correlations,it is incapable of modeling highorder correlations among different objects in systems;thus,the graph structure cannot fully convey the intricate correlations among objects.Confronted with the aforementioned two challenges,hypergraph computation models high-order correlations among data,knowledge,and rules through hyperedges and leverages these high-order correlations to enhance the data.Additionally,hypergraph computation achieves collaborative computation using data and high-order correlations,thereby offering greater modeling flexibility.In particular,we introduce three types of hypergraph computation methods:①hypergraph structure modeling,②hypergraph semantic computing,and③efficient hypergraph computing.We then specify how to adopt hypergraph computation in practice by focusing on specific tasks such as three-dimensional(3D)object recognition,revealing that hypergraph computation can reduce the data requirement by 80%while achieving comparable performance or improve the performance by 52%given the same data,compared with a traditional data-based method.A comprehensive overview of the applications of hypergraph computation in diverse domains,such as intelligent medicine and computer vision,is also provided.Finally,we introduce an open-source deep learning library,DeepHypergraph(DHG),which can serve as a tool for the practical usage of hypergraph computation. 展开更多
关键词 High-order correlation Hypergraph structure modeling Hypergraph semantic computing Efficient hypergraph computing Hypergraph computation framework
下载PDF
Numerical investigation of turbulent mass transfer processes in turbulent fluidized bed by computational mass transfer
13
作者 Hailun Ren Liang Zeng +3 位作者 Wenbin Li Shuyong Chen Zhongli Tang Donghui Zhang 《Chinese Journal of Chemical Engineering》 SCIE EI CAS CSCD 2024年第12期64-74,共11页
Turbulent fluidized bed possesses a distinct advantage over bubbling fluidized bed in high solids contact efficiency and thus exerts great potential in applications to many industrial processes.Simulation for fluidiza... Turbulent fluidized bed possesses a distinct advantage over bubbling fluidized bed in high solids contact efficiency and thus exerts great potential in applications to many industrial processes.Simulation for fluidization of fluid catalytic cracking(FCC)particles and the catalytic reaction of ozone decomposition in turbulent fluidized bed is conducted using the EulerianeEulerian approach,where the recently developed two-equation turbulent(TET)model is introduced to describe the turbulent mass diffusion.The energy minimization multi-scale(EMMS)drag model and the kinetic theory of granular flow(KTGF)are adopted to describe gaseparticles interaction and particleeparticle interaction respectively.The TET model features the rigorous closure for the turbulent mass transfer equations and thus enables more reliable simulation.With this model,distributions of ozone concentration and gaseparticles two-phase velocity as well as volume fraction are obtained and compared against experimental data.The average absolute relative deviation for the simulated ozone concentration is 9.67%which confirms the validity of the proposed model.Moreover,it is found that the transition velocity from bubbling fluidization to turbulent fluidization for FCC particles is about 0.5 m$se1 which is consistent with experimental observation. 展开更多
关键词 Turbulent fluidized bed Simulation computational mass transfer TURBULENCE computational fluid dynamics
下载PDF
Joint computation offloading and parallel scheduling to maximize delay-guarantee in cooperative MEC systems
14
作者 Mian Guo Mithun Mukherjee +3 位作者 Jaime Lloret Lei Li Quansheng Guan Fei Ji 《Digital Communications and Networks》 SCIE CSCD 2024年第3期693-705,共13页
The growing development of the Internet of Things(IoT)is accelerating the emergence and growth of new IoT services and applications,which will result in massive amounts of data being generated,transmitted and pro-cess... The growing development of the Internet of Things(IoT)is accelerating the emergence and growth of new IoT services and applications,which will result in massive amounts of data being generated,transmitted and pro-cessed in wireless communication networks.Mobile Edge Computing(MEC)is a desired paradigm to timely process the data from IoT for value maximization.In MEC,a number of computing-capable devices are deployed at the network edge near data sources to support edge computing,such that the long network transmission delay in cloud computing paradigm could be avoided.Since an edge device might not always have sufficient resources to process the massive amount of data,computation offloading is significantly important considering the coop-eration among edge devices.However,the dynamic traffic characteristics and heterogeneous computing capa-bilities of edge devices challenge the offloading.In addition,different scheduling schemes might provide different computation delays to the offloaded tasks.Thus,offloading in mobile nodes and scheduling in the MEC server are coupled to determine service delay.This paper seeks to guarantee low delay for computation intensive applica-tions by jointly optimizing the offloading and scheduling in such an MEC system.We propose a Delay-Greedy Computation Offloading(DGCO)algorithm to make offloading decisions for new tasks in distributed computing-enabled mobile devices.A Reinforcement Learning-based Parallel Scheduling(RLPS)algorithm is further designed to schedule offloaded tasks in the multi-core MEC server.With an offloading delay broadcast mechanism,the DGCO and RLPS cooperate to achieve the goal of delay-guarantee-ratio maximization.Finally,the simulation results show that our proposal can bound the end-to-end delay of various tasks.Even under slightly heavy task load,the delay-guarantee-ratio given by DGCO-RLPS can still approximate 95%,while that given by benchmarked algorithms is reduced to intolerable value.The simulation results are demonstrated the effective-ness of DGCO-RLPS for delay guarantee in MEC. 展开更多
关键词 Edge computing computation offloading Parallel scheduling Mobile-edge cooperation Delay guarantee
下载PDF
Large-scale model testing of high-pressure grouting reinforcement for bedding slope with rapid-setting polyurethane
15
作者 ZHANG Zhichao TANG Xuefeng +2 位作者 LIU Kan YE Longzhen HE Xiang 《Journal of Mountain Science》 SCIE CSCD 2024年第9期3083-3093,共11页
Bedding slope is a typical heterogeneous slope consisting of different soil/rock layers and is likely to slide along the weakest interface.Conventional slope protection methods for bedding slopes,such as retaining wal... Bedding slope is a typical heterogeneous slope consisting of different soil/rock layers and is likely to slide along the weakest interface.Conventional slope protection methods for bedding slopes,such as retaining walls,stabilizing piles,and anchors,are time-consuming and labor-and energy-intensive.This study proposes an innovative polymer grout method to improve the bearing capacity and reduce the displacement of bedding slopes.A series of large-scale model tests were carried out to verify the effectiveness of polymer grout in protecting bedding slopes.Specifically,load-displacement relationships and failure patterns were analyzed for different testing slopes with various dosages of polymer.Results show the great potential of polymer grout in improving bearing capacity,reducing settlement,and protecting slopes from being crushed under shearing.The polymer-treated slopes remained structurally intact,while the untreated slope exhibited considerable damage when subjected to loads surpassing the bearing capacity.It is also found that polymer-cemented soils concentrate around the injection pipe,forming a fan-shaped sheet-like structure.This study proves the improvement of polymer grouting for bedding slope treatment and will contribute to the development of a fast method to protect bedding slopes from landslides. 展开更多
关键词 POLYURETHANE Bedding slope GROUTING Slope protection large-scale model test
下载PDF
Secure and Efficient Outsourced Computation in Cloud Computing Environments
16
作者 Varun Dixit Davinderjit Kaur 《Journal of Software Engineering and Applications》 2024年第9期750-762,共13页
Secure and efficient outsourced computation in cloud computing environments is crucial for ensuring data confidentiality, integrity, and resource optimization. In this research, we propose novel algorithms and methodo... Secure and efficient outsourced computation in cloud computing environments is crucial for ensuring data confidentiality, integrity, and resource optimization. In this research, we propose novel algorithms and methodologies to address these challenges. Through a series of experiments, we evaluate the performance, security, and efficiency of the proposed algorithms in real-world cloud environments. Our results demonstrate the effectiveness of homomorphic encryption-based secure computation, secure multiparty computation, and trusted execution environment-based approaches in mitigating security threats while ensuring efficient resource utilization. Specifically, our homomorphic encryption-based algorithm exhibits encryption times ranging from 20 to 1000 milliseconds and decryption times ranging from 25 to 1250 milliseconds for payload sizes varying from 100 KB to 5000 KB. Furthermore, our comparative analysis against state-of-the-art solutions reveals the strengths of our proposed algorithms in terms of security guarantees, encryption overhead, and communication latency. 展开更多
关键词 Secure computation Cloud computing Homomorphic Encryption Secure Multiparty computation Resource Optimization
下载PDF
EG-STC: An Efficient Secure Two-Party Computation Scheme Based on Embedded GPU for Artificial Intelligence Systems
17
作者 Zhenjiang Dong Xin Ge +2 位作者 Yuehua Huang Jiankuo Dong Jiang Xu 《Computers, Materials & Continua》 SCIE EI 2024年第6期4021-4044,共24页
This paper presents a comprehensive exploration into the integration of Internet of Things(IoT),big data analysis,cloud computing,and Artificial Intelligence(AI),which has led to an unprecedented era of connectivity.W... This paper presents a comprehensive exploration into the integration of Internet of Things(IoT),big data analysis,cloud computing,and Artificial Intelligence(AI),which has led to an unprecedented era of connectivity.We delve into the emerging trend of machine learning on embedded devices,enabling tasks in resource-limited environ-ments.However,the widespread adoption of machine learning raises significant privacy concerns,necessitating the development of privacy-preserving techniques.One such technique,secure multi-party computation(MPC),allows collaborative computations without exposing private inputs.Despite its potential,complex protocols and communication interactions hinder performance,especially on resource-constrained devices.Efforts to enhance efficiency have been made,but scalability remains a challenge.Given the success of GPUs in deep learning,lever-aging embedded GPUs,such as those offered by NVIDIA,emerges as a promising solution.Therefore,we propose an Embedded GPU-based Secure Two-party Computation(EG-STC)framework for Artificial Intelligence(AI)systems.To the best of our knowledge,this work represents the first endeavor to fully implement machine learning model training based on secure two-party computing on the Embedded GPU platform.Our experimental results demonstrate the effectiveness of EG-STC.On an embedded GPU with a power draw of 5 W,our implementation achieved a secure two-party matrix multiplication throughput of 5881.5 kilo-operations per millisecond(kops/ms),with an energy efficiency ratio of 1176.3 kops/ms/W.Furthermore,leveraging our EG-STC framework,we achieved an overall time acceleration ratio of 5–6 times compared to solutions running on server-grade CPUs.Our solution also exhibited a reduced runtime,requiring only 60%to 70%of the runtime of previously best-known methods on the same platform.In summary,our research contributes to the advancement of secure and efficient machine learning implementations on resource-constrained embedded devices,paving the way for broader adoption of AI technologies in various applications. 展开更多
关键词 Secure two-party computation embedded GPU acceleration privacy-preserving machine learning edge computing
下载PDF
Secure Computation Efficiency Resource Allocation for Massive MIMO-Enabled Mobile Edge Computing Networks
18
作者 Sun Gangcan Sun Jiwei +3 位作者 Hao Wanming Zhu Zhengyu Ji Xiang Zhou Yiqing 《China Communications》 SCIE CSCD 2024年第11期150-162,共13页
In this article,the secure computation efficiency(SCE)problem is studied in a massive multipleinput multiple-output(mMIMO)-assisted mobile edge computing(MEC)network.We first derive the secure transmission rate based ... In this article,the secure computation efficiency(SCE)problem is studied in a massive multipleinput multiple-output(mMIMO)-assisted mobile edge computing(MEC)network.We first derive the secure transmission rate based on the mMIMO under imperfect channel state information.Based on this,the SCE maximization problem is formulated by jointly optimizing the local computation frequency,the offloading time,the downloading time,the users and the base station transmit power.Due to its difficulty to directly solve the formulated problem,we first transform the fractional objective function into the subtractive form one via the dinkelbach method.Next,the original problem is transformed into a convex one by applying the successive convex approximation technique,and an iteration algorithm is proposed to obtain the solutions.Finally,the stimulations are conducted to show that the performance of the proposed schemes is superior to that of the other schemes. 展开更多
关键词 EAVESDROPPING massive multiple input multiple output mobile edge computing partial offloading secure computation efficiency
下载PDF
Online identification and extraction method of regional large-scale adjustable load-aggregation characteristics
19
作者 Siwei Li Liang Yue +1 位作者 Xiangyu Kong Chengshan Wang 《Global Energy Interconnection》 EI CSCD 2024年第3期313-323,共11页
This article introduces the concept of load aggregation,which involves a comprehensive analysis of loads to acquire their external characteristics for the purpose of modeling and analyzing power systems.The online ide... This article introduces the concept of load aggregation,which involves a comprehensive analysis of loads to acquire their external characteristics for the purpose of modeling and analyzing power systems.The online identification method is a computer-involved approach for data collection,processing,and system identification,commonly used for adaptive control and prediction.This paper proposes a method for dynamically aggregating large-scale adjustable loads to support high proportions of new energy integration,aiming to study the aggregation characteristics of regional large-scale adjustable loads using online identification techniques and feature extraction methods.The experiment selected 300 central air conditioners as the research subject and analyzed their regulation characteristics,economic efficiency,and comfort.The experimental results show that as the adjustment time of the air conditioner increases from 5 minutes to 35 minutes,the stable adjustment quantity during the adjustment period decreases from 28.46 to 3.57,indicating that air conditioning loads can be controlled over a long period and have better adjustment effects in the short term.Overall,the experimental results of this paper demonstrate that analyzing the aggregation characteristics of regional large-scale adjustable loads using online identification techniques and feature extraction algorithms is effective. 展开更多
关键词 Load aggregation Regional large-scale Online recognition Feature extraction method
下载PDF
A semantic vector map-based approach for aircraft positioning in GNSS/GPS denied large-scale environment
20
作者 Chenguang Ouyang Suxing Hu +6 位作者 Fengqi Long Shuai Shi Zhichao Yu Kaichun Zhao Zheng You Junyin Pi Bowen Xing 《Defence Technology(防务技术)》 SCIE EI CAS CSCD 2024年第4期1-10,共10页
Accurate positioning is one of the essential requirements for numerous applications of remote sensing data,especially in the event of a noisy or unreliable satellite signal.Toward this end,we present a novel framework... Accurate positioning is one of the essential requirements for numerous applications of remote sensing data,especially in the event of a noisy or unreliable satellite signal.Toward this end,we present a novel framework for aircraft geo-localization in a large range that only requires a downward-facing monocular camera,an altimeter,a compass,and an open-source Vector Map(VMAP).The algorithm combines the matching and particle filter methods.Shape vector and correlation between two building contour vectors are defined,and a coarse-to-fine building vector matching(CFBVM)method is proposed in the matching stage,for which the original matching results are described by the Gaussian mixture model(GMM).Subsequently,an improved resampling strategy is designed to reduce computing expenses with a huge number of initial particles,and a credibility indicator is designed to avoid location mistakes in the particle filter stage.An experimental evaluation of the approach based on flight data is provided.On a flight at a height of 0.2 km over a flight distance of 2 km,the aircraft is geo-localized in a reference map of 11,025 km~2using 0.09 km~2aerial images without any prior information.The absolute localization error is less than 10 m. 展开更多
关键词 large-scale positioning Building vector matching Improved particle filter GPS-Denied Vector map
下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部