期刊文献+
共找到15,880篇文章
< 1 2 250 >
每页显示 20 50 100
A Novel Predictive Model for Edge Computing Resource Scheduling Based on Deep Neural Network
1
作者 Ming Gao Weiwei Cai +3 位作者 Yizhang Jiang Wenjun Hu Jian Yao Pengjiang Qian 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第4期259-277,共19页
Currently,applications accessing remote computing resources through cloud data centers is the main mode of operation,but this mode of operation greatly increases communication latency and reduces overall quality of se... Currently,applications accessing remote computing resources through cloud data centers is the main mode of operation,but this mode of operation greatly increases communication latency and reduces overall quality of service(QoS)and quality of experience(QoE).Edge computing technology extends cloud service functionality to the edge of the mobile network,closer to the task execution end,and can effectivelymitigate the communication latency problem.However,the massive and heterogeneous nature of servers in edge computing systems brings new challenges to task scheduling and resource management,and the booming development of artificial neural networks provides us withmore powerfulmethods to alleviate this limitation.Therefore,in this paper,we proposed a time series forecasting model incorporating Conv1D,LSTM and GRU for edge computing device resource scheduling,trained and tested the forecasting model using a small self-built dataset,and achieved competitive experimental results. 展开更多
关键词 Edge computing resource scheduling predictive models
下载PDF
A Novel Quantization and Model Compression Approach for Hardware Accelerators in Edge Computing
2
作者 Fangzhou He Ke Ding +3 位作者 DingjiangYan Jie Li Jiajun Wang Mingzhe Chen 《Computers, Materials & Continua》 SCIE EI 2024年第8期3021-3045,共25页
Massive computational complexity and memory requirement of artificial intelligence models impede their deploy-ability on edge computing devices of the Internet of Things(IoT).While Power-of-Two(PoT)quantization is pro... Massive computational complexity and memory requirement of artificial intelligence models impede their deploy-ability on edge computing devices of the Internet of Things(IoT).While Power-of-Two(PoT)quantization is pro-posed to improve the efficiency for edge inference of Deep Neural Networks(DNNs),existing PoT schemes require a huge amount of bit-wise manipulation and have large memory overhead,and their efficiency is bounded by the bottleneck of computation latency and memory footprint.To tackle this challenge,we present an efficient inference approach on the basis of PoT quantization and model compression.An integer-only scalar PoT quantization(IOS-PoT)is designed jointly with a distribution loss regularizer,wherein the regularizer minimizes quantization errors and training disturbances.Additionally,two-stage model compression is developed to effectively reduce memory requirement,and alleviate bandwidth usage in communications of networked heterogenous learning systems.The product look-up table(P-LUT)inference scheme is leveraged to replace bit-shifting with only indexing and addition operations for achieving low-latency computation and implementing efficient edge accelerators.Finally,comprehensive experiments on Residual Networks(ResNets)and efficient architectures with Canadian Institute for Advanced Research(CIFAR),ImageNet,and Real-world Affective Faces Database(RAF-DB)datasets,indicate that our approach achieves 2×∼10×improvement in the reduction of both weight size and computation cost in comparison to state-of-the-art methods.A P-LUT accelerator prototype is implemented on the Xilinx KV260 Field Programmable Gate Array(FPGA)platform for accelerating convolution operations,with performance results showing that P-LUT reduces memory footprint by 1.45×,achieves more than 3×power efficiency and 2×resource efficiency,compared to the conventional bit-shifting scheme. 展开更多
关键词 Edge computing model compression hardware accelerator power-of-two quantization
下载PDF
Energy-optimal DNN model placement in UAV-enabled edge computing networks
3
作者 Jianhang Tang Guoquan Wu +3 位作者 Mohammad Mussadiq Jalalzai Lin Wang Bing Zhang Yi Zhou 《Digital Communications and Networks》 SCIE CSCD 2024年第4期827-836,共10页
Unmanned aerial vehicle(UAV)-enabled edge computing is emerging as a potential enabler for Artificial Intelligence of Things(AIoT)in the forthcoming sixth-generation(6G)communication networks.With the use of flexible ... Unmanned aerial vehicle(UAV)-enabled edge computing is emerging as a potential enabler for Artificial Intelligence of Things(AIoT)in the forthcoming sixth-generation(6G)communication networks.With the use of flexible UAVs,massive sensing data is gathered and processed promptly without considering geographical locations.Deep neural networks(DNNs)are becoming a driving force to extract valuable information from sensing data.However,the lightweight servers installed on UAVs are not able to meet the extremely high requirements of inference tasks due to the limited battery capacities of UAVs.In this work,we investigate a DNN model placement problem for AIoT applications,where the trained DNN models are selected and placed on UAVs to execute inference tasks locally.It is impractical to obtain future DNN model request profiles and system operation states in UAV-enabled edge computing.The Lyapunov optimization technique is leveraged for the proposed DNN model placement problem.Based on the observed system overview,an advanced online placement(AOP)algorithm is developed to solve the transformed problem in each time slot,which can reduce DNN model transmission delay and disk I/O energy cost simultaneously while keeping the input data queues stable.Finally,extensive simulations are provided to depict the effectiveness of the AOP algorithm.The numerical results demonstrate that the AOP algorithm can reduce 18.14%of the model placement cost and 29.89%of the input data queue backlog on average by comparing it with benchmark algorithms. 展开更多
关键词 UAV-Enabled edge computing DNN model Placement 6G networks Inference tasks
下载PDF
Optimization Techniques for GPU-Based Parallel Programming Models in High-Performance Computing
4
作者 Shuntao Tang Wei Chen 《信息工程期刊(中英文版)》 2024年第1期7-11,共5页
This study embarks on a comprehensive examination of optimization techniques within GPU-based parallel programming models,pivotal for advancing high-performance computing(HPC).Emphasizing the transition of GPUs from g... This study embarks on a comprehensive examination of optimization techniques within GPU-based parallel programming models,pivotal for advancing high-performance computing(HPC).Emphasizing the transition of GPUs from graphic-centric processors to versatile computing units,it delves into the nuanced optimization of memory access,thread management,algorithmic design,and data structures.These optimizations are critical for exploiting the parallel processing capabilities of GPUs,addressingboth the theoretical frameworks and practical implementations.By integrating advanced strategies such as memory coalescing,dynamic scheduling,and parallel algorithmic transformations,this research aims to significantly elevate computational efficiency and throughput.The findings underscore the potential of optimized GPU programming to revolutionize computational tasks across various domains,highlighting a pathway towards achieving unparalleled processing power and efficiency in HPC environments.The paper not only contributes to the academic discourse on GPU optimization but also provides actionable insights for developers,fostering advancements in computational sciences and technology. 展开更多
关键词 Optimization Techniques GPU-Based Parallel Programming models High-Performance computing
下载PDF
Computing Power Network:A Survey
5
作者 Sun Yukun Lei Bo +4 位作者 Liu Junlin Huang Haonan Zhang Xing Peng Jing Wang Wenbo 《China Communications》 SCIE CSCD 2024年第9期109-145,共37页
With the rapid development of cloud computing,edge computing,and smart devices,computing power resources indicate a trend of ubiquitous deployment.The traditional network architecture cannot efficiently leverage these... With the rapid development of cloud computing,edge computing,and smart devices,computing power resources indicate a trend of ubiquitous deployment.The traditional network architecture cannot efficiently leverage these distributed computing power resources due to computing power island effect.To overcome these problems and improve network efficiency,a new network computing paradigm is proposed,i.e.,Computing Power Network(CPN).Computing power network can connect ubiquitous and heterogenous computing power resources through networking to realize computing power scheduling flexibly.In this survey,we make an exhaustive review on the state-of-the-art research efforts on computing power network.We first give an overview of computing power network,including definition,architecture,and advantages.Next,a comprehensive elaboration of issues on computing power modeling,information awareness and announcement,resource allocation,network forwarding,computing power transaction platform and resource orchestration platform is presented.The computing power network testbed is built and evaluated.The applications and use cases in computing power network are discussed.Then,the key enabling technologies for computing power network are introduced.Finally,open challenges and future research directions are presented as well. 展开更多
关键词 computing power modeling computing power network computing power scheduling information awareness network forwarding
下载PDF
SCIRD: Revealing Infection of Malicious Software in Edge Computing-Enabled IoT Networks
6
作者 Jiehao Ye Wen Cheng +3 位作者 Xiaolong Liu Wenyi Zhu Xuan’ang Wu Shigen Shen 《Computers, Materials & Continua》 SCIE EI 2024年第5期2743-2769,共27页
The Internet of Things(IoT)has characteristics such as node mobility,node heterogeneity,link heterogeneity,and topology heterogeneity.In the face of the IoT characteristics and the explosive growth of IoT nodes,which ... The Internet of Things(IoT)has characteristics such as node mobility,node heterogeneity,link heterogeneity,and topology heterogeneity.In the face of the IoT characteristics and the explosive growth of IoT nodes,which brings about large-scale data processing requirements,edge computing architecture has become an emerging network architecture to support IoT applications due to its ability to provide powerful computing capabilities and good service functions.However,the defense mechanism of Edge Computing-enabled IoT Nodes(ECIoTNs)is still weak due to their limited resources,so that they are susceptible to malicious software spread,which can compromise data confidentiality and network service availability.Facing this situation,we put forward an epidemiology-based susceptible-curb-infectious-removed-dead(SCIRD)model.Then,we analyze the dynamics of ECIoTNs with different infection levels under different initial conditions to obtain the dynamic differential equations.Additionally,we establish the presence of equilibrium states in the SCIRD model.Furthermore,we conduct an analysis of the model’s stability and examine the conditions under which malicious software will either spread or disappear within Edge Computing-enabled IoT(ECIoT)networks.Lastly,we validate the efficacy and superiority of the SCIRD model through MATLAB simulations.These research findings offer a theoretical foundation for suppressing the propagation of malicious software in ECIoT networks.The experimental results indicate that the theoretical SCIRD model has instructive significance,deeply revealing the principles of malicious software propagation in ECIoT networks.This study solves a challenging security problem of ECIoT networks by determining the malicious software propagation threshold,which lays the foundation for buildingmore secure and reliable ECIoT networks. 展开更多
关键词 Edge computing Internet of Things malicious software propagation model HETEROGENEITY
下载PDF
Computational Fluid Dynamics Approach for Predicting Pipeline Response to Various Blast Scenarios: A Numerical Modeling Study
7
作者 Farman Saifi Mohd Javaid +1 位作者 Abid Haleem S.M.Anas 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第9期2747-2777,共31页
Recent industrial explosions globally have intensified the focus in mechanical engineering on designing infras-tructure systems and networks capable of withstanding blast loading.Initially centered on high-profile fac... Recent industrial explosions globally have intensified the focus in mechanical engineering on designing infras-tructure systems and networks capable of withstanding blast loading.Initially centered on high-profile facilities such as embassies and petrochemical plants,this concern now extends to a wider array of infrastructures and facilities.Engineers and scholars increasingly prioritize structural safety against explosions,particularly to prevent disproportionate collapse and damage to nearby structures.Urbanization has further amplified the reliance on oil and gas pipelines,making them vital for urban life and prime targets for terrorist activities.Consequently,there is a growing imperative for computational engineering solutions to tackle blast loading on pipelines and mitigate associated risks to avert disasters.In this study,an empty pipe model was successfully validated under contact blast conditions using Abaqus software,a powerful tool in mechanical engineering for simulating blast effects on buried pipelines.Employing a Eulerian-Lagrangian computational fluid dynamics approach,the investigation extended to above-surface and below-surface blasts at standoff distances of 25 and 50 mm.Material descriptions in the numerical model relied on Abaqus’default mechanical models.Comparative analysis revealed varying pipe performance,with deformation decreasing as explosion-to-pipe distance increased.The explosion’s location relative to the pipe surface notably influenced deformation levels,a key finding highlighted in the study.Moreover,quantitative findings indicated varying ratios of plastic dissipation energy(PDE)for different blast scenarios compared to the contact blast(P0).Specifically,P1(25 mm subsurface blast)and P2(50 mm subsurface blast)showed approximately 24.07%and 14.77%of P0’s PDE,respectively,while P3(25 mm above-surface blast)and P4(50 mm above-surface blast)exhibited lower PDE values,accounting for about 18.08%and 9.67%of P0’s PDE,respectively.Utilising energy-absorbing materials such as thin coatings of ultra-high-strength concrete,metallic foams,carbon fiber-reinforced polymer wraps,and others on the pipeline to effectively mitigate blast damage is recommended.This research contributes to the advancement of mechanical engineering by providing insights and solutions crucial for enhancing the resilience and safety of underground pipelines in the face of blast events. 展开更多
关键词 Blast loading computational fluid dynamics computer modeling pipe networks response prediction structural safety
下载PDF
Numerical Analysis of Bacterial Meningitis Stochastic Delayed Epidemic Model through Computational Methods
8
作者 Umar Shafique Mohamed Mahyoub Al-Shamiri +3 位作者 Ali Raza Emad Fadhal Muhammad Rafiq Nauman Ahmed 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第10期311-329,共19页
Based on theWorld Health Organization(WHO),Meningitis is a severe infection of the meninges,the membranes covering the brain and spinal cord.It is a devastating disease and remains a significant public health challeng... Based on theWorld Health Organization(WHO),Meningitis is a severe infection of the meninges,the membranes covering the brain and spinal cord.It is a devastating disease and remains a significant public health challenge.This study investigates a bacterial meningitis model through deterministic and stochastic versions.Four-compartment population dynamics explain the concept,particularly the susceptible population,carrier,infected,and recovered.The model predicts the nonnegative equilibrium points and reproduction number,i.e.,the Meningitis-Free Equilibrium(MFE),and Meningitis-Existing Equilibrium(MEE).For the stochastic version of the existing deterministicmodel,the twomethodologies studied are transition probabilities and non-parametric perturbations.Also,positivity,boundedness,extinction,and disease persistence are studiedrigorouslywiththe helpofwell-known theorems.Standard and nonstandard techniques such as EulerMaruyama,stochastic Euler,stochastic Runge Kutta,and stochastic nonstandard finite difference in the sense of delay have been presented for computational analysis of the stochastic model.Unfortunately,standard methods fail to restore the biological properties of the model,so the stochastic nonstandard finite difference approximation is offered as an efficient,low-cost,and independent of time step size.In addition,the convergence,local,and global stability around the equilibria of the nonstandard computational method is studied by assuming the perturbation effect is zero.The simulations and comparison of the methods are presented to support the theoretical results and for the best visualization of results. 展开更多
关键词 Bacterial Meningitis disease stochastic delayed model stability analysis extinction and persistence computational methods
下载PDF
Computation Tree Logic Model Checking of Multi-Agent Systems Based on Fuzzy Epistemic Interpreted Systems
9
作者 Xia Li Zhanyou Ma +3 位作者 Zhibao Mian Ziyuan Liu Ruiqi Huang Nana He 《Computers, Materials & Continua》 SCIE EI 2024年第3期4129-4152,共24页
Model checking is an automated formal verification method to verify whether epistemic multi-agent systems adhere to property specifications.Although there is an extensive literature on qualitative properties such as s... Model checking is an automated formal verification method to verify whether epistemic multi-agent systems adhere to property specifications.Although there is an extensive literature on qualitative properties such as safety and liveness,there is still a lack of quantitative and uncertain property verifications for these systems.In uncertain environments,agents must make judicious decisions based on subjective epistemic.To verify epistemic and measurable properties in multi-agent systems,this paper extends fuzzy computation tree logic by introducing epistemic modalities and proposing a new Fuzzy Computation Tree Logic of Knowledge(FCTLK).We represent fuzzy multi-agent systems as distributed knowledge bases with fuzzy epistemic interpreted systems.In addition,we provide a transformation algorithm from fuzzy epistemic interpreted systems to fuzzy Kripke structures,as well as transformation rules from FCTLK formulas to Fuzzy Computation Tree Logic(FCTL)formulas.Accordingly,we transform the FCTLK model checking problem into the FCTL model checking.This enables the verification of FCTLK formulas by using the fuzzy model checking algorithm of FCTL without additional computational overheads.Finally,we present correctness proofs and complexity analyses of the proposed algorithms.Additionally,we further illustrate the practical application of our approach through an example of a train control system. 展开更多
关键词 model checking multi-agent systems fuzzy epistemic interpreted systems fuzzy computation tree logic transformation algorithm
下载PDF
Real-Time Monitoring Method for Cow Rumination Behavior Based on Edge Computing and Improved MobileNet v3
10
作者 ZHANG Yu LI Xiangting +4 位作者 SUN Yalin XUE Aidi ZHANG Yi JIANG Hailong SHEN Weizheng 《智慧农业(中英文)》 CSCD 2024年第4期29-41,共13页
[Objective]Real-time monitoring of cow ruminant behavior is of paramount importance for promptly obtaining relevant information about cow health and predicting cow diseases.Currently,various strategies have been propo... [Objective]Real-time monitoring of cow ruminant behavior is of paramount importance for promptly obtaining relevant information about cow health and predicting cow diseases.Currently,various strategies have been proposed for monitoring cow ruminant behavior,including video surveillance,sound recognition,and sensor monitoring methods.How‐ever,the application of edge device gives rise to the issue of inadequate real-time performance.To reduce the volume of data transmission and cloud computing workload while achieving real-time monitoring of dairy cow rumination behavior,a real-time monitoring method was proposed for cow ruminant behavior based on edge computing.[Methods]Autono‐mously designed edge devices were utilized to collect and process six-axis acceleration signals from cows in real-time.Based on these six-axis data,two distinct strategies,federated edge intelligence and split edge intelligence,were investigat‐ed for the real-time recognition of cow ruminant behavior.Focused on the real-time recognition method for cow ruminant behavior leveraging federated edge intelligence,the CA-MobileNet v3 network was proposed by enhancing the MobileNet v3 network with a collaborative attention mechanism.Additionally,a federated edge intelligence model was designed uti‐lizing the CA-MobileNet v3 network and the FedAvg federated aggregation algorithm.In the study on split edge intelli‐gence,a split edge intelligence model named MobileNet-LSTM was designed by integrating the MobileNet v3 network with a fusion collaborative attention mechanism and the Bi-LSTM network.[Results and Discussions]Through compara‐tive experiments with MobileNet v3 and MobileNet-LSTM,the federated edge intelligence model based on CA-Mo‐bileNet v3 achieved an average Precision rate,Recall rate,F1-Score,Specificity,and Accuracy of 97.1%,97.9%,97.5%,98.3%,and 98.2%,respectively,yielding the best recognition performance.[Conclusions]It is provided a real-time and effective method for monitoring cow ruminant behavior,and the proposed federated edge intelligence model can be ap‐plied in practical settings. 展开更多
关键词 cow rumination behavior real-time monitoring edge computing improved MobileNet v3 edge intelligence model Bi-LSTM
下载PDF
Research on the Assessment System of Computational Mechanics Courses Based on the TOPSIS Entropy Weight Model
11
作者 Huijun Ning Ruhuan Yu +1 位作者 Qianshu Wang Mingming Lin 《Journal of Contemporary Educational Research》 2024年第6期166-182,共17页
This paper takes the assessment and evaluation of computational mechanics course as the background,and constructs a diversified course evaluation system that is student-centered and integrates both quantitative and qu... This paper takes the assessment and evaluation of computational mechanics course as the background,and constructs a diversified course evaluation system that is student-centered and integrates both quantitative and qualitative evaluation methods.The system not only pays attention to students’practical operation and theoretical knowledge mastery but also puts special emphasis on the cultivation of students’innovative abilities.In order to realize a comprehensive and objective evaluation,the assessment and evaluation method of the entropy weight model combining TOPSIS(Technique for Order Preference by Similarity to Ideal Solution)multi-attribute decision analysis and entropy weight theory is adopted,and its validity and practicability are verified through example analysis.This method can not only comprehensively and objectively evaluate students’learning outcomes,but also provide a scientific decision-making basis for curriculum teaching reform.The implementation of this diversified course evaluation system can better reflect the comprehensive ability of students and promote the continuous improvement of teaching quality. 展开更多
关键词 TOPSIS entropy weight model computational mechanics Course assessment and evaluation system Assessment model
下载PDF
A Novel Parallel Computing Confidentiality Scheme Based on Hindmarsh-Rose Model
12
作者 Jawad Ahmad Mimonah Al Qathrady +3 位作者 Mohammed SAlshehri Yazeed Yasin Ghadi Mujeeb Ur Rehman Syed Aziz Shah 《Computers, Materials & Continua》 SCIE EI 2023年第8期1325-1341,共17页
Due to the inherent insecure nature of the Internet,it is crucial to ensure the secure transmission of image data over this network.Additionally,given the limitations of computers,it becomes evenmore important to empl... Due to the inherent insecure nature of the Internet,it is crucial to ensure the secure transmission of image data over this network.Additionally,given the limitations of computers,it becomes evenmore important to employ efficient and fast image encryption techniques.While 1D chaotic maps offer a practical approach to real-time image encryption,their limited flexibility and increased vulnerability restrict their practical application.In this research,we have utilized a 3DHindmarsh-Rosemodel to construct a secure cryptosystem.The randomness of the chaotic map is assessed through standard analysis.The proposed system enhances security by incorporating an increased number of system parameters and a wide range of chaotic parameters,as well as ensuring a uniformdistribution of chaotic signals across the entire value space.Additionally,a fast image encryption technique utilizing the new chaotic system is proposed.The novelty of the approach is confirmed through time complexity analysis.To further strengthen the resistance against cryptanalysis attacks and differential attacks,the SHA-256 algorithm is employed for secure key generation.Experimental results through a number of parameters demonstrate the strong cryptographic performance of the proposed image encryption approach,highlighting its exceptional suitability for secure communication.Moreover,the security of the proposed scheme has been compared with stateof-the-art image encryption schemes,and all comparison metrics indicate the superior performance of the proposed scheme. 展开更多
关键词 Hindmarsh-rose model image encryption SHA-256 parallel computing
下载PDF
A Dynamic Bayesian-Based Comprehensive Trust Evaluation Model for Dispersed Computing Environment
13
作者 Hongwen Hui Zhengxia Gong +1 位作者 Jianwei An Jianzhong Qi 《China Communications》 SCIE CSCD 2023年第2期278-288,共11页
Dispersed computing is a new resourcecentric computing paradigm.Due to its high degree of openness and decentralization,it is vulnerable to attacks,and security issues have become an important challenge hindering its ... Dispersed computing is a new resourcecentric computing paradigm.Due to its high degree of openness and decentralization,it is vulnerable to attacks,and security issues have become an important challenge hindering its development.The trust evaluation technology is of great significance to the reliable operation and security assurance of dispersed computing networks.In this paper,a dynamic Bayesian-based comprehensive trust evaluation model is proposed for dispersed computing environment.Specifically,in the calculation of direct trust,a logarithmic decay function and a sliding window are introduced to improve the timeliness.In the calculation of indirect trust,a random screening method based on sine function is designed,which excludes malicious nodes providing false reports and multiple malicious nodes colluding attacks.Finally,the comprehensive trust value is dynamically updated based on historical interactions,current interactions and momentary changes.Simulation experiments are introduced to verify the performance of the model.Compared with existing model,the proposed trust evaluation model performs better in terms of the detection rate of malicious nodes,the interaction success rate,and the computational cost. 展开更多
关键词 dispersed computing trust evaluation model malicious node interaction success rate detection rate
下载PDF
3D Model Occlusion Culling Optimization Method Based on WebGPU Computing Pipeline
14
作者 Liming Ye Gang Liu +4 位作者 Genshen Chen Kang Li Qiyu Chen Wenyao Fan Junjie Zhang 《Computer Systems Science & Engineering》 SCIE EI 2023年第11期2529-2545,共17页
Nowadays,Web browsers have become an important carrier of 3D model visualization because of their convenience and portability.During the process of large-scale 3D model visualization based on Web scenes with the probl... Nowadays,Web browsers have become an important carrier of 3D model visualization because of their convenience and portability.During the process of large-scale 3D model visualization based on Web scenes with the problems of slow rendering speed and low FPS(Frames Per Second),occlusion culling,as an important method for rendering optimization,can remove most of the occluded objects and improve rendering efficiency.The traditional occlusion culling algorithm(TOCA)is calculated by traversing all objects in the scene,which involves a large amount of repeated calculation and time consumption.To advance the rendering process and enhance rendering efficiency,this paper proposes an occlusion culling with three different optimization methods based on the WebGPU Computing Pipeline.Firstly,for the problem of large amounts of repeated calculation processes in TOCA,these units are moved from the CPU to the GPU for parallel computing,thereby accelerating the calculation of the Potential Visible Sets(PVS);Then,for the huge overhead of creating pipeline caused by too many 3D models in a certain scene,the Breaking Occlusion Culling Algorithm(BOCA)is introduced,which removes some nodes according to building a Hierarchical Bounding Volume(BVH)scene tree to reduce the overhead of creating pipelines;After that,the structure of the scene tree is transmitted to the GPU in the order of depth-first traversal and finally,the PVS is obtained by parallel computing.In the experiments,3D geological models with five different scales from 1:5,000 to 1:500,000 are used for testing.The results show that the proposed methods can reduce the time overhead of repeated calculation caused by the computing pipeline creation and scene tree recursive traversal in the occlusion culling algorithm effectively,with 97%rendering efficiency improvement compared with BOCA,thereby accelerating the rendering process on Web browsers. 展开更多
关键词 WebGPU potential visible set occlusion culling computing pipeline 3D model
下载PDF
Leveraging Quantum Computing for the Ising Model to Simulate Two Real Systems: Magnetic Materials and Biological Neural Networks (BNNs)
15
作者 David L. Cao Khoi Dinh 《Journal of Quantum Information Science》 2023年第3期138-155,共18页
Quantum computing is a field with increasing relevance as quantum hardware improves and more applications of quantum computing are discovered. In this paper, we demonstrate the feasibility of modeling Ising Model Hami... Quantum computing is a field with increasing relevance as quantum hardware improves and more applications of quantum computing are discovered. In this paper, we demonstrate the feasibility of modeling Ising Model Hamiltonians on the IBM quantum computer. We developed quantum circuits to simulate these systems more efficiently for both closed and open boundary Ising models, with and without perturbations. We tested these various geometries of systems in both 1-D and 2-D space to mimic two real systems: magnetic materials and biological neural networks (BNNs). Our quantum model is more efficient than classical computers, which can struggle to simulate large, complex systems of particles. 展开更多
关键词 Ising model Magnetic Material Biological Neural Network Quantum computting International Business Machines (IBM)
下载PDF
Computing of LQR Technique for Nonlinear System Using Local Approximation 被引量:1
16
作者 Aamir Shahzad Ali Altalbe 《Computer Systems Science & Engineering》 SCIE EI 2023年第7期853-871,共19页
The main idea behind the present research is to design a state-feedback controller for an underactuated nonlinear rotary inverted pendulum module by employing the linear quadratic regulator(LQR)technique using local a... The main idea behind the present research is to design a state-feedback controller for an underactuated nonlinear rotary inverted pendulum module by employing the linear quadratic regulator(LQR)technique using local approximation.The LQR is an excellent method for developing a controller for nonlinear systems.It provides optimal feedback to make the closed-loop system robust and stable,rejecting external disturbances.Model-based optimal controller for a nonlinear system such as a rotatory inverted pendulum has not been designed and implemented using Newton-Euler,Lagrange method,and local approximation.Therefore,implementing LQR to an underactuated nonlinear system was vital to design a stable controller.A mathematical model has been developed for the controller design by utilizing the Newton-Euler,Lagrange method.The nonlinear model has been linearized around an equilibrium point.Linear and nonlinear models have been compared to find the range in which linear and nonlinear models’behaviour is similar.MATLAB LQR function and system dynamics have been used to estimate the controller parameters.For the performance evaluation of the designed controller,Simulink has been used.Linear and nonlinear models have been simulated along with the designed controller.Simulations have been performed for the designed controller over the linear and nonlinear system under different conditions through varying system variables.The results show that the system is stable and robust enough to act against external disturbances.The controller maintains the rotary inverted pendulum in an upright position and rejects disruptions like falling under gravitational force or any external disturbance by adjusting the rotation of the horizontal link in both linear and nonlinear environments in a specific range.The controller has been practically designed and implemented.It is vivid from the results that the controller is robust enough to reject the disturbances in milliseconds and keeps the pendulum arm deflection angle to zero degrees. 展开更多
关键词 computing rotary inverted pendulum(RIP) modeling and simulation linear quadratic regulator(LQR) nonlinear system
下载PDF
Computational Experiments for Complex Social Systems:Experiment Design and Generative Explanation 被引量:2
17
作者 Xiao Xue Deyu Zhou +5 位作者 Xiangning Yu Gang Wang Juanjuan Li Xia Xie Lizhen Cui Fei-Yue Wang 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2024年第4期1022-1038,共17页
Powered by advanced information technology,more and more complex systems are exhibiting characteristics of the cyber-physical-social systems(CPSS).In this context,computational experiments method has emerged as a nove... Powered by advanced information technology,more and more complex systems are exhibiting characteristics of the cyber-physical-social systems(CPSS).In this context,computational experiments method has emerged as a novel approach for the design,analysis,management,control,and integration of CPSS,which can realize the causal analysis of complex systems by means of“algorithmization”of“counterfactuals”.However,because CPSS involve human and social factors(e.g.,autonomy,initiative,and sociality),it is difficult for traditional design of experiment(DOE)methods to achieve the generative explanation of system emergence.To address this challenge,this paper proposes an integrated approach to the design of computational experiments,incorporating three key modules:1)Descriptive module:Determining the influencing factors and response variables of the system by means of the modeling of an artificial society;2)Interpretative module:Selecting factorial experimental design solution to identify the relationship between influencing factors and macro phenomena;3)Predictive module:Building a meta-model that is equivalent to artificial society to explore its operating laws.Finally,a case study of crowd-sourcing platforms is presented to illustrate the application process and effectiveness of the proposed approach,which can reveal the social impact of algorithmic behavior on“rider race”. 展开更多
关键词 Agent-based modeling computational experiments cyber-physical-social systems(CPSS) generative deduction generative experiments meta model
下载PDF
MTBAC: A Mutual Trust Based Access Control Model in Cloud Computing 被引量:12
18
作者 LIN Guoyuan WANG Danru +1 位作者 BIE Yuyu LEI Min 《China Communications》 SCIE CSCD 2014年第4期154-162,共9页
As a new computing mode,cloud computing can provide users with virtualized and scalable web services,which faced with serious security challenges,however.Access control is one of the most important measures to ensure ... As a new computing mode,cloud computing can provide users with virtualized and scalable web services,which faced with serious security challenges,however.Access control is one of the most important measures to ensure the security of cloud computing.But applying traditional access control model into the Cloud directly could not solve the uncertainty and vulnerability caused by the open conditions of cloud computing.In cloud computing environment,only when the security and reliability of both interaction parties are ensured,data security can be effectively guaranteed during interactions between users and the Cloud.Therefore,building a mutual trust relationship between users and cloud platform is the key to implement new kinds of access control method in cloud computing environment.Combining with Trust Management(TM),a mutual trust based access control(MTBAC) model is proposed in this paper.MTBAC model take both user's behavior trust and cloud services node's credibility into consideration.Trust relationships between users and cloud service nodes are established by mutual trust mechanism.Security problems of access control are solved by implementing MTBAC model into cloud computing environment.Simulation experiments show that MTBAC model can guarantee the interaction between users and cloud service nodes. 展开更多
关键词 cloud computing access control trust model mutual trust mechanism MTBAC
下载PDF
Development of multiple soft computing models for estimating organic and inorganic constituents in coal 被引量:7
19
作者 M.Onifade A.I.Lawal +4 位作者 J.Abdulsalam B.Genc S.Bada K.O.Said A.R.Gbadamosi 《International Journal of Mining Science and Technology》 SCIE EI CAS CSCD 2021年第3期483-494,共12页
The distribution of the various organic and inorganic constituents and their influences on the combustion of coal has been comprehensively studied.However,the combustion characteristics of pulverized coal depend not o... The distribution of the various organic and inorganic constituents and their influences on the combustion of coal has been comprehensively studied.However,the combustion characteristics of pulverized coal depend not only on rank but also on the composition,distribution,and combination of the macerals.Unlike the proximate and ultimate analyses,determining the macerals in coal involves the use of sophisticated microscopic instrumentation and expertise.In this study,an attempt was made to predict the amount of macerals(vitrinite,inertinite,and liptinite)and total mineral matter from the Witbank Coalfields samples using the multiple input single output white-box artificial neural network(MISOWB-ANN),gene expression programming(GEP),multiple linear regression(MLR),and multiple nonlinear regression(MNLR).The predictive models obtained from the multiple soft computing models adopted are contrasted with one another using difference,efficiency,and composite statistical indicators to examine the appropriateness of the models.The MISOWB-ANN provides a more reliable predictive model than the other three models with the lowest difference and highest efficiency and composite statistical indicators. 展开更多
关键词 Multiple soft computing models COAL Organic and inorganic constituents
下载PDF
A Unified Framework of the Cloud Computing Service Model 被引量:2
20
作者 Wen-Lung Shiau Chao-Ming Hsiao 《Journal of Electronic Science and Technology》 CAS 2013年第2期150-160,共11页
After a comprehensive literature review and analysis, a unified cloud computing framework is proposed, which comprises MapReduce, a vertual machine, Hadoop distributed file system (HDFS), Hbase, Hadoop, and virtuali... After a comprehensive literature review and analysis, a unified cloud computing framework is proposed, which comprises MapReduce, a vertual machine, Hadoop distributed file system (HDFS), Hbase, Hadoop, and virtualization. This study also compares Microsoft, Trend Micro, and the proposed unified cloud computing architecture to show that the proposed unified framework of the cloud computing service model is comprehensive and appropriate for the current complexities of businesses. The findings of this study can contribute to the knowledge for academics and practitioners to understand, assess, and analyze a cloud computing service application. 展开更多
关键词 Cloud computing service model conceptual framework EVOLUTION information system.
下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部