摘要
网格中的各种资源的失效是不可避免的,为尽可能减少由于资源失效导致任务执行失败带来的影响,网格的任务调度算法的目标不仅要最小化任务执行的时间,还要兼顾考虑任务在资源上执行失败的风险。提出了Risk-DLS(Dynamic Level Scheduling)算法是将资源的风险估计模型与DLS算法相结合,通过仿真试验与DLS算法相比较,新算法在选择适当的参数的情况下,不仅能够最小化DAG型应用的完成时间,还能提高任务执行的成功率,有效的减少网格环境的不确定性对任务执行的影响。
In Grid Computing, various resources including machines and network failures are inevitable and can have an adverse effect on applications execution in the system, therefore the objective of task scheduling is not only to minimize applications running time, but also to reduce risks of application execution failures. A new scheduling algorithm named Risk-DLS( dynamic level scheduling) algorithm is proposed,which combines the risk estimation model of resources with DLS algorithm. The simulation shows that, by choosing reasonable parameters, Risk-DLS algorithm is better than DLS algorithm in terms of running time, at the same time it can reduce adverse impact on uncertainty in grid environment.
出处
《南京邮电大学学报(自然科学版)》
EI
2008年第5期24-29,35,共7页
Journal of Nanjing University of Posts and Telecommunications:Natural Science Edition
基金
国家自然科学基金(60573141和60773041)
国家高技术研究发展计划(863计划)(2006AA01Z201、2006AA01Z439、2007AA01Z404和2007AA01Z478)
江苏省高技术研究计划(BG2006001)
现代通信国家重点实验室基金(9140C1105040805)
江苏省自然科学基金(BK2008451)资助项目