摘要
针对当今企业海量数据的增长给数据容灾系统的存储容量、处理能力?、数据传输带宽带来巨大压力的现状,设计了一种基于重复数据删除技术的数据容灾系统。通过I/O吞吐量、cPU利用率、响应时间、数据备份时间和重复数据删除率等指标对数据容灾系统性能进行了测试,测试结果表明数据容灾系统对应用服务器响应时间的影响甚小,重复数据删除效果显著。
Nowadays, the growth of the magnanimity data brings enormous pressure to memory capacity, handling capacity,bandwidth of data transmission. In order to solve the problem, the paper designa data disaster tolerance system based on data de-duplication. It tests the function of data disaster tolerance sys- tem through I/O throughput, CPU utilization ratio,back up time, response time and data de-duplication. The result indicates that it has little effect on response time of application service, results of data de-dupli- cation is remarkable.
出处
《河北理工大学学报(自然科学版)》
CAS
2011年第3期73-81,共9页
Journal of Hebei Polytechnic University:Social Science Edition
关键词
重复数据删除
数据容灾
存储优化
性能测试
data de-duplication
data disaster tolerance
Storage Optimization
Performance Testing