摘要
高速网络流大小的测量面临着高速存储资源极度稀缺的挑战,难以满足海量流式数据的实时存储需求.目前的研究大多采用存储资源共享技术,以便将设计的估计器置于稀缺的高速片上缓存中.然而,这种方法引入了大量难以消除的噪声,导致中小规模流的估算精度不高.为了解决这一问题,本文提出一种能根据流大小自适应调整所占用存储空间的自适应Sketch技术,并在此基础上设计出一个高精度、低存储开销的每流大小估计器.自适应Sketch技术利用可逆计数器高效滤除海量噪声小流,并进一步采用采样概率逐层递减的采样计数器实现对不同规模流的自适应采样计数,从而控制大流对资源的过多占用,实现了低开销、高精度的每流大小测量.基于真实网络数据集CAIDA 2019的仿真实验表明,所提出的自适应Sketch流大小估计器的平均相对误差较现有机制降低了接近1个数量级.
The measurement of flow size in high-speed networks faces a significant challenge due to the scarcity of high-speed memory resources,making it difficult to meet the real-time storage demands of massive flow data.Existing works commonly rely on memory-sharing techniques to place designed estimators in the limited highspeed memory.However,this approach introduces a substantial amount of noise that is hard to eliminate,leading to lower estimation accuracy for medium and small-scale flows.This paper proposes an Adaptive Sketch technique that adapts the memory space based on the flow size to address this issue.Building upon this technique,a highprecision,low-memory-cost flow size estimator is designed.The flow size estimator efficiently filters out massive noising/small flows using reversible counters and further employs sample counters with decreasing sampling probabilities at each level to adaptively sample different-sized flows.This technique effectively controls memory usage by large flows,achieving low cost and high precision in flow size measurement.Experiments based on the real network dataset CAIDA 2019 demonstrate that the proposed flow size estimator reduces the average relative error by nearly 1 order of magnitude compared to existing mechanisms.
作者
卜霄菲
黄河
孙玉娥
王兆杰
吴晓灿
Xiaofei BU;He HUANG;Yu-E SUN;Zhaojie WANG;Xiaocan WU(College of Software,Shenyang Normal University,Shenyang 110034,China;School of Computer Science and Technology,Soochow University,Suzhou 215006,China;School of Rail Transportation,Soochow University,Suzhou 215131,China)
出处
《中国科学:信息科学》
CSCD
北大核心
2024年第7期1677-1691,共15页
Scientia Sinica(Informationis)
基金
国家自然科学基金(批准号:62332013,62072322,62202322,U20A20182)项目资助。