针对甚长基线干涉测量(very long baseline interferometry,VL-BI)数据采集记录系统对高速大容量先进先出(first in first out,FIFO)缓存的需求以及现有解决方案的不足,提出了一种基于优先级调度的高速大容量FIFO缓存设计方案:设...针对甚长基线干涉测量(very long baseline interferometry,VL-BI)数据采集记录系统对高速大容量先进先出(first in first out,FIFO)缓存的需求以及现有解决方案的不足,提出了一种基于优先级调度的高速大容量FIFO缓存设计方案:设计将同步动态随机存储器(synchronous dynamic random access memory,SDRAM)划分成环形缓存链,并依据提出的设计参考原则,合理分配任务量和设定优先级判据,通过任务号的管理,实现了一个标准的高速大容量异步FIFO缓存。性能测试结果表明,该设计融合了时分法和多体法的优点,面积开销小,实时性强,最高持续读写速度达680MB/s,容量利用率近100%。展开更多
To reduce network redundancy,innetwork caching is considered in many future Internet architectures,such as Information Centric Networking.In in-network caching system,the item sojourn time of LRU(Least Recently Used) ...To reduce network redundancy,innetwork caching is considered in many future Internet architectures,such as Information Centric Networking.In in-network caching system,the item sojourn time of LRU(Least Recently Used) replacement policy is an important issue for two reasons:firstly,LRU is one of the most common used cache policy;secondly,item sojourn time is positively correlated to the hit probability,so this metric parameter could be useful to design the caching system.However,to the best of our knowledge,the sojourn time hasn't been studied theoretically so far.In this paper,we first model the LRU cache policy by Markov chain.Then an approximate closedform expression of the item expectation sojourn time is provided through the theory of stochastic service system,which is a function of the item request rates and cache size.Finally,extensive simulation results are illustrated to show that the expression is a good approximation of the item sojourn time.展开更多
文摘针对甚长基线干涉测量(very long baseline interferometry,VL-BI)数据采集记录系统对高速大容量先进先出(first in first out,FIFO)缓存的需求以及现有解决方案的不足,提出了一种基于优先级调度的高速大容量FIFO缓存设计方案:设计将同步动态随机存储器(synchronous dynamic random access memory,SDRAM)划分成环形缓存链,并依据提出的设计参考原则,合理分配任务量和设定优先级判据,通过任务号的管理,实现了一个标准的高速大容量异步FIFO缓存。性能测试结果表明,该设计融合了时分法和多体法的优点,面积开销小,实时性强,最高持续读写速度达680MB/s,容量利用率近100%。
文摘To reduce network redundancy,innetwork caching is considered in many future Internet architectures,such as Information Centric Networking.In in-network caching system,the item sojourn time of LRU(Least Recently Used) replacement policy is an important issue for two reasons:firstly,LRU is one of the most common used cache policy;secondly,item sojourn time is positively correlated to the hit probability,so this metric parameter could be useful to design the caching system.However,to the best of our knowledge,the sojourn time hasn't been studied theoretically so far.In this paper,we first model the LRU cache policy by Markov chain.Then an approximate closedform expression of the item expectation sojourn time is provided through the theory of stochastic service system,which is a function of the item request rates and cache size.Finally,extensive simulation results are illustrated to show that the expression is a good approximation of the item sojourn time.