摘要
大数据计算面对的是传统IT技术无法处理的数据量超大规模、服务请求高吞吐量和和数据类型异质多样的挑战。得益于国内外各大互联网公司的实际应用和开源代码贡献,源于Google的Apache Hadoop软件已成为PB量级大数据处理的成熟技术和事实标准。本文介绍了大数据计算系统中存储和索引两项研究工作,RCFile和CCIndex,分别有效解决了大数据计算系统的存储空间问题和查询性能问题。
Volume, Variety and Velocity are the three challenges must faced for the big data computing, which cannot be dealt with by traditional IT technologies. Benefit from numerous Internet companies' practical applications and continuous code contribution, the Apache Hadoop software, that was stemed from google's GFS and MapReduce, has become a mature software stack and the de facto standard of PB scale data processing. This paper introduces structuring data storage and index construction research of big data computing system, RCFile and CCIndex respectively, which are effective solutions to storage space and query performance issues.
出处
《科研信息化技术与应用》
2012年第6期26-33,共8页
E-science Technology & Application
基金
国家高技术研究发展计划(863计划)(2011AA01A203)