摘要
随着数据规模的快速增长,单机的数据分析工具已经无法满足需求。针对大数据的分析问题,设计并实现了一种基于组件的大数据分析服务平台Haflow。Haflow自定义了业务流程模型和可扩展的组件接口,组件接口支持各种异构工具的集成。系统接收用户定义的业务流程,将其翻译成执行流程实例,提交到Hadoop分布式集群上执行。Haflow是一个可扩展的、分布式的、支持异构分析工具的、面向服务的大数据分析服务平台。提出该平台有两重意义:一方面平台将与数据分析业务无关的工作封装起来,支持各种异构组件,以加快分析应用的开发速度;另一方面,平台后端使用Hadoop分布式系统来实现多任务的并发,从而提高应用的平均执行速度。
As the expansion of data size, the data analysis tools that run only on stand-alone computers are no longer sufficient. We designed and implemented a module based big data analysis platform named Hallow to solve this problem. In Hallow, we defined an analysis business model and an extensible module interface, which supports integration of heterogeneous tools. Users submit their analysis flows, and system will interpret them, and then commit to the Hadoop. Hallow is an extensible,distributed,heterogeneous supported, service oriented big data platform. The goal of the platform is twofold. First, it provides an platform that encapsulates the dummy jobs that have nothing to do with the business itself, improving the development speed of analysis applications. Second, by submitting jobs to Hadoop which can execute jobs concurrently, the platform decreases the mean time of the analysis applications.
出处
《计算机科学》
CSCD
北大核心
2014年第9期75-79,共5页
Computer Science
基金
国家自然科学基金(61202065
61170074)
国家863计划(2012AA011204)
国家科技支撑计划(2012BAH05F02)资助
关键词
大数据
数据分析
数据挖掘
组件
分布式
服务
平台
Big data
Data analysis
Data mining, Module
Distributed
Service
Platform