期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
Example based painting generation 被引量:1
1
作者 GUO Yan-wen YU Jin-hui +2 位作者 XU Xiao-dong WANG Jin PENG Qun-sheng 《Journal of Zhejiang University-Science A(Applied Physics & Engineering)》 SCIE EI CAS CSCD 2006年第7期1152-1159,共8页
We present an approach for generating paintings on photographic images with the style encoded by the example paintings and adopt representative brushes extracted from the example paintings as the painting primitives. ... We present an approach for generating paintings on photographic images with the style encoded by the example paintings and adopt representative brushes extracted from the example paintings as the painting primitives. Our system first divides the given photographic image into several regions on which we synthesize a grounding layer with texture patches extracted from the example paintings. Then, we paint those regions using brushes stochastically chosen from the brush library, with further brush color and shape perturbations. The brush direction is determined by a direction field either constructed by a convenient user interactive manner or synthesized from the examples. Our approach offers flexible and intuitive user control over the painting process and style. 展开更多
关键词 Non-photorealistic rendering (NPR) Van Gogh PAINTING GROUNDING BRUSH
下载PDF
ZenLDA: Large-Scale Topic Model Training on Distributed Data-Parallel Platform 被引量:1
2
作者 Bo Zhao Hucheng Zhou +1 位作者 Guoqiang Li Yihua Huang 《Big Data Mining and Analytics》 2018年第1期57-74,共18页
Recently, topic models such as Latent Dirichlet Allocation(LDA) have been widely used in large-scale web mining. Many large-scale LDA training systems have been developed, which usually prefer a customized design from... Recently, topic models such as Latent Dirichlet Allocation(LDA) have been widely used in large-scale web mining. Many large-scale LDA training systems have been developed, which usually prefer a customized design from top to bottom with sophisticated synchronization support. We propose an LDA training system named ZenLDA, which follows a generalized design for the distributed data-parallel platform. The novelty of ZenLDA consists of three main aspects:(1) it converts the commonly used serial Collapsed Gibbs Sampling(CGS) inference algorithm to a Monte-Carlo Collapsed Bayesian(MCCB) estimation method, which is embarrassingly parallel;(2)it decomposes the LDA inference formula into parts that can be sampled more efficiently to reduce computation complexity;(3) it proposes a distributed LDA training framework, which represents the corpus as a directed graph with the parameters annotated as corresponding vertices and implements ZenLDA and other well-known inference methods based on Spark. Experimental results indicate that MCCB converges with accuracy similar to that of CGS, while running much faster. On top of MCCB, the ZenLDA formula decomposition achieved the fastest speed among other well-known inference methods. ZenLDA also showed good scalability when dealing with large-scale topic models on the data-parallel platform. Overall, ZenLDA could achieve comparable and even better computing performance with state-of-the-art dedicated systems. 展开更多
关键词 LATENT DIRICHLET ALLOCATION collapsed Gibbs sampling Monte-Carlo GRAPH COMPUTING LARGE-SCALE machine learning
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部