摘要
分词是Python中的一项重要应用,实现分词功能的工具有很多种,如jieba、SnowNLP、THULAC、NLPIR等。词云是在分词的基础上设计并实现的,它提供阅读整个信息的重点,揭示关键概念,并可使用不同的展示形式,以有趣、高效、新颖的方式呈现给阅读者。在此,以中文分词为例,详细介绍使用jieba库和wordcloud库实现词云的设计与优化。
Word segmentation is an important application in Python.There are many tools for implementing word segmentation,such as jieba,SnowNLP,THULAC,NLPIR,etc.The word cloud is designed and implemented on the basis of word segmentation.It provides the focus of reading of the entire information,reveals key concepts,and can be presented to readers in fun,efficient and novel ways using different presentation forms.Taking Chinese word segmentation as an example,the design and optimization of word cloud using jieba library and wordcloud library are introduced in detail.
作者
徐博龙
XU Bolong(School of Information Engineering,Guangdong Engineering Polytechnic,Guangzhou,China,510000)
出处
《福建电脑》
2019年第6期25-28,共4页
Journal of Fujian Computer