期刊文献+
共找到4篇文章
< 1 >
每页显示 20 50 100
Word Net-based lexical semantic classification for text corpus analysis
1
作者 龙军 王鲁达 +2 位作者 李祖德 张祖平 杨柳 《Journal of Central South University》 SCIE EI CAS CSCD 2015年第5期1833-1840,共8页
Many text classifications depend on statistical term measures to implement document representation. Such document representations ignore the lexical semantic contents of terms and the distilled mutual information, lea... Many text classifications depend on statistical term measures to implement document representation. Such document representations ignore the lexical semantic contents of terms and the distilled mutual information, leading to text classification errors.This work proposed a document representation method, Word Net-based lexical semantic VSM, to solve the problem. Using Word Net,this method constructed a data structure of semantic-element information to characterize lexical semantic contents, and adjusted EM modeling to disambiguate word stems. Then, in the lexical-semantic space of corpus, lexical-semantic eigenvector of document representation was built by calculating the weight of each synset, and applied to a widely-recognized algorithm NWKNN. On text corpus Reuter-21578 and its adjusted version of lexical replacement, the experimental results show that the lexical-semantic eigenvector performs F1 measure and scales of dimension better than term-statistic eigenvector based on TF-IDF. Formation of document representation eigenvectors ensures the method a wide prospect of classification applications in text corpus analysis. 展开更多
关键词 document representation lexical semantic content CLASSIFICATION EIGENVECTOR
下载PDF
The Development of Lexical Semantic Autonomy of English Majors
2
作者 崔艳嫣 《Chinese Journal of Applied Linguistics》 2014年第2期231-243,265,共14页
This study investigates the development of lexical semantic autonomy by a semantic relatedness test on the part of 200 English majors from Year one through Year four in a university in Shandong Province. Despite the o... This study investigates the development of lexical semantic autonomy by a semantic relatedness test on the part of 200 English majors from Year one through Year four in a university in Shandong Province. Despite the obvious development of their receptive vocabulary size, the subjects at the four learning stages have not achieved the lexical semantic autonomy, confirming the view that L1 semantic involvement in L2 word processing is a long and constant state of L2 lexical development. Explicit vocabulary teaching complemented by data-driven learning can be adopted to trigger the semantic restructuring, overcome semantic fossilization, and promote the development of lexical semantic autonomy in L2 vocabulary acquisition. 展开更多
关键词 lexical competence lexical semantic autonomy semantic-relatedness
原文传递
Measuring code maintainability with deep neural networks
3
作者 Yamin HU Hao JIANG Zongyao HU 《Frontiers of Computer Science》 SCIE EI CSCD 2023年第6期61-75,共15页
The maintainability of source code is a key quality characteristic for software quality.Many approaches have been proposed to quantitatively measure code maintainability.Such approaches rely heavily on code metrics,e.... The maintainability of source code is a key quality characteristic for software quality.Many approaches have been proposed to quantitatively measure code maintainability.Such approaches rely heavily on code metrics,e.g.,the number of Lines of Code and McCabe’s Cyclomatic Complexity.The employed code metrics are essentially statistics regarding code elements,e.g.,the numbers of tokens,lines,references,and branch statements.However,natural language in source code,especially identifiers,is rarely exploited by such approaches.As a result,replacing meaningful identifiers with nonsense tokens would not significantly influence their outputs,although the replacement should have significantly reduced code maintainability.To this end,in this paper,we propose a novel approach(called DeepM)to measure code maintainability by exploiting the lexical semantics of text in source code.DeepM leverages deep learning techniques(e.g.,LSTM and attention mechanism)to exploit these lexical semantics in measuring code maintainability.Another key rationale of DeepM is that measuring code maintainability is complex and often far beyond the capabilities of statistics or simple heuristics.Consequently,DeepM leverages deep learning techniques to automatically select useful features from complex and lengthy inputs and to construct a complex mapping(rather than simple heuristics)from the input to the output(code maintainability index).DeepM is evaluated on a manually-assessed dataset.The evaluation results suggest that DeepM is accurate,and it generates the same rankings of code maintainability as those of experienced programmers on 87.5%of manually ranked pairs of Java classes. 展开更多
关键词 code maintainability lexical semantics deep learning neural networks
原文传递
Automatic Extraction of Contextual Co-occurrence Chain and Its Relationship with Textual Cohesion 被引量:1
4
作者 孙爱珍 《Chinese Journal of Applied Linguistics》 2011年第4期3-14,127,共13页
Semantic lexical chains have been regarded as important in textural cohesion, although traditionally, the classification of these chains has been limited to repetition, synonymy, hyponymy, and collocates. The cases of... Semantic lexical chains have been regarded as important in textural cohesion, although traditionally, the classification of these chains has been limited to repetition, synonymy, hyponymy, and collocates. The cases of automatic extraction of lexical chains have found that the contextual synonyms can not be recognized, nor extracted automatically. This study took the data-based technology to extract the contextually co-occurring lexical chains through thematic lexical items. It found that these contextually co-occurring lexical chains can include the semantic lexical chains and contextual synonyms. It also found that, in extraction of collocates of the co-occurring lexical items, these collocates form secondary lexical chains, which contribute to textual cohesion. The vertical lexical chains made of contextually cooccurring lexical items and the horizontal chains made of collocational lexical items work together in making the text into a coherent whole. 展开更多
关键词 semantic lexical chain the contextual co-occurrence chain automatic extraction collocation chain
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部