期刊文献+
共找到1,604篇文章
< 1 2 81 >
每页显示 20 50 100
一种基于E-Chunk的机器翻译模型 被引量:10
1
作者 李 沐 吕学强 姚天顺 《软件学报》 EI CSCD 北大核心 2002年第4期669-676,共8页
提出了一种基于E-Chunk的多引擎机器翻译模型.该模型以中心语驱动的分析技术为基础,通过词汇相似特征计算E-Chunk的匹配代价,自底向上地完成最优E-Chunk覆盖的构造.并以E-Chunk为基本翻译单元完成机器翻译过程初步的实验结果显示,该方... 提出了一种基于E-Chunk的多引擎机器翻译模型.该模型以中心语驱动的分析技术为基础,通过词汇相似特征计算E-Chunk的匹配代价,自底向上地完成最优E-Chunk覆盖的构造.并以E-Chunk为基本翻译单元完成机器翻译过程初步的实验结果显示,该方法在面向领域文本的自动翻译方面是有效的. 展开更多
关键词 E-chunk 机器翻译 词汇相似计算 信息处理
下载PDF
基于状态机的HTTP Chunked流并发解析 被引量:2
2
作者 李明哲 陈君 +1 位作者 王劲林 陈晓 《计算机工程》 CAS CSCD 北大核心 2015年第1期256-260,共5页
某些流媒体服务器需要对HTTP Chunked编码数据流进行并发解析,朴素静态解析算法难以应用于高效灵活的事件驱动并发模型,且会造成长延迟和多次数据拷贝,导致内存和计算资源开销都较高。针对上述问题,提出一种基于有限状态机的解析策略。... 某些流媒体服务器需要对HTTP Chunked编码数据流进行并发解析,朴素静态解析算法难以应用于高效灵活的事件驱动并发模型,且会造成长延迟和多次数据拷贝,导致内存和计算资源开销都较高。针对上述问题,提出一种基于有限状态机的解析策略。将一次接收和一次解析操作构成一个任务片,从而适应事件驱动模型,对收到的数据包进行即时处理和释放,不需要缓存整个HTTP报文,减少一次内存拷贝开销。在数据处理过程中,通过有限状态机保存解析状态,能够在任务片退出后恢复之前的解析状态,从而解决事件驱动模型下的字段断裂问题。实验结果表明,相比于静态解析算法,该策略能够明显地降低解析过程的处理时间和占用的内存。 展开更多
关键词 流媒体 HTTP chunked编码 并发解析 事件驱动模型 有限状态机 内存拷贝
下载PDF
基于连接文法的双语E-Chunk获取方法 被引量:3
3
作者 吕学强 陈文亮 姚天顺 《东北大学学报(自然科学版)》 EI CAS CSCD 北大核心 2002年第9期829-832,共4页
提出了一种面向机器翻译领域的扩展Chunk概念·E Chunk是在Chunk概念基础上基于语义惟一性的一种扩展形式 ,其具体形态表现为具有无歧义性、复现性、可嵌套性、内部结构句法自足性的无歧义机器可翻译单元·讨论了使用连接文法... 提出了一种面向机器翻译领域的扩展Chunk概念·E Chunk是在Chunk概念基础上基于语义惟一性的一种扩展形式 ,其具体形态表现为具有无歧义性、复现性、可嵌套性、内部结构句法自足性的无歧义机器可翻译单元·讨论了使用连接文法的连接因子进行英语E Chunk的识别技术和双语E Chunk获取方法·双语E Chunk库的建立必将为基于Chunk的机器翻译技术提供极大的支持· 展开更多
关键词 连接文法 E-chunk 获取方法 自然语言处理 连接因子 双语对齐 词义消歧 机器翻译
下载PDF
基于E-Chunk的问句实例分析系统 被引量:2
4
作者 骆正华 樊孝忠 +1 位作者 刘林 龚永罡 《北京理工大学学报》 EI CAS CSCD 北大核心 2005年第1期63-66,共4页
分析中文问句的结构特点,研究处理问句需要解决的问题,提出一种基于语义块实例分析问句的新方法.采用语义块向量表示问句实例,比较输入问句和问句实例库中问句实例的相似性.在语义块向量结构相似的基础上计算问句的相似度.测试结果表明... 分析中文问句的结构特点,研究处理问句需要解决的问题,提出一种基于语义块实例分析问句的新方法.采用语义块向量表示问句实例,比较输入问句和问句实例库中问句实例的相似性.在语义块向量结构相似的基础上计算问句的相似度.测试结果表明,该方法可行,准确率和召回率可分别达到82.05%和91.95%,该方法对问句分析系统的设计具有借鉴意义和继续深入研究的价值. 展开更多
关键词 语义块 问句实例 语义相似度 知网
下载PDF
Chunk approach for English teaching 被引量:5
5
作者 张立英 徐勇 《Sino-US English Teaching》 2007年第11期14-18,共5页
This paper suggests a chunk approach to solve the plateau problem among advanced English learners. The paper first discusses the extant problems and then provides a definition of the chunk approach. Based on some rese... This paper suggests a chunk approach to solve the plateau problem among advanced English learners. The paper first discusses the extant problems and then provides a definition of the chunk approach. Based on some research results in cognitive psychology, it analyses the important role that chunks play in language acquisition and production and thus provides a cognitive foundation for implementing the chunk approach in English teaching. The paper also offers a set of classroom activities which can be easily adopted or adapted by other teachers. 展开更多
关键词 chunk chunk approach advanced English learners classroom activities
下载PDF
基于Chunk-CRF的情感问答研究 被引量:1
6
作者 唐琴 宋锐 林鸿飞 《智能系统学报》 2008年第6期504-510,共7页
相对于事实性问答系统而言,观点或情感问答系统的研究除了需要考虑观点持有者及情感倾向性等与情感相关问题以外,其难点还在于答案形式更复杂更分散.从百度知道人工搜集了大量的情感问题,并根据情感问题的特征,统计并归纳了五大情感问... 相对于事实性问答系统而言,观点或情感问答系统的研究除了需要考虑观点持有者及情感倾向性等与情感相关问题以外,其难点还在于答案形式更复杂更分散.从百度知道人工搜集了大量的情感问题,并根据情感问题的特征,统计并归纳了五大情感问题类型.问题分类模式与传统事实性问答系统不同,不能仅仅根据疑问词对其进行分类,还需要考虑到观点以及受众的反应.问题分类使用基于Chunk的CRF模型与规则相结合的情感问题分类方法.在答案抽取时结合组块识别的结果和情感的倾向性,并根据情感问题类型的不同采取不同的方法以获取答案.实验结果表明了评价体系的有效性. 展开更多
关键词 事实性问答 情感问答 组块分析 知网
下载PDF
谈Chunking策略在中英翻译中的应用 被引量:1
7
作者 吴越 郦青 《山东广播电视大学学报》 2017年第2期49-52,共4页
在翻译实践中,不论是初学的译者还是具备一定经验的译者,都对"信达雅"、"归化"、"异化"、功能目的论、奈达、纽马克以及巴尔胡达罗夫的理论比较熟悉,也常遵循这些理论进行翻译工作。然而,译者对chunking... 在翻译实践中,不论是初学的译者还是具备一定经验的译者,都对"信达雅"、"归化"、"异化"、功能目的论、奈达、纽马克以及巴尔胡达罗夫的理论比较熟悉,也常遵循这些理论进行翻译工作。然而,译者对chunking策略知之甚少、用之甚少。笔者认为,作为一名译者,应该意识到翻译即是对不同文化间相互交融的问题。因此,译者必须具备组织与分块的能力,借以实现两种不同文化模式的匹配,这也是翻译过程中极其重要且难以实现的一点。而初学译者一旦在意识上形成chunking模式,这项翻译技巧便能自然地内化成为译者心中的无意识策略。因此,初学的译者应该加强chunking技能的练习。 展开更多
关键词 chunking策略 初学译者 翻译过程 实际运用
下载PDF
SECCL-based research on prefabricated chunks in achieving oral English fluency
8
作者 刘春阳 杨雨时 《Sino-US English Teaching》 2010年第11期16-21,共6页
Fluency on oral English has always been the goal of Chinese English learners. Language corpuses offer great convenience to language researches. Prefabricated chunks are a great help for learners to achieve oral Englis... Fluency on oral English has always been the goal of Chinese English learners. Language corpuses offer great convenience to language researches. Prefabricated chunks are a great help for learners to achieve oral English fluency. With the help of computer software, chunks in SECCL are categorized. The conclusion is in the process of chunks acquiring, emphasis should be on content-related chunks, especially specific topic-related ones. One effective way to gain topic-related chunks is to build topic-related English corpus of native speakers. 展开更多
关键词 oral English fluency prefabricated chunks English corpus content-related chunks specific topic-related chunks
下载PDF
A Cognitive Study of Metaphor Chunks in News English
9
作者 栗蔷薇 吕丽静 《魅力中国》 2011年第2期240-240,251,共2页
This paper aims to demonstrate the pervasiveness of metaphor chunks in News English and introduce effective ways of understanding themcorrectly from the perspective of cognitive linguistics.Considering the difficulty ... This paper aims to demonstrate the pervasiveness of metaphor chunks in News English and introduce effective ways of understanding themcorrectly from the perspective of cognitive linguistics.Considering the difficulty in making out the accurate meaning of metaphor chunks in News Eng-lish,some translation strategies have also been proposed in hopes that it will benefit readers in their understanding and appreciation of News English. 展开更多
关键词 METAPHOR chunk NEWS ENGLISH
下载PDF
Study on EFL and Prefabricated Chunks
10
作者 杨春宇 《海外英语》 2021年第5期272-273,共2页
Language is the most important tool for human beings with the outside world.In order to improve the efficiency of communication,people need to maximize the efficiency of language processing to ensure the smooth produc... Language is the most important tool for human beings with the outside world.In order to improve the efficiency of communication,people need to maximize the efficiency of language processing to ensure the smooth production and understanding of the meaning,although it is a subtle and complex process in human communication.As Dr.Widdowson proposed that language knowledge is largely chunk knowledge in the 1980’s.The process of language output is the process of copying prefabricated Chunks knowledge and then transferring it to language output.Based on data collected before,this paper intends to study the dominant reproduction and implicit output of the English language from the perspective of prefabricated chunks,to play a guiding role in optimizing the output ability of EFL learners. 展开更多
关键词 Prefabricated chunks features of chunk use EFL learners INPUT OUTPUT
下载PDF
Multimodal Design of Lexical Chunks Teaching
11
作者 冯俊英 《海外英语》 2018年第23期255-256,共2页
Based on the concepts of Lexical Chunks and Multimodal Teaching,this paper focuses the input source of English vocabulary learning,integrating the advantages of Lexical Approach with Multimodal Teaching.As a new teach... Based on the concepts of Lexical Chunks and Multimodal Teaching,this paper focuses the input source of English vocabulary learning,integrating the advantages of Lexical Approach with Multimodal Teaching.As a new teaching exploration of English vocabulary,the teaching practice in classroom has shown that teachers should make full and reasonable use of various teaching means and resources to achieve multimodal teaching of lexical chunks,which is helpful to promote students to learn vocabulary quickly and effectively,and improve students'English language competence and performance. 展开更多
关键词 LEXICAL chunks MULTIMODAL TEACHING TEACHING DESIGN
下载PDF
Probability Theory Predicts That Chunking into Groups of Three or Four Items Increases the Short-Term Memory Capacity 被引量:1
12
作者 Motohisa Osaka 《Applied Mathematics》 2014年第10期1474-1484,共11页
Short-term memory allows individuals to recall stimuli, such as numbers or words, for several seconds to several minutes without rehearsal. Although the capacity of short-term memory is considered to be 7 &#177?2 ... Short-term memory allows individuals to recall stimuli, such as numbers or words, for several seconds to several minutes without rehearsal. Although the capacity of short-term memory is considered to be 7 &#177?2 items, this can be increased through a process called chunking. For example, in Japan, 11-digit cellular phone numbers and 10-digit toll free numbers are chunked into three groups of three or four digits: 090-XXXX-XXXX and 0120-XXX-XXX, respectively. We use probability theory to predict that the most effective chunking involves groups of three or four items, such as in phone numbers. However, a 16-digit credit card number exceeds the capacity of short-term memory, even when chunked into groups of four digits, such as XXXX-XXXX-XXXX-XXXX. Based on these data, 16-digit credit card numbers should be sufficient for security purposes. 展开更多
关键词 SHORT-TERM Memory chunkING Probabilistic Model CREDIT Card Number
下载PDF
Chunk Parsing and Entity Relation Extracting to Chinese Text by Using Conditional Random Fields Model 被引量:2
13
作者 Junhua Wu Longxia Liu 《Journal of Intelligent Learning Systems and Applications》 2010年第3期139-146,共8页
Currently, large amounts of information exist in Web sites and various digital media. Most of them are in natural lan-guage. They are easy to be browsed, but difficult to be understood by computer. Chunk parsing and e... Currently, large amounts of information exist in Web sites and various digital media. Most of them are in natural lan-guage. They are easy to be browsed, but difficult to be understood by computer. Chunk parsing and entity relation extracting is important work to understanding information semantic in natural language processing. Chunk analysis is a shallow parsing method, and entity relation extraction is used in establishing relationship between entities. Because full syntax parsing is complexity in Chinese text understanding, many researchers is more interesting in chunk analysis and relation extraction. Conditional random fields (CRFs) model is the valid probabilistic model to segment and label sequence data. This paper models chunk and entity relation problems in Chinese text. By transforming them into label solution we can use CRFs to realize the chunk analysis and entities relation extraction. 展开更多
关键词 Information EXTRACTION chunk PARSING ENTITY RELATION EXTRACTION
下载PDF
A MAXIMUM ENTROPY CHUNKING MODEL WITH N-FOLD TEMPLATE CORRECTION 被引量:1
14
作者 Sun Guanglu Guan Yi Wang Xiaolong 《Journal of Electronics(China)》 2007年第5期690-695,共6页
This letter presents a new chunking method based on Maximum Entropy (ME) model with N-fold template correction model.First two types of machine learning models are described.Based on the analysis of the two models,the... This letter presents a new chunking method based on Maximum Entropy (ME) model with N-fold template correction model.First two types of machine learning models are described.Based on the analysis of the two models,then the chunking model which combines the profits of conditional probability model and rule based model is proposed.The selection of features and rule templates in the chunking model is discussed.Experimental results for the CoNLL-2000 corpus show that this approach achieves impressive accuracy in terms of the F-score:92.93%.Compared with the ME model and ME Markov model,the new chunking model achieves better performance. 展开更多
关键词 chunkING Maximum Entropy (ME) model Template correction Cross-validation
下载PDF
Distributed Chunk-Based Optimization for MultiCarrier Ultra-Dense Networks 被引量:2
15
作者 GUO Shaozhen XING Chengwen +2 位作者 FEI Zesong ZHOU Gui YAN Xinge 《China Communications》 SCIE CSCD 2016年第1期80-90,共11页
In this paper,a distributed chunkbased optimization algorithm is proposed for the resource allocation in broadband ultra-dense small cell networks.Based on the proposed algorithm,the power and subcarrier allocation pr... In this paper,a distributed chunkbased optimization algorithm is proposed for the resource allocation in broadband ultra-dense small cell networks.Based on the proposed algorithm,the power and subcarrier allocation problems are jointly optimized.In order to make the resource allocation suitable for large scale networks,the optimization problem is decomposed first based on an effective decomposition algorithm named optimal condition decomposition(OCD) algorithm.Furthermore,aiming at reducing implementation complexity,the subcarriers are divided into chunks and are allocated chunk by chunk.The simulation results show that the proposed algorithm achieves more superior performance than uniform power allocation scheme and Lagrange relaxation method,and then the proposed algorithm can strike a balance between the complexity and performance of the multi-carrier Ultra-Dense Networks. 展开更多
关键词 ultra-dense small cell networks optimization chunk power allocation subcarrier allocation distributed resource allocation
下载PDF
Lexical Chunks as Scaffolding in College English Teaching
16
作者 蒋素兰 《海外英语》 2012年第15期22-23,共2页
Lexical chunks minimize the language learners'burden of memorization and play a very important role in saving language pro cessing efforts so as to improve the learners'language fluency,appropriacy and idiomat... Lexical chunks minimize the language learners'burden of memorization and play a very important role in saving language pro cessing efforts so as to improve the learners'language fluency,appropriacy and idiomaticity.Lexical chunks are taken as"scaffolding"in college English teaching to effectively enhance learners'language proficiency. 展开更多
关键词 LEXICAL chunks SCAFFOLDING chunk COMPETENCE writin
下载PDF
Application of Lexical Chunks in Teaching College English Writing
17
作者 王雪竹 《海外英语》 2015年第21期279-280,共2页
Lexical chunks,as one of the most important units with the feature of both grammar and vocabulary in a language use,as the smallest unit of language input,memory,storage and output,has a very broad application prospec... Lexical chunks,as one of the most important units with the feature of both grammar and vocabulary in a language use,as the smallest unit of language input,memory,storage and output,has a very broad application prospect in teaching college English writing.Lexical chunk is semi-refined language,which can be used as a whole stored in the memory,and can be extracted directly in use.Lexical chunks can effectively improve processing efficiency of the language resource information,alleviate learners ' timed writing pressure,overcome the influence of negative transfer of native language,to improve the students' writing fluency,accuracy and vitality. 展开更多
关键词 lexical chunks college English writing teaching
下载PDF
An Analysis on Lexical Chunk and the Teaching Method
18
作者 崔淑慧 《海外英语》 2015年第2期93-94,共2页
As a new way to learn English, lexical chunk is very significant. This thesis discusses problems that Chinese English learners met in learning lexical chunks, including overgeneralization, incorrect analogies, Chinese... As a new way to learn English, lexical chunk is very significant. This thesis discusses problems that Chinese English learners met in learning lexical chunks, including overgeneralization, incorrect analogies, Chinese expression used in translation and improper learning strategies. This thesis introduces some new ways to learn lexical chunks in order to strengthen students' sense of lexical chunk. 展开更多
关键词 LEXICAL chunk
下载PDF
MT-Oriented English PoS Tagging and Its Application to Noun Phrase Chunking
19
作者 Ma Jianjun Huang Degen +1 位作者 Liu Haixia Sheng Wenfeng 《China Communications》 SCIE CSCD 2012年第3期58-67,共10页
A hybrid approach to English Part-of-Speech(PoS) tagging with its target application being English-Chinese machine translation in business domain is presented,demonstrating how a present tagger can be adapted to learn... A hybrid approach to English Part-of-Speech(PoS) tagging with its target application being English-Chinese machine translation in business domain is presented,demonstrating how a present tagger can be adapted to learn from a small amount of data and handle unknown words for the purpose of machine translation.A small size of 998 k English annotated corpus in business domain is built semi-automatically based on a new tagset;the maximum entropy model is adopted,and rule-based approach is used in post-processing.The tagger is further applied in Noun Phrase(NP) chunking.Experiments show that our tagger achieves an accuracy of 98.14%,which is a quite satisfactory result.In the application to NP chunking,the tagger gives rise to 2.21% increase in F-score,compared with the results using Stanford tagger. 展开更多
关键词 English PoS tagging maximum entro- py rule-based approach machine translation NP chunking
下载PDF
A Detailed Chunk-Level Performance Study of Web Page Retrieve Latency
20
作者 谢海光 李翔 李建华 《Journal of Shanghai Jiaotong university(Science)》 EI 2005年第4期354-363,共10页
It is a widely discussed question that where the web latency comes from. In this paper, we propose a novel chunk-level latency dependence model to give a better illustration of the web latency. Based on the fact that ... It is a widely discussed question that where the web latency comes from. In this paper, we propose a novel chunk-level latency dependence model to give a better illustration of the web latency. Based on the fact that web content is delivered in chunk sequence, and clients care more about whole page retrieval latency, this paper carries out a detailed study on how the chunk sequence and relations affect the web retrieval latency. A series of thorough experiments are also conducted and data analysis are also made. The result is useful for further study on how to reduce the web latency. 展开更多
关键词 content delivery retrieve latency data dependency chunk
下载PDF
上一页 1 2 81 下一页 到第
使用帮助 返回顶部