This paper suggests a chunk approach to solve the plateau problem among advanced English learners. The paper first discusses the extant problems and then provides a definition of the chunk approach. Based on some rese...This paper suggests a chunk approach to solve the plateau problem among advanced English learners. The paper first discusses the extant problems and then provides a definition of the chunk approach. Based on some research results in cognitive psychology, it analyses the important role that chunks play in language acquisition and production and thus provides a cognitive foundation for implementing the chunk approach in English teaching. The paper also offers a set of classroom activities which can be easily adopted or adapted by other teachers.展开更多
Fluency on oral English has always been the goal of Chinese English learners. Language corpuses offer great convenience to language researches. Prefabricated chunks are a great help for learners to achieve oral Englis...Fluency on oral English has always been the goal of Chinese English learners. Language corpuses offer great convenience to language researches. Prefabricated chunks are a great help for learners to achieve oral English fluency. With the help of computer software, chunks in SECCL are categorized. The conclusion is in the process of chunks acquiring, emphasis should be on content-related chunks, especially specific topic-related ones. One effective way to gain topic-related chunks is to build topic-related English corpus of native speakers.展开更多
This paper aims to demonstrate the pervasiveness of metaphor chunks in News English and introduce effective ways of understanding themcorrectly from the perspective of cognitive linguistics.Considering the difficulty ...This paper aims to demonstrate the pervasiveness of metaphor chunks in News English and introduce effective ways of understanding themcorrectly from the perspective of cognitive linguistics.Considering the difficulty in making out the accurate meaning of metaphor chunks in News Eng-lish,some translation strategies have also been proposed in hopes that it will benefit readers in their understanding and appreciation of News English.展开更多
Language is the most important tool for human beings with the outside world.In order to improve the efficiency of communication,people need to maximize the efficiency of language processing to ensure the smooth produc...Language is the most important tool for human beings with the outside world.In order to improve the efficiency of communication,people need to maximize the efficiency of language processing to ensure the smooth production and understanding of the meaning,although it is a subtle and complex process in human communication.As Dr.Widdowson proposed that language knowledge is largely chunk knowledge in the 1980’s.The process of language output is the process of copying prefabricated Chunks knowledge and then transferring it to language output.Based on data collected before,this paper intends to study the dominant reproduction and implicit output of the English language from the perspective of prefabricated chunks,to play a guiding role in optimizing the output ability of EFL learners.展开更多
Based on the concepts of Lexical Chunks and Multimodal Teaching,this paper focuses the input source of English vocabulary learning,integrating the advantages of Lexical Approach with Multimodal Teaching.As a new teach...Based on the concepts of Lexical Chunks and Multimodal Teaching,this paper focuses the input source of English vocabulary learning,integrating the advantages of Lexical Approach with Multimodal Teaching.As a new teaching exploration of English vocabulary,the teaching practice in classroom has shown that teachers should make full and reasonable use of various teaching means and resources to achieve multimodal teaching of lexical chunks,which is helpful to promote students to learn vocabulary quickly and effectively,and improve students'English language competence and performance.展开更多
Short-term memory allows individuals to recall stimuli, such as numbers or words, for several seconds to several minutes without rehearsal. Although the capacity of short-term memory is considered to be 7 ±?2 ...Short-term memory allows individuals to recall stimuli, such as numbers or words, for several seconds to several minutes without rehearsal. Although the capacity of short-term memory is considered to be 7 ±?2 items, this can be increased through a process called chunking. For example, in Japan, 11-digit cellular phone numbers and 10-digit toll free numbers are chunked into three groups of three or four digits: 090-XXXX-XXXX and 0120-XXX-XXX, respectively. We use probability theory to predict that the most effective chunking involves groups of three or four items, such as in phone numbers. However, a 16-digit credit card number exceeds the capacity of short-term memory, even when chunked into groups of four digits, such as XXXX-XXXX-XXXX-XXXX. Based on these data, 16-digit credit card numbers should be sufficient for security purposes.展开更多
Currently, large amounts of information exist in Web sites and various digital media. Most of them are in natural lan-guage. They are easy to be browsed, but difficult to be understood by computer. Chunk parsing and e...Currently, large amounts of information exist in Web sites and various digital media. Most of them are in natural lan-guage. They are easy to be browsed, but difficult to be understood by computer. Chunk parsing and entity relation extracting is important work to understanding information semantic in natural language processing. Chunk analysis is a shallow parsing method, and entity relation extraction is used in establishing relationship between entities. Because full syntax parsing is complexity in Chinese text understanding, many researchers is more interesting in chunk analysis and relation extraction. Conditional random fields (CRFs) model is the valid probabilistic model to segment and label sequence data. This paper models chunk and entity relation problems in Chinese text. By transforming them into label solution we can use CRFs to realize the chunk analysis and entities relation extraction.展开更多
This letter presents a new chunking method based on Maximum Entropy (ME) model with N-fold template correction model.First two types of machine learning models are described.Based on the analysis of the two models,the...This letter presents a new chunking method based on Maximum Entropy (ME) model with N-fold template correction model.First two types of machine learning models are described.Based on the analysis of the two models,then the chunking model which combines the profits of conditional probability model and rule based model is proposed.The selection of features and rule templates in the chunking model is discussed.Experimental results for the CoNLL-2000 corpus show that this approach achieves impressive accuracy in terms of the F-score:92.93%.Compared with the ME model and ME Markov model,the new chunking model achieves better performance.展开更多
In this paper,a distributed chunkbased optimization algorithm is proposed for the resource allocation in broadband ultra-dense small cell networks.Based on the proposed algorithm,the power and subcarrier allocation pr...In this paper,a distributed chunkbased optimization algorithm is proposed for the resource allocation in broadband ultra-dense small cell networks.Based on the proposed algorithm,the power and subcarrier allocation problems are jointly optimized.In order to make the resource allocation suitable for large scale networks,the optimization problem is decomposed first based on an effective decomposition algorithm named optimal condition decomposition(OCD) algorithm.Furthermore,aiming at reducing implementation complexity,the subcarriers are divided into chunks and are allocated chunk by chunk.The simulation results show that the proposed algorithm achieves more superior performance than uniform power allocation scheme and Lagrange relaxation method,and then the proposed algorithm can strike a balance between the complexity and performance of the multi-carrier Ultra-Dense Networks.展开更多
Lexical chunks minimize the language learners'burden of memorization and play a very important role in saving language pro cessing efforts so as to improve the learners'language fluency,appropriacy and idiomat...Lexical chunks minimize the language learners'burden of memorization and play a very important role in saving language pro cessing efforts so as to improve the learners'language fluency,appropriacy and idiomaticity.Lexical chunks are taken as"scaffolding"in college English teaching to effectively enhance learners'language proficiency.展开更多
Lexical chunks,as one of the most important units with the feature of both grammar and vocabulary in a language use,as the smallest unit of language input,memory,storage and output,has a very broad application prospec...Lexical chunks,as one of the most important units with the feature of both grammar and vocabulary in a language use,as the smallest unit of language input,memory,storage and output,has a very broad application prospect in teaching college English writing.Lexical chunk is semi-refined language,which can be used as a whole stored in the memory,and can be extracted directly in use.Lexical chunks can effectively improve processing efficiency of the language resource information,alleviate learners ' timed writing pressure,overcome the influence of negative transfer of native language,to improve the students' writing fluency,accuracy and vitality.展开更多
As a new way to learn English, lexical chunk is very significant. This thesis discusses problems that Chinese English learners met in learning lexical chunks, including overgeneralization, incorrect analogies, Chinese...As a new way to learn English, lexical chunk is very significant. This thesis discusses problems that Chinese English learners met in learning lexical chunks, including overgeneralization, incorrect analogies, Chinese expression used in translation and improper learning strategies. This thesis introduces some new ways to learn lexical chunks in order to strengthen students' sense of lexical chunk.展开更多
A hybrid approach to English Part-of-Speech(PoS) tagging with its target application being English-Chinese machine translation in business domain is presented,demonstrating how a present tagger can be adapted to learn...A hybrid approach to English Part-of-Speech(PoS) tagging with its target application being English-Chinese machine translation in business domain is presented,demonstrating how a present tagger can be adapted to learn from a small amount of data and handle unknown words for the purpose of machine translation.A small size of 998 k English annotated corpus in business domain is built semi-automatically based on a new tagset;the maximum entropy model is adopted,and rule-based approach is used in post-processing.The tagger is further applied in Noun Phrase(NP) chunking.Experiments show that our tagger achieves an accuracy of 98.14%,which is a quite satisfactory result.In the application to NP chunking,the tagger gives rise to 2.21% increase in F-score,compared with the results using Stanford tagger.展开更多
It is a widely discussed question that where the web latency comes from. In this paper, we propose a novel chunk-level latency dependence model to give a better illustration of the web latency. Based on the fact that ...It is a widely discussed question that where the web latency comes from. In this paper, we propose a novel chunk-level latency dependence model to give a better illustration of the web latency. Based on the fact that web content is delivered in chunk sequence, and clients care more about whole page retrieval latency, this paper carries out a detailed study on how the chunk sequence and relations affect the web retrieval latency. A series of thorough experiments are also conducted and data analysis are also made. The result is useful for further study on how to reduce the web latency.展开更多
文摘This paper suggests a chunk approach to solve the plateau problem among advanced English learners. The paper first discusses the extant problems and then provides a definition of the chunk approach. Based on some research results in cognitive psychology, it analyses the important role that chunks play in language acquisition and production and thus provides a cognitive foundation for implementing the chunk approach in English teaching. The paper also offers a set of classroom activities which can be easily adopted or adapted by other teachers.
文摘Fluency on oral English has always been the goal of Chinese English learners. Language corpuses offer great convenience to language researches. Prefabricated chunks are a great help for learners to achieve oral English fluency. With the help of computer software, chunks in SECCL are categorized. The conclusion is in the process of chunks acquiring, emphasis should be on content-related chunks, especially specific topic-related ones. One effective way to gain topic-related chunks is to build topic-related English corpus of native speakers.
文摘This paper aims to demonstrate the pervasiveness of metaphor chunks in News English and introduce effective ways of understanding themcorrectly from the perspective of cognitive linguistics.Considering the difficulty in making out the accurate meaning of metaphor chunks in News Eng-lish,some translation strategies have also been proposed in hopes that it will benefit readers in their understanding and appreciation of News English.
文摘Language is the most important tool for human beings with the outside world.In order to improve the efficiency of communication,people need to maximize the efficiency of language processing to ensure the smooth production and understanding of the meaning,although it is a subtle and complex process in human communication.As Dr.Widdowson proposed that language knowledge is largely chunk knowledge in the 1980’s.The process of language output is the process of copying prefabricated Chunks knowledge and then transferring it to language output.Based on data collected before,this paper intends to study the dominant reproduction and implicit output of the English language from the perspective of prefabricated chunks,to play a guiding role in optimizing the output ability of EFL learners.
文摘Based on the concepts of Lexical Chunks and Multimodal Teaching,this paper focuses the input source of English vocabulary learning,integrating the advantages of Lexical Approach with Multimodal Teaching.As a new teaching exploration of English vocabulary,the teaching practice in classroom has shown that teachers should make full and reasonable use of various teaching means and resources to achieve multimodal teaching of lexical chunks,which is helpful to promote students to learn vocabulary quickly and effectively,and improve students'English language competence and performance.
文摘Short-term memory allows individuals to recall stimuli, such as numbers or words, for several seconds to several minutes without rehearsal. Although the capacity of short-term memory is considered to be 7 ±?2 items, this can be increased through a process called chunking. For example, in Japan, 11-digit cellular phone numbers and 10-digit toll free numbers are chunked into three groups of three or four digits: 090-XXXX-XXXX and 0120-XXX-XXX, respectively. We use probability theory to predict that the most effective chunking involves groups of three or four items, such as in phone numbers. However, a 16-digit credit card number exceeds the capacity of short-term memory, even when chunked into groups of four digits, such as XXXX-XXXX-XXXX-XXXX. Based on these data, 16-digit credit card numbers should be sufficient for security purposes.
文摘Currently, large amounts of information exist in Web sites and various digital media. Most of them are in natural lan-guage. They are easy to be browsed, but difficult to be understood by computer. Chunk parsing and entity relation extracting is important work to understanding information semantic in natural language processing. Chunk analysis is a shallow parsing method, and entity relation extraction is used in establishing relationship between entities. Because full syntax parsing is complexity in Chinese text understanding, many researchers is more interesting in chunk analysis and relation extraction. Conditional random fields (CRFs) model is the valid probabilistic model to segment and label sequence data. This paper models chunk and entity relation problems in Chinese text. By transforming them into label solution we can use CRFs to realize the chunk analysis and entities relation extraction.
基金Supported by National Natural Science Foundation of China (No.60504021).
文摘This letter presents a new chunking method based on Maximum Entropy (ME) model with N-fold template correction model.First two types of machine learning models are described.Based on the analysis of the two models,then the chunking model which combines the profits of conditional probability model and rule based model is proposed.The selection of features and rule templates in the chunking model is discussed.Experimental results for the CoNLL-2000 corpus show that this approach achieves impressive accuracy in terms of the F-score:92.93%.Compared with the ME model and ME Markov model,the new chunking model achieves better performance.
基金supported in part by Beijing Natural Science Foundation(4152047)the 863 project No.2014AA01A701+1 种基金111 Project of China under Grant B14010China Mobile Research Institute under grant[2014]451
文摘In this paper,a distributed chunkbased optimization algorithm is proposed for the resource allocation in broadband ultra-dense small cell networks.Based on the proposed algorithm,the power and subcarrier allocation problems are jointly optimized.In order to make the resource allocation suitable for large scale networks,the optimization problem is decomposed first based on an effective decomposition algorithm named optimal condition decomposition(OCD) algorithm.Furthermore,aiming at reducing implementation complexity,the subcarriers are divided into chunks and are allocated chunk by chunk.The simulation results show that the proposed algorithm achieves more superior performance than uniform power allocation scheme and Lagrange relaxation method,and then the proposed algorithm can strike a balance between the complexity and performance of the multi-carrier Ultra-Dense Networks.
文摘Lexical chunks minimize the language learners'burden of memorization and play a very important role in saving language pro cessing efforts so as to improve the learners'language fluency,appropriacy and idiomaticity.Lexical chunks are taken as"scaffolding"in college English teaching to effectively enhance learners'language proficiency.
文摘Lexical chunks,as one of the most important units with the feature of both grammar and vocabulary in a language use,as the smallest unit of language input,memory,storage and output,has a very broad application prospect in teaching college English writing.Lexical chunk is semi-refined language,which can be used as a whole stored in the memory,and can be extracted directly in use.Lexical chunks can effectively improve processing efficiency of the language resource information,alleviate learners ' timed writing pressure,overcome the influence of negative transfer of native language,to improve the students' writing fluency,accuracy and vitality.
文摘As a new way to learn English, lexical chunk is very significant. This thesis discusses problems that Chinese English learners met in learning lexical chunks, including overgeneralization, incorrect analogies, Chinese expression used in translation and improper learning strategies. This thesis introduces some new ways to learn lexical chunks in order to strengthen students' sense of lexical chunk.
基金supported by the National Natural Science Foundation of China under Grant No.61173100the Fundamental Research Funds for the Central Universities under Grant No.GDUT10RW202
文摘A hybrid approach to English Part-of-Speech(PoS) tagging with its target application being English-Chinese machine translation in business domain is presented,demonstrating how a present tagger can be adapted to learn from a small amount of data and handle unknown words for the purpose of machine translation.A small size of 998 k English annotated corpus in business domain is built semi-automatically based on a new tagset;the maximum entropy model is adopted,and rule-based approach is used in post-processing.The tagger is further applied in Noun Phrase(NP) chunking.Experiments show that our tagger achieves an accuracy of 98.14%,which is a quite satisfactory result.In the application to NP chunking,the tagger gives rise to 2.21% increase in F-score,compared with the results using Stanford tagger.
文摘It is a widely discussed question that where the web latency comes from. In this paper, we propose a novel chunk-level latency dependence model to give a better illustration of the web latency. Based on the fact that web content is delivered in chunk sequence, and clients care more about whole page retrieval latency, this paper carries out a detailed study on how the chunk sequence and relations affect the web retrieval latency. A series of thorough experiments are also conducted and data analysis are also made. The result is useful for further study on how to reduce the web latency.