Combinatory categorial grammer(CCG) supertagging is an important subtask that takes place before full parsing and can benefit many natural language processing(NLP) tasks like question answering and machine translation...Combinatory categorial grammer(CCG) supertagging is an important subtask that takes place before full parsing and can benefit many natural language processing(NLP) tasks like question answering and machine translation. CCG supertagging can be regarded as a sequence labeling problem that remains a challenging problem where each word is assigned to a CCG lexical category and the number of the probably associated CCG supertags to each word is large. To address this, recently recurrent neural networks(RNNs), as extremely powerful sequential models, have been proposed for CCG supertagging and achieved good performances. In this paper, a variant of recurrent networks is proposed whose design makes it much easier to train and memorize information for long range dependencies based on gated recurrent units(GRUs), which have been recently introduced on some but not all tasks. Results of the experiments revealed the effectiveness of the proposed method on the CCGBank datasets and show that the model has comparable accuracy with the previously proposed models for CCG supertagging.展开更多
基金Supported by the National Basic Research Program(No.2014CB340503)the National Natural Science Foundation of China(No.61472105,61502120)
文摘Combinatory categorial grammer(CCG) supertagging is an important subtask that takes place before full parsing and can benefit many natural language processing(NLP) tasks like question answering and machine translation. CCG supertagging can be regarded as a sequence labeling problem that remains a challenging problem where each word is assigned to a CCG lexical category and the number of the probably associated CCG supertags to each word is large. To address this, recently recurrent neural networks(RNNs), as extremely powerful sequential models, have been proposed for CCG supertagging and achieved good performances. In this paper, a variant of recurrent networks is proposed whose design makes it much easier to train and memorize information for long range dependencies based on gated recurrent units(GRUs), which have been recently introduced on some but not all tasks. Results of the experiments revealed the effectiveness of the proposed method on the CCGBank datasets and show that the model has comparable accuracy with the previously proposed models for CCG supertagging.