Neural network model for dependency parsing incorporating global vector feature

LSTM and piecewise CNN were utilized to extract word vector features and global vector features,respectively.Then the two features were input to feed forward network for training.In model training,the probabilistic training method was adopted.Compared with the original dependency paring model,the pr...

Full description

Saved in:
Bibliographic Details
Main Authors: Hengjun WANG, Nianwen SI, Yulong SONG, Yidong SHAN
Format: Article
Language:zho
Published: Editorial Department of Journal on Communications 2018-02-01
Series:Tongxin xuebao
Subjects:
Online Access:http://www.joconline.com.cn/zh/article/doi/10.11959/j.issn.1000-436x.2018024/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:LSTM and piecewise CNN were utilized to extract word vector features and global vector features,respectively.Then the two features were input to feed forward network for training.In model training,the probabilistic training method was adopted.Compared with the original dependency paring model,the proposed model focused more on global features,and used all potential dependency trees to update model parameters.Experiments on Chinese Penn Treebank 5 (CTB5) dataset show that,compared with the parsing model using LSTM or CNN only,the proposed model not only remains the relatively low model complexity,but also achieves higher accuracies.
ISSN:1000-436X