Hand gesture recognition is an important way for human machine interaction, and it is widely used in many areas, such as health care, smart home, virtual reality as well as other areas. While many valuable efforts have been made, it still lacks efficient ways to capture fine grain hand gesture as well as track data of long distance dependency in complex gesture. In this paper, we firstly design a novel data glove with two arm rings and a specially integrated three-dimensional flex sensor to capture fine grain motion from full arm and all knuckles. Secondly, an improved deep feature fusion network is proposed to detect long distance dependency in complex hand gestures. In order to track detailed motion features, a convolutional neural network based feature fusion strategy is given to fuse data from multi-sensors by extracting both shallow and deep features. Moreover, a residual module is introduced to avoid over fitting and gradient vanishing during deepening the neural network. Thirdly, a long short term memory (LSTM) model with fused feature vector as input is introduced to classify complex hand motions into corresponding categories. Results of comprehensive experiments demonstrate that our work performs better than related algorithms, especially in America sign language (with the precise of 99.93%) and Chinese sign language (with the precise of 96.1%).