Abstract:
Presently deep learning (DL) techniques are massively used in the semiconductor industry. At the same time, applying a deep learning approach for small datasets is also an immense challenge as larger dataset generation needs more computational time-cost factors for technology computer-aided design (TCAD) simulation. In this paper, to overcome the aforesaid issue, a hybrid DL-aided prediction of electrical characteristics of the multichannel devices induced by work function fluctuation (WKF) with a smaller dataset is proposed. For the first time, an amalgamation approach of two deep learning algorithms (i.e.1D-CNN and LSTM) is implemented for all four channels (1 to 4 channels) of gate-all-around (GAA) silicon Nanosheet and Nanofin MOSFETs (NS-FETs and NF-FETs). The proposed joint learning framework combines a one-dimensional convolutional neural network (1D-CNN) with long short-term memory (LSTM) model. In this architecture, CNN can extract the features efficiently from the input WKF, and LSTM identifies the historical sequence of the captured features of the input regression data. To illustrate the excellency of the proposed approach, a comparative study of our hybrid model along with three individual DL models i.e. 1D-CNN and LSTM including a baseline multilevel perceptron (MLP) model are demonstrated for a promising small dataset (i.e.1100 samples). The results indicate a superior prediction of 1D-CNN-LSTM in terms of root mean square error (RMSE) with an average value of 1.7943X10−7 and R2 Score with an average value of 96.18% within the shortest time span in contrast to the other three algorithms. Finally, it can be quantified according to the evaluation and performance that hybrid methodology not only adopts the complexity of both NS- and NF-FETs, but also estimates the characteristics of all four channels of it efficiently with a smaller dataset, lesser time span, and reduced computational cost.