Browse by author
Lookup NU author(s): Kai Liu,
Dr Jie Zhang
Full text for this publication is not currently held within this repository. Alternative links are provided below where available.
© 2021 Elsevier B.V.Recurrent neural network (RNN) is a dynamic neural network where the current network output is related to the previous outputs. Long short-term memory network (LSTM) has emerged as a high-performance RNN. However, the original LSTM does not consider variable and sample relevance for process modelling. To overcome this problem, the paper proposes a Dual-layer Attention-based LSTM (DA-LSTM) network to model a fed-batch fermentation process. In the proposed DA-LSTM, LSTM is used to extract features of the input data and multiple time series results of the hidden layer, an encoder input attention mechanism is to select relevant driving series in the input data sequence, and a temporal decoder attention mechanism is used to measure the importance of encoder hidden states. The model with this deep architecture for high-level representations can learn very complex dynamic systems. To demonstrate the effectiveness of the proposed method, a comparative study with the original LSTM, signal attention-based LSTM is carried out. It is shown that the proposed method gives better modelling performance than others.
Author(s): Liu K, Zhang J
Publication type: Book Chapter
Publication status: Published
Book Title: 31st European Symposium on Computer Aided Process Engineering
Print publication date: 25/06/2021
Online publication date: 18/07/2021
Acceptance date: 02/04/2020
Series Title: Computer Aided Chemical Engineering
Publisher: Elsevier B.V.
Place Published: Amsterdam
Library holdings: Search Newcastle University Library for this item