Assessing the Impact of Minor Modifications on the Interior Structure of Gru: Gru1 and Gru2
Loading...
Files
Date
2022
Authors
Yigit, Gulsum
Amasyali, Mehmet Fatih
Journal Title
Journal ISSN
Volume Title
Publisher
Wiley
Open Access Color
Green Open Access
Yes
OpenAIRE Downloads
OpenAIRE Views
Publicly Funded
No
Abstract
In this study, two GRU variants named GRU1 and GRU2 are proposed by employing simple changes to the internal structure of the standard GRU, which is one of the popular RNN variants. Comparative experiments are conducted on four problems: language modeling, question answering, addition task, and sentiment analysis. Moreover, in the addition task, curriculum learning and anti-curriculum learning strategies, which extend the training data having examples from easy to hard or from hard to easy, are comparatively evaluated. Accordingly, the GRU1 and GRU2 variants outperformed the standard GRU. In addition, the curriculum learning approach, in which the training data is expanded from easy to difficult, improves the performance considerably.
Description
Keywords
curriculum learning, gated recurrent units, recurrent neural networks, Seq2seq, short-term dependency, Seq2seq, curriculum learning, gated recurrent units, short-term dependency, recurrent neural networks
Turkish CoHE Thesis Center URL
Fields of Science
0202 electrical engineering, electronic engineering, information engineering, 02 engineering and technology
Citation
WoS Q
Q3
Scopus Q
Q2

OpenCitations Citation Count
3
Source
Concurrency and Computation-Practice & Experience
Volume
34
Issue
20
Start Page
End Page
PlumX Metrics
Citations
CrossRef : 4
Scopus : 4
Captures
Mendeley Readers : 5
SCOPUS™ Citations
4
checked on Feb 01, 2026
Web of Science™ Citations
3
checked on Feb 01, 2026
Page Views
2
checked on Feb 01, 2026
Google Scholar™


