THESIS
2023
1 online resource (xi, 52 pages) : illustrations (some color)
Abstract
In this thesis, we investigate the potential of the Transformer algorithm for short-term end-user
load demand forecasting in power systems. To this end, we compare the performance of the
Transformer model with that of two conventional recurrent neural network models, namely
Long Short-Term Memory and Gated Recurrent Units. Our findings indicate that although the
Transformer model can effectively parallelize the learning process and handle long sequence
dependencies, it does not perform superior to conventional Recurrent Neural Network
architectures that have undergone an architectural modification. This study sheds light on the
potential of architectural modification for the Transformer algorithm in the context of time
series applications, specifically in load demand forecasting in powe...[
Read more ]
In this thesis, we investigate the potential of the Transformer algorithm for short-term end-user
load demand forecasting in power systems. To this end, we compare the performance of the
Transformer model with that of two conventional recurrent neural network models, namely
Long Short-Term Memory and Gated Recurrent Units. Our findings indicate that although the
Transformer model can effectively parallelize the learning process and handle long sequence
dependencies, it does not perform superior to conventional Recurrent Neural Network
architectures that have undergone an architectural modification. This study sheds light on the
potential of architectural modification for the Transformer algorithm in the context of time
series applications, specifically in load demand forecasting in power systems. Furthermore, it
emphasizes the potential of the time series transformer approach as a viable alternative for
addressing forecasting challenges in power systems.
Post a Comment