Prediction of Lightning Strikes Electrical Transmission Lines Using Machine Learning Approaches
Keywords:
Transformer model, RNN, LSTM, lightning strike forecasting, transmission tower, time-series predictionAbstract
This study evaluates and compares the performance of three Machine Learning models—RNN, LSTM, and Transformer—for forecasting lightning strikes that can disrupt transmission towers along the 150 kV Bukit Asam - Baturaja line. Using historical lightning data from 2018 to 2024, the models were trained and validated, with the Transformer model demonstrating superior predictive accuracy. The Transformer achieved a high R-squared value of 0.9543, significantly outperforming RNN and LSTM models. Results indicate that the Transformer model's self-attention mechanism effectively captures long-term dependencies and patterns, making it a reliable choice for forecasting. However, the study is limited to this specific region and dataset, highlighting the need for future research to incorporate additional variables, such as meteorological and geographical factors, for improved adaptability. The findings underscore the importance of accurate and efficient forecasting models to support proactive measures and mitigate lightning-induced disturbances on transmission infrastructure.chnology for risk mitigation in electrical transmission networks.
Downloads
References
Hochreiter, S. & Schmidhuber, J., “Long Short-Term Memory,” Neural Computation, vol. 9, no. 8, pp. 1735–1780, 1997.
Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, Ł., & Polosukhin, I., “Attention Is All You Need,” Advances in Neural Information Processing Systems, vol. 30, 2017.
S. Lilik, "Optimalkan Pemanfaatan Big Data dan Machine Learning untuk Prediksi Pemadaman Listrik," Fakultas Teknologi Maju dan Multidisiplin, Universitas Airlangga, 2021.
Hyndman, R. J. & Athanasopoulos, G., Forecasting: Principles and Practice, 3rd ed., OTexts, 2018.
Jin X, Zhang J, Kong J, Su T, Bai Y. A Reversible Automatic Selection Normalization (RASN) Deep Network for Predicting in the Smart
Agriculture System. Agronomy 2022;12. https://doi.org/10.3390/agronomy12030591.
Michael I. Jordan. Serial order: A parallel, distributed processing approach. Technical report, Institute for Cognitive Science, University of California, San Diego, 1986.
S. Hochreiter and J. Schmidhuber, "Long short-term memory," Neural. Comput., vol. 9, no. 8, pp. 1735- 1780, 1997. MIT-Press.
A. Graves, "Generating sequences with recurrent neural networks," arXiv preprint arXiv:1308.0850, 2013.
S. Minaee, E. Azimi, and A. Abdolrashidi, "Deep-sentiment: Sentiment analysis using ensemble of cnn and bi-lstm models," arXiv preprint arXiv:1904.04206, 2019.
W. Fang, Y. Chen, and Q. Xue, "Survey on research of RNN-based spatiotemporal sequence prediction algorithms," J. Big. Data., vol. 3, no. 3, pp. 97, 2021, doi: 10.32604/jbd.2021.016993.
Y. Wang, M. Huang, L. Zhao, and X. Zhu, "Attention-based LSTM for Aspect-level Sentiment Classification," presented at the Conference on Artificial Intelligence, Jan. 2016. doi: 10.18653/v1/D16-1058. [Online]. Available: https://www.researchgate.net/publication/311990858
M. A. Wani, F. A. Bhat, S. Afzal, and A. I. Khan, Advances in deep learning. Springer, 2020.
P. Zerveas, S. Jayaraman, D. Patel, A. Bhamidipaty, and C. Eickhoff, “A Transformer-based Framework for Multivariate Time Series
Representation Learning,” Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining, pp. 2114-2124,
H. S. Tsai, S. Bai, and J. J. Malik, “Transformer-based Feature Aggregation for Video Classification,” IEEE Transactions on Neural
Networks and Learning Systems, vol. 30, no. 5, pp. 1402-1410, 2019.
N. Ahmed, I. Mohammed, and K. Lee, “Improving Transformers for Time Series Forecasting,” International Journal of Neural Networks, vol. 35, no. 7, pp. 185-198, 2022.
J. Su, S. Lin, and G. Sun, “Modeling Time Series Data Using Transformer Architectures,” IEEE Transactions on Artificial Intelligence, vol. 12, no. 3, pp. 142-153, 2021.
A. Shankaranarayana and B. Runje, “Time Series Data Representation Using Timestamp Positional Encodings,” IEEE Transactions on Knowledge and Data Engineering, vol. 33, no. 6, pp. 2211-2220, 2021.
A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, Ł. Kaiser, and I. Polosukhin, “Attention is All You Need,” Advances in Neural Information Processing Systems, vol. 30, pp. 5998-6008, 2017.
D. Jakah, D. Muslim, A. T. Mursito, Z. Zakaria, and E. T. Sumarnadi, "Perlindungan petir, sistem pentanahan dan resistivitas tanah: Studi kasus," Jurnal Teknologi dan Rekayasa, vol. 15, no. 2, pp. 263–270, 2021.
[1] S. Poudel, "Recurrent Neural Network (RNN) Architecture Explained," Medium, May 4, 2020. [Online]. Available:
https://medium.com/@poudelsushmita878/recurrent-neural-network-rnnarchitecture-explained-1d69560541ef. [Accessed: Nov. 28, 2024].
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 ITB Graduate School Conference

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
