Submitted:
11 July 2024
Posted:
14 July 2024
You are already at the latest version
Abstract
Keywords:
1. Introduction
2. Background and Related Works
2.1. Time-Series Classification
2.2. Time-Series Clustering
2.3. Time-Series Regression
3. Photometric Data and Data Pre-Processing
A TSER model is represented as a function → , where denotes a set of time series. The objective of time-series extrinsic regression is to derive a regression model from a dataset , where each represents a time series and represents a continuous scalar value.
dex; mag;
- Phase Folding: In phase folding, observations of a variable star’s brightness over time are transformed into a phase-folded light curve. This involves folding the observations based on the star’s known or estimated period. The period is the duration of one complete cycle of variability, such as the time it takes for a star to pulsate or undergo other periodic changes. By folding the observations, multiple cycles of variability are aligned so that they overlap, simplifying the analysis of the star’s variability pattern. This technique allows astronomers to better understand the periodic behavior of variable stars and to compare observations more effectively.
- Phase Alignment: Alignment refers to the process of adjusting or aligning multiple observations of a variable star’s light curve to a common reference point. This is particularly important when studying stars with irregular or asymmetric variability patterns. By aligning observations, astronomers can more accurately compare the shape, timing, and amplitude of variations in the star’s brightness. This helps in identifying patterns, detecting periodicity, and studying the underlying physical mechanisms driving the variability. Therefore, RRab type stars have a sawtooth-shaped light curve, which is indeed asymmetric, with a rapid rise and a slow decline. For these kind of stars, the phase alignment mentioned is particularly important.
4. Metodology
4.1. Model Selection and Optimization
4.2. Network Architectures
4.2.1. Fully Convolutional Network
4.2.2. InceptionTime
4.2.3. Residual Network
4.2.4. Long Short-Term Memory and Bi-directional Long Short-Term Memory
4.2.5. Gated Recurrent Unit and Bi-directional Gated Recurrent Unit
4.2.6. Convolutional GRU and Convolutional LSTM
5. Results and Discussion
5.1. Experiment setup
5.2. Results of the experiments
5.3. Performance Comparison of Predictive Models
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
| ESA | European Space Agency |
| HDS | High Dispersion Spectroscopy |
| BP and RP | Blue Photometer and Red Photometer |
| TSER | Time Series Extrinsic Regression |
| TSC | Time Series Classification |
| BJD | Barycentric Julian Day |
| RNN | Recurrent Neural Networks |
| CNN | Convolutional Neural Networks |
| dTCN | dilated Temporal Convolutional Neural Networks |
| tCNN | temporal Convolutional Neural Networks |
| LSTM and BiLSTM | Long Short-Term Memory and Bi-Directional Long Short-Term Memory |
| GRU and BiGRU | Gated Recurrent Unit and Bi-Directional Gated Recurrent Unit |
| FCN | Fully Convolutional Network |
| ReLU | Rectified Linear Unit |
| ResNet | Residual Network |
| SOM | Self-Organizing Maps |
| RF | Random Forest |
| MSE | Mean squared error |
| RMSE and wRMSE | Root Mean squared error and weighted Root Mean Squared Error |
| MAE and wMAE | Mean Absolute Squared Error and weighted Mean Absolute Squared Error |
Appendix A
| Raw catalog | ||||||||||||||||||||||||||||||||||||
| FCN | ResNet | Inception | LSTM | BiLSTM | GRU | BiGRU | ConvLSTM | ConvGRU | ||||||||||||||||||||||||||||
| Metrics | training | validation | training | validation | training | validation | training | validation | training | validation | training | validation | training | validation | training | validation | training | validation | ||||||||||||||||||
| 0,8898 | 0,7188 | 0,954 | 0,7863 | 0,966 | 0,7888 | -0.0326 | -0.0326 | 0.9323 | 0.9230 | 0.9358 | 0.9247 | 0.9407 | 0.9294 | 0.1636 | 0.1570 | 0.9225 | 0.8980 | |||||||||||||||||||
| wrmse | 0,1042 | 0,1664 | 0,0673 | 0,1451 | 0,0579 | 0,1442 | 0.3168 | 0.3168 | 0.0811 | 0.0865 | 0.0790 | 0.0856 | 0.0759 | 0.0828 | 0.2851 | 0.2862 | 0.0868 | 0.0996 | ||||||||||||||||||
| wmae | 0,0786 | 0,1297 | 0,0443 | 0,1132 | 0,037 | 0,1135 | 0.2367 | 0.2367 | 0.0619 | 0.0658 | 0.0602 | 0.0640 | 0.0578 | 0.0621 | 0.2006 | 0.2031 | 0.0671 | 0.0751 | ||||||||||||||||||
| rmse | 0,1049 | 0,1669 | 0,0676 | 0,1455 | 0,0584 | 0,1447 | 0.3189 | 0.3189 | 0.0814 | 0.0868 | 0.0792 | 0.0858 | 0.0762 | 0.0831 | 0.2869 | 0.2883 | 0.0871 | 0.0998 | ||||||||||||||||||
| mae | 0,0792 | 0,1302 | 0,0446 | 0,1135 | 0,0374 | 0,1139 | 0.2384 | 0.2384 | 0.0622 | 0.0661 | 0.0604 | 0.0642 | 0.0581 | 0.0624 | 0.2021 | 0.2048 | 0.0674 | 0.0754 | ||||||||||||||||||
| Pre-processed catalog without mean magnitude | ||||||||||||||||||||||||||||||||||||
| 0.9303 | 0.9250 | 0.9400 | 0.9247 | 0.9449 | 0.9328 | 0.9343 | 0.9254 | 0.9333 | 0.9258 | 0.9382 | 0.9328 | 0.9329 | 0.9273 | 0.9439 | 0.9232 | 0.9437 | 0.9273 | |||||||||||||||||||
| wrmse | 0.0823 | 0.0854 | 0.0764 | 0.0856 | 0.0732 | 0.0808 | 0.0799 | 0.0851 | 0.0805 | 0.0850 | 0.0775 | 0.0808 | 0.0808 | 0.0841 | 0.0738 | 0.0864 | 0.0739 | 0.0841 | ||||||||||||||||||
| wmae | 0.0600 | 0.0618 | 0.0561 | 0.0614 | 0.0539 | 0.0582 | 0.0594 | 0.0622 | 0.0590 | 0.0614 | 0.0569 | 0.0587 | 0.0595 | 0.0613 | 0.0547 | 0.0635 | 0.0548 | 0.0610 | ||||||||||||||||||
| rmse | 0.0824 | 0.0855 | 0.0767 | 0.0857 | 0.0735 | 0.0811 | 0.0802 | 0.0854 | 0.0807 | 0.0852 | 0.0778 | 0.0812 | 0.0810 | 0.0843 | 0.0742 | 0.0868 | 0.0742 | 0.0842 | ||||||||||||||||||
| mae | 0.0602 | 0.0620 | 0.0564 | 0.0617 | 0.0542 | 0.0585 | 0.0597 | 0.0624 | 0.0593 | 0.0617 | 0.0572 | 0.0591 | 0.0597 | 0.0615 | 0.0550 | 0.0638 | 0.0551 | 0.0613 | ||||||||||||||||||
| Pre-processed catalog | ||||||||||||||||||||||||||||||||||||
| 0.9375 | 0.9317 | 0.9506 | 0.9303 | 0.9508 | 0.9392 | 0.9358 | 0.9301 | 0.9396 | 0.9333 | 0.9447 | 0.9401 | 0.9420 | 0.9368 | 0.9400 | 0.9271 | 0.9410 | 0.9325 | |||||||||||||||||||
| wrmse | 0.0780 | 0.0815 | 0.0693 | 0.0823 | 0.0691 | 0.0769 | 0.0790 | 0.0824 | 0.0767 | 0.0805 | 0.0733 | 0.0763 | 0.0751 | 0.0784 | 0.0764 | 0.0842 | 0.0757 | 0.0810 | ||||||||||||||||||
| wmae | 0.0570 | 0.0585 | 0.0522 | 0.0601 | 0.0520 | 0.0564 | 0.0597 | 0.0612 | 0.0574 | 0.0594 | 0.0547 | 0.0563 | 0.0560 | 0.0575 | 0.0581 | 0.0630 | 0.0571 | 0.0597 | ||||||||||||||||||
| rmse | 0.0780 | 0.0815 | 0.0695 | 0.0823 | 0.0693 | 0.0769 | 0.0792 | 0.0825 | 0.0769 | 0.0807 | 0.0735 | 0.0765 | 0.0753 | 0.0785 | 0.0767 | 0.0844 | 0.0759 | 0.0811 | ||||||||||||||||||
| mae | 0.0571 | 0.0587 | 0.0525 | 0.0603 | 0.0523 | 0.0567 | 0.0600 | 0.0615 | 0.0577 | 0.0597 | 0.0549 | 0.0565 | 0.0562 | 0.0578 | 0.0584 | 0.0633 | 0.0573 | 0.0599 | ||||||||||||||||||
References
- Smith, H.A. RR Lyrae stars; Vol. 27, Cambridge University Press, 2004.
- Tanakul, N.; Sarajedini, A. RR Lyrae variables in M31 and its satellites: An analysis of the galaxy’s population. Monthly Notices of the Royal Astronomical Society 2018, 478, 4590–4601. [Google Scholar] [CrossRef]
- Clementini, G.; Ripepi, V.; Molinaro, R.; Garofalo, A.; Muraveva, T.; Rimoldini, L.; Guy, L.P.; Fombelle, G.J.D.; Nienartowicz, K.; Marchal, O.; Audard, M.; Holl, B.; Leccia, S.; Marconi, M.; Musella, I.; Mowlavi, N.; Lecoeur-Taibi, I.; Eyer, L.; Ridder, J.D.; Regibo, S.; Sarro, L.M.; Szabados, L.; Evans, D.W.; Riello, M. Gaia Data Release 2: Specific characterisation and validation of all-sky Cepheids and RR Lyrae stars. Astronomy and Astrophysics 2019, 622. [Google Scholar] [CrossRef]
- Dékány, I.; Grebel, E.K. Near-infrared Search for Fundamental-mode RR Lyrae Stars toward the Inner Bulge by Deep Learning. The Astrophysical Journal 2020, 898, 46. [Google Scholar] [CrossRef]
- Bhardwaj, A. RR Lyrae and Type II Cepheid Variables in Globular Clusters: Optical and Infrared Properties, 2022. [CrossRef]
- Jurcsik, J.; Kovács, G. Determination of [Fe/H] from the light curves of RR Lyrae stars. Astronomy and Astrophysics, v. 312, p. 111-120 1996, 312, 111–120. [Google Scholar]
- Layden, A.C. The metallicities and kinematics of RR Lyrae. New observations of local stars, 1994.
- Smolec, R. Metallicity dependence of the Blazhko effect. arXiv preprint astro-ph/0503614. [CrossRef]
- Ngeow, C.C.; Yu, P.C.; Bellm, E.; Yang, T.C.; Chang, C.K.; Miller, A.; Laher, R.; Surace, J.; Ip, W.H. The palomar transient factory and RR Lyrae: the metallicity-light curve relatino based on ab-type RR Lyrae in kepler field. The Astrophysical Journal Supplement Series 2016, 227, 30. [Google Scholar] [CrossRef]
- Skowron, D.; Soszyński, I.; Udalski, A.; Szymański, M.; Pietrukowicz, P.; Poleski, R. ; Wyrzykowski,.; Ulaczyk, K., Kozłowski, S., Skowron, J., Eds.; others. OGLE-ing the Magellanic System: Photometric Metallicity from Fundamental Mode RR Lyrae Stars, 2016. [Google Scholar]
- Mullen, J.P.; Marengo, M.; Martínez-Vázquez, C.E.; Neeley, J.R.; Bono, G.; Dall’Ora, M.; Chaboyer, B.; Thévenin, F.; Braga, V.F.; Crestani, J.; Fabrizio, M.; Fiorentino, G.; Gilligan, C.K.; Monelli, M.; Stetson, P.B. Metallicity of Galactic RR Lyrae from Optical and Infrared Light Curves. I. Period–Fourier–Metallicity Relations for Fundamental-mode RR Lyrae. The Astrophysical Journal 2021, 912, 144. [Google Scholar] [CrossRef]
- Crestani, J.; Fabrizio, M.; Braga, V.F.; Sneden, C.; Preston, G.; Ferraro, I.; Iannicola, G.; Bono, G.; Alves-Brito, A.; Nonino, M.; D’Orazi, V.; Inno, L.; Monelli, M.; Storm, J.; Altavilla, G.; Chaboyer, B.; Dall’Ora, M.; Fiorentino, G.; Gilligan, C.; Grebel, E.K.; Lala, H.; Lemasle, B.; Marengo, M.; Marinoni, S.; Marrese, P.M.; Martínez-Vázquez, C.E.; Matsunaga, N.; Mullen, J.P.; Neeley, J.; Prudil, Z.; da Silva, R.; Stetson, P.B.; Thévenin, F.; Valenti, E.; Walker, A.; Zoccali, M. On the Use of Field RR Lyrae as Galactic Probes. II. A New ΔS Calibration to Estimate Their Metallicity*. The Astrophysical Journal 2021, 908, 20. [Google Scholar] [CrossRef]
- Gilligan, C.K.; Chaboyer, B.; Marengo, M.; Mullen, J.P.; Bono, G.; Braga, V.F.; Crestani, J.; Dall’Ora, M.; Fiorentino, G.; Monelli, M.; others. Metallicities from high-resolution spectra of 49 RR Lyrae variables. Monthly Notices of the Royal Astronomical Society 2021, 503, 4719–4733. [Google Scholar]
- Vallenari, A.; Brown, A.G.; Prusti, T.; De Bruijne, J.H.; Arenou, F.; Babusiaux, C.; Biermann, M.; Creevey, O.L.; Ducourant, C.; Evans, D.W.; others. Gaia data release 3-summary of the content and survey properties. Astronomy & Astrophysics 2023, 674, A1. [Google Scholar]
- Clementini, G.; Ripepi, V.; Garofalo, A.; Molinaro, R.; Muraveva, T.; Leccia, S.; Rimoldini, L.; Holl, B.; de Fombelle, G.J.; Sartoretti, P.; others. Gaia Data Release 3-Specific processing and validation of all-sky RR Lyrae and Cepheid stars: The RR Lyrae sample. Astronomy & Astrophysics 2023, 674, A18. [Google Scholar]
- Naul, B.; Bloom, J.S.; Pérez, F.; Walt, S.V.D. A recurrent neural network for classification of unevenly sampled variable stars. Nature Astronomy 2018, 2, 151–155. [Google Scholar] [CrossRef]
- Aguirre, C.; Pichara, K.; Becker, I. Deep multi-survey classification of variable stars, 2019. [CrossRef]
- Jamal, S.; Bloom, J.S. On Neural Architectures for Astronomical Time-series Classification with Application to Variable Stars. The Astrophysical Journal Supplement Series 2020, 250, 30. [Google Scholar] [CrossRef]
- Kang, Z.; Zhang, Y.; Zhang, J.; Li, C.; Kong, M.; Zhao, Y.; Wu, X.B. Periodic Variable Star Classification with Deep Learning: Handling Data Imbalance in an Ensemble Augmentation Way. Publications of the Astronomical Society of the Pacific 2023, 135. [Google Scholar] [CrossRef]
- Allam, T.; Peloton, J.; McEwen, J.D. The Tiny Time-series Transformer: Low-latency High-throughput Classification of Astronomical Transients using Deep Model Compression 2023.
- Rebbapragada, U.; Protopapas, P.; Brodley, C.E.; Alcock, C. Finding anomalous periodic time series : An application to catalogs of periodic variable stars. Machine Learning 2009, 74, 281–313. [Google Scholar] [CrossRef]
- Armstrong, D.J.; Kirk, J.; Lam, K.W.; McCormac, J.; Osborn, H.P.; Spake, J.; Walker, S.; Brown, D.J.; Kristiansen, M.H.; Pollacco, D.; West, R.; Wheatley, P.J. K2 variable catalogue - II. Machine learning classification of variable stars and eclipsing binaries in K2 fields 0-4. Monthly Notices of the Royal Astronomical Society 2016, 456, 2260–2272. [Google Scholar] [CrossRef]
- Mackenzie, C.; Pichara, K.; Protopapas, P. Clustering-based feature learning on variable stars. The Astrophysical Journal 2016, 820, 138. [Google Scholar] [CrossRef]
- Valenzuela, L.; Pichara, K. Unsupervised classification of variable stars. Monthly Notices of the Royal Astronomical Society 2018, 474, 3259–3272. [Google Scholar] [CrossRef]
- Sanders, J.L.; Matsunaga, N. Hunting for C-rich long-period variable stars in the Milky Way’s bar-bulge using unsupervised classification of Gaia BP/RP spectra. Monthly Notices of the Royal Astronomical Society 2023, 521, 2745–2764. [Google Scholar] [CrossRef]
- Surana, S.; Wadadekar, Y.; Bait, O.; Bhosale, H. Predicting star formation properties of galaxies using deep learning. Monthly Notices of the Royal Astronomical Society 2021, 493, 4808–4815. [Google Scholar] [CrossRef]
- Noughani, N.G.; Kotulla, R. Chasing Down Variables from a Decade-Long Dataset.
- R., M.F.; Corral, L.J.; Fierro-Santillán, C.R.; Navarro, S.G. R., M.F.; Corral, L.J.; Fierro-Santillán, C.R.; Navarro, S.G. O-type Stars Stellar Parameter Estimation Using Recurrent Neural Networks 2022.
- Dékány, I.; Grebel, E.K. Photometric Metallicity Prediction of Fundamental-mode RR Lyrae Stars in the Gaia Optical and K s Infrared Wave Bands by Deep Learning. The Astrophysical Journal Supplement Series 2022, 261, 33. [Google Scholar] [CrossRef]
- Dékány, I.; Grebel, E.K.; Pojmański, G. Metallicity Estimation of RR Lyrae Stars From Their I-Band Light Curves. The Astrophysical Journal 2021, 920, 33. [Google Scholar] [CrossRef]
- Tan, C.W.; Bergmeir, C.; Petitjean, F.; Webb, G.I. Time series extrinsic regression: Predicting numeric values from time series data. Data Mining and Knowledge Discovery 2021, 35, 1032–1060. [Google Scholar] [CrossRef] [PubMed]
- Layden, A.C. The metallicities and kinematics of RR Lyrae variables, 1: New observations of local stars. The Astronomical Journal, vol. 108, no. 3, p. 1016-1041 1994, 108, 1016–1041. [Google Scholar] [CrossRef]
- Long, J.; Shelhamer, E.; Darrell, T. Fully convolutional networks for semantic segmentation. Proceedings of the IEEE conference on computer vision and pattern recognition, 2015, pp. 3431–3440. [CrossRef]
- Ioffe, S.; Szegedy, C. Batch normalization: Accelerating deep network training by reducing internal covariate shift. International conference on machine learning. pmlr, 2015, pp. 448–456.
- Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going deeper with convolutions. Proceedings of the IEEE conference on computer vision and pattern recognition, 2015, pp. 1–9. [CrossRef]
- Ismail Fawaz, H.; Forestier, G.; Weber, J.; Idoumghar, L.; Muller, P.A. Deep learning for time series classification: a review. Data mining and knowledge discovery 2019, 33, 917–963. [Google Scholar] [CrossRef]
- Ismail Fawaz, H.; Lucas, B.; Forestier, G.; Pelletier, C.; Schmidt, D.F.; Weber, J.; Webb, G.I.; Idoumghar, L.; Muller, P.A.; Petitjean, F. Inceptiontime: Finding alexnet for time series classification. Data Mining and Knowledge Discovery 2020, 34, 1936–1962. [Google Scholar] [CrossRef]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. Proceedings of the IEEE conference on computer vision and pattern recognition, 2016, pp. 770–778. [CrossRef]
- Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural computation 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
- Pascanu, R.; Mikolov, T.; Bengio, Y. On the difficulty of training recurrent neural networks. International conference on machine learning. Pmlr, 2013, pp. 1310–1318. [CrossRef]
- Cho, K.; Van Merriënboer, B.; Gulcehre, C.; Bahdanau, D.; Bougares, F.; Schwenk, H.; Bengio, Y. Learning phrase representations using RNN encoder-decoder for statistical machine translation. Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP) 2014. [Google Scholar] [CrossRef]
- Karim, F.; Majumdar, S.; Darabi, H.; Chen, S. LSTM fully convolutional networks for time series classification. IEEE access 2017, 6, 1662–1669. [Google Scholar] [CrossRef]
- Yu, Z.; Liu, G.; Liu, Q.; Deng, J. Spatio-temporal convolutional features with nested LSTM for facial expression recognition. Neurocomputing 2018, 317, 50–57. [Google Scholar] [CrossRef]
- Majd, M.; Safabakhsh, R. Correlational convolutional LSTM for human action recognition. Neurocomputing 2020, 396, 224–229. [Google Scholar] [CrossRef]






| id | source_id | period | AmpG | #epochs | [Fe/H] | [Fe/H] |
|---|---|---|---|---|---|---|
| 0 | 5978423987417346304 | 0.415071 | 0.61029154 | 53 | -0.144963 | 0.398111 |
| 1 | 5358310424375618304 | 0.407642 | 0.6174223 | 56 | -0.223005 | 0.391468 |
| 2 | 5341271082206872704 | 0.327778 | 0.7399841 | 53 | 0.087612 | 0.382031 |
| 3 | 5844089608021904768 | 0.459576 | 0.47177884 | 54 | -0.380516 | 0.396500 |
| 4 | 5992931321712867200 | 0.390948 | 0.76943225 | 63 | -0.256892 | 0.391830 |
| ... | ... | ... | ... | ... | ... | ... |
| 5997 | 5917421845281955584 | 0.532958 | 0.9153245 | 52 | -1.507490 | 0.379151 |
| 5998 | 4659766188753815552 | 0.413777 | 0.9984105 | 245 | -0.758832 | 0.373709 |
| 5999 | 5868263951719014528 | 0.365109 | 1.0959375 | 66 | -0.300124 | 0.380579 |
| 6000 | 5963340573264428928 | 0.452752 | 1.0733474 | 64 | -1.079237 | 0.369655 |
| 6001 | 5796804423258834560 | 0.510323 | 1.0356201 | 57 | -1.553741 | 0.372771 |
| Layers | Hyperparameters | Parameters |
|---|---|---|
| input_1 | ... | 0 |
| gru_1 | 20 units | 1440 |
| dropout_1 | = 0.2 | 0 |
| gru_2 | 16 units | 1824 |
| dropout_2 | = 0.2 | 0 |
| gru_3 | 8 units | 624 |
| dropout_3 | = 0.1 | 0 |
| dense | ... | 9 |
| Metrics | Proposed best model (GRU) | Dekány best model (BiLSTM) | |
|---|---|---|---|
| Dataset (number of RR Lyrae stars) | 6002 | 4458 | |
| regression performance | training | 0.9447 | 0.96 |
| validation | 0.9401 | 0.93 | |
| wRMSE | training | 0.0733 | 0.10 |
| validation | 0.0763 | 0.13 | |
| wMAE | training | 0.0547 | 0.07 |
| validation | 0.0563 | 0.10 | |
| RMSE | training | 0.0735 | 0.15 |
| validation | 0.0765 | 0.18 | |
| MAE | training | 0.0549 | 0.12 |
| validation | 0.0565 | 0.13 | |
| Architecture | GRU (3 layers) | BiLSTM (2 layers) | |
| Regularization | L1, L2 | L1 | |
| Dropout | Yes | Yes |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).