Using Machine Learning to Predict the Correlation of Spectra Using SDSS Magnitudes as an Improvement to the Locus Algorithm

Eugene Hickey, Technological University Dublin
Oisin Creaner, Dublin Institute for Advanced Studies
Kevin Nolan Mr, Technological University Dublin
Tom O'Flynn, Technological University Dublin

Document Type Article

New Astronomy



The Locus Algorithm is a new technique to improve the quality of differential photometry by optimising the choices of reference stars. At the heart of this algorithm is a routine to assess how good each potential reference star is by comparing its SDSS magnitude values to those of the target star. In this way, the difference in wavelengthdependent effects of the Earth’s atmospheric scattering between target and reference can be minimised. This paper sets out a new way to estimate the quality of each reference star using machine learning. A random subset of stars from SDSS with spectra was chosen. For each one, a suitable reference star, also with a spectrum, was chosen. The correlation between the two spectra in the SDSS r band (between 550nm and 700nm) was taken to be the gold-standard measure of how well they match up for differential photometry. The five SDSS magnitude values for each of these stars were used as predictors. A number of supervised machine learning models were constructed on a training set of the stars and were each evaluated on a testing set. The model using Support Vector Regression had the best performance of these models. It was then tested on a final, hold-out, validation set of stars to get an unbiased measure of its performance. With an R2 of 0.62, the SVR model presents enhanced performance for the Locus Algorithm technique.