Files
Download Full Text (295 KB)
Publisher
Technological University Dublin
Description
The location of intersections is an important consideration for vulnerable road users such as People with Blindness or Visually Impairment (PBVI) or children. Route planning applications, however, do not give information about the location of intersections as this information is not available at scale. In this paper, we propose a deep learning framework to automatically detect the location of intersections from satellite images using convolutional neural networks. For this purpose, we labelled 7,342 Google maps images from Washington, DC, USA to create a dataset. This dataset covers a region of 58.98 km$^{2}$ and has 7,548 intersections. We then applied a recent object detection model (EfficientDet) to detect the location of intersections. Experiments based on the road network in Washington, DC, show that the accuracy of our model is within 5 meters for 88.6\% of the predicted intersections. Most of our predicted centres of the intersections (approx 80\%) are within 2 metres of the ground truth centre. Using hybrid images, we obtained an average recall and an average precision of 76.5\% and 82.8\% respectively, computed for values of Intersection Over Union (IOU) from 0.5 to 0.95, step 0.05. We have published an automation script to enable the reproduction of our dataset for other researchers.
Publication Date
2023
Keywords
intersections, vulnerable road users, People with Blindness or Vision Impairment (PBVI), deep learning framework, convolutional neural networks
Disciplines
Computer Sciences
Conference
First Annual Teaching and Research Showcase 2023
DOI
https://doi.org/10.21427/NM1E-9H33
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-Share Alike 4.0 International License.
Recommended Citation
El-taher, F., Miralles, L., Courtney, J., & Mckeever, S. (2023). Detecting Road Intersections from Satellite Images using Convolutinal Neural Networks. Technological University Dublin. DOI: 10.21427/NM1E-9H33