DeepCloud : Intra-day satellite prediction of cloudiness using deep learning strategies

 

Autor(es):
Camiruaga, Ignacio ; Herrera, Andrés ; Mozo, Franco
Tipo:
Tesis de grado
Tutor(es) / Supervisor(es):
Alonso Suárez, Rodrigo ; Castro, Alberto ; Marchesoni, Franco
Versión:
Aceptado
Resumen:

This project analyzes deep learning techniques applied to satellite-based cloudiness prediction, a vital component of a solar forecasting solution. The techniques can learn from a dataset to make extrapolation into the future of a sequence of images, a process usually named satellite nowcasting. In this way, intra-day image forecasting is addressed up to 5 hours into the future, with a 10-minute periodicity. The images used are from the GOES-16 geostationary satellite, covering a large portion of southeast South America, including Uruguay, the main region of interest. The new deep learning techniques are compared against strong baselines in the field, such as the persistence and fine-tuned Cloud Motion Vectors strategies, which were previously analyzed for this region in recent studies. Several state-of-the-art architectures are implemented and evaluated over different well-known computer vision metrics as well as forecasting metrics. Our results showed the ability of deep learning models to account for complex atmospheric dynamics and make accurate predictions in a short time span. The main contribution is a deep-learning model based on the U-Net architecture that surpasses in performance all the other state-of-the-art models implemented on this dataset. The new model is presented along with detailed ablation studies and thorough evaluations, that shed light on the behavior and many potential variations of the deep learning solutions.

Año:
2022
Idioma:
Inglés
Temas:
Solar forecast
U-Net
Deep learning
Satellite images
GOES-16 satellite
Pronóstico solar
Aprendizaje profundo
Imágenes satelitales
Satélite GOES-16
Institución:
Universidad de la República
Repositorio:
COLIBRI
Enlace(s):
https://hdl.handle.net/20.500.12008/32272
Nivel de acceso:
Acceso abierto