The work in this project is grouped into four work packages as listed below:
1. Data handling and tools
“create an unprecedented information hub on air quality related data, which can then be explored in future projects to improve the understanding of atmospheric composition processes”
Canonical Analysis Workflows – reproducibility and reusability on air quality data
The TOAR database and its interfaces
GeoDataServices
2. Interpolation
“address interpolation of air quality data through the unique combination of relevant datasets and through careful choice of the machine learning techniques that will be applied including the optimisation of neural network architectures based on inspection of the learned system state”
How does AI predict ozone based on environmental features?
A benchmark dataset for machine learning on global air quality metrics
Web processing service for ozone flux calculations
Importance of meteorological and spatial patterns for the interpolation of air quality data using deep neural networks
3. Forecasting
“develop improved methods for air quality predictions with (deep) neural networks, and possibly demonstrate a novel concept based on movie frame prediction methods”
Temperature predicting by Stochastic Adversarial Video Prediction
Near surface ozone predictions
4. Quality assurance
“explore novel concepts for robust, automated outlier detection and data screening of air quality measurements with deep neural networks”