Multimodal remote sensing for climate monitoring

This project represents an interdisciplinary research initiative aimed at bridging the knowledge gap between AI and multimodal remote sensing data to foster climate monitoring and sustainable development of the planet. Under this overall objective, we explore the following three research directions:

Background

The increasing threat of climate change necessitates effective strategies for monitoring and mitigating events like glacier melting, rising sea levels, and temperature fluctuations. In December 2020, the EU Council endorsed resolutions urging the use of digital solutions to achieve carbon neutrality by 2050. To address the shortcomings of current research, there is a growing focus on leveraging AI and remote sensing techniques for climate crisis management. However, existing efforts mainly rely on individual remote sensing data sources, hindering intelligent analysis and accurate prediction of climate events. Fusing multimodal remote sensing data has become an international scientific frontier and a crucial requirement to tackle climate challenges.

Unified feature representation and fusion for multimodal data

Traditional Earth observation methods primarily rely on single remote sensing data sources, which often exhibit limitations in terms of data richness, temporal coverage, and interpretability. To address these limitations, our objective is to delve into the integration of multimodal remote sensing data for Earth observation. By combining data from diverse sources, such as hyperspectral (HS), multispectral (MS), Synthetic Aperture Radar (SAR), Light Detection and Ranging (LiDAR), Open Street Map (OSM), thermal infrared, and more, we endeavor to enhance the interpretability of intricate scenes and phenomena, reducing ambiguity in the analysis and prediction of surface information.

Deep learning-based Earth system modeling

Earth system modeling plays a pivotal role in understanding the complex interactions within the climate patterns, ecosystems, and geophysical phenomena. Currently, the existing Earth system modeling methods mainly rely on numerical models with physical equations. However, these models may be inflexible to adapt to rapidly changing conditions or integrating new data sources. To address this challenge, our objective is to develop a new paradigm of deep learning-based modeling that can capture intricate dynamics and relationships within Earth’s multifaceted systems.

Methods and use cases for sustainable development

Sustainable development is an urgent global necessity, requiring precise monitoring and informed decision-making to address the intricate challenges posed by climate change. Using AI and remote sensing techniques to monitor and comprehend the impacts of climate change and support sustainable development actions emerges as one of the most promising strategies in this endeavor. While the preceding two research directions focus on feature learning and the modeling of Earth systems using multimodal remote sensing data, we further develop innovative methods and present use cases where multimodal remote sensing can play a pivotal role in advancing sustainable development and climate monitoring.

About the project

Project leader: Yonghao Xu

Funding: This project is partially supported by the Zenith research organization from the Faculty of Science and Engineering at Linköping University.