eo-learn makes extraction of valuable information from satellite imagery easy.
The availability of open Earth observation (EO) data through the Copernicus and Landsat programs represents an unprecedented resource for many EO applications, ranging from ocean and land use and land cover monitoring, disaster control, emergency services and humanitarian relief. Given the large amount of high spatial resolution data at high revisit frequency, techniques able to automatically extract complex patterns in such spatio-temporal data are needed.
eo-learn is a collection of open source Python packages that have been developed to seamlessly access and process spatio-temporal image sequences acquired by any satellite fleet in a timely and automatic manner. eo-learn is easy to use, it’s design modular, and encourages collaboration – sharing and reusing of specific tasks in a typical EO-value-extraction workflows, such as cloud masking, image co-registration, feature extraction, classification, etc. Everyone is free to use any of the available tasks and is encouraged to improve the, develop new ones and share them with the rest of the community.
eo-learn makes extraction of valuable information from satellite imagery as easy as defining a sequence of operations to be performed on satellite imagery. Image below illustrates a processing chain that maps water in satellite imagery by thresholding the Normalised Difference Water Index in user specified region of interest.
eo-learn library acts as a bridge between Earth observation/Remote sensing field and Python ecosystem for data science and machine learning. The library is written in Python and uses NumPy arrays to store and handle remote sensing data. Its aim is to make entry easier for non-experts to the field of remote sensing on one hand and bring the state-of-the-art tools for computer vision, machine learning, and deep learning existing in Python ecosystem to remote sensing experts.
eo-learn is divided into several subpackages according to different functionalities and external package dependencies. Therefore it is not necessary for user to install entire package but only the parts that he needs.
At the moment there are the following subpackages:
eo-learn-core - The main subpackage which implements basic building blocks (
EOWorkflow) and commonly used functionalities.
eo-learn-coregistration - The subpackage that deals with image co-registration.
eo-learn-features - A collection of utilities for extracting data properties and feature manipulation.
eo-learn-geometry - Geometry subpackage used for geometric transformation and conversion between vector and raster data.
eo-learn-io - Input/output subpackage that deals with obtaining data from Sentinel Hub services or saving and loading data locally.
eo-learn-mask - The subpackage used for masking of data and calculation of cloud masks.
eo-learn-ml-tools - Various tools that can be used before or after the machine learning process.
eo-learn-visualization - Visualization tools for core elements of eo-learn.
Examples and introductions to the package can be found here. A large collection of examples is available at the ``eo-learn-examples` <https://github.com/sentinel-hub/eo-learn-examples>`_ repository. While the examples there are not always up-to-date they can be a great source of ideas.
If you would like to contribute to
eo-learn, check out our contribution guidelines.
Blog posts and papers
Introducing eo-learn (by Devis Peressutti)
Use eo-learn with AWS SageMaker (by Drew Bollinger)
Tree Cover Prediction with Deep Learning (by Daniel Moraite)
Tracking a rapidly changing planet (by Development Seed)
Land Cover Monitoring System (by Jovan Visnjic and Matej Aleksandrov)
eo-learn Webinar (by Anze Zupanc)
Land Cover Classification (still to come)
Minimum Agriculture Activity (still to come)
Scale-up your eo-learn workflow using Batch Processing API (by Maxim Lamare)
Questions and Issues
You are welcome to send your feedback to the package authors, EO Research team, through any of Sentinel Hub communication channel.
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreements No. 776115 and No. 101004112.