Skip to content

Commit

Permalink
move State of the art engineering
Browse files Browse the repository at this point in the history
  • Loading branch information
robmarkcole committed Oct 9, 2021
1 parent 4fbaafb commit 043f981
Showing 1 changed file with 11 additions and 11 deletions.
22 changes: 11 additions & 11 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,9 @@ This document lists resources for performing deep learning (DL) on satellite ima
* [ML best practice](https://github.com/robmarkcole/satellite-image-deep-learning#ml-best-practice)
* [ML metrics](https://github.com/robmarkcole/satellite-image-deep-learning#ml-metrics)
* [Datasets](https://github.com/robmarkcole/satellite-image-deep-learning#datasets)
* [State of the art engineering](https://github.com/robmarkcole/satellite-image-deep-learning#state-of-the-art-engineering)
* [Online platforms for analytics](https://github.com/robmarkcole/satellite-image-deep-learning#online-platforms-for-analytics)
* [Free online compute](https://github.com/robmarkcole/satellite-image-deep-learning#free-online-compute)
* [State of the art engineering](https://github.com/robmarkcole/satellite-image-deep-learning#state-of-the-art-engineering)
* [Cloud providers](https://github.com/robmarkcole/satellite-image-deep-learning#cloud-providers)
* [Deploying models](https://github.com/robmarkcole/satellite-image-deep-learning#deploying-models)
* [Image formats, data management and catalogues](https://github.com/robmarkcole/satellite-image-deep-learning#image-formats-data-management-and-catalogues)
Expand Down Expand Up @@ -781,16 +781,6 @@ Since there is a whole community around GEE I will not reproduce it here but lis
* [BlenderGIS](https://github.com/domlysz/BlenderGIS) could be used for synthetic data generation
* [bifrost.ai](https://www.bifrost.ai/) -> simulated data service with geospatial output data formats

# State of the art engineering
* Compute and data storage are moving to the cloud. Read how [Planet](https://cloud.google.com/customers/planet) and [Airbus](https://cloud.google.com/customers/airbus) use the cloud
* Google Earth Engine and Microsoft Planetary Computer are democratising access to massive compute platforms
* No-code platforms and auto-ml are making ML techniques more accessible than ever
* Custom hardware is being developed for rapid training and inferencing with deep learning models, both in the datacenter and at the edge
* Supervised ML methods typically require large annotated datasets, but approaches such as self-supervised and active learning are offering alternatives pathways
* Traditional data formats aren't designed for processing on the cloud, so new standards are evolving such as COGS and STAC
* Computer vision traditionally delivered high performance image processing on a CPU by using compiled languages like C++, as used by OpenCV for example. The advent of GPUs are changing the paradigm, with alternatives optimised for GPU being created, such as [Kornia](https://kornia.github.io/)
* Whilst the combo of python and keras/tensorflow/pytorch are currently preeminent, new python libraries such as [Jax](https://github.com/google/jax) and alternative languages such as [Julia](https://julialang.org/) are showing serious promise

# Online platforms for analytics
* [This article discusses some of the available platforms](https://medium.com/pangeo/cloud-native-geoprocessing-of-earth-observation-satellite-data-with-pangeo-997692d91ca2)
* [Pangeo](http://pangeo.io/index.html) -> There is no single software package called “pangeo”; rather, the Pangeo project serves as a coordination point between scientists, software, and computing infrastructure. Includes open source resources for parallel processing using Dask and Xarray. Pangeo recently announced their 2.0 goals: pivoting away from directly operating cloud-based JupyterHubs, and towards eductaion and research
Expand Down Expand Up @@ -820,6 +810,16 @@ A GPU is required for training deep learning models (but not necessarily for inf
* Tensorflow, pytorch & fastai available but you may need to update them
* Advantage that many datasets are already available

# State of the art engineering
* Compute and data storage are moving to the cloud. Read how [Planet](https://cloud.google.com/customers/planet) and [Airbus](https://cloud.google.com/customers/airbus) use the cloud
* Google Earth Engine and Microsoft Planetary Computer are democratising access to massive compute platforms
* No-code platforms and auto-ml are making ML techniques more accessible than ever
* Custom hardware is being developed for rapid training and inferencing with deep learning models, both in the datacenter and at the edge
* Supervised ML methods typically require large annotated datasets, but approaches such as self-supervised and active learning are offering alternatives pathways
* Traditional data formats aren't designed for processing on the cloud, so new standards are evolving such as COGS and STAC
* Computer vision traditionally delivered high performance image processing on a CPU by using compiled languages like C++, as used by OpenCV for example. The advent of GPUs are changing the paradigm, with alternatives optimised for GPU being created, such as [Kornia](https://kornia.github.io/)
* Whilst the combo of python and keras/tensorflow/pytorch are currently preeminent, new python libraries such as [Jax](https://github.com/google/jax) and alternative languages such as [Julia](https://julialang.org/) are showing serious promise

# Cloud providers
An overview of the most relevant services provided by AWS and Google. Also consider Microsoft Azure, or one of the many smaller but more specialised platorms such as [spell.ml](https://spell.ml/) or [paperspace](https://www.paperspace.com/)

Expand Down

0 comments on commit 043f981

Please sign in to comment.