Awards & Nominations

Team Lilo has received the following awards and nominations. Way to go!

Global Nominee

Lilo - a sustainable solution to macroplastic identification and tracking

High-Level Project Summary

Lilo is an app for tracking and identification of macroplastics in the coasts and oceans of earth. Our lilo website displays plastic bottles on the map, these are markers for plastic marine debris that was flagged by our ML data pipeline. https://lilo-plastic-map.netlify.app/Lilo processes Sentinel-2 Satellite imagery, runs inference on this imagery using our machine learning model data pipeline, and visualizes this real time data on our website. We are also planning on building Lilo to become a bounty platform. Lilo users will be able to collect bounties placed by governments by collecting large garbage patches flagged by our machine learning algorithm.

Detailed Project Description

Plastic marine debris is the most dangerous threat to sea life and to the general health of earth’s oceans. To bring about change by educating the public, and to provide a tool for tracking and elimination of this plastic, we created Lilo.


Lilo is an app for tracking and identification of macroplastics in the coasts and oceans of earth. Lilo processes Sentinel-2 Satellite imagery, runs inference on this imagery using our machine learning model data pipeline, and visualizes this real time data on our website. We are also planning on building Lilo to become a bounty platform. Lilo users will be able to collect bounties placed by governments by collecting large garbage patches flagged by our machine learning algorithm.


Lilo at first will serve as a tool for education. Lilo’s current dataset includes identified plastics from high population coastal areas and scans of the infamous garbage patches that exist in different locations of the ocean. The user of the app can choose different dates and times to display identified plastics from (work in progress). The user can also click on plastics identified from our machine learning pipeline to get more information on a singular plastic cluster.


In the future, Lilo will serve as a tracking and elimination platform, alongside our bounty program. 


This will serve two major purposes:




  1. Users who use the lilo app will be able to see an arrow displaying the direction a macroplastic will be heading based on ocean currents when they click on a plastic identifier. This will make it easier for cleanup crews to identify the heading of large clusters of plastic, fishing nets, and other invasive debris. We plan to partner with scientists, ship faring companies, and governments to provide a Bounty program, where large scale macroplastic debris clusters will be marked on the lilo app map with a special marker. These are very large clusters of plastic that ships passing by can collect for a reward from our funding partners. You can find a bounty UI example below of what we would implement if we had more time.

  1. Users can view our machine learning model’s detected plastic data near high population coastal areas to see plastic that is at risk for entering the open ocean and deteriorating into microplastics. This means that the plastic can be stopped at the source by cleanup crews, citizen scientists, and company participants in the bounty program.

Example UI:


We didn’t have the time to fully implement the two features we had planned around the bounty system due to time constraints of the hackathon but we plan to add these features in the future.


How does Lilo work?


Lilo comprises the backend, the machine learning data pipeline, and the frontend website application.


Based on the work of researchers studying microplastic pollution and detection models, the machine learning pipeline uses methods from the eo-learn python library to retrieve SENTINEL-2 data from the web API, analyze bands of raw data using spectral algorithms for differentiating plastics from other materials, then train a predictive machine learning model using Naive Bayes classification. The pipeline then allows us to apply the trained model to real-time data from the SENTINEL Hub, as well as historical data captured by the satellites. Users can select the times and locations they want to view from our website, which presents them with the results of our model for that region at that time. 

A significant discovery by the researchers was that a normalization procedure using data points around the region of interest can significantly improve the classification model’s ability to differentiate between plastics and water or other debris. This pipeline allows for flexible implementation of the normalization procedure, giving researchers the ability to fine-tune the algorithm for a variety of regions around the world where plastics may be co-located with other debris, and difficult to identify using satellite imagery. 


What tools, coding languages, hardware/software did you use to develop the project?



We used AWS to host our infrastructure, our backend is written in golang, frontend is in react and using leaflet. 


As for our data pipeline, This pipeline allows developers to train any scykit model on satellite data compatible with EO-Learn, so we hope to take advantage of better models by training and testing them using this pipeline. While Naive Bayes is a relatively forgiving model for small training datasets, we believe that significant improvements could be made using more complex models and more labeled training data. An important note is that this pipeline considers each small unit of satellite data independently. Using more advanced machine learning could allow us to utilize context from around the region to make better inferences about the location of micro plastics in the ocean.




Lilo Design Doc - backend infrastructure



Lilo’s backend infra is comprised of a single lambda sitting behind api gateway in our AWS account supported by an ec2 instance.


GetDebrisData() -> returns identified plastic debris in json format to the frontend for processing and display on the map




The Lilo frontend app will make a request to the API GetDebrisData on startup, this API will return a large json with the following for the frontend to process and display:


{

[

`date`: 09-23-21 (string)

`id`: 0 (number)

`plastic_data`: [         //json array of plastic lat/long data, a serialized string

{

Latitude: `

Longitude: `

Timestamp: `

Characteristics/Description:


},

{

},

...

]

]


}


When the ML inference job runs, it will put the json data into dynamodb via python sdk, after running for false positives against our NASA ocean current dataset. 


Dynamodb is a key value store, the key will be the date that the satellite images were taken, and the value will be the plastic lat/long geospatial locations of identified plastic. The json may also contain region data of the plastic identified. 



Space Agency Data

We used https://podaac.jpl.nasa.gov/dataset/OSCAR_L4_OC_third-deg ocean current data as input in our machine learning data pipeline to identify false positives.


We also wanted to display ocean current velocity vectors on the frontend of the application website using this NASA dataset but did not have time to implement this.

Hackathon Journey

We weren't able to complete all the features that we wanted to implement for this hackathon, but we learned a lot. Data is very sparse for ocean information such as currents, but we finally found a dataset from NASA that we were able to use as input to our machine learning model inference.


Id like to thank my team mates who did an amazing job working on everything.

References

NASA Open Dataset: https://podaac.jpl.nasa.gov/dataset/OSCAR_L4_OC_third-deg


Sentinel hub data: https://apps.sentinel-hub.com/dashboard/#/


https://github.com/bingr001/LiloFrontend


https://github.com/jryebread/DebrisService

Tags

#ocean #debris #plastic #AI #MachineLearning

Global Judging

This project has been submitted for consideration during the Judging process.