Space Debris Danger Zone Mapping

High-Level Project Summary

The main goal was to develop an open-source 3d geospatial app that displays information on known active satellites and debris, with utilizing both Data Mining and ML techniques to provide additional analytics. Among the main features, our project provides more accurate historical TLE data representation and a new perspective on the debris mapping — density-based debris risk zones, highlighting active satellites that are in a dangerous proximity to them. We believe such improvements will provide new risk management opportunities for the NASA experts and analytics.

Detailed Project Description

Our project has the next list of main features that distinguish us from the existing solutions:


1. More accurate historical data:

Publicly stored TLE and the SGP algorithm can give adequate results only for less than a week (both in the future and the past) due to the chaotic nature of the spatial data. However, existing implementations tend to use the same estimates even for the time frames that happened years ago, therefore, giving extremely inaccurate historical data. Our solution was to extract all existing historical epochs from space-track.org and then utilize them to properly initialize satellites/debris in the SGP algorithm for a user-specified timestamp. By doing so, we enable experts, analysts, and ordinary users to see more accurate historical events



2. Risk zones:

Our applications not only maps existing debris on the Earth's 3D surface like existing solutions but also uses advanced Data Mining techniques, such as kernel density estimations, to create "zones" with higher debris concentrations that increase the chances of space object (both active satellites/stations and debris) collision. Thus, giving the basis for a more complex analytic platform to help mitigated equipment damage, fight the Kessler syndrome and clearing the space


3. Endangered active satellites:

As an example of want can be implemented utilizing risk zones, we've added a real-time highlighting of the active satellites that are in dangerous proximity to the high-density risk zones. There are three levels of danger: minimal (these satellites are grey), low-medium risk (that are less than 300 km from the borders of the nearest risk zone, highlighted in lime-green), and high risk (less than 100 km, crimson red)


4. Orbit population graphs:


A simple set of analytical charts that give some additional useful and easy-to-access information about the altitude debris density, its deviation, as well as a grouped (by satellite/debris type) histogram plot with an "orbit population".

So, our project can be used as a whole application or be scrapped in parts to just use API scripts for risk zones calculation or data extraction. The app is open-source and available publicly on GitHub!


The potential future we see for our project:





  • Potentially more analytical data will be added so the project can be an analytical dashboard for space-cleaning missions and scientists. An ability to real-time display and analysis of crucially important data and alerts can be added. 
  • Clusters of trash objects, which have similar and close orbits, can be identified. This information can be used by future space missions to collect more trash in one run.
  • Info about object dangerousness can be used to select which trash objects should be removed with higher priority.
  • The potential use of improved ML models will help to increase prediction accuracy. However, it will take more than 2 hackathon days and aerodynamics specialists to build such modes.
  • Trash objects with lower orbits should be collected\pushed into the atmosphere to burn in the first place, so them having less risk of collision with other objects.
  • Positioning satellites along with trash objects in advance could potentially help to avoid collisions of space objects and the growing risk of Kessler syndrome.

Space Agency Data

We've utilized a public SGP (Simplified perturbations models) data and algorithm provided by The United States Space Surveillance Network (which is operated by the U.S. Space Force) that contains numerous TLE (two-line element) sets, produced by NORAD and NASA. Also, the space-track.com was used to gain historical data.

From this data we extracted all historical epochs and used the calculation to gain the latitude, longitude and height for each point of the time, that helped us in the developing of all main features and in developing the ML model to test our hypothetizes.

Hackathon Journey

Day 1

9:00 - Start importing history data to DB.

9:20 - Creating the initial application architecture.

10:25 - installed Clone Space Birds library

11:00 - Installing the necessary libraries and creating the environment.

11:30 - Data exploration

11:30 - Finished understanding how the Space Birds the library works

11:30 - Discussed project's vision and data science related stuff with Volodymyr Sofinskyi

11:35 - Added an ability to host application locally

11:40 - Change library from real-time update to manual update

12:00 - Discussed project's vision and visualization features with Anna Voropaeva

12:00 - Configuring the Docker and Docker-Compose.

12:20 - Fixed the issue with historical data via www.space-track.org, beginning to parse the data

12:25 - developed scripts for historical TLE conversion to the .CSV files to then fill the DB

12:45 - beginning to work with density calculations

12:50 - Prepared coloured dots, working on disabling Follow mode in library by default, planning integration with backend

13:00 - Analysis of existing approaches for predicting chaotic time series dependencies, selection of neural network architecture

13:00 - Writing an SQL query.

13:30 - Building and training a neural network

13:30 - Deleting duplicate records from the DB.

14:00 - Creating SQL scripts for automatic initialization of the database at startup.

14:00 - Investigation of other approaches to solving the prediction problem

15:00 - Evaluation of the results of building regression, decision trees, random forest, regression knn

15:00 - Creating services and controllers for REST-requests.

15:20 - Finished parsing the TLE data, saved to the CSV and after that transferred to the DB

15:45 - Based on the results, reprocessing the data

16:00 - Data Analysis of existing data with altitude estimations on (All satellites, Cosmos, Iridium an Fengyun debris)

16:10 - Finished with density calculation

16:30 - Getting and analysing the results of the neural network

17:00 - Rebuilding a new neural network architecture

17:30 - developed an API for the density calculation

18:00 - Standard Deviation of altitudes for Cosmos and Iridium heights

18:30 - Analysis of approaches to tuning neural networks

19:00 - Training the network and analysing the first iterations

20:00 - Analysis of results

21:00 - Creation of a report on the results

21:00 - valuation of SGP4 predictions

22:00 - Connecting the app to the Python ML application.

22:20 - Adding caching to the app.


Day 2

1:00 - Change the structure of responses for several services

3:00 - Unite everything in a singular app

6:00 - testing

8:00 - implementing minor adjustments, bug fixes

10:00 - Evaluation of ML-model designed by us

10:00 - Start working on the project description, presentation

11:50 - Finished project description

References

Data:

celestrak.com/

space-track.org/


SGP4 API:

https://pypi.org/project/sgp4/

https://rhodesmill.org/skyfield/


Visualization library:

https://github.com/WorldWindLabs/SpaceBirds


Languages: Python, Java, JS


Also, we want to thank Volodymyr Sofinskyi and Anna Voropaeva for their mentorship, views and advices. Special thanks to Roman Malkevych for giving us an opportunity and aspiration for the project!

Tags

#SpaceDebris #SpaceSafety #NasaSpaceAppChallenge #DebrisMapping #Space #Debris #Software #DataMining #DataScience #MachineLearning

Global Judging

This project has been submitted for consideration during the Judging process.