High-Level Project Summary
Our team proposes a solution in which an application is developed to enable astronauts to log events while in the mission, view, and fork the logs of other astronauts into their profile, have AI analyzing logs and classifying them by event type, emotions, people involved, and other tags that would give insights on the events logged. The application enables astronauts to log by voice, image, or video where Natural Language Processing (NLP) and Computer Vision technologies will be used to analyze the events logged and tag them accordingly for filtering and analysis purposes. The application would log meta-data automatically like timestamp, location, and suit status.
Link to Project "Demo"
Link to Final Project
Detailed Project Description
Detailed Solution:
The application will enable people on Earth to view logs of astronauts live in the form of voices, images, and videos that will be taken at the time of the mission. The solution would utilize voice prints and face recognition technologies in addition to IoT pieces mounted to the astronaut suit to ensure a level of security where no astronaut can overwrite or delete the logs of another astronaut, and where access is denied for unapproved people.
The Application enables people on earth to see logs of astronauts at the time of the mission. Astronauts log by text, voice, image, and video where NLP and Computer Vision are used to analyze the log and classify it based on the event type. Astronauts will be able to fork other astronauts’ logs into their profile and approve their logs by the end of the mission.
Our team studied the logs of astronauts and flight controllers from Apollo 11 and Apollo 13 flights and noticed the value of developing the application in a way that utilizes another form of inputs other than text, in addition to using AI in analyzing these inputs. We conducted a brief search on available technologies that can help in the execution of the solution and went through a waterfall of suggestions on what features are or aren’t valuable to be included in the application.
Our application provides a high level of security through voice prints and faces recognition technologies. It makes use of the suit of the astronaut by connecting it to the application for easier logging of events through voice and videos at the time of the mission. In addition to that, there is the feature of emergent logging, where a log is taken automatically in case a dangerous situation is detected through the suit, like damage to the suit or a lack of oxygen or any strange physical status of the astronaut, where this log should include only meta-data like location and timestamp and any voices or videos recorded in that situation.
We developed the Back End of the application, with datasets and APIs to send and receive data regarding the log’s addition and modification and regarding the astronauts as well. Simultaneously, we will be developing the software to handle all the input events from the user like texts, images, voice, and videos, in addition to a website and android application to be the interface of the solution to its users from the astronauts and people on Earth.
Database Model:
Mobile Application:
Web Application:



Space Agency Data
Hackathon Journey
Our Space Apps experience has been pretty interesting, exciting, and informative. We've had a chance to learn more about the different techniques used in journaling space flights, the challenges that astronauts and data analysts face in relation to this journaling process, and we learned how to work in a team and distribute the work evenly based on every team member's skills. The main thing that inspired us to choose this challenge is that it could be approached with a bunch of different tech-related solutions. We designed an integrated framework composed of python software, a web application, and a mobile application that shares real-time logging data over some local network (which might be the ISS network, if applied) and that allows data analysts on Earth to analyze the real-time logs (could be in the form of text, images, audios, or videos) in real-time. We resolved our team issues by great time management and dedication and that was all it took to get us going.
References
[1]: https://history.nasa.gov/afj/ap13fj/08day3-problem.html
[2}: https://history.nasa.gov/afj/ap11fj/01launch.html
[3]: https://www.nasa.gov/mission_pages/NEEMO/NEEMO12/mission_journal_2.html
Tags
#space #moon #python #flutter #APIs #real-time-data #web #mysql #lunar-surface
Global Judging
This project has been submitted for consideration during the Judging process.






