Nearing Completion !!!

Hi Folks,
              The last evaluation of the project is just a couple of days away I'm finished with the coding part and started with the documentations of the project. When I look back in time ,the project has helped me learn a lot and it gave me a fruitful summer.
              I would like to point out some of the problems that we encountered and which was not covered in the last posts and possible improvements for the project. One of the problem that we faced was the lack of correlation between FORSPEF and PREDICCS data. The plot below shows the variation of the mars radiation dosage over a year and the vertical lines represent the SEP prediction of FORSPEF which has probabilities above 0.25.

              The other part is the observation that we made on the data taken from earth and mars.The below plot shows the plot of time varying earth and mars radiation dosages. We then run correlation tests over the data and found out that there existed only a weak correlation am…

The last phase !!!

Hi folks,         The last phase is here and the project is nearing completion !!!.The last few weeks was mainly about building a web based GUI interface for the project along with some bug fixes here and there. plotly.js was used for plotting the radiation dosages and a flask server was used to host the service. I did not have much experiences working with the front end designing part and i did not like the thought of doing it much too, but it turned out to be pretty interesting.          In the current model i have used the flask server as a client to the REST API build using falcon.The rest of the remaining time will be dedicated to provide a detailed documentation,making the code more readable and optimised, testing the service and fixing bugs.

The second evaluations

Hi folks,
      It's time for the second evaluations. Through this post I would like to give a brief overview of the first and second phases of the project.
      The first phase was more of an algorithmic phase making the deterministic model. It was a pure coding phase since we had a clear cut idea of what to do. And this made the first phase more easier compared to the second phase.
      The second phase was more of a research based than a coding one. Since the aim was to make a machine learning model I need to have sufficient domain knowledge for developing a perfect model. I had done some research during the proposal stage where I found out some of the sources and made a vague plan which I have mentioned in the last blog post. But there was a lot of problems encountered like data scarcity, lack of correlation between data sources and out of sync of data among the sources . I could solve the last part of the syncing issue by consider fixed time frames and processing the data.…

The Second phase !!!

Hi folks,
      Its been almost a month since my last post. The evaluations went well and i have completed the deterministic approach for the Radiation forecasting server. The first week after the evaluations were bug fixes for the code and making some changes as suggested by the mentors.
      The second phase of the project was focused on implementing the forecasting server based on a machine learning approach. And the aim was to integrate both the models to give more confidence to the output.Initially we had planned to collect data from FORSPEF,PREDICCS,NOAA and CACTUS. So i started collecting data from the archives of these sources one by one. The aim was to scrape data from 2007 which was possible in the case of NOAA and CACTUS. For PREDICCS the archive started from 2012 ,but i did the scrapping and cleaning for it too. But then FORSPEF had its archive from 2015 . I searched for SEP event frequency to check whether there is ample data for the project and then found out from this

Three weeks and counting... !!!

Hi folks,
This post will be an update on the works done till my first evaluations.The project is going as per planned and it has been a great experience so far. The learning curve has been steep and i would like to thank my mentors Hitesh and Giovanni for their help and guidance.
The week one of the project has been mainly data scarpping for the deterministic and the machine learning approaches. I chose scrapy to make spiders since the framework is really simple to use and provides a lot of features to make web scraping efficient.The yield  command in scrapy instead of print made me curious and i came across the wonderful concept of generators.Generators are used in memory intensive operations when there is no need for accessing all the elements of a memory intensive list at the same time.Scrapy uses generators for keeping the scrapped data. You could learn more about generators and yield statement from here.( )
For the databas…

Brief of my GSoC project

Hi folks,

It's been a while since my last post.I was caught up with GSoC and some exams. I'll use this post to give a description of what my project is  and about what all I did during the last one month being a GSoCer. Before moving on to my project details I would like to brief about the MARS CITY project by Italian Mars Society.
            A human mission to Mars has been the subject of science fiction, engineering, and scientific proposals since the 19th century.Eventually, humans will most likely journey to Mars.Getting astronauts to the Martian surface and returning them safely to Earth, however, is an extremely difficult engineering challenge. A thorough understanding of the Martian environment is critical to the safe operation of equipment and to human health. MARS CITY project aims to create a simulator for human crew of Mars. Simulators also provide an ideal platform for conducting research in psychology, physiology, medicine, mission operations, human factors and h…