Elections Analyzer 2018 – Week 5 Premortem

Ok, last week wasn’t a very productive week so this week we will focus all our energy in finishing a real working data mining demo with a cron job running in the background to extract the tweets from the table and demo the process to get the sentiment analysis.

This week will be important for me to finish all the connections to the database and the implementation of the whole architecture of the software so my team can continue doing what they are doing.

First I will work on the cron job first.

Elections Analyzer 2018 – Week 4 Premortem

This week I’m motivated to finish the connection to the database to create the raw_tweet implementation and begin analyzing this raw_tweet.

I think this week is just continue to implement the same things that last week before the presentation, so in this blog post I’m gonna introduce you to our team.

Elections Analyzer 2018 – Week 3 Postmortem

This week we worked on getting all together for the presentation with the teacher. Mike developed a basic database connection that allowed us to get the tweets Alex and I were mining and store it in a MySQL table that will be the bridge to analyze stuff later on and clear the information, do the sentiment analysis and determine if the tweet belong to the candidate.

Alfonso worked on implementing all the unit testing of some basic arithmetic functions. We are still learning about the unit testing so it will take us some time to adapt to a test our functionality there.

Elections Analyzer 2018 – Week 3 Premortem

The weekend I worked on the Makefile of the project so everybody can install dependencies and the devOps work is fully done with space for test, install and run the project in developer mode and in production.

This was hard for me because I needed to be sure that it worked in both Unix and Windows systems, so I tested it in MacOS, Ubuntu and Windows.

There’s still more work to do this week because everything is already installed but for the first delivery we plan to finish the integration of the API with the streaming tweets and saving them in a temporary local database so we can ensure that later on we will be able to process the data and store it in any service the other team decides.

So for this week my personal tasks I want to make is the easy connection with the API with a good python interface that handles the data correctly asynchronously.