Difference between revisions of "IS480 Team wiki: 2011T1 TheTalkies Midterm"
|Line 355:||Line 355:|
====5. Getting trends different from twitter trends====
====5. Getting trends different from twitter trends====
To be able to decide the trends around SMU, we need to extract the trends which are different from normal trends from Twitter. We pull the tweets every 15 minutes. Then we have to use our own algorithm to count and extract the trends. Currently, we decided to use 10 as the threshold number of tweets that a topic can become trend around SMU in our application. The server we are provided cannot enable the cron scripts. So, we have to explore a few possibilities such as Google Engine etc to substitute in cron jobs. Fortunately, one of the services suggested by sponsor satisfies our need.
To be able to decide the trends around SMU, we need to extract the trends which are different from normal trends from Twitter. We pull the tweets every 15 minutes. Then we have to use our own algorithm to count and extract the trends. Currently, we decided to use 10 as the threshold number of tweets that a topic can become trend around SMU in our application . The server we are provided cannot enable the cron scripts. So, we have to explore a few possibilities such as Google Engine etc to substitute in cron jobs. Fortunately, one of the services suggested by sponsor satisfies our need.
====6. Building 3D model====
====6. Building 3D model====
Revision as of 17:09, 25 September 2011
- 1 Project Progress Summary
- 2 Project Management
- 3 Technical Complexity
- 4 Quality of product
- 5 Reflection
Project Progress Summary
- Between acceptance presentation(10 Aug 2011) and midterm presentation(26 Aug 2011), there were 3 iterations completed.
- During iteration 3, our team has completed trend page in which users can see top 5 trends in SMU area. We also had completed School of Economics/School of Social Science (SESS) model.
- During iteration 4, we have completed LostnFound page which is a kind of trending page. In this case, user can report what he has found and search what lost in school from our application, by using location key tags such as #SISL1, #SESSL2, #LIBRARYL3, etc. We also completed Library model.
- During iteration 5, we had completed Tweet/Discuss and report function, in which both twitter user and non-twitter user can discuss, and report in Trend page and LostnFound page. In the mean time, we had completed School of Business (SOB) model.
- We finished all the features planned for these 3 iteration at the end of iteration 5, even though, during iterations, we faced some delays and we had been behind schedule for up to a week, the cause of which will be mentioned details in Project Highlights.
- For the time being, we are in the middle of 6th iteration. With iteration 5 finished, we have completed most of our main features, and in following iterations, we will be implementing back-end features such as statistics collection and fine-tuning of the app to increase the performance and user-friendliness of application. We are confident that our planned features for post mid-term presentation, can be done in time if all team members are putting the same efforts as they are doing right now.
Unexpected things happened along the way -
- Germanium API was updated from 1.3 to 1.5 and our client wants us to use more updated version of API i.e. 1.5.
- Some delay in schedule and team has put more efforts in order to catch up the schedule. The reason for that delay was mainly because almost all our team members involved in an cultural show event of one of our CCAs which was in first 3 weeks of school term. Although we anticipated some part of tight schedules beforehand, team members were busier and had to commit more effort in organizing the event than expected which led to decrease in individual as well as team efficiency. We had been behind the planned schedule for up to one week (7 days) in iteration 3. In subsequent iteration, we tried to catch up the planned schedule but the delay gap was too wide that we could not manage to be back on schedule as planned. However, we managed to narrow down the gap and the delay was reduced to 5 days which was recorded as further delay in iteration 4. Major catching up happened in iteration 5 when the event was over and the timetables for other classes were not as occupied. We maintained the momentum and the determination to complete the features in time as planned in this iteration. The better commitment from every team member allowed us to finish all the planned features for iteration 5 as well as those leftover tasks from previous iterations. Finally, after iteration 5, our team was back on track as planned in term of time.
Provide more details about the status, schedule and the scope of the project. Describe the complexity of the project.
This is our team status on features listed in main wiki page.
|Feature||Status||Confident Level (0-1)||Comment|
|3D Model of school buildings||80%||1||Models of LKCSB, SIS, SESS and library done. SOA pending.|
|Timeline||90%||1||Fully functional. To add in some buttons according to new UI.|
|Trends||100%||1||Implemented and final.|
|LostnFound||90%||1||Implemented. UAT result pending.|
|Tweet/Discuss/Report||90%||1||Implemented. UAT result pending.|
|Statistic Collection and Error Logging||0%||0.9||Need to check server resources.|
|Friend Stream||0%||0.9||Need to research on twitter api|
Project Schedule (Plan Vs Actual):
Our team's project scope remains constant from the beginning to present stage. Until now, we implemented features according to plan and in iteration 6, we would be implementing Logging and Statistic function and building School Of Accountancy (SOA) model as plan. In iteration 7, we will implement UI revamp and Friend Stream function in following iteration.
|3 (11 Aug - 24 Aug 2011)||Modelling SESS||11 Aug - 19 Aug 2011||11 Aug - 22 Aug 2011||Aung and Soe Thet are responsible for building School of Social Sciences model and delay by 3 days.|
|Database||11 Aug - 19 Aug 2011||11 Aug - 19 Aug 2011||Thandar and Khaing come out with database structure and create database.|
|Server side (Trend Page)||11 Aug - 24 Aug 2011||11 Aug - 31 Aug 2011||Phyo and Thandar faced unexpected difficulties such as learning cron job, changing database structure to update trend keywords.|
|Client Side (Trend Page)||20 Aug - 24 Aug 2011||11 Aug - 29 Aug 2011||Soe Thet and Khaing need to wait for server side since there is delay in server side.|
|Germanium (Trend Page)||11 Aug - 24 Aug 2011||11 Aug - 31 Aug 2011||Moe Hein and Aung implement codes to connect the 3D model and the trend page accordingly.|
|Mock Up||11 Aug - 24 Aug 2011||11 Aug - 24 Aug 2011||Moe Hein and Aung create full mock up pages for wireframing as Sponsor requested.|
|Test Plan||11 Aug - 17 Aug 2011||11 Aug - 17 Aug 2011||Khaing and Aung create test plan for unit testing.|
|Misc||11 Aug - 24 Aug 2011||11 Aug - 24 Aug 2011||Khaing and Aung improve wiki based on the Acceptance Presentation Feedback.|
|4 (25 Aug - 7 Sep 2011)||Leftover from iteration #3||25 Aug - 31 Aug 2011||25 Aug - 1 Sep 2011||Project Manager Soe Thet and the whole team were busy with Myanmar Festival Gala on 3rd September and so we delayed further|
|Server Side (Trend Page)||31 Aug - 7 Sep 2011||1 Sep - 9 Sep 2011||Phyo and Soe Thet finalize server side coding, test cron job and results|
|Client side (Lost and Found)
||31 Aug - 7 Sep 2011||1 Sep - 10 Sep 2011||Aung and Soe Thet implement time filter in Lost and Found page based on Trend feature|
|3D Model of Library||31 Aug - 7 Sep 2011||1 Sep - 11 Sep 2011||Khaing and Aung was responsible to draw Library Model|
||31 Aug - 7 Sep 2011||1 Sep - 11 Sep 2011||Because of Germanium API changes, Moe Hein and Thandar adjust current codes as well as implement Accordion Effect in BBL Tree Object.|
|5 (8 Sep - 21 Sep 2011)||Leftover from iteration #4||8 Sep - 13 Sep 2011||8 Sep - 13 Sep 2011||Project Manager Soe Thet and the whole team were able to finish implementing features from previous iteration and Testing.|
|Tweet/ Report/ Discuss||13 Sep - 21 Sep 2011||13 Sep - 21 Sep 2011||Aung and Phyo has finished tweet/report and discussion feature in Trend Page and LostnFound Page.|
|SOB 3D Model||13 Sep - 21 Sep 2011||13 Sep - 21 Sep 2011||Khaing was responsible to build School Of Business (SOB) model|
|Germanium (Level, Placemark Position)||13 Sep - 21 Sep 2011||13 Sep - 21 Sep 2011||Thandar set placemark position on Campus Ground and Level according to position #(hash tag).|
|UI Improvement||13 Sep - 21 Sep 2011||13 Sep - 21 Sep 2011||Moe Hein was responsible for User interface improvement and design changes for UAT1 and Mid-term presentation.|
Summary of analysis for the metrics collected. You may refer to another page for the details about the metrics and how it is collected. [To be done later]
|No||Risk||Likelihood of Occurrence||Impact on Project||Mitigation Strategy|
|1.1 Underestimation of the time taken for a task||Low (Previously Medium )||High||
|2.1 Unfamiliarity with twitter and Germanium APIs||Low (Previously Medium)||Low (Previously Medium)||
|2.2 Integration difficulties with Germanium and twitter (Placemarks)||Low (Previously Medium)||Low (Previously High)||
|3.1 Performance slowing down due to 3D loading and delays in thread pulling from twitter||High||Medium||
|3.2 Twitter server downtimer||Medium||High||
Twitter can show the live updates with time. Twitter API enables a lot of developers creative applications around Twitter. Germanium API enables web browsers to load the 3D models with Germanium plugin. So, we build an application that can show tweets in 3D model with Germanium API. Although individual technologies are easy, it creates a lot of challenges and complexity when we combined them.
The following are the technical complexities of our project in order of highest complexity first.
1. Page Refresh is not desired
2. Dynamic placemark
3. Real time update
Twitter can show the live updates. Whenever a user tweets, it can show up in the timeline very quickly. However, we have to implement that live and real time update feature in our application as well. Given the fact that the tweets are linked with placemarks and dynamically changing with time, updating the placemarks dynamically is not an easy task.
4. Categorising tweets with locations
We pulled tweets around SMU with using latitude and longitude of SMU. We also want to categorize the tweets based on building to be shown on the specific building.
5. Getting trends different from twitter trends
To be able to decide the trends around SMU, we need to extract the trends which are different from normal trends from Twitter. We pull the tweets every 15 minutes. Then we have to use our own algorithm to count and extract the trends. Currently, we decided to use 10 as the threshold number of tweets that a topic can become trend around SMU in our application in 1 hr. For example, if a user is tweeting 9 times about #FYP in last 5 minutes. It won't show up in the trends list yet. After the server pick up the 10th tweets about #FYP around SMU, it will promoted to become Trend in our application. When another user clicks on #FYP on our application, he should see the tweets around and in SMU with #FYP hashtag. The server we are provided cannot enable the cron scripts. So, we have to explore a few possibilities such as Google Engine etc to substitute in cron jobs. Fortunately, one of the services suggested by sponsor satisfies our need.
6. Building 3D model
Building 3D model is not easy if we don't say it is very challenging. 2 of our team members have to spend about 2 months to learn how to create 3D models in 3Ds Max. Although there are a lot of tutorials and examples regarding to the 3D modelling, we need to learn from start like drawing lines and circles. It is definitely not an easy journey for us. However, once a developer knows how to create a model from the floor plan and texturing techniques, it is very easy for other developers learn from the previous developer and help in modelling. Then it brings up another challenge of sharing the same texturing files and IDs. More than 2 developers are creating the models and so it means all of them need to sync the texturing files and IDs to be consist in the final models on the browser. For example, when the first developer create texture ID of 4 for glass walls, the rest developers have to follow the same ID and procedure as well. It seems easy and nice for end users yet developing and sharing among the developers is not an easy task.
Quality of product
To provide flexibility and maintainability to our application, we planned to have a properties file in which we can modify different configurations of the application without having to look into codes.
|Project Management||Minutes||Sponsor weeks - 2, 4, 6
Supervisor weeks - 2, 4, 6
|Features List, Planned Schedule and Actual Schedule||Use Cases and Schedules|
|Testing||UAT 1 test plan and test cases||UAT 1 Test Plan|
Our application can be accessed at http://3dsocializer.phpfogapp.com/
The First UAT (24-25 Sept)
Total Number of UAT Tester
Frequent Twitter User
UAT Survey Results
In this section, describe what have the team learn? Be brief. Sometimes, the client writes a report to feedback on the system; this sponsor report can be included or linked from here.
Any training and lesson learn? What are the take-away so far? It would be very convincing if the knowledge is share at the wiki knowledge base and linked here.
Benjamin Gan Reflection:
You may include individual reflection if that make sense at this point. The team is uncooperative and did not follow my instructions.