HeaderSIS.jpg

Difference between revisions of "IS480 Team wiki: 2016T1 Charlies Angels FinalWiki"

From IS480
Jump to navigation Jump to search
Line 184: Line 184:
 
|-
 
|-
 
| 1
 
| 1
| Database Structure
+
| Addition of a new backend crawler [Secondary Failover Capabilities]
| Due to the large volume of data that is involved in such a project, we set to achieve a low execution time for complex queries, as this is common in a financial data related application
+
| Due to the volatile nature of the websites we crawl from, there is a pertinent need for the team to ensure that financial data do not go down as this would render the website to be unusable. This is highly complex due to the large number of financial variables needed to crawl either on real-time demand or daily changed variables
| Frequent discussions and reviews of the schema design to seek an optimised schema for robustness and scalability
+
| Work closely with Business Analyst to ensure that the websites crawled have tags that are identified by IDs instead of an absolute element such as td in a table. This is to ensure that there is a minimum layer of certainty for the data being crawled.
|-
 
| 2
 
| Crawling of Data at a specified interval
 
| Lack of financial data APIs such as Bloomberg/Reuters
 
| By analysing and performing pattern recognition by developing a Scheduler application that utilises a Cron job to obtain such data at a specified interval.
 
|-
 
| 3
 
| Generation of PDF & Excel reports
 
| Lack of technical know-how in generating PDFs and Excels, coupled with the lack of proper documentation provided by the utilised libraries
 
| Extensive research and constant tweaking of codes to reach the outcome
 
 
|-
 
|-
 
|}
 
|}

Revision as of 11:29, 14 November 2016


Charlies Angels Logo.png


Charlies Angels Home.png  HOME

 

Charlies Angels About Us.png  ABOUT US

 

Charlies Angels Project Overview.png  PROJECT OVERVIEW

 

Charlies Angels Project Management.png  PROJECT MANAGEMENT

 

Charlies Angels Project Documentation.png  DOCUMENTATION

 
Main Wiki Midterm Wiki Charlies Angels Current Stage.png Final Wiki

Project Progress Summary


Charlies Angels Final Project Summary.png


Charlies Angels Deployed Project.png Charlies Angels Final Slides.png Charlies Angels Poster.png Charlies Angels Pitch Video.png

*Note: Access to Poster and Pitch Video requires SMU-Gmail Login to View

Project Highlights

Project Challenges

Project Achievements

Project Management

Project Status

Charlies Angels Final Project Status.png

Project Schedule (Plan vs. Actual)


Team Charlies Angels is pleased to announce that there were no major changes to the Project Schedule since our Mid-Terms presentation. The small changes made to the Project Schedule is firstly, the confirmation of our UT date has shifted to 16 Nov. This is to ensure that our functionalities are thoroughly tested before we gather users [Investors in the Market] to test our application, as they are professionals in the field, we wanted to ensure that the product is of highest possible quality before rolling it out. Second, it is the confirmation of our Final Presentation date that is on the 30th November to ensure that our sponsor could attend. Overall, the team is highly confident on the successful completion of the project, and would seek to have a proper documentation detailing the handover process to the sponsor.


Planned Project Schedule

Charlies Angels Project Schedule MidTerms.png


Actual Project Schedule

Charlies Angels Project Schedule Finals.png

Project Metrics

For more information on how the team tabulates the score for each component, please refer to the following page on our Wiki: Charlies Angels Metrics
Team Charlies Angels is pleased to share that on the whole, there has not been major lapse(s) in our project metrics. And the team has managed to successfully steer in the right direction despite the high technical and business complexity the team faced in the course of this project.


Charlies-Angels-Metrics.png

Project Risks

Due to the nature of our application, Team Charlies Angels has been proactively monitoring our efforts to mitigate risk(s). We would like to highlight the top two risks that our project is facing since the mid-terms presentation, the details are found in table below:

Charlies Angels Risk Matrix.png

Current Risks that may affect the team:

S/N RISK TYPE RISK DESCRIPTION LIKELIHOOD SEVERITY CATEGORY MITIGATION
9. Technology Risk The recent acquisition of Yahoo by Verizon poses an imminent threat to the existence of the Yahoo API where the team utilises to call for its financial information. Probable Critical Unacceptable The team has identified that Yahoo would close the service only in 2017. However, in the event that the service is closed during the phase of development, the purchase of a paid API is absolutely necessary. In the event of this purchase, the team would propose the drop of all functions happening thereafter, and focus on the fetch of data as it is essentially what makes the application.

A list of websites have been provided to the client as alternate sources for varying financial data information:
  1. http://www.indexq.org/index/FSSTI.php
  2. http://www.tradingeconomics.com/singapore/stock-market
  3. http://www.tradingeconomics.com/analytics/api.aspx?source=footer
  4. http://www.shareinvestor.com/fundamental/factsheet.html?counter=D05.SI
  5. http://www.shareinvestor.com/membership/plans_overview.html
  6. https://www.google.com/finance?q=SGX%3AD05&ei=X_wFWKmtMsOFuASk-o3IAQ
  7. http://www.marketwatch.com/investing/Stock/D05?countrycode=SG
  8. https://sg.uobkayhian.com/page/UOBKH/STI_ComponentStk.jsp
  9. http://www.marketwatch.com/investing/index/sti?CountryCode=sg
  10. http://www.sharesinv.com/prices/index-sti/
  11. http://www.tradingeconomics.com/singapore/government-bond-yield
  12. http://sg.morningstar.com/ap/quicktake/returns.aspx?PerformanceId=0P00006PNA&activetab=TotalReturn
10. Technology Risk As the application is utilising web crawling patterns, this aspect is subjected heavily on the patterns that are used on the host(s) which we crawl the data from. Probable Critical Unacceptable The team has come up with mitigation plans to have two possible steps to take. The first choice would be to proceed to find an alternative website that displays the essential information that we require and proceed to crawl from there. Should the former step fail, we would proceed to consult our client, and proceed to make a purchase for paid API for financial data. In the event of this purchase, the team would propose the drop of all functions happening thereafter, and focus on the fetch of data as it is essentially what makes the application.

Technical Complexity

Business Domain Complexities

Technical Complexities
Rank Technical Complexity Faced Reason How did the team overcome this complexity
1 Determining Risk Measurements of Stocks Lack of technical know-how, coupled with most of typical risk measurement(s) require inputs from Analysts such as Expected Returns, etc. However, we are required to calculate these with the absence of such fields Discussed and validated with our sponsor on how to determine risk measurements for stocks
2 Calculation of Sharpe Ratio Online sources may have Sharpe Ratio calculations. However, it pertains to having an 'Expected Return' field but in our case, the sponsor requires us to calculate without it. Discussed and validated with our sponsor on how to determine Sharpe Ratio formula to be without the 'Expected Return' field

Front-End Technical Complexities

Technical Complexities
Rank Technical Complexity Faced Reason How did the team overcome this complexity
1 New UI frameworks such as React, Redux, Grommet First time being exposed to such frameworks Since these are new frameworks, some parts of the framework are incomplete/broken. Hence, the team had to submit several bug requests to the developers

Backend Technical Complexities

Technical Complexities
Rank Technical Complexity Faced Reason How did the team overcome this complexity
1 Addition of a new backend crawler [Secondary Failover Capabilities] Due to the volatile nature of the websites we crawl from, there is a pertinent need for the team to ensure that financial data do not go down as this would render the website to be unusable. This is highly complex due to the large number of financial variables needed to crawl either on real-time demand or daily changed variables Work closely with Business Analyst to ensure that the websites crawled have tags that are identified by IDs instead of an absolute element such as td in a table. This is to ensure that there is a minimum layer of certainty for the data being crawled.

Changes in Scope

Quality of product

Project Deliverables

Deployment

Testing

Reflection

Team Reflection

Individual Reflections

Supervisor Comments