HeaderSIS.jpg

IS480 Team wiki: 2013T2 SkyTeam Midterm Product Quality

From IS480
Jump to navigation Jump to search
SkyTeamLogo.png

Midterm Wiki

 

General Wiki

 
Project Progress Summary Project Management Product Quality Reflections


Project Deliverables

Project Aspect Document Type Document Link
Project Management Minutes Internal Meetings
Supervisor Meetings]]
Sponsor Meetings
Metrics Bug Metrics
Schedule Metrics
Project Requirements Prototypes Lo-fi
Hi-fi
Business Requirements [Description]
Design Use Case Diagram [Use Case]
System Architecture Diagram [System Architecture]
Testing Test Plan [Plan]
User Testing [Testing]

Technical Complexity

RISK CALCULATION There are 2 input parameters in risk calculation: hazard level and vulnerability index. The overall risk of a specific building is equal to hazard level times vulnerability index Total risk = V H where: V = total vulnerability index of the building H = hazard level of the surrounding regions where the building locates

1.1 Hazard
Hazard level represents the probability that an unexpected event can happen within the region where the targeted building locates. Within the scope of this project, 5 there are 3 types of hazard that were taken into consideration: Flood, Fire, and Earthquake. These values are provided by the sponsor in the form of hazard map.

ST System1.jpg

1.2 Vulnerability
Vulnerability index depends on 5 characteristics of a building: building type, building age, foundation type, masonry type, and roof material. These values will be provided by the end users (insurance companies) when they upload data directly on the website.

ST System2.jpg

HISTORICAL ANALYSIS As part of the application’s requirement, we need to implement historical analysis of the hazard level of different regions in Malaysia across the years. Specifically the application user can view the movement of three kinds of hazard risk year to year. Our team decided to make use of the Google Visualization API to develop our motion chart. As during the process of looking into the Google API, we had difficulties in integrating their methods with our real data which is fetched dynamically from the Google Fusion Table in the form of KML. Understanding this, we had to read the library and API in depth to adjust our own data.