HeaderSIS.jpg

Difference between revisions of "IS480 Team wiki: 2016T1 IPMAN Midterm Wiki"

From IS480
Jump to navigation Jump to search
 
(6 intermediate revisions by the same user not shown)
Line 56: Line 56:
 
[[Image:IPMAN_Dashboard_MidTerm.png|center|1000px]]
 
[[Image:IPMAN_Dashboard_MidTerm.png|center|1000px]]
 
<center>
 
<center>
[[Image:IPMAN_MidTermSlides_Icon.png|150px|link=]]
+
[[Image:IPMAN_MidTermSlides_Icon.png|150px|link=File:Team IPMAN Mid Term Presentation Slides.pdf]]
 
[[Image:IPMAN_MidTermDeployedSite_Icon.png|150px|link=http://www.hookcoffee.com.sg/manager]]
 
[[Image:IPMAN_MidTermDeployedSite_Icon.png|150px|link=http://www.hookcoffee.com.sg/manager]]
 
</center>
 
</center>
Line 133: Line 133:
 
3. '''Bug Metrics'''
 
3. '''Bug Metrics'''
 
<br/>
 
<br/>
[[Image:IPMAN BugMetric Chart.png|center|600px]]
+
[[Image:IPMAN PM BugMetrics.png|center|600px]]
 
<br/><br/>
 
<br/><br/>
 
4. '''Mean Time to Recover (MTTR)'''
 
4. '''Mean Time to Recover (MTTR)'''
Line 140: Line 140:
 
(Hours spent on analysing the issue + Hours spent to implement the changes) / Number of issue in the sprint  
 
(Hours spent on analysing the issue + Hours spent to implement the changes) / Number of issue in the sprint  
 
<br/>
 
<br/>
[[Image:IPMAN PM MTTR.png|center|600px]]
+
[[Image:IPMAN PM MTTR Midterm.png|center|600px]]
 
<br/>
 
<br/>
  
Line 203: Line 203:
 
===<div style="background: #07264C; padding: 15px; line-height: 0.3em; text-indent: 15px; font-size:16px; font-family:Garamond"><font color= #FFFFFF>Project Risks</font></div>===
 
===<div style="background: #07264C; padding: 15px; line-height: 0.3em; text-indent: 15px; font-size:16px; font-family:Garamond"><font color= #FFFFFF>Project Risks</font></div>===
 
<div style="font-family:Garamond;font-size:16px">
 
<div style="font-family:Garamond;font-size:16px">
[[Image:IPMAN RiskManagement Chart.png|center|800px]]
+
[[Image:IPMAN RiskManagement Chart MidTerm.png|center|800px]]
 
From the period of Acceptance till Mid-Terms, we will like to highlight that our concerns were 1) Technical Risk and 2) Stakeholder Management Risk as described below:
 
From the period of Acceptance till Mid-Terms, we will like to highlight that our concerns were 1) Technical Risk and 2) Stakeholder Management Risk as described below:
 
<br/>
 
<br/>
Line 464: Line 464:
 
[[Image: IPMAN Midterm Testing Overview.png|800px]]
 
[[Image: IPMAN Midterm Testing Overview.png|800px]]
 
<div style="font-family:Garamond;font-size:16px">
 
<div style="font-family:Garamond;font-size:16px">
For more detailed version of Team IPMAN's user acceptance test results, assess here:
+
For more detailed version of Team IPMAN's user acceptance test results, access it here:
 
</div>
 
</div>
 
[[Image: IPMAN UserTesting1 Icon.png|200px|link=IS480_Team_wiki:_2016T1 IPMAN User Testing 1]]
 
[[Image: IPMAN UserTesting1 Icon.png|200px|link=IS480_Team_wiki:_2016T1 IPMAN User Testing 1]]

Latest revision as of 00:50, 17 November 2016

Team IPMAN Logo 1200x960.png


Team IPMAN Icon Home.png   HOME

 

Team IPMAN Icon AboutUs.png   ABOUT US

 

Team IPMAN Icon ProjectOverview.png   PROJECT OVERVIEW

 

Team IPMAN Icon ProjectManagement.png   PROJECT MANAGEMENT

 

Team IPMAN Icon ProjectDocumentation.png   DOCUMENTATION

 


IPMAN MidTermWiki Title.png

Project Progress Summary


IPMAN Dashboard MidTerm.png

IPMAN MidTermSlides Icon.png IPMAN MidTermDeployedSite Icon.png

Note: We have deployed live on Hook Coffee's existing application. However, as Team IPMAN has signed an NDA with Hook Coffee, the above link to the deployed site is only available to existing administrators in Hook Coffee. To view the application, we have created a staging server in which the credentials are available in the deployment section below.


Project Highlights [in chronological order]

  • Implementation of Facebook Login, due to unfamiliarity, we took 2 weeks to learn and implement this function in Sprint 3
  • Integration with QR code implemented by sponsors so that labels can be pre-populated once code is scanned in Sprint 5
  • Conducted our first UAT with our sponsors and packers in Sprint 6 to test our process order and customer management module and worked on Performance and Design Improvements in Sprint 6
  • Decision made in Sprint 7 to shift several tasks for subsequent sprints due to greater focus on smart marketing and dashboard graph modules as requested by sponsor. Dropped Facebook bot and recommendation engine (planned for Sprint 12) so that we could spend more time on these two modules.
  • Conducted our second UAT with our sponsors and their business development in-charge in Sprint 9 to provide feedback mainly on our smart marketing module
  • Integration of Intercom together with our system in Sprint 10


Project Management


Project Status

IPMAN Progress Overview Midterm.png


IPMAN Project Scope Status.png


Project Schedule (Plan vs. Actual)

Several changes were made to the project schedule due to greater emphasis on Smart Marketing module and Dashboard Graph module as requested by sponsors. Hence, IPMAN has dropped several tasks that were scheduled in subsequent sprints to focus on these changes (as reflected in the actual schedule). Progress of the team is well-paced and optimistic.

Planned Project Schedule

IPMAN Timeline MidTerm Planned.png


Actual Project Schedule

IPMAN Timeline MidTerm Actual.png


Project Metrics

Team IPMAN has applied the following 4 metrics to enable greater efficiency and effectiveness in project planning. Of the 4 metrics, the team has introduced a new metric after Acceptance, Mean Time to Recover (MTTR). This metric has been modified from the Mean Time to Restore Service (MTRS), a metric adopted from Information Technology Infrastructure Library (ITIL). MTTR helps to calculate the time taken the team takes to resolve issues and changes suggested by sponsors and supervisor on a per sprint basis.

Below are the graphical representations of the various metrics that IPMAN has applied:

1. Sprint Velocity

IPMAN MidTerms PM SprintVelocity.png


Sprint velocity is a measurement of the amount of work Team IPMAN has completed during each sprint and it a key metric. Velocity is calculated at the end of each sprint by summing the total points of completed user stories. Only stories that are completed at the end of the iteration are counted. As seen from the Sprint Velocity chart above, from acceptance onwards (Between Sprint 7 and 8), the team has completed all planned story points.

2. Scrum Burndown Chart
The Scrum Burndown Chart is a visual measurement tool to display the amount of work completed each day by Team IPMAN against the ideal projected rate of completion for the current sprint.

Formula
Ideal: Total planned story points to be completed over number of days in a sprint
Actual: Actual story points remaining after completion of story points each day in a sprint

The team will like to highlight several key points for the following sprints after Acceptance as shown below:

IPMAN PM BurndownChart 7.png


Team IPMAN did not complete one story point which is 8 point for the ‘Edit Customer’s Preference Page at View Customer’s Order Page because sponsors got back to us the requirement for the preference list on 18th Aug which was very close to our end of sprint. Back-end for this function was completed on 19th August thus there caused insufficient of time to complete the front-end feature. It was initially planned to have two days for the front-end (edit customers’ preference page).

IPMAN PM BurndownChart 9.png


Team IPMAN managed to complete all the story points in sprint 9 however, it was all completed in the last four days of the sprint. Team IPMAN conducted UAT 2 in the first week of sprint 9 and implemented the changes based on UAT’s feedback too. Team IPMAN built a dashboard summary during this sprint. After the entire dashboard was completed, testing was carried out to ensure that all the data retrieved was correct. Hence, most of the story points were cleared on the last day of the sprint.

IPMAN PM BurndownChart 8.png IPMAN PM BurndownChart 10.png


Team IPMAN had completed the story points at a good pace for sprint 8 and 10. Sprint 10’s story points were completed 2 days before the sprint.

3. Bug Metrics

IPMAN PM BugMetrics.png



4. Mean Time to Recover (MTTR)
Formula
(Hours spent on analysing the issue + Hours spent to implement the changes) / Number of issue in the sprint

IPMAN PM MTTR Midterm.png


S/N Sprint Issue / Change Description Suggested by Hours spent on analysing the issue/brainstorm Hours spent on to implement the changes Total Hours to Recover
8 10 Make changes to the explanation of HookCoffee's Business Process Supervisor suggested that team should give paint a scenario to the audience with details so that it is more realistic Supervisor 1 1 2
9 10 Make changes to the technical complexity with examples so that audience can follow through Supervisor suggested us to include the screenshots of the various pages and keep our explanation simple Supervisor 0.5 1 1.5
10 10 Choose a few highlights to present for the Project Management Supervisor mentioned that we do not have time to present everything hence only choose to focus on 2-3 main points Supervisor 1 1 2
11 10 Sponsor requested for a pop-up modal to appear after scanning the QR code for 'special gift' The pop-up model only appears if the process code was entered instead of using the QR scanner. The issue was that the QR code field was not taken into account Supervisor 0.25 1 1.25

For further details on other sprints, refer to MTTR log on the team's metrics page for more details.


Project Risks

IPMAN RiskManagement Chart MidTerm.png

From the period of Acceptance till Mid-Terms, we will like to highlight that our concerns were 1) Technical Risk and 2) Stakeholder Management Risk as described below:

S/N Risk Type Risk Event Likelihood Impact Level Category Strategy Adopted Actions
9 Technical Risk Team IPMAN was unable to deploy as soon as all the bugs were resolved. Whenever sponsors report for bug issues, team immediately resolves the bugs. However, team IPMAN was unable to deploy the application with the fixed bugs as team was still building on the features in the current sprint and would only deploy the application at the end of the sprint when all features were completed. High High A Mitigate Team IPMAN came out with another branch for deployment which means there are now 2 branches, one of it is the deployment branch and another branch is to work on the existing features. Therefore, whenever sponsor raises any bug, team IPMAN would be able to resolve the bugs and deploy immediately without waiting for the features to be completed at the end of the sprint.
10 Stakeholder Management Risk Team IPMAN had to manage not only sponsors but also the developer team that the sponsors engaged. For instance, the freelancer developers have utilized Intercom to track the customers’ interaction on the application. This feature implemented affected the Smart Marketing features that we built. High High A Mitigate Team IPMAN communicated with sponsors early and found out more the implementation of Intercom. Team IPMAN also constantly met up with sponsor and checked with him about the progress of Intercom. Team IPMAN followed communication protocol between the sponsors, developer team and team IPMAN. Team IPMAN is using Trello and Telegram for communications and updates.



Change Management

Below is the change log for Sprint 8 to 10 (after acceptance):

IPMAN MidTerm ChangeLog.png


Technical Complexity

1. Comparison Using Hashes

HookCoffee is a subscription service that ships out coffee to its users, hence they have to print addresses and labels for each order. However, the generation for these labels/addresses tend to be slow, in the order of minutes (and increases even more as the number of labels grow large). As you can see, the previous system wasn’t designed to handle large number of orders (having multiple print label buttons). So, how do we make it fast?

We experimented with different libraries (pdfrw and PyPDF2) and found out that the problem was inherent within the pdf generation itself, so we traded space for time, i.e. we chose to generate the labels as soon as they are created, way before the time when the order is process.

For this, we implemented a task scheduler, celery, to create tasks to pregenerate the labels and we deployed the task scheduler on their live server. However, there came another issue: orders can change all the time and there are many different ways that it can change. There are too many endpoints to which the order can be changed (e.g. customer’s name, customer’s address, order details, directly through the database, the Django Admin, etc.). The problem is further exacerbated by foreign key dependencies, so any changes in these tables would have to be reflected in the pregenerated labels. The order table is shown below:

IPMAN TC ComparisonwHash Code.png


In such a complex situation, how do we ensure the pre-generated labels are up to date?

The solution: Hashing

We convert each order to a String. This string is passed to a hash function, MD5, which generates a fixed-length digest which is unique to each order (which is tied to every detail in the order). We are also able to attach relevant metadata as well.

IPMAN TC ComparisonwHash Code metadata.png


For example, if a customer ordered Guji Liya today, it would produce an order label that is of the file name

<order_id>_99464c1699fd5bc192b88a46fafcd329.pdf.


Let’s assume Guji Liya is no longer in stock and the packers at HookCoffee hence changed it to NeverEverLand.

IPMAN TC ComparisonwHash Code example.png


The system hashes the order (which is now f7e3f23927d8e2921e52aefce5dbf544) and finds that there is a mismatch in hashes, hence the system will generate a new label for the order.


2. Fortnightly Live Deployment

We scheduled live deployments at the end of every sprint (lasting two weeks), to ensure that our sponsor gets new features and bug fixes frequently. Since we are working alongside with other freelance developers (working on different features) on the same application, it can become complicated to ensure a bug-free deployment. So, communication with our sponsor and the freelance developers as well as handling of the codes must be carefully managed. We have even managed to integrate a task scheduler middleware (celery) on top of the architecture.



Quality of Product

1. Software Architecture Redesign

Codes written by previous freelancers were tightly-coupled and un-reusable. It can be cumbersome to write the same logic at other endpoints. Also, minor changes to codes also frequently broke other features.

The diagram below depicts the architecture prior to the introduction of KOPIKaki.

IPMAN TC MVC Before.png


The code snippet depicts an example whereby the request is used at multiple points which makes it tightly coupled.

IPMAN TC MVC Before Code.png



To prevent the continuation of these issues, we built a React frontend that makes AJAX calls to RESTful Django endpoints.

IPMAN TC MVC After.png


These endpoints are modularized and reusable, which allows different ‘views’ to call upon the same endpoints, thereby reducing lines of codes and code maintainability. The decoupling also allows our sponsor to potentially create new views (e.g. iOS, Android) without rewriting backend codes. This can set a good architecture for future freelancers to build upon.

Below is an example of how it has been modularized:

IPMAN TC MVC After methods.png


To help facilitate this, we are also actively maintaining a set of API documentation that we can hand over.

IPMAN TC MVC RestEndpoint.png


IPMAN TC MVC RestEndpoint orders.png


2. Use of Adapter Pattern

In order to communicate with Mailchimp services from our web application, we have to make calls to Mailchimp’s RESTful API services. However, we do not want to call these APIs directly from our controller, as it would mean that API calls to fetch similar data would be repeated.

How do we ensure that our main application can reuse the code that calls these RESTful services, as well as protect our main application from changes to Mailchimp’s API?

We solve this by using the Adapter Pattern. The high-level overview is shown below:

IPMAN TC Adapter.png


Using the above as an example, we delegate the responsibility of calling MailChimp’s RESTful API to the module mailchimp_api.py. In doing so, we can centralize all of the Mailchimp API related code. If Mailchimp decides to change their API interface (e.g. add a new field), only one module will be affected (Adapter – mailchimp_api.py), and we would only have to make changes there. Below is an example of the code from mailchimp_api.py module.

IPMAN TC Adapter PseudoCode.png



When the actual call is made from the browser to our application, the resulting call graph is as below. As you can see, because all the methods in our app call Mailchimp’s RESTful services through the mailchimp_api.py module, we are able reduce change effort by reducing coupling, and the impacted modules from a change in Mailchimp’s API would only be at mailchimp_api.py. This can be seen from a modified call-graph generated using pycallgraph, as below:

IPMAN TC Adapter CallGraph.png



3. Decorator Pattern: Capturing MailChimp Statistics

Hookcoffee extensively uses email marketing to push out their products. However, while Mailchimp provides the capability to keep track of who has opened the email and who has clicked through, for any given campaign, the activities that the prospective customer has on the website is invisible to the owners.

Mailchimp provides an eCommerce tracking option which enables web developers to record revenue generated from their campaigns, but fail to provide granularity in the sales funnel. For instance, did the customer register their interest on the site? Or did they make a purchase or was engaged in the campaign in a positive way (switching to a different type of coffee).

Previously, it was invisible to HookCoffee what happens at that level. However, by building on top of the eCommerce function, we are able to fetch who the user is and from which campaign he was from by interacting with Mailchimp’s RESTful APIs. We also recorded what the user did AFTER they clicked through the email campaign on the local database.

However, the issue came when we found out that Mailchimp actually stores critical details after the clickthrough on the request itself as parameters on the request, as shown below.

IPMAN TC MailChimpStats Request.png


This meant that for every end point, we have to capture these parameters and store them in session, as shown below.

IPMAN TC MailChimpStats Flow.png


The naïve method is to go into every single url view endpoint and add a method to capture such data, but this proved to be impossible, due to the large number of possible endpoints. Furthermore, as the project went on, we cannot guarantee that all these endpoints will capture the data as other freelancers are also working on the same website.

We used the Decorator pattern to handle this. The decorator is an additional functionality we can add without affecting current implementations. Think of it as a plugin for a web browser.

IPMAN TC MailChimpStats Decorator.png


There exists a plugin to bind decorators to a url.py file, however, there was no indication of how we would do it on the same file. We dig deeper and realize the urls.py is automatically exported as a list in Django. Now, we store all URL patterns as a list of unwrapped (or undecorated) URL patterns. We then hook our own decorator (processMailchimpParams, which takes a request and stores the mailchimp campaign id and resolve the email address if it exists) to the list of unwrapped URL patterns, thereby causing it to execute for every endpoint (with the exception of static pages, e.g. images). This enables an easy to use solution that is easily extensible by other developers on the team, as modifications are made on the same file.

IPMAN TC MailChimpStats URLPatterns.png





Intermediate Deliverables

Topic of Interest Link
Project Management Project Schedule
User Stories
Minutes
Metrics
Risk Management
Change Management
Project Overview Project Overview
Team's Motivation
Project Scope
Project Documentation Personas & Scenarios
Diagrams
Technologies Implemented
Low & Mid Fidelity Prototypes
Testing Testing Documentation


Deployment

Note: Facebook login will not work as it is tied to the hostname: hookcoffee.com.sg
To view application, visit: http://128.199.107.26/manager
Username: temp@supertemp.temp
Password: supersecurepassword


Testing

Internal Testing

For easier reference on testing done on our system, the team has came up with this step-by-step process as seen below:

IPMAN InternalTesting Process.png

For a more detailed explanation on Team IPMAN's internal testing methodology, view our testing page as linked in the Intermediate Deliverables component.


User Acceptance Test 1 & 2

Team IPMAN has conducted 2 user tests which allowed us to better manage sponsor expectations as well as improve on usability of our application interface.

IPMAN Midterm Testing Overview.png

For more detailed version of Team IPMAN's user acceptance test results, access it here:

IPMAN UserTesting1 Icon.png     IPMAN UserTesting2 Icon.png



Reflection


Team Reflection

This journey has proven to be an enriching learning experience for Team IPMAN. We learnt the importance of good stakeholder management which allows us to better react to unforeseen circumstances. We have also learnt that communication, adaptability and active team participation accelerates the process of mitigating situations at hand.

Sponsors' Testimonial (Ernest & Faye)

IPMAN HookCoffee Ernest.png IPMAN HookCoffee Faye.png

"Team IPMAN produces work of great quality, values constant communication and provides regular updates (every 2 weeks) on the progress of product development. The team is dedicated, possess strong technical capabilities and is willing and flexible with requests to improve the user experience design on the product."

Individual Reflections

IPMAN MidTerm Reflections.png



Silhouettes wiki background IPMAN.png