HeaderSIS.jpg

IS480 Team wiki: 2016T1 Folium Midterm Wiki

From IS480
Jump to navigation Jump to search
Team Folium Logo1.jpg


Team Folium Home.png   HOME

 

Team Folium About.png   ABOUT US

 

Team Folium Icon Overview.png   PROJECT OVERVIEW

 

Team Folium projectmgmt icon.png   PROJECT MANAGEMENT

 

Team Folium Doc.png   DOCUMENTATION

 


Project Progress Summary

Midterm SlidesDeployed Site
Midterm Presentation Slide
Team Folium Deployment.jpg



Deployment Progress

  • Current Iteration: Iteration 10
  • Iteration period: 25 September 2016 to 9 October 2016
  • Major milestone: Midterm Presentation

Project Highlights

  • The new generation system is built from the legacy system. Team is trying to optimize the old system's grading architecture,system architecture and develop new features with greater capabilities to system users.
  • System has gone live and has been deployed 3 times throughout the semester.During this period, the students have submitted 4 lab assignments to the server. System has been processed with 3177 submissions.
  • Team has ensured the system stability, reliability and usability by conducing 5 tests, including 2 sponsor user tests, 2 student user tests and 1 live test.
  • Team has created a feedback channel with the live system users to enhance the system.

Project Management

Project Status

Folium progress for midterm.jpg

Folium MIdterm Status.PNG

Planned vs Actual Scope

Planned Actual
Project Scope Acceptance.jpg
TeamFoliumProject Scope.jpg

Major changes made

# Changes Iteration Module Description Reason Status
1 Allow student submitting files concurrently (+) 7 Submission Management In the previous version, students are not able to submit a second solution when an existing solution is still in the submission queue awaiting grading. In the current version, students can submit multiple files to the queue. For example, if lab2a.rb is still in the queue, a student can submit lab2b.rb. He will be able to see both submissions in the queue. Supervisor suggested addition of unique functions. Finished and notified Sponsors.
2 Allow admin to manage grading machine IP address and set database username & password in admin interface (+) 8 Administration Module Allow admin to add the grading machines and set the database username, password from admin interface. Supervisor request to manage the database account and grading machine in admin interface. Finished and notified Sponsors.
3 Enabling grading machine to auto generate and forward email to admin in case of unforeseen circumstances (+) 9 Submission Management In times of grader undesired system behavior, admin will be notified with an auto-generated email. Sponsor wants to be notified when graders has any undesired behavior. Development in progress.
4 Optimizing scoreboard ranking algorithm (+) 9 Ranking Module Implementing new scoreboard ranking algorithm to enable admin to rank based on total score, which is computed from time, quality score and score index. Sponsor suggested to rank students’ submission based on different quality score & time score index value. Development in progress.
5 Able to retrieve historical data (-) 9 Administration Module Allows admin to retrieve historical data of submission, including username, question and submission file of specific submission. This feature is similar to the previous feature “Create central repository for all submissions”. After discuss with sponsors, we drop the feature. Settled with Sponsors.

Planned vs Actual Project Schedule

Folium MidTerm Planned Project Schedule copy.jpg


Folium Mid Term Actual Project Schedule copy.jpg

Project Metrics

Schedule Metrics

View our Schedule Metrics Here!

Folium Task Metric Score.png

Schedule Metrics Highlights
Iteration Planned Tasks Actual Tasks Schedule Metric Score Action Status
3 4 3 75% Estimates are generally accurate and on track. Delayed slightly due to not fully understand the feature related to command line prompt. After further study, it turned out that it is not a bug needed to be fixed and sponsors have been updated with the result.

Follow up action: feature has been pushed to next iteration but been removed afterwards. So no buffer day has been used.

Completed
5 3 2 67% Estimates are optimistic about the test case feature. Delayed due to underestimate the complexity of test case feature. Team spent some time studying the grading architecture and figuring out the principle behind the feature.

Follow up action: feature has been extended until next iteration to finish. Buffer days are used.

Completed
6 4 3 75% Same test case feature failed to finish and dragged until iteration 7 to fully finish.

Follow up action: Another developer has been assigned to do pair programming to fasten the process of developing this feature. Team manage to finish the test case feature in iteration 7 with no buffer day spent.

Completed

Bug Metrics

View our Bug Metrics here!

Bug Count
TeamFolium Bug Count.png
Bug Score
TeamFolium Bug Score.png

Project Risks

Risk Type Risk Event Likelihood Impact Mitigation Strategy Status
Sponsor Management Sponsors may have changes in function requirement which may affect our overall schedule. Medium High Edit our timeline and requirements for our future.

Perform regression testing.

Ongoing
Project Management Risks We will collect students’ feedback and if encounter any critical bugs, we will need to stop the ongoing process and debug to ensure students can use the system smoothly. Medium High Decide on which part to focus on and debug first

Ensure students’ feedback being responded in 24 hours.

Ongoing
Technical Team has no experience with new technologies which will be implemented to our project. High Medium Dedicate time and effort doing research. Ongoing
Technical System crashes or files are not backdated. Low High Ensure that they all have the latest version.

Always update codes to the latest versions locally and on the repository.

Ongoing
Project Management Team needs to reshuffle project features based on sponsors' request after the Acceptance Milestone. High High Team needs to reschedule the project features and reallocate developers on each tasks. Overcame

Technical Complexity

System Architecture

System Architecture Red.png

Explanation
1. Users submit submissions via web server. Web server uploads and saves the submissions to database.
2. Grading applications check database every 30 seconds for new submission:

<code>
Executors.newSingleThreadScheduledExecutor().scheduleAtFixedRate(waitForSubmission, 0, 30, TimeUnit.SECONDS);
</code>

3. If grading application finds a new submission, it starts evaluating and grading user's submission. After the grading process, it saves the results to database.
4. Web server retrieves the results from database and displays to the users.

Grading Process

Grading Process.png

Optimizing Grading Application

1. Make Grading Application Independent of Tomcat
Grading application is independent of the client side. There is no need to host grading application on Tomcat so we make it become a command line Java application.
2. Notify Admin by Email when Grading App Shuts Down Accidentally

<code>
Runtime.getRuntime().addShutdownHook(new Thread() {
      @Override
      public void run() {
            QueueManager.getInstance().cancelWaitForSubmission();
            System.out.println();
            System.out.println("Status: Sending Notification Email.");
            MailDAO.getInstance().mailShutDownNotification();
            System.out.println("Status: Grader has been shut down.");
      }
});
</code>

With the concern that admin has no knowledge when grading application shuts down accidentally, we improve the grading application such that it will notify admin by email that it is shutting down. We achieve this feature by attaching a shutdown hook to JVM, which provides a graceful exit from JVM.
3. Load Testing Using Visual Studio
Before deploying the system, we performed load testing to test the stability and capacity of grading application using Visual Studio.
We simulated 186 users submit submissions at the same time and the system was able to handle all submissions.

Security

1. Start of Grading Application
We enhance the security of the grading application by requesting admin to key in the correct database username and password when starting the grading application. If the JDBC Driver cannot get connection from the database using the inputted username and password, the grading application will not be started. It is secure because the correct username and password is not hard-coded or stored locally with the grading application. Besides, if the username and password is correct, but the IP address of the grader machine is not registered, the grading application will not be started as well.
2. Prevent Users Submitting Submissions after Deadline via HTTP POST Request

Disable Submission.png
<code>
if (current.compareTo(deadline) > 0) {
      sendError(request, response, "Submission is not allowed at the moment. Please contact help desk.");
      return;
}
</code>

After the deadline, the "Choose Files" button is disabled and users cannot upload submissions via the interface. However, users may try to submit submissions via HTTP POST request. To prevent it, we compare the current time with the deadline when processing the HTTP request. If it is after the deadline, the web application will send the HTTP response with an error message.

New Technologies

Learn & Use JavaScript & jQuery

NewTech.png
JQ.png

Flexibility

Configurable Settings via XML & Properties File
1. IP Address & URL of Database
2. System Email Address & Password
3. System Email Content
4. Admin Email Address
5. Path of Main, Grader & Required Folder

Quality Of Product

Immediate Deliverables

Stage Specification Modules
Project Management Meeting Minutes Internal, Supervisor & Sponsor Meeting Minutes
Project Schedule Project Schedule
Metrics Project Metrics
Risk Management Risk Management
Change Management Change Management
Requirements Project Scope Project Scope
Analysis Use Case Use Case
System Architecture System Architecture
Technology Interactions Technology Interactions
Operated Technologies Operated Technologies
Design Prototypes Design & Prototypes
Testing User Test Plan & Results User Test Plan & Results

Deployment

View our Deployment here!

Deployed Site


Team Folium Deployment.jpg






Objective

  • New vision of server will be deployed every time before new lab releases.
  • The new version includes features finished under the iteration schedule and bugs fixed reported by students via feedback channel.

Action Taken

  • Deploy new WAR file to the server
  • Publish Announcement to notify students of new changes

Feedback Channel

  • teamis103@gmail.com
# Date&Time Deploy for student assignment
1 22 AUG 2016/ 11:30pm Lab 1
2 28 AUG 2016/ 11:30pm Lab 2
3 10 SEP 2016/ 11:30pm Lab3 & Lab4

Testing

# Test Level Total Users Objective
1 User Test Sponsor Level 2
  • Gathering sponsor’s feedback with regards to the heuristics and design of the existing functions of the current application.
  • Identify any usability issues that persist in the application.
  • Improvement of our application based on the results.
  • Managing expectations.
2 User Test Sponsor Level 2
  • Gathering sponsor’s feedback with regards to the heuristics and design of the existing functions of the current application.
  • Identify any usability issues that persist in the application.
  • Improvement of our application based on the results.
  • Managing expectations.
3 User Test Student Level 14
  • Users should be able to complete task #1 without guidance from a test facilitator.
  • Users should find the system useful in conducting code evaluation in terms of Quality and Time Taken.
  • User should find the functionalities sufficient for the task.
  • Participants should find it easy to submit assignments and manage their submissions.
4 Live Test Student Level 187
  • Maintaining system's stability and flexibility in processing students' submissions after go-live.
  • Testing for corner cases that potentially give rise to undesired system behavior, which may compromise user experience.
  • Gathering actual users' feedback regarding system design and functionalities.
5 User Test Student Level 170
  • Assessing overall practicality of functions currently implemented in the system based on their usage frequency.
  • Assessing clarity of system and error messages currently displayed in getting users to explore the system.
  • Gathering actual users' (key stakeholders) overall feedback regarding system design and functionalities.

For testing results, assess here:

SUT 1.jpg SUT 2.jpg SUT 3 New.jpg SLT 1 New Latest.jpg SUT-4.jpg


Reflection

Team Reflection

  • Quality assurance as an essential aspect of software development process.
  • Readily available technical support and continuous system maintenance after 'go-live' are essential elements in enhancing customer experience.
  • Critical stakeholders’ feedback as effective way in improving system performance and usability

Individual Reflection

Folium mid term learning outcome.jpg