Difference between revisions of "ANLY482 AY2017-18T2 Group14 Final"

From Analytics Practicum
Jump to navigation Jump to search
Line 49: Line 49:
  
 
<br>
 
<br>
<div style="background: #EAEAEA; padding: 10px; font-weight: bold; text-align:center; line-height: wrap_content; text-indent: 20px;font-size:20px; font-family:helvetica"><font color= #3d3d3d>Conclusion </font></div>
+
<div style="background: #EAEAEA; padding: 10px; font-weight: bold; text-align:center; line-height: wrap_content; text-indent: 20px;font-size:20px; font-family:helvetica"><font color= #3d3d3d>Conclusion & Learning Points </font></div>
 
<font face ="Open Sans" size=4>
 
<font face ="Open Sans" size=4>
 
Although a dataset might come from the same source, but the underlying data could possess slight differences from each other. When performing data standardization and cleaning, it is unwise that any dataset from the same source is already cleaned and without problems. In data analytical work, there is still a requirement to explore and fully understand the data.
 
Although a dataset might come from the same source, but the underlying data could possess slight differences from each other. When performing data standardization and cleaning, it is unwise that any dataset from the same source is already cleaned and without problems. In data analytical work, there is still a requirement to explore and fully understand the data.
 
+
<br>
 
From this project, we learnt about the complexity of handling large datasets (standardizing, data insertion, data storage, database performance, data retrieval, data cleaning), as this is the first project that we handled datasets of this scale where in previous projects we have never encountered any dataset greater than 2 GB. This required us to not only think about getting the database, report or dashboard to run results as intended but also to ensure that it runs within a reasonable amount of time.  
 
From this project, we learnt about the complexity of handling large datasets (standardizing, data insertion, data storage, database performance, data retrieval, data cleaning), as this is the first project that we handled datasets of this scale where in previous projects we have never encountered any dataset greater than 2 GB. This required us to not only think about getting the database, report or dashboard to run results as intended but also to ensure that it runs within a reasonable amount of time.  
 
+
<br>
 
During the progress of the project, we have been hit with a reality that there are communication gaps between IT and business users when we faced issues sometimes to convey the value of our work to our sponsor, who is from a non-technical background. We realised that, without the ability to translate technical information into ways where non-technical personnel would be able to understand, all the work done will be in vain. As Analytics graduates in training, we need to display our multidisciplinary understanding and apply it into the work we perform.
 
During the progress of the project, we have been hit with a reality that there are communication gaps between IT and business users when we faced issues sometimes to convey the value of our work to our sponsor, who is from a non-technical background. We realised that, without the ability to translate technical information into ways where non-technical personnel would be able to understand, all the work done will be in vain. As Analytics graduates in training, we need to display our multidisciplinary understanding and apply it into the work we perform.
 
</font>
 
</font>

Revision as of 13:04, 12 April 2018

Anly4821718T2G14Logo.png

HOME

 

ABOUT US

 

PROJECT OVERVIEW

 

PROJECT MANAGEMENT

 

DOCUMENTATION

 

ANLY482 Main Page

 

 


Conclusion & Learning Points

Although a dataset might come from the same source, but the underlying data could possess slight differences from each other. When performing data standardization and cleaning, it is unwise that any dataset from the same source is already cleaned and without problems. In data analytical work, there is still a requirement to explore and fully understand the data.
From this project, we learnt about the complexity of handling large datasets (standardizing, data insertion, data storage, database performance, data retrieval, data cleaning), as this is the first project that we handled datasets of this scale where in previous projects we have never encountered any dataset greater than 2 GB. This required us to not only think about getting the database, report or dashboard to run results as intended but also to ensure that it runs within a reasonable amount of time.
During the progress of the project, we have been hit with a reality that there are communication gaps between IT and business users when we faced issues sometimes to convey the value of our work to our sponsor, who is from a non-technical background. We realised that, without the ability to translate technical information into ways where non-technical personnel would be able to understand, all the work done will be in vain. As Analytics graduates in training, we need to display our multidisciplinary understanding and apply it into the work we perform.