Difference between revisions of "Two Eyes One Pizza Data"
(11 intermediate revisions by 2 users not shown) | |||
Line 1: | Line 1: | ||
− | [[File: | + | [[File:3hg.png|300px|frameless|center]] |
<span class="mw-ui-button {{#switch: {{{color|white}}} | red = mw-ui-destructive | green = mw-ui-constructive | white = | blue = mw-ui-progressive}}">[[Project_Groups|{{{Clickable Button|Back to Project Home}}}]]</span><noinclude> | <span class="mw-ui-button {{#switch: {{{color|white}}} | red = mw-ui-destructive | green = mw-ui-constructive | white = | blue = mw-ui-progressive}}">[[Project_Groups|{{{Clickable Button|Back to Project Home}}}]]</span><noinclude> | ||
<!--Header--> | <!--Header--> | ||
Line 32: | Line 32: | ||
<div style="font-family:Helvetica;font-size:16px"> | <div style="font-family:Helvetica;font-size:16px"> | ||
− | < | + | These are the datasets we are using for data transformation: |
+ | </p> | ||
+ | {| class="wikitable" style="background-color:#FFFFFF;" width="100%" | ||
+ | |- | ||
+ | ! style="font-weight: bold;background: #000000;color:#fbfcfd;width: 50%;" | Dataset | ||
+ | ! style="font-weight: bold;background: #000000;color:#fbfcfd;" | Rationale | ||
+ | |- | ||
+ | | <center> Administrative Boundaries, Taiwan </center> || | ||
+ | * A dataset containing SHP files of the administrative boundaries of taiwan (county, town, village) | ||
+ | * Used as a reference to digitize IFC branch trade areas | ||
+ | |- | ||
+ | | <center> Branch location of IFC, Taiwan </center> || | ||
+ | * A dataset containing the geographical information of each individual branch. | ||
+ | * Used as the main target of our project | ||
+ | *Branches (FID) that we are looking into: 5, 162, 195, 7, 57, 25, 144, 129, 54, 81, 69, 132, 53, 122 | ||
+ | |- | ||
+ | | <center> Point of Interests '''(POI)''' , Taiwan </center> || | ||
+ | * A dataset containing each individual Point-Of-Interests in Taiwan (e.g. ATMs, Amusement Parks, Banks) | ||
+ | * Used as features for analysis with regards to each branch | ||
+ | |- | ||
+ | | <center> Powerpoint Slides of trade areas for each branch '''(PPT slides)''' </center> || | ||
+ | * A dataset containing each individual hand drawn trade area for each branch in Taiwan | ||
+ | * Used as features for analysis with regards to each branch | ||
+ | |- | ||
+ | | <center> Outlets Monthly Sales Data </center> || | ||
+ | * A dataset containing the monthly sales information of each individual branch | ||
+ | * Used to study the sales data along with the profile of each branch to generate yielding patterns (e.g. top and bottom performer) | ||
+ | |- | ||
+ | |} | ||
− | + | ||
− | Steps: | + | Steps for data transformation: |
# '''Align to Geometry and Coordinate Reference System for all files: EPSG:3828''' | # '''Align to Geometry and Coordinate Reference System for all files: EPSG:3828''' | ||
# '''Extracting each POI from the different shp files''' | # '''Extracting each POI from the different shp files''' | ||
Line 105: | Line 133: | ||
<div style="font-family:Helvetica;font-size:16px"> | <div style="font-family:Helvetica;font-size:16px"> | ||
− | It is an exhausting task to aggregating each of the POIs into each trade area using the "Count Points in Polygon" tool. <br/><br/> The batch processing tool was tried but it is unable to append each newly created POI into an existing geopackage. Hence, a python script was written and it ran under QGIS's Python Console. It utilised '''" | + | It is an exhausting task to aggregating each of the POIs into each trade area using the "Count Points in Polygon" tool. <br/><br/> The batch processing tool was tried but it is unable to append each newly created POI into an existing geopackage. Hence, a python script was written and it ran under QGIS's Python Console. It utilised '''"qgis:countpointsinpolygon"''' function, an inbuilt function by QGIS. <br/><br/> Learn more about the function here: https://docs.qgis.org/2.8/en/docs/user_manual/processing_algs/qgis/vector_analysis_tools/countpointsinpolygon.html |
[[Image: Py.png |1000px|center]] | [[Image: Py.png |1000px|center]] | ||
Line 112: | Line 140: | ||
[[Image: Ta.png |1000px|center]] | [[Image: Ta.png |1000px|center]] | ||
+ | |||
+ | Since all POIs are contained in their own geopackage, we need to find a way to merge these geopackages into one. Gladly, we have the '''OSGeo4W Shell'''. It allows us to run shell commands to merge these geopackages. | ||
+ | <br/><br/> | ||
+ | First, open the OSGeo4W Shell and cd to the directory that has the master geopackage (Taiwan_stores0x.gpkg) we want to merge into. | ||
+ | |||
+ | ''cd C:\Users\.......\Project\data\GeoPackage'' | ||
+ | |||
+ | And as all the layers in each geopackage is named as Taiwan_stores37, I want to rename them to their respective area. The test folder contains all the geopackages that I want to merge.<br/> | ||
+ | |||
+ | ''for %f in (./test/*.gpkg); do ogrinfo ./test/%f -sql "ALTER TABLE Taiwan_stores37 RENAME TO %~nf"'' | ||
+ | |||
+ | Lastly, the below command append and merges all the geopackages in the test folder into Taiwan_stores0x.gpkg. <br/> | ||
+ | |||
+ | ''for %f in (./test/*.gpkg); do ogr2ogr -f "format" -append ./Taiwan_stores0x.gpkg ./test/%f''<br/> | ||
==<div style="background: #8b1209; padding: 15px; line-height: 0.3em; text-indent: 15px; font-size:18px; font-family:Helvetica"><font color= #FFFFFF>6. Investigation on areas not covered by trade area</font></div>== | ==<div style="background: #8b1209; padding: 15px; line-height: 0.3em; text-indent: 15px; font-size:18px; font-family:Helvetica"><font color= #FFFFFF>6. Investigation on areas not covered by trade area</font></div>== | ||
Line 122: | Line 164: | ||
<br/> | <br/> | ||
In the above figure, we do see buildings that are not in any trade area. Therefore, this uncovered area may have be omitted and should be fulfilled by one of the branches. | In the above figure, we do see buildings that are not in any trade area. Therefore, this uncovered area may have be omitted and should be fulfilled by one of the branches. | ||
+ | |||
+ | |||
+ | ==<div style="background: #8b1209; padding: 15px; line-height: 0.3em; text-indent: 15px; font-size:18px; font-family:Helvetica"><font color= #FFFFFF>7. Extracting Sales Data</font></div>== | ||
+ | We obtained raw sales data which was generated from the company's Sales system. This was provided to us in a Report format which wasn't directly importable into QGIS. There were about 200 over outlet's data and it would have been impossible to do it manually one by one. Therefore we wrote 2 script that was used to extract key information from the Report and automated the output for the whole directory. | ||
+ | </br> | ||
+ | We found that the report although wasn't exact directly importable into QGIS. There was a pattern in each reports which allow us to easily parse and obtain key information which can be outputted into 2 different types of output: Daily Sales Data, Regional Sales Data for a Year. </br> | ||
+ | |||
+ | [[File:Sales script.jpg|800px]] </br> | ||
+ | Both scripts were written in Python. We found that valid rows has date on its left for the Daily Sales Data. Therefore we made us of this pattern to export the daily sales data into a separate CSV. In another similar script, we identified that once a certain word is encountered in the first column, any rows after that consisted of Regional Sales data and therefore made us of that fact to wrote a second script for a different output.</br><br> | ||
+ | |||
+ | [[File:For all files.png|800px]] </br> | ||
+ | This method is then called in both scripts to enable all the Reports in the directory to be processed and outputted as a separate CSV in another directory. This allowed us to easily distribute the Regional Sales Data to the other groups who were working on different regions.</br></br> | ||
==<div style="background: #8b1209; padding: 15px; line-height: 0.3em; text-indent: 15px; font-size:18px; font-family:Helvetica"><font color= #FFFFFF>Comments</font></div>== | ==<div style="background: #8b1209; padding: 15px; line-height: 0.3em; text-indent: 15px; font-size:18px; font-family:Helvetica"><font color= #FFFFFF>Comments</font></div>== |
Latest revision as of 10:10, 4 November 2019
Contents
- 1 Preliminary Data Observations
- 2 1. Geometry and Coordinate Reference System
- 3 2. Extracting each POI from the different shp files
- 4 3. Formation of polygons for each branch to align to ppt slides' trade area
- 5 4. Creation of competitors' POI
- 6 5. Aggregation of count of each POI to each trade area (Script)
- 7 6. Investigation on areas not covered by trade area
- 8 7. Extracting Sales Data
- 9 Comments
Preliminary Data Observations
These are the datasets we are using for data transformation:
Dataset | Rationale |
---|---|
| |
| |
| |
| |
|
Steps for data transformation:
- Align to Geometry and Coordinate Reference System for all files: EPSG:3828
- Extracting each POI from the different shp files
- Formation of polygons for each branch to align to ppt slides' trade area
- Creation of competitors' POI
- Aggregation of count of each POI to each trade area
- Investigation on areas not covered by trade area
- Data pre-processing for Sales Data
- Aggregating Sales Data with each branch
- Regression Analysis
1. Geometry and Coordinate Reference System
Every qgis file created have to align to the same reference system used by Taiwan Maps for accuracy and unit measurement.
Information on 3828 Reference system: https://epsg.io/3828
2. Extracting each POI from the different shp files
We are given the following POIs to extract:
- ATM
- Bank
- Bar or Pub
- Bookstore
- Bowling Centre
- Bus Station
- Business Facility
- Cinema
- Clothing Store
- Coffee Shop
- Commuter Rail Station
- Consumer Electronics Store
- Convenience Store
- Department Store
- Government Office
- Grocery Store
- Higher Education
- Hospital
- Hotel
- Medical Service
- Pharmacy
- Residential Area/ Building
- Restaurant
- School
- Shopping
- Sports Centre
- Sports Complex
- Train Station
These POIs have to be extracted by finding their respective Facility Type in the shp files provided.
For example, Facility Type of 9583 was filtered using the filter function in QGIS and exported as a layer to the geopackage.
3. Formation of polygons for each branch to align to ppt slides' trade area
As the trade areas are predefined from the ppt slides, we have to manually digitalise the trade area for each of the outlets. QGIS tools were used in the process: Split Features, Merge Features and our artistic skills. Moreover, we have added the store codes and area code given by the ppt slides so that we can easily identify these polygons.
Below is the generated polygons for all of our trade area.
4. Creation of competitors' POI
5 Competitors are identified and they are Dominos Pizza, Napolean Pizza, Mcdonalds, KFC and MosBurger. These data points are crucial in our analysis as they may have a correlation with the sales data. Hence, these competitors branches location have to be extracted from "Restaurant 5800".
After extraction, they are exported as layers in the geopackage to be used for aggregation.
5. Aggregation of count of each POI to each trade area (Script)
It is an exhausting task to aggregating each of the POIs into each trade area using the "Count Points in Polygon" tool.
The batch processing tool was tried but it is unable to append each newly created POI into an existing geopackage. Hence, a python script was written and it ran under QGIS's Python Console. It utilised "qgis:countpointsinpolygon" function, an inbuilt function by QGIS.
Learn more about the function here: https://docs.qgis.org/2.8/en/docs/user_manual/processing_algs/qgis/vector_analysis_tools/countpointsinpolygon.html
After the aggregation of POIs into each trade area, below is the screenshot of the columns for a particular trade area.
Since all POIs are contained in their own geopackage, we need to find a way to merge these geopackages into one. Gladly, we have the OSGeo4W Shell. It allows us to run shell commands to merge these geopackages.
First, open the OSGeo4W Shell and cd to the directory that has the master geopackage (Taiwan_stores0x.gpkg) we want to merge into.
cd C:\Users\.......\Project\data\GeoPackage
And as all the layers in each geopackage is named as Taiwan_stores37, I want to rename them to their respective area. The test folder contains all the geopackages that I want to merge.
for %f in (./test/*.gpkg); do ogrinfo ./test/%f -sql "ALTER TABLE Taiwan_stores37 RENAME TO %~nf"
Lastly, the below command append and merges all the geopackages in the test folder into Taiwan_stores0x.gpkg.
for %f in (./test/*.gpkg); do ogr2ogr -f "format" -append ./Taiwan_stores0x.gpkg ./test/%f
6. Investigation on areas not covered by trade area
As we observe the generated trade areas, it seems that there are some areas that are not covered by any of the branches. Hence, we decided to anaylse these areas.
In the above figure, the uncovered area is 大安森林公园 which is a park. Hence, it is explainable that such locations should not be covered in the delivery area.
In the above figure, we do see buildings that are not in any trade area. Therefore, this uncovered area may have be omitted and should be fulfilled by one of the branches.
7. Extracting Sales Data
We obtained raw sales data which was generated from the company's Sales system. This was provided to us in a Report format which wasn't directly importable into QGIS. There were about 200 over outlet's data and it would have been impossible to do it manually one by one. Therefore we wrote 2 script that was used to extract key information from the Report and automated the output for the whole directory.
We found that the report although wasn't exact directly importable into QGIS. There was a pattern in each reports which allow us to easily parse and obtain key information which can be outputted into 2 different types of output: Daily Sales Data, Regional Sales Data for a Year.
Both scripts were written in Python. We found that valid rows has date on its left for the Daily Sales Data. Therefore we made us of this pattern to export the daily sales data into a separate CSV. In another similar script, we identified that once a certain word is encountered in the first column, any rows after that consisted of Regional Sales data and therefore made us of that fact to wrote a second script for a different output.
This method is then called in both scripts to enable all the Reports in the directory to be processed and outputted as a separate CSV in another directory. This allowed us to easily distribute the Regional Sales Data to the other groups who were working on different regions.
Comments
Feel free to leave comments / suggestions!