Difference between revisions of "AY1516 T2 Team AP Analysis PostInterimPlan"

From Analytics Practicum
Jump to navigation Jump to search
Line 87: Line 87:
 
</table>
 
</table>
  
 
+
After crawling the Facebook API for ~4.5 Hours, the result is 1600++ posts dating 10 Months ago, with a CSV file size of ~38MB. Entire code can be viewed [https://drive.google.com/a/smu.edu.sg/file/d/0B4ESKidr4zkINlNQaWxlZXByZFU/view?usp=sharing here].  
After crawling the Facebook API for ~4.5 Hours, the result is 1600++ posts dating 10 Months ago, with a CSV file size of ~38MB.  
 
  
 
[[File:Screen Shot 2016-04-09 at 5.22.10 pm.png|thumbnail|''Code snippet of likers & commenters retrieval'']]
 
[[File:Screen Shot 2016-04-09 at 5.22.10 pm.png|thumbnail|''Code snippet of likers & commenters retrieval'']]
 
[[File:Screen Shot 2016-04-09 at 5.34.50 pm.png|thumbnail|''Code snippet of conversion of CSV into GraphML format'']]
 
[[File:Screen Shot 2016-04-09 at 5.34.50 pm.png|thumbnail|''Code snippet of conversion of CSV into GraphML format'']]
  
Subsequently, we wanted to visualize the data using the Gephi tool. Hence, additional python code was used to read the CSV file, programmatically reading each row of the CSV, and attaching each post ID to likers and commenters respectively. This is done so that we can construct the .graphml graph formatted file, which gephi is able to read.  
+
Subsequently, we wanted to visualize the data using the Gephi tool. Hence, additional python code was used to read the CSV file, programmatically reading each row of the CSV, and attaching each post ID to likers and commenters respectively. This is done so that we can construct the .graphml graph formatted file, which gephi is able to read. Entire code can be viewed [https://drive.google.com/a/smu.edu.sg/file/d/0B4ESKidr4zkIdWJNdGJWV3BVVVU/view?usp=sharing here].
 +
 
 +
The resultant file (~211MB) is uploaded [https://drive.google.com/a/smu.edu.sg/file/d/0B4ESKidr4zkIbnhPTWlnSU5sUms/view?usp=sharing here] for reference.
  
The resultant file is uploaded [https://drive.google.com/a/smu.edu.sg/file/d/0B4ESKidr4zkIbnhPTWlnSU5sUms/view?usp=sharing here] for reference
+
==<div style="background: #232AE8; line-height: 0.3em; font-family:helvetica;  border-left: #6C7A89 solid 15px;"><div style="border-left: #FFFFFF solid 5px; padding:15px;font-size:15px;"><font color= "#ffffff"><strong>Gephi Analysis</strong></font></div></div>==

Revision as of 17:49, 9 April 2016

Team ap home white.png HOME

Team ap overview white.png OVERVIEW

Team ap analysis white.png ANALYSIS

Team ap project management white.png PROJECT MANAGEMENT

Team ap documentation white.png DOCUMENTATION


Data Retrieval & Manipulation Findings Post interim plan


Facebook Graph API (Post Interim Plan)

Apart from analysing one of SGAG's popular social network Twitter, we plan to leverage the Facebook Graph API. Drawing from our experience using the twitter API, we are looking to crawl Facebook data in a similar fashion, crawling, retrieving and aggregating post-level Facebook data. Hopefully, this process can yield conclusive results about the SGAG's social network (likes, shares, etc) on Facebook.

Approach (Post Interim Plan)

StepExpected ResultNotes
1 Collect all post data
  • Get all posts of SGAG on SGAG page, preferably all posts from 2 years ago to date
  • For each post, look out for 'Like' count, and when Graph API 2.6 is out, look out for the other Facebook reactions as well
2 All user objects for each like, for every post
  • For each follower, explore whether privacy options by users will limit us to further classify these user objects
  • Analyse 'Like' count per post, and if possible chart 'Likers' social network on a post-level basis
  • Manually categorize posts, like what we did for Twitter, and possibly prune them, to further refine insights
3 "Comment-Level" per post and number of shares on a "user-level"
  • Analyse posts by its comments
  • Are posts that are highly commented popular?
  • Analyze users who actually "Share" sgag's posts, and their connection with other users

Data Retrieval

Constructing the graph from scratch involved the usage of python code to retrieve posts from SGAG's Facebook account for posts dating back 10 months. This involved connecting to the Facebook graph API programatically to formulate a csv file that resembles this structure:

Each user ID in List of Likers and List of Commenters are separated by a semicolon, and tagged to each post.

Post IDList of LikersList of Commenters
378167172198277_1187053787976274 10206930900524483;1042647259126948;10204920589409318; ... 10153979571077290;955321504523847;1701864973403904; ...

After crawling the Facebook API for ~4.5 Hours, the result is 1600++ posts dating 10 Months ago, with a CSV file size of ~38MB. Entire code can be viewed here.

Code snippet of likers & commenters retrieval
Code snippet of conversion of CSV into GraphML format

Subsequently, we wanted to visualize the data using the Gephi tool. Hence, additional python code was used to read the CSV file, programmatically reading each row of the CSV, and attaching each post ID to likers and commenters respectively. This is done so that we can construct the .graphml graph formatted file, which gephi is able to read. Entire code can be viewed here.

The resultant file (~211MB) is uploaded here for reference.

Gephi Analysis