Do You Care About My Privacy? Communicating Privacy Policies Between Facebook Friends Sebastian Banescu [email protected]
Simona Posea [email protected]
Andrei Calin [email protected]
7 November 2010
an individual perspective, these tools ignore that privacy violations are to a large extent caused by other users. In Section 2, we review existing solutions regarding the privacy issue in social networks. Section 3 explains our analysis of the risks that the consumer faces when she joins a social network. In Section 4 we present the design of tool that takes this analysis into account. Section 5 shows the results generated by the tool for a set of users. The final section shows how we intend to develop the solution further.
As a consequence to the overwhelming usage of Facebook, protecting user privacy has become a challenging task. In this paper we present a novel approach to improve Facebook user privacy in the form of a tool. The main contributions of this tool, the Personal Chief Security Officer are: (i) explaining to users the risks that are caused by using Facebook, (ii) defining more precise privacy policies to reduce this risks and (iii) communicating these policies between users to increase awareness and create clear social norms for acceptable behavior.
Several aspects of social network privacy have been discussed by researchers, who have different analyses and solutions to Facebook security problems. During the last few years, different security gaps have been exploited and resulted in attacks. The type of the attack can vary from private data being stolen2 , financial attack or intimidation to harassment or even criminal acts3 . One of the main privacy problems refers to the Facebook Corporation being able to view and use information related to any user. To avoid this confidentiality issue, Luo et al  have developed a technical solution entitled FaceCloak. This method implies storing encrypted user data on a third party server. Another problem are highly complicated privacy policies of Facebook. More specifically, users of the social network do not understand how and why they should set their privacy settings. This can lead to a high disclosure of user data. Lipford et al  have given a solution based on structuring one’s privacy settings interface depending on the information available to a particular audience (search, network, friend, or self).
As the popularity of social networks increases, protecting user data tends to become an increasingly difficult problem. The virtual and real life of the consumer seem to become more and more interconnected. The aggregation of these two dimensions can generate several risks, that can affect the user in many ways. Problems at the work place or harassment  are just a few examples of the consequences that becoming part of a social network can have. Claiming to have over 500 million members1 , Facebook appears as being the most popular social network to this date. Therefore this platform has drawn the attention of numerous researchers in the field of security. This paper focuses on analyzing the security problems and developing a solution to enhance privacy for Facebook users, in a novel way, complementary to other solutions. It ensures that users are up-to-date with the privacy policies of their friends, and of others who want to become their friends. Although there exist tools that scan and help the user to maintain a secure profile from
http://www.tvsa.co.za/default.asp? blogname=FranklySpeaking&articleid=14854 3 http://edition.cnn.com/2010/WORLD/ americas/08/24/colombia.facebook.killings
1 http://www.nytimes.com/2010/07/22/ technology/22facebook.html
Type of Risk individual user risks social risks
Type of Solution advisor oracle establishing social norms
Table 1: Risks and Solutions To illustrate such an attack the following scenario is considered: suppose a user (Alice) updates her status to reveal her current whereabouts. Her action may lead to burglary, because everyone with access to her wall can see when she is not home. Another example refers to Alice posting several of her likes and interests, from which others may infer her sexual orientation . This inferred information may be unacceptable to her current or future employers. Limiting these risks requires Alice to alter her behavior.
Recently, there have been changes in the privacy settings pages of Facebook5 , which facilitates user privacy configuration. By choosing to share all information with friends only, a user may consider she has attained perfect privacy. Most users are not aware of the threats their 3 Problem analysis friends pose to them. Friends privacy goals can often differ from one another. In this section we discuss the risks In this section we present a different analysis of Facethat have the most significant impact on Facebook users. book privacy problems. Our approach focuses on exchanging privacy goals between Facebook users in order to clearly specify what they should expect from each 3.2.1 Posting sensitive information other. According to Schneier’s taxonomy, the solution In Social Network Sites (SNS) each user can be tagged described in this paper solves privacy issues related to by her friends in different pictures or videos, which what other people post about you (incidental data), and sometimes can put her in an awkward position. Although also disclosed and entrusted data. More specifically, it Facebook allows the users to remove unwanted tags, in handles both problems caused by oneself and by others. most of the cases the one who is tagged is not online In social networks, users do not consider security and to remove the tag immediately. And this can have dire privacy as one of their main goals. The average consumer consequences, like job loss, divorce or conflicts between is not aware of the risks assumed when deciding to be- friends. come part of a social network. Attacks originate from Individuals are not the only ones who can be affected different types of risks, each having the solutions pre- by social networks. By revealing secret information, insented in Table 1. siders can cause significant financial losses to companies
or public institutions. Moreover, military operations can be canceled from the same reason. A similar case has been recorded at the beginning of 2010 in Israel6 , when
Individual user risks
Single user risks are threats that individual users cause 5 and that can affect only them. This category of risks can http://www.readwriteweb.com/archives/ the_3_facebook_settings_every_user_ be mitigated by individual users themselves. should_check_now.php 6 http://www.jpost.com/Israel/Article. aspx?id=170156
Alice's desired states
Facebook built-in policies of Alice
Zhou and Pei  have shown that it is possible to infer someone’s identity from a social network from her friend list even if direct information (name, photo, address, email, etc) is not explicitly specified. Therefore even if a user hides all information posted on her profile, most of it can be inferred by inspecting her friend list.
Policies specified in the pCSO by Alice and Friends
Figure 1: Policy Diagram
Conclusion of analysis
Type of data
Phone Number/Email Address
Likes & Interests
Sex / Birthdate
Education & Work
Using embarrassing pictures
Find controversial opinion
Find personal schedule
Social security number
Call on behalf of employeer
Send email on behalf of the user
Inferences from user data
Type of attacks
Impact on User
Figure 2: Classification Tree presenting Risks and Threats on Facebook on our literature review and is organized on four levels which represent different stages of possible attacks. The topmost level lists the basic information items that can be extracted from different social networks. The second level presents how this information can be combined and used to initiate attacks described on level three. The bottom level shows the impact of attacks. Notice that an attack may have several consequences.
sends friend request
accept mutual exchange
Policy exchange protocol
confirm friend request (optional)
Figure 3: Policy Exchange Protocol
Automatic enforcement would not be feasible for any given goal. As an example consider the setting: ”Do not post any pictures that my employer would disapprove of”. In these cases manual enforcement is the only viable alternative, i.e. the user detects any privacy violation and acts accordingly. Nevertheless some goals are detectable. For instance ”Do not tag me in photos” or ”Do not post information that could allow people to infer I am not home during a fixed period”. The former can be easily detected by checking Photos tab of one’s Facebook profile. The latter can be checked with a certain degree of error by natural language processing. 4
Figure 5: Outgoing Requests
Shared Risk Repository
Figure 6: Main Menu & Incoming Requests
The application was recommended as a privacy enhancing tool to 46 users chosen from the developers’Facebook friend lists; having different ages, sex and professional backgrounds. From these 46 users, 21 agreed to test the application and give feedback. 57,14% of the test subjects were female. 61.9% of the testers had a technical background, 19% had business studies, and the rest had other professions or not specified. 76.19% had ages between 18 and 26, the others were between 26 and 42. The survey has the following questions: Do you find the application easy to use?, Do you consider the application helpful for enhancing your privacy?, Would you recommend the application to your friends? and Will you continue using the tool?. Their answers are shown in the chart from Figure 7.
Add/ Review/ Modify Security Experts
Initial testing and results
Figure 4: Integration Scheme
 FaceCloak: An Architecture for User Privacy on Social Networking Sites, volume 3, Los Alamitos, CA, USA, August 2009. IEEE Computer Society.
Easy to use
 David Bednall, Alan Hirst, Marie Ashwin, Orhan Icoz, Bertil Hulten, and Timothy Bednall. Social networking, social harassment and social policy. 2009.
Helpful for privacy
Continue to use
40,00% 30,00% 20,00%
 Lujun Fang and Kristen LeFevre. Privacy wizards for social networking sites. In WWW ’10: Proceedings of the 19th international conference on World wide web, pages 351–360, New York, NY, USA, 2010. ACM.
Figure 7: Survey Results
 Carolyn Y. Johnson. Project ’Gaydar’: An MIT experiment raises new questions about online privacy. September 2009.
Conclusions and future work
 Heather R. Lipford, Andrew Besmer, and Jason Watson. Understanding privacy settings in Facebook Facebook privacy is not improved easily, both because with an audience view. In Proceedings of UPSEC of the large number of risks that one is exposed to and ’08 (Usability, Psychology and Security), 2008. because the desired level of privacy may differ between users. Thus, a complex and flexible solution is required.  Alan Mislove, Bimal Viswanath, Krishna P. GumMoreover, users need to become aware of the fact that madi, and Peter Druschel. You are who you know: risks can be generated by both their own actions and the inferring user profiles in online social networks. In actions of friends. They need to learn that privacy is not WSDM ’10: Proceedings of the third ACM internaonly their goal but also that of others. tional conference on Web search and data mining, pages 251–260, New York, NY, USA, 2010. ACM. In this paper we presented a novel approach for enhancing Facebook user privacy, where both individual  Bruce Schneier. A taxonomy of social networking risks and social risks are taken into consideration. This data. IEEE Security and Privacy, 8:88, 2010. approach can help those millions of users to clearly set norms of acceptable behavior. Applying these norms  A. Van Cleeff. A Risk Management Process for Conleads to a high level of risks avoidance, due to the fact sumers: The Next Step in Information Security. In that user interaction is performed with respect to the users Proceedings of the 2010 workshop on New security goals. According to our initial testing results, the appliparadigms. ACM, 2010 in press. cation proves to be useful for improving the privacy level  Bin Zhou and Jian Pei. Preserving privacy in soof users. cial networks against neighborhood attacks. In ICDE As further development, we intend to offer users the ’08: Proceedings of the 2008 IEEE 24th Internapossibility to create fine-grained policy goals that apply tional Conference on Data Engineering, pages 506– only to certain groups of users. In order to facilitate the 515, Washington, DC, USA, 2008. IEEE Computer enforcement of privacy statements we will automate the Society. detection of goal violation where-ever possible. Future work also considers designing a secure architecture for our application. This would include securing the communication between clients and the server using transport layer security. Encryption of user information stored inside the database would require storing keys on a separate server. 6