Do You Care About My Privacy?

1 downloads 0 Views 299KB Size Report
Nov 7, 2010 - Communicating Privacy Policies Between Facebook Friends. Sebastian Banescu ..... wants to become Bob's friend and sends him a request.

Do You Care About My Privacy? Communicating Privacy Policies Between Facebook Friends Sebastian Banescu [email protected]

Simona Posea [email protected]

Andrei Calin [email protected]

7 November 2010


an individual perspective, these tools ignore that privacy violations are to a large extent caused by other users. In Section 2, we review existing solutions regarding the privacy issue in social networks. Section 3 explains our analysis of the risks that the consumer faces when she joins a social network. In Section 4 we present the design of tool that takes this analysis into account. Section 5 shows the results generated by the tool for a set of users. The final section shows how we intend to develop the solution further.

As a consequence to the overwhelming usage of Facebook, protecting user privacy has become a challenging task. In this paper we present a novel approach to improve Facebook user privacy in the form of a tool. The main contributions of this tool, the Personal Chief Security Officer are: (i) explaining to users the risks that are caused by using Facebook, (ii) defining more precise privacy policies to reduce this risks and (iii) communicating these policies between users to increase awareness and create clear social norms for acceptable behavior.



Related work

Several aspects of social network privacy have been discussed by researchers, who have different analyses and solutions to Facebook security problems. During the last few years, different security gaps have been exploited and resulted in attacks. The type of the attack can vary from private data being stolen2 , financial attack or intimidation to harassment or even criminal acts3 . One of the main privacy problems refers to the Facebook Corporation being able to view and use information related to any user. To avoid this confidentiality issue, Luo et al [1] have developed a technical solution entitled FaceCloak. This method implies storing encrypted user data on a third party server. Another problem are highly complicated privacy policies of Facebook. More specifically, users of the social network do not understand how and why they should set their privacy settings. This can lead to a high disclosure of user data. Lipford et al [5] have given a solution based on structuring one’s privacy settings interface depending on the information available to a particular audience (search, network, friend, or self).


As the popularity of social networks increases, protecting user data tends to become an increasingly difficult problem. The virtual and real life of the consumer seem to become more and more interconnected. The aggregation of these two dimensions can generate several risks, that can affect the user in many ways. Problems at the work place or harassment [2] are just a few examples of the consequences that becoming part of a social network can have. Claiming to have over 500 million members1 , Facebook appears as being the most popular social network to this date. Therefore this platform has drawn the attention of numerous researchers in the field of security. This paper focuses on analyzing the security problems and developing a solution to enhance privacy for Facebook users, in a novel way, complementary to other solutions. It ensures that users are up-to-date with the privacy policies of their friends, and of others who want to become their friends. Although there exist tools that scan and help the user to maintain a secure profile from

2 blogname=FranklySpeaking&articleid=14854 3 americas/08/24/colombia.facebook.killings

1 technology/22facebook.html


The same aspect is treated by Reclaim Privacy 4 . This is a privacy scanner that can automatically inspect your Facebook privacy settings and highlight those that are risky from the point of view of information leakage. Although it is user friendly and requires only a few clicks, this tool only highlights the privacy settings that are loose. Another issue with Facebook privacy settings is that achieving the preferred privacy policy is usually time consuming, because a user would have to manually create lists of friends and associate corresponding sharing levels. To overcome this aspect Feng and LeFerve [3] have developed a wizard that infers a user’s sharing preferences based on communities extracted from her neighborhood. In order to differentiate privacy issues, Bruce Schneier [7] gives a taxonomy of social networking data. Different approaches to solving the privacy problems in social networks, handle different types of data from this taxonomy. The previously presented approaches handle privacy issues related to data you give to the social networking site (service data) and data that you post on your own pages (disclosed data) or pages of other users (entrusted data).

Type of Risk individual user risks social risks

Type of Solution advisor oracle establishing social norms

Table 1: Risks and Solutions To illustrate such an attack the following scenario is considered: suppose a user (Alice) updates her status to reveal her current whereabouts. Her action may lead to burglary, because everyone with access to her wall can see when she is not home. Another example refers to Alice posting several of her likes and interests, from which others may infer her sexual orientation [4]. This inferred information may be unacceptable to her current or future employers. Limiting these risks requires Alice to alter her behavior.


Social risks

Recently, there have been changes in the privacy settings pages of Facebook5 , which facilitates user privacy configuration. By choosing to share all information with friends only, a user may consider she has attained perfect privacy. Most users are not aware of the threats their 3 Problem analysis friends pose to them. Friends privacy goals can often differ from one another. In this section we discuss the risks In this section we present a different analysis of Facethat have the most significant impact on Facebook users. book privacy problems. Our approach focuses on exchanging privacy goals between Facebook users in order to clearly specify what they should expect from each 3.2.1 Posting sensitive information other. According to Schneier’s taxonomy, the solution In Social Network Sites (SNS) each user can be tagged described in this paper solves privacy issues related to by her friends in different pictures or videos, which what other people post about you (incidental data), and sometimes can put her in an awkward position. Although also disclosed and entrusted data. More specifically, it Facebook allows the users to remove unwanted tags, in handles both problems caused by oneself and by others. most of the cases the one who is tagged is not online In social networks, users do not consider security and to remove the tag immediately. And this can have dire privacy as one of their main goals. The average consumer consequences, like job loss, divorce or conflicts between is not aware of the risks assumed when deciding to be- friends. come part of a social network. Attacks originate from Individuals are not the only ones who can be affected different types of risks, each having the solutions pre- by social networks. By revealing secret information, insented in Table 1. siders can cause significant financial losses to companies


or public institutions. Moreover, military operations can be canceled from the same reason. A similar case has been recorded at the beginning of 2010 in Israel6 , when

Individual user risks

Single user risks are threats that individual users cause 5 and that can affect only them. This category of risks can the_3_facebook_settings_every_user_ be mitigated by individual users themselves. should_check_now.php 6 aspx?id=170156

4 05/facebook-transparency-tool/


a soldier made public on Facebook the location and the 4 Solution: personal Chief Security time of a national military operation. Officer Similar to tagging the problem of posting on a user’s profile by other friends can reveal information that the The main difference between our approach and others is target would have wanted to keep private from others that the possibility of users to interact with each other through have access to her profile. the exchange of privacy policies. In a privacy policy a user specifies what she expects from others and what other can expect from her. Through this mechanism it 3.2.2 Inferring private information establishes public norms. This is relevant because peoSome users may think that if they set all their attributes ple find it difficult to adhere to their own principles. This accessible only to friends then they attain a high level concept of communicating one’s security policy has been of privacy. It has been shown that even if only 20% of presented by Van Cleeff [8]. the people in one’s Friends List provide public access to This approach has been integrated in a Facebook applitheir attributes (i.e. hometown, schools, employer, curcation entitled personal Chief Security Officer (pCSO)8 . rent city, relationship status, sexual orientation, political Note that the current solution has been designed to enviews, etc), then the hidden attributes of the target user force a declarative security policy based on reciprocal could be inferred with 80% accuracy[6]. user trust. It acts as a privacy advisor and includes a mechanism for communicating one’s settings to others. 3.2.3 Social engineering Figure 1 explains the aims of our tool graphically. It shows the potential state space of Facebook relevant for Because most of the default settings for the majority of a user (Alice). Out of all possible states only a small attributes are set visible for ”Friends of Friends”, it is part is desirable. Facebook built-in policies (for user and enough for a single friend of the target to be manipulated friends) can only partly account for these desired states. into adding an attacker as a friend, in order for most of The arrows show the role of our tool: help the user to the private information of the target to be visible to that shift to the desired states, by help of friends and herself. attacker. Furthermore if several of one’s friends accept the attackers friend request, then she could send the target Entire Facebook statespace Facebook built-in a friend request with high probability of being accepted. relevant for Alice policies of Alice's friends This experiment7 has been done by a college student that gained 75 thousand friends on Facebook after creating a script that sent 250.000 friend requests. 3.2.4

Alice's desired states

Inferring identity

Facebook built-in policies of Alice

Zhou and Pei [9] have shown that it is possible to infer someone’s identity from a social network from her friend list even if direct information (name, photo, address, email, etc) is not explicitly specified. Therefore even if a user hides all information posted on her profile, most of it can be inferred by inspecting her friend list.


Policies specified in the pCSO by Alice and Friends

Figure 1: Policy Diagram

Conclusion of analysis

Taking all these risks into consideration, one notices that Facebook security and privacy is not only a matter of in- 4.1 Risk reasoning dividual users. It can only be improved by establishing and enforcing social norms. The next section presents a Figure 2 shows a schematic view of how the tool reasons regarding risks. Subsequently it presents advice to users solution for this problem. while they are defining their privacy policy. It is based 7 stories/2005/09/01/a-new-kind-of-fame/



Type of data

Phone Number/Email Address


Likes & Interests


Sex / Birthdate

Education & Work

Using embarrassing pictures

Find controversial opinion

Find personal schedule

Social security number

Call on behalf of employeer

Harassment calls

Send email on behalf of the user

ID theft






Publish opinions


Job Loss

Criminal attack

Family Conflicts


ID theft


Financial Loss

Physical Harm

Law suite

Inferences from user data

Type of attacks

Impact on User


Figure 2: Classification Tree presenting Risks and Threats on Facebook on our literature review and is organized on four levels which represent different stages of possible attacks. The topmost level lists the basic information items that can be extracted from different social networks. The second level presents how this information can be combined and used to initiate attacks described on level three. The bottom level shows the impact of attacks. Notice that an attack may have several consequences.



sends friend request

request privacy policy exchange

accept mutual exchange



Policy exchange protocol

A simple scenario is presented in Figure 3 where the two actors involved are called Alice and Bob. Initially Alice wants to become Bob’s friend and sends him a request. Before accepting her as a friend, Bob can ask Alice for a mutual exchange of privacy policies. If Alice accepts the request, Bob can make a decision regarding the friend request according to his privacy goals. If Alice’s privacy policy introduces unwanted risks for Bob then he may send her a request, specifying the minimal changes she has to perform. At this point Alice may choose to decline Bob’s request, or change her security settings and then notify Bob. Now if this new policy satisfies Bob’s privacy goals then he confirms the friend request, otherwise he ignores it. Note that not all privacy policy requests have to be accepted, i.e. one can also reject such a request for particular reasons. We have also considered a protocol in which the privacy policy would not have been mutually exchanged. If Alice accepted to send her privacy policy then Bob would be able to see it, but she would not be able to see his privacy policy. In order to be able to see his privacy policy she would also have to send a separate request, which he must accept. This prolonged interaction would have been cumbersome for users.

confirm friend request (optional)

Figure 3: Policy Exchange Protocol


Policy Enforcement

Automatic enforcement would not be feasible for any given goal. As an example consider the setting: ”Do not post any pictures that my employer would disapprove of”. In these cases manual enforcement is the only viable alternative, i.e. the user detects any privacy violation and acts accordingly. Nevertheless some goals are detectable. For instance ”Do not tag me in photos” or ”Do not post information that could allow people to infer I am not home during a fixed period”. The former can be easily detected by checking Photos tab of one’s Facebook profile. The latter can be checked with a certain degree of error by natural language processing. 4



The implementation of the solution has been done using Facebook API for PHP and JavaScript. The serverside scripts run on an Apache HTTP server hosted on a Linux Debian machine. For data storage MySQL relational database management system has been used. The only sensitive data stored are user e-mail addresses and privacy policies. The database schema also includes privacy goals, threats and mitigation together with examples.

Figure 5: Outgoing Requests

The Shared Risk Repository presented in Figure 4 contains the privacy goals without any association with users. It can be viewed by users, and they may also select goals suitable for their own privacy. Security experts may add goals and other users may also recommend goals that are not included in the repository. As opposed to current privacy settings page provided by Facebook, one can also define privacy goals to specify the expectations that she has from other users. Moreover our solution also guides the user in the process of choosing her sharing preferences and goals. This is performed by an advisor module integrated into our application, which offers information regarding threats, their impact and the way to mitigate them, along with suggestive examples. The user can view the status of her requests (accepted, rejected, pending) on the Outgoing Requests page presented in Figure 5. For those users who accepted the request, their privacy policies are visible on the View Privacy Policy page of the application. Moreover, one can accept or reject requests sent by other users, by accessing the Received Requests page presented in Figure 6.

Shared Risk Repository

Figure 6: Main Menu & Incoming Requests


The application was recommended as a privacy enhancing tool to 46 users chosen from the developers’Facebook friend lists; having different ages, sex and professional backgrounds. From these 46 users, 21 agreed to test the application and give feedback. 57,14% of the test subjects were female. 61.9% of the testers had a technical background, 19% had business studies, and the rest had other professions or not specified. 76.19% had ages between 18 and 26, the others were between 26 and 42. The survey has the following questions: Do you find the application easy to use?, Do you consider the application helpful for enhancing your privacy?, Would you recommend the application to your friends? and Will you continue using the tool?. Their answers are shown in the chart from Figure 7.

Add/ Review/ Modify Security Experts


User Interface

Facebook Bob

User Interface

Initial testing and results

pCSO Server

Figure 4: Integration Scheme





90,00% 80,00%


[1] FaceCloak: An Architecture for User Privacy on Social Networking Sites, volume 3, Los Alamitos, CA, USA, August 2009. IEEE Computer Society.




Easy to use

[2] David Bednall, Alan Hirst, Marie Ashwin, Orhan Icoz, Bertil Hulten, and Timothy Bednall. Social networking, social harassment and social policy. 2009.

Helpful for privacy


Would recommend


Continue to use

40,00% 30,00% 20,00%

[3] Lujun Fang and Kristen LeFevre. Privacy wizards for social networking sites. In WWW ’10: Proceedings of the 19th international conference on World wide web, pages 351–360, New York, NY, USA, 2010. ACM.

10,00% 0,00%

Figure 7: Survey Results


[4] Carolyn Y. Johnson. Project ’Gaydar’: An MIT experiment raises new questions about online privacy. September 2009.

Conclusions and future work

[5] Heather R. Lipford, Andrew Besmer, and Jason Watson. Understanding privacy settings in Facebook Facebook privacy is not improved easily, both because with an audience view. In Proceedings of UPSEC of the large number of risks that one is exposed to and ’08 (Usability, Psychology and Security), 2008. because the desired level of privacy may differ between users. Thus, a complex and flexible solution is required. [6] Alan Mislove, Bimal Viswanath, Krishna P. GumMoreover, users need to become aware of the fact that madi, and Peter Druschel. You are who you know: risks can be generated by both their own actions and the inferring user profiles in online social networks. In actions of friends. They need to learn that privacy is not WSDM ’10: Proceedings of the third ACM internaonly their goal but also that of others. tional conference on Web search and data mining, pages 251–260, New York, NY, USA, 2010. ACM. In this paper we presented a novel approach for enhancing Facebook user privacy, where both individual [7] Bruce Schneier. A taxonomy of social networking risks and social risks are taken into consideration. This data. IEEE Security and Privacy, 8:88, 2010. approach can help those millions of users to clearly set norms of acceptable behavior. Applying these norms [8] A. Van Cleeff. A Risk Management Process for Conleads to a high level of risks avoidance, due to the fact sumers: The Next Step in Information Security. In that user interaction is performed with respect to the users Proceedings of the 2010 workshop on New security goals. According to our initial testing results, the appliparadigms. ACM, 2010 in press. cation proves to be useful for improving the privacy level [9] Bin Zhou and Jian Pei. Preserving privacy in soof users. cial networks against neighborhood attacks. In ICDE As further development, we intend to offer users the ’08: Proceedings of the 2008 IEEE 24th Internapossibility to create fine-grained policy goals that apply tional Conference on Data Engineering, pages 506– only to certain groups of users. In order to facilitate the 515, Washington, DC, USA, 2008. IEEE Computer enforcement of privacy statements we will automate the Society. detection of goal violation where-ever possible. Future work also considers designing a secure architecture for our application. This would include securing the communication between clients and the server using transport layer security. Encryption of user information stored inside the database would require storing keys on a separate server. 6