Security Challenges for Reputation Mechanisms using Online Social ...

2 downloads 249 Views 455KB Size Report
Nov 9, 2009 - new online identities can potentially allow users to subvert the mechanism. Online social networks offer one way to ad- dress this concern by ...
Security Challenges for Reputation Mechanisms using Online Social Networks Tad Hogg

Hewlett-Packard Laboratories 1501 Page Mill Road Palo Alto, CA

[email protected]

ABSTRACT

However, reputation mechanisms can be manipulated by collusion among friends or the creation of false identities [29], particularly in the context of small vendors in different legal jurisdictions who cannot rely on the branding associated with large, well-known companies. Friends can give each other mutually high ratings in spite of poor actual performance, distorting the reported reputation values. Even the perception of such distortion, whether or not it occurs, can significantly reduce the usefulness of the reputation system. In some cases, characteristics of the community in which the transactions take place can help address this problem. For example, when users collaborate to create public content, such as for Wikipedia, the community response to edits can form the basis of a robust reputation mechanism [4]. Another possible approach arises from the growing availability of explicitly-specified social networks in online communities. These consist of links among individuals indicating various social relationships. Services building online social networks, such as Facebook and LinkedIn, have rapidly acquired millions of users and assist them in forming new social or business contacts through those they already have. These existing contacts are either specified manually or gathered automatically, e.g., from email, instant messaging, links on web home pages [1], patterns of activity recorded by sensors [17, 26] or the web of trust for decentralized cryptographic keys [13].

Reputation mechanisms are a key component of e-commerce, particularly for peer-to-peer transactions. The ratings provided by previous users can help people identify others who are likely to be reliable and encourage honest behavior among participants. However, the anonymity and ease of creating new online identities can potentially allow users to subvert the mechanism. Online social networks offer one way to address this concern by embedding participants in a network of links, allowing users to personalize recommendations via people they trust. However, using these networks raises additional concerns for network integrity and users’ privacy.

Categories and Subject Descriptors J.4 [Social and Behavioral Sciences]: Economics; K.4.4 [Computers and Society]: Electronic Commerce—security

General Terms Economics

Keywords reputation, social networks, user-generated content

1.

REPUTATION MECHANISMS 2.

Economic transactions often rely on trust. For instance, sellers usually know more than buyers about items or services offered and could misrepresent them. Without trust, mutually beneficial transactions may not take place, resulting in economic loss [10]. Fortunately, with repeated transactions, reputations [20] can encourage truthful behavior and have been applied to e-commerce in the form of ratings, such as those used by eBay. These online reputation systems typically include simple numerical scores, which are easy to process automatically, richer feedback of user experiences with text comments [11] and user profiles or photos, that affect perceived honesty [9].

USING SOCIAL NETWORKS

Social networks can be useful for reputation mechanisms in two ways [16, 32, 34]: as an automated aid to identify reputation based on position in the network, and as a filtering tool for users’ ratings. The remainder of this section describes these approaches. The thresholds and weights used by these approaches can be efficiently computed using the network structure.

2.1

Ratings Based on Network Structure

The first approach uses an individual’s position in a social network to compute an implicit reputation [35, 28, 8], without requiring explicit effort on the part of users to rate one another after each transaction. This approach is useful when social connectivity correlates with honest fulfillment of transactions. Automated management of reputation ratings, both for service quality and reliability of rating others, can also help produce reliable reputations [36]. One way to assign reputation based on social network structure considers each link in the network as an implicit recommendation for the person linked to [14]. Alterna-

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. AISec’09, November 9, 2009, Chicago, Illinois, USA. Copyright 2009 ACM 978-1-60558-781-3/09/11 ...$10.00.

31

tively, weights can be added to the links by allowing users to privately rate their contacts based on characteristics such as trustworthiness. One can then apply a PageRank algorithm [25] to assign reputations to individuals. Because PageRank is based on the global structure of the network, it is more difficult to spoof than local network properties, as it is not sufficient to have just anyone recommend you, but they need to have high reputation themselves.

2.2

4572993 4572495 2243724 3128366 4572471 1000240

385052

Filtering Explicit Ratings

3381223

A second approach uses ratings produced explicitly by users based on past transactions, filtered through the social network. For example, the Sporas reputation mechanism [38] reduces the benefit of collusion by only considering a single rating from any other person, no matter how much experience, i.e., number of transactions, they may have. More generally, one can filter based on the receiver’s preferences and position of evaluators in the social network. The Regret [31] system uses ratings from the receiver’s social group, and the social group of the individual being rated. There are many other ways to use the social network. For example, a user may be encouraged to see a single person giving repeat business to a vendor, but only if that person and the vendor are not too close in the social network. Others may choose to rely more on the number of different raters, from mutually distant parts of the social network. Or one may give higher weights to ratings from one’s own friends. Allowing users to select among various filters gives flexibility in using social networks, and this diversity could complicate attempts to distort the reported reputations. Even the partial knowledge of social networks provided by online systems can help evaluate and combine various recommendations. For instance, given a vendor to evaluate and a social network structure involving both the vendor and the person requesting the reputation information, we can restrict recommendations to come from those far from the vendor in the social network. In this way one excludes the vendor’s immediate social circle, which may be biased in the target’s favor. Conversely, if one trusts people to reveal their information, one could ask for ratings from the vendor’s contacts since they likely have better information about the vendor than others (e.g., knowledge of the vendor’s behavior in other business transactions beyond those known to the online rating system). This latter case can rely on a secondary reputation mechanism of people rating the helpfulness of recommendations from others about a vendor, in addition to the primary reputation measure of the vendor.

3.

4572497

1000247

2864759 3214858

385103

Figure 1: An attempt to hide a friend relationship by friends deliberately avoiding entering their link in a network, shown here as part of the BuddyZoo network. Numbers represent users. The two users in the thick boxes attempt to hide their relationship (dashed thick line) but are still linked through several friends-of-friends (thick solid lines).

abundance of motifs (small subgraphs), arousing suspicion if it differs significantly from that of social networks in general [21]. If reputation scores are filtered by excluding explicit ratings from friends and friends of friends of the person being evaluated, altering reputation scores requires collusion outside of one’s identified social circle, which is more difficult than collusion among people who already know each other. For example, one could try to thwart the reputation mechanism by deliberately removing links to hide collusion among friends. The temptation for this distortion is at least partly countered by loss of the benefits of a social network to its users from accurate links, e.g., obtaining business referrals or having a high social ranking and enhancing the trust of potential customers. Thus a design question is how to arrange for a large loss of the non-reputation benefits of the network to any user who hides or misrepresents connections, compared to the potential gain from hiding a collusion among friends to spoof the reputation mechanism. To illustrate the benefit of using social networks, consider two friends declining to reveal the link between themselves to subvert reputation filtering based on network distance. Fig. 1 shows an example based on the BuddyZoo network from 2004 (http://buddyzoo.com) consisting of 140,181 users of the AOL instant messenger service who submitted their buddy lists to the BuddyZoo service. BuddyZoo is a subset of users from a large social network volunteering information about their social contacts. The resulting network includes links only between users who explicitly registered with BuddyZoo, with an average of 9.5 links per user. In this network, 9% of the users have only a single connection, and would disconnect themselves from the network if they were to remove the link. Of the remaining pairs of users, only 19% could remove their direct link and be at least distance 3 from each other, while all others would remain friends of friends. The average number of friends shared by two friends in BuddyZoo was four, which means that an average of four additional participants would have to be persuaded to collude to spoof a reputation filtering mechanism that excludes

BENEFITS AND CHALLENGES OF USING SOCIAL NETWORKS

Using social network structure with reputation makes it more difficult to spoof the system by creating false identities or colluding in small groups. False identities would either give themselves away by connecting to their old friends, or remain disconnected, in which case they will have poor social ranking. Moreover, large-scale analysis of social networks can uncover some forms of group collusion. For example, web pages colluding to alter their search engine ranking by linking to one another can be identified and removed if they all have a similar number of links [15], and similar techniques can help with other online graphs such as social networks [37]. Alternately, collusion could alter the relative

32

ratings from friends of friends, as well as friends, of the person being rated. This means that the majority of friends could not effectively conceal their connection because they share friends in common. Another example is the friends social network from Essembly (http://essembly.com), a political discussion community [3]. The network as of December 2006 had 4873 users, with 5.5 links per user, on average, and 40% of users had only one link. Of the remaining pairs, only 10% would be at least distance 3 from each other if they removed their link. For Essembly, friends had five mutual friends, on average. These examples illustrate the robustness of social networks against alteration, and rely on the high clustering of links in social networks [24]. On the other hand, the incompleteness of social network data limits rating accuracy. That is, the available online networks are only a partial representation of the actual social links among people [27]. Nodes with no links might be fake identities without real social ties or they may simply be individuals who prefer not to reveal their social links. One would like to separate false identities from real users by providing incentives for real users to reveal their social ties. An obvious incentive is that appearing in the network is likely to enhance the trust of potential customers. Moreover, if social network filtering is used as part of the search criterion to find highly rated vendors, those who choose not to participate in the social network would never appear in the search results. Using online social networks for reputation raises privacy concerns, which could inhibit participation (for networks relying on intentional enrollment) or lead to legal barriers to collecting the information (for networks produced as side effects of other activities). An important security challenge is how to reduce these concerns. One approach is using a trusted third party to handle the network data and the associated computations, revealing only the final reputation scores. Alternatively, for those concerned that a third party may eventually misuse or be compelled to reveal the network, decentralized secure computation [12] could produce the aggregate filtered reputation values without a single party having access to the full social network or all the ratings, though such techniques incur substantial computational cost. This security issue is particularly relevant for combining social networks from different providers (e.g, Facebook and Digg) since anonymous users can often be inferred from the topology of the network [23]. NodeRank [28] is a decentralized algorithm similar to PageRank that can assign reputations using a social network. Alternately, one can propagate reputation ratings along the social network, where each agent receives information about potential targets through referral chains [38, 36]. Cryptographic techniques can further improve privacy of economic mechanisms by allowing precise control over the distribution of information among participants without requiring a trusted intermediary [18, 22]. While these techniques may be helpful, it remains to be seen how they interact with the network structure, continual changes in the network as new users join, and the need to assure users the reported ratings are reliable. In summary, online social networks could improve the effectiveness of reputation ratings by making it more difficult to alter the ratings via collusion. Security techniques and incentive designs to encourage truthful revelation are key to ensuring accurate use of the network while respecting pri-

vacy. Experimental evaluations [7, 2, 19, 6, 30, 5] of how people use network-based reputation mechanisms under various incentives for misrepresenting links could identify likely areas where security could improve the mechanism in practice. More broadly, social network information could affect more complex mechanisms, such as markets for reputation in which participants can sell their identity to others [33]. Such markets provide incentives for maintaining reputations even when participants decide to leave their business. Such sales of “brands” involve changes in the owner’s position in the social network, and raise the question of how to securely update reputation methods using the network and how the potential to misrepresent such updates could alter the mechanism incentives.

4.

ACKNOWLEDGEMENTS I thank L. Adamic for helpful discussions.

5.

REFERENCES

[1] L. A. Adamic and E. Adar. Friends and neighbors on the web. Social Networks, 25(3), 2003. [2] G. E. Bolton, E. Katok, and A. Ockenfels. How effective are online reputation mechanisms? Management Science, 50:1587–1602, 2002. [3] M. J. Brzozowski, T. Hogg, and G. Szabo. Friends and foes: Ideological social networking. In Proc. of the SIGCHI Conference on Human Factors in Computing (CHI2008), pages 817–820, NY, 2008. ACM Press. [4] K. Chatterjee, L. de Alfraro, and I. Pye. Robust content-driven reputation. In D. Balfanz and J. Staddon, editors, Proc. of the First ACM Workshop on Security and Artificial Intelligence (AISec08), pages 33–42, NY, 2008. ACM. [5] K.-Y. Chen, S. Golder, T. Hogg, and C. Zenteno. How do people respond to reputation: Ostracize, price discriminate or punish? In V. Padmanabhan and F. E. Bustamante, editors, Proc. of the 2nd Intl. Workshop on Hot Topics in Web Systems and Technologies, pages 31–36. IEEE, 2008. [6] K.-Y. Chen and T. Hogg. Experimental evaluation of an eBay-style self-reporting reputation mechanism. In X. Deng and Y. Ye, editors, Proc. of the Workshop on Internet and Network Economics (WINE2005), pages 434–443. Springer, 2005. [7] D. V. Dejong, R. Forsythe, and R. J. Lundholm. Ripoffs, lemons and reputation formation in agency relationships: A laboratory market study. J. of Finance, XL:809–820, 1985. [8] C. Dellarocas. The digitization of word-of-mouth: Promise and challenges of online reputation systems. Management Science, 49(10):1407–1424, 2003. [9] J. Duarte, S. Siegel, and L. A. Young. Trust and credit. Working Paper ssrn.com/abstract=1343275, SSRN, Feb. 2009. [10] F. Fukuyama. Trust: The Social Virtues and the Creation of Prosperity. Free Press, 1996. [11] A. Ghose, P. G. Ipeirotis, and A. Sundararajan. Reputation premiums in electronic peer-to-peer markets: analyzing textual feedback and network structure. In Proceedings of the 2005 ACM SIGCOMM workshop on Economics of peer-to-peer systems, pages 150–154, NY, 2005. ACM.

33

complex networks. SIAM Review, 45(2):167–256, 2003. [25] L. Page, S. Brin, R. Motwani, and T. Winograd. The pagerank citation ranking: Bringing order to the web. Technical report, Stanford Digital Library Technologies Project, 1998. [26] A. S. Pentland. Automatic mapping and modeling of human networks. Physica A, 378:59–67, 2007. [27] A. S. Pentland. Honest Signals: How They Shape Our World. Bradford Books, 2008. [28] J. M. Pujol, R. Sanguesa, and J. Delgado. Extracting reputation in multiagent systems by means of social network topology. In AAMAS, 2002. [29] P. Resnick, K. Kuwabara, R. Zeckhauser, and E. Friedman. Reputation systems. Communications of the ACM, 43:45–48, 2000. [30] P. Resnick, R. Zechhauser, J. Swanson, and K. Lockwood. The value of reputation on eBay: A controlled experiment. Experimental Economics, 9:79–101, 2006. [31] J. Sabater and C. Sierra. Social regret, a reputation model based on social relations. SIGecom Exch., 3(1):44–56, 2002. [32] G. Swamynathan et al. Do social networks improve e-commerce?: A study on social marketplaces. In Proceedings of the First Workshop on Online Social Networks, pages 1–6, New York, 2008. ACM. [33] S. Tadelis. The market for reputations as an incentive mechanism. SIEPR Policy Paper 01-001, Stanford, 2001. [34] J.-C. Wang and C.-C. Chiu. Recommending trusted online auction sellers using social network analysis. Expert Systems with Applications, 34:1666–1679, 2008. [35] S. Wasserman and K. Faust. Social network analysis. Cambridge University Press, Cambridge, 1994. [36] B. Yu and M. P. Singh. A social mechanism of reputation management in electronic communities. In Cooperative Information Agents, pages 154–165, 2000. [37] H. Yu et al. SybilGuard: defending against sybil attacks via social networks. IEEE/ACM Transactions on Networking, 16:576–589, 2008. [38] G. Zacharia, A. Moukas, and P. Maes. Collaborative reputation mechanisms in electronic marketplaces. In Proc. of the 32nd Hawaii Intl. Conf. on System Sciences (HICSS), 1999.

[12] O. Goldreich. Secure multi-party computation. working draft version 1.1, 1998. Available at philby.ucsd.edu/cryptolib/books.html. [13] X. Guardiola et al. Macro- and micro-structure of trust networks. arxiv.org preprint cond-mat/0206240, June 2002. [14] R. Guha, R. Kumar, P. Raghavan, and A. Tomkins. Propagation of trust and distrust. In Proc. of the 13th Intl. World Wide Web Conf. (WWW2004), pages 403–412, New York, 2004. ACM. [15] M. R. Henzinger, R. Motwani, and C. Silverstein. Challenges in web search engines. ACM SIGIR Forum, 36:11–22, 2002. [16] T. Hogg and L. Adamic. Enhancing reputation mechanisms via online social networks. In Proc. of the 5th ACM Conference on Electronic Commerce (EC’04), pages 236–237. ACM Press, 2004. [17] T. Hogg and B. A. Huberman. Using sensor networks to uncover social network structure. In G. Sukhatme et al., editors, Proc. of AAAI-04 Sensor Networks Workshop, pages 63–64, 2004. [18] B. A. Huberman, M. Franklin, and T. Hogg. Enhancing privacy and trust in electronic communities. In Proc. of the ACM Conference on Electronic Commerce (EC99), pages 78–86, NY, 1999. ACM Press. [19] C. Keser. Experimental games for the design of reputation management systems. IBM Systems Journal, 42(3):498–506, 2003. [20] D. B. Klein, editor. Reputation: Studies in the Voluntary Elicitation of Good Conduct. Univ. of Michigan Press, Ann Arbor, 1997. [21] R. Milo, S. Shen-Orr, S. Itzkovitz, N. Kashtan, D. Chklovskii, and U. Alon. Network Motifs: Simple Building Blocks of Complex Networks. Science, 298(5594):824–827, 2002. [22] M. Naor, B. Pinkas, and R. Sumner. Privacy perserving auctions and mechanism design. In Proc. of the ACM Conference on Electronic Commerce (EC99), pages 129–139, NY, 1999. ACM Press. [23] A. Narayanan and V. Shmatikov. De-anonymizing social networks. In Proc. of the 30th IEEE Symposium on Security and Privacy, 2009. [24] M. E. J. Newman. The structure and function of

34